AI is finding its way into every corner of our lives—including how we grieve.

So-called “griefbots” are digital recreations of people who’ve passed away, built from their messages, voice notes, and online data. You can talk to them, text them, even hear their voice. It sounds comforting, and sometimes it is. However, it also raises a hard question: are these bots helping us process grief, or just letting us avoid it? Here are some things to keep in mind as this strange new form of mourning becomes more real than most of us expected.
1. They can feel comforting, but that feeling is easy to get stuck in.

When you’re in pain, hearing a familiar voice again can be soothing. It gives you something to hold onto. In the early days after a loss, that might be exactly what someone needs to get through the shock and numbness.
The thing is, comfort isn’t the same as healing. Talking to a griefbot might help you feel close, but it can also keep you from facing what actually happened. The person is gone, but the illusion they’re still here can quietly delay the reality check grief demands.
2. They offer connection, but it’s not real closure.

Grieving takes time, and part of that process involves learning how to live without the person you lost. Griefbots can make it easier to bypass that pain by offering fake conversations that feel familiar but go nowhere emotionally. Instead of helping you move through it, they might keep you circling the same emotional block. It’s not memory, it’s mimicry. While that might feel safe, it can also stall real progress.
3. They can confuse the brain’s ability to adapt.

When we lose someone, our minds start to adjust to that absence, slowly and painfully. But if you’re still interacting with their voice or texting their digital double, it sends mixed signals. You might logically know they’re gone, but your emotions don’t get the memo. This makes it harder to move forward. Your brain doesn’t have the chance to fully process the loss, and that can leave you stuck in a weird in-between space where nothing quite feels real.
4. They blur the line between remembering and re-creating.

There’s a big difference between honouring someone’s memory and trying to bring them back. A griefbot doesn’t just remind you of them—it creates new, artificial moments. Things they never said. Conversations you never had. It feels like connection, but it’s not. As time goes on, it gets harder to separate what’s real from what’s made up. That can certainly change how you remember the person you loved.
5. They’re built from data, not from depth.

Griefbots are made from what someone left behind: texts, voice clips, old posts. That can capture tone, habits, maybe even their sense of humour. Sadly, it can’t capture their contradictions, their growth, or the things they never put into words. You’re not talking to them; you’re talking to an algorithm trained to sound like them. It’s a clever imitation, not a person. Deep down, you’ll probably always feel the difference.
6. They can create emotional dependence.

If a griefbot becomes your go-to for comfort, it’s easy to rely on it. Especially if you’re struggling to connect with other people, or move through your grief. Sadly, as time goes on, that comfort can turn into emotional avoidance. Instead of adjusting to life without that person, you might end up clinging to a version of them that’s not really them. It keeps you in a loop—one that can feel safe, but slowly isolates you from real healing and connection.
7. They raise serious questions about consent.

Would your loved one have wanted to be turned into a bot? Did they ever agree to that? Also, who gets to decide how their digital self shows up—what they say, how they respond, what parts of their personality are used? These are messy, emotional questions. Grief makes people want to hold on. However, there’s a difference between honouring someone’s memory and recreating their voice without their permission.
8. They change how we think about death.

Death used to be final. It hurt, but it had boundaries. Griefbots eat away at that. If someone can still “talk” to you after they’ve died, even through AI, it makes it harder to let go. Also, it changes what we expect from grief itself. Instead of finding closure, we might start chasing connection we’re not meant to have. That doesn’t make us wrong, but it does complicate how we heal, and what it even means to “move on.”
9. They treat grief like a tech problem, not a human one.

Grief isn’t something to fix—it’s something to move through. It’s messy, unpredictable, and deeply personal. A bot might help manage some of the loneliness or shock, but it can’t sit with you in the real work of mourning. The danger is thinking this is a shortcut. That if we just build smarter tools, we won’t have to feel the pain. But that pain has a purpose. It reshapes us. Avoiding it might feel good in the moment, but it comes at a cost.
10. They force us to rethink how we remember people.

At their core, griefbots challenge how we carry someone forward. Is it through simulation? Or through meaning—how we live, love, and show up in ways that reflect the person we lost?
Talking to a bot might feel easier than sitting with silence. However, memory is deeper than dialogue. It’s in the values, stories, and moments that shaped you. Honouring someone doesn’t mean digitally recreating them. It means carrying them with you, in ways a machine never could.