There’s this new stop for the public transport, near the Lychakiv cemetery, named ‘Memorial of Heroes of Ukraine.’ There are buried the people who protected the country and paid the highest price, their lives.
There is the sea of flags. It looks very emotional, it’s difficult even to drive by. I’m getting carried away each and every time I’m near that place. It’s that case when photo or video is just so much different to the reality. It’s very emotional, to see this see of flags. This sea of dead boys and girls, who fought tooth and nail to give us this opportunity to live, even a slightest bit longer than them.
I saw a girl with flowers, clearly visiting someone there. She sat near us in the tram, and was texting with someone.
That’s when this thought visited me.
One day — and this is reality already — there might be this black mirror-ish product, an AI chat bot. It would imitate the killed person.
Or a dead person, not necessarily killed. Not even necessarily suddenly dead.
Definitely, it’s possible with various AI chat bots we have already. ChatGPT, Gemini, DeepSeek, maybe Llama, and others too.
But what I mean here is that it could be like a separate app for the grieving person. Or just a separate chat (-bot) in your favourite messenger, or even email. Or FaceTime — or similar chat app — audio or even video call opportunity.
It’s like having a distant friend, who’s difficult to catch in person, but you’re keeping in touch online. You might ask them what they’re up to, send them something, and the chat bot won’t be know-it-all help-them-all instance. It won’t imitate what current AI helpers do, it would imiate the person. It would have its way to learn the person thoroughly. Many people may tune their experience of that person, helping to build this digital persona of theirs. Theoretically, this process can be started long before this — sudden or not — death. The person themselves can start training that bot, which can simply optimise communication, like some (fancy or not) auto-complete system does these days. The bot can analyse my other content I produce, YouTube blog, various textual posts, listen to my conversations…
Privacy nightmare, I know. But that’s not the point.
This isn’t the AI, as we know it today. It’s just someone you could talk to, with the personal qualities of the (dead now) person. It could continue living. Ideally, it can evolve over time, reconsider some believes, as we people do, change it behaviour over prolonged period of time.
However, I believe this thing make things even worse for us, the living. That’s why I think of it as a black mirror-ish.