Sadness meant silence, photographs, memories stored in old letters. Now some people hear their lost relatives say again – through artificial intelligence. Creating voices from voice notes. Building avatar of this chat. Sure, it's comfort. But also difficult territory.
What's going on
Diego Felix dos Santos, 39 years old, left something simple after his father's death. Voice. Then came a note from the hospital, and the rest: sending this sample to eleven laboratories (voice service) in order to generate messages in your father's voice.
Now he hears greetings like “Hi son, how are you?” A friend of the tone – new news in the voice he thought he had lost forever.
Companies such as Storyfile, further AI, Eternos and others enter. They offer “regret technology”: services that allow you to create avatars, voice clones, digital twins of relatives who have passed. For some it is healing. He raises eyebrows for others.
Sweet point and sharp edges
People who used these tools often say that they do not replace mourning – but addition Something delicate.
Anett Bommer, whose husband used Eternos before his death, calls it “part of my life now” – a project he built for his family. She did not resist the avatar during the most difficult regret, but later it happened to something valuable.
But experts warn: this comfort is not without costs. What about consent (especially posthumous)?
And what about emotional dependence – could someone get stuck in mourning, sticking to these digital eh? And then there is a mess of data privacy. Who is the owner of the voice? Can it be used later?
Researchers of the University of Cambridge want constant consent, transparency, and sensitive data security. Because the possibilities are evolving quickly, but the law and emotional readiness can be delayed.
Why does it matter – more than you think
This is not science fiction. This is a real life. Here are some of the major waves:
- Mental health and regret: Therapists are cautious. AI voice clones can help in closing for some, but for others they risk delaying acceptance or complicating natural regrets.
- Ethical precedents: If digital skopper becomes more common, societies will need a clear framework. How to define consent before death? What about the rights over the voice or similarity of a person after their departure?
- Regulation and commercial versus Personal use: Companies burdening subscriptions, sales of “older accounts” – it's okay if they are handled carefully. But commercialization can press on the bends: less strict consent, leaks, re -use of votes without proper supervision.
- Cultural, religious, personal variety: Not everyone accepts voice clones or avatars. For some relics and rituals, faith carry the weight of memory. For others, this technology opens new paths for treatment. There is no one universal.
What to watch, what to ask yourself a question
Before you try something like this (if you ever think about your loss), here are some questions and spaces:
Question | Why is it important |
Whether the deceased agreed before Death (voice recording, similarity)? | It affects the legality and moral right to create the AI version. |
Can you control what will do later with their digital data? | It ensures that the voice / similarity is not incorrectly used commercially or manipulated. |
Is there a plan to “turn off” or if necessary for Avatar? | It is associated with the problems of emotional dependence. |
How can this affect your mourning process in time? | It can help some, but it can stop emotional acceptance for others. |
My opinion
I feel how these tools offer: relief, closeness, something that you can stick to. Sadness is brutal, unpredictable. When someone gives you a “one more chance” to connect – even if virtually – there is something saint about it.
But I'm worried too. There is a small border between comfort and illusion. Between memory and farewell delay. Between the tool and the ball.
As this technology increases, a handrail requires: ethical project, honest marketing, clear user education. Because nothing should use regret for profit or a promising more than it can provide.
This seems to be the beginning of a deep conversation – about losses, heritage, which means presence when someone has disappeared. AI voices are not ghosts. It's echoes. Use them wisely.