Artificial intelligence has been advancing at the speed of light, but there were aspects of our lives that were still untouched by robots and machines. Now, everyone knows that one of the hardest things to come to terms with, is the death of a loved one. People have longed wished to be able to bring back the deceased from the dead. This is exactly what they mean when they say “careful what you wish for.”
Apparently, Amazon’s Alexa will be able to mimic the voices of dead people in the near future. If this sounds morbid and insane to you, you’re not alone. Tech companies have, by now, monetized on most aspects of modern life. Some people are arguing that it’s a stretch that they’re currently monetizing on people’s loss and grief.
What we know so far
At the company’s re:MARS conference, Amazon representatives shared information about a new feature that is capable of synthesizing short audio clips and turning them into longer speech. In one of the demonstrations, a video segment was showed where a child asks Alexa if his grandma can finish reading the Wizard of Oz to him, and then Alexa swaps into the voice of the grandmother in a manner so realistic that it is eerie.
This recent development in technology comes in response to the COVID-19 pandemic, where many people lost their loved ones, at times prematurely.
Rohit Prasad, Senior Vice President and Head Scientist for Alexa shared:
“This required inventions where we had to learn to produce a high-quality voice with less than a minute of recording versus hours of recording in the studio. The way we made it happen is by framing the problem as a voice conversion task and not a speech generation path. We are unquestionably living in the golden era of AI, where our dreams and science fiction are becoming a reality.”
Of course, this kind of technology poses a lot of moral and ethical questions pertaining to capitalizing on people’s grief in order to create a profitable service that promises people some emotional reprieve from severe loss. Another concern is the AI-generated voice being used when people are still alive, making it easier to fake consent in a lot of aspects.
Though this news can strike some (myself included) as morbid, it does give others some hope that they could potentially hold onto some of their lost loved ones.
Photo via Elaine Thompson/AP