Alexa can mimic the voices of the dead


The Alexa’s new skill is perhaps the most disturbing of all, namely the ability to imitate the voices of any person after about a minute of listening to a sample audio. To demonstrate this new feature, senior vice president of the team behind Amazon’s voice assistant, Rohit Prasad, chose to play a video in which Alexa reads a bedtime story to a child by mimicking grandma’s voice. The dead grandmother a short time earlier, bringing to the fore the age-old ethical question of how far technology can go to try to keep those who are no longer alive in some way.

On the occasion of the Amazon Re: Mars conference event in Las Vegas, Alexa showed a lot of upcoming news and what has conquered the front page is undoubtedly the ability to imitate the voices. In accordance to told a Engadget by a spokesperson for the company, the training takes about a minute of audio files of the person to be replicated to already obtain more than valid results, when previous experiments required hours and hours of listening. A very fast system, therefore, which calibrates the output of the text-to-voice modulating the tones to represent the most disparate timbres. “We live in a golden age of artificial intelligence – commented Prasad – and if the academic year it cannot eliminate the pain of a loss, it can sure make the memories remain“and we can only agree with the manager, even if there are two essential aspects to underline.

The first point concerns ethics: the imitation of the voice, of any voice, is a kind of deepfake at the audio level with the representation of the stamp of a person who may never have consented to this procedure. And this applies to those who are still alive, who could undergo the cloning of the voice by fishing any short movie or audio file, but also those who are no longer there. As in the case of the famous virtual hug between a mother and a dead daughter, the representation of the missing person could cause serious problems, particularly in the process of mourning. Second, it could accelerate the evolution of existing scams who use voice imitation technology for illicit purposes such as last year’s case of a transfer of over 30 million euros from a UAE bank to a tech criminal who had impersonated a leader


Categories:   Science

Tags:  , ,

Comments