Amazon is educating Alexa to mimic the voice of dead kin

A hot potato: Amazon is building abilities that will enable its Alexa voice assistant to mimic any human voice after hearing them communicate for fewer than a moment. Dismissing the probable creepiness of the aspect, some are involved about the possible for abuse.

Rohit Prasad, who sales opportunities the Alexa team at Amazon, stated the aim of the venture is to “make the reminiscences last” just after “so a lot of of us have misplaced somebody we appreciate” as a end result of the pandemic.

Alexa could be qualified to imitate a voice making use of pre-recorded audio, meaning the individual doesn’t have to be existing – or even alive – to provide as a supply. In a video section proven through a convention this 7 days, a boy or girl requested Alexa if grandma could end examining The Wizard of Oz. Positive more than enough, Alexa modifications voices to mock the child’s grandmother and complete looking through the story.

Prasad reported during the presentation that Alexa now gets billions of requests per 7 days from hundreds of hundreds of thousands of Alexa-enabled products throughout 17 languages in much more than 70 international locations all over the globe.

The opportunity for abuse appears high. For illustration, the instrument could be employed to develop convincing deepfakes for misinformation campaigns or political propaganda. Fraudsters could leverage the capabilities for economical obtain, like in 2020 when scammers tricked a bank supervisor into transferring $35 million to fund an acquisition that did not exist.

What are your thoughts on the issue? Is Amazon taking the strategy of voice cloning a little bit too significantly below, or are you intrigued by the thought of owning a “dialogue” with somebody from the grave?

READ MORE:  Microsoft launches its initially AI-run webcam for the Surface area Hub 2 and Floor Hub 2S

Picture credit: Jan Antonin Kolar

Related Articles

Back to top button