Why AI-Cloning Your Own Voice Might Be a Pretty Great Idea After All

Just make sure to read the small print

  • A new AI tool, Augie, makes it easy to clone your own voice to record video voiceovers.
  • Voice cloning is real here and can be very useful. 
  • Cloned voices are bad news for voice actors and great news for scammers.


Man in recording studio
Voice actor recording in-studio.

Smile/Getty

Augie is a new AI tool for making social media and promo videos, but it has one feature that makes it stand out: Augie can clone your voice and create video voiceovers for you. 

Voice cloning is popping up in more and more places, and while it has some severe downsides, as a tool it can be extremely useful. It can be used any time you need to record a voice, but don't have a recording studio handy, or the resources to hire a professional. Of course, this means that those voiceover professionals will suffer, but not in all cases. Augie looks like one of the better tools, letting computers do what they do best: eliminating tedious tasks so you can focus on more interesting work. 

"I am a professional voice actor, so, as you can imagine, the topic of AI voice cloning has been top of discussions with our industry and legal teams. From what we have experienced and learned, even a 'normal' person should be aware of the risks associated with vocal cloning," professional voice actor Kira Gurnee told Lifewire via email.

AI Voice Over

Augie creates videos, letting you edit together clips, add photos, and so on. But the neat part is that you can record a snippet of your voice, and Augie will create a clone. From then on, you just need to type in your voiceover script, and it'll generate the audio for you, with no lawnmowers, barking dogs, or screaming babies to ruin the recording. But not everybody is happy.

The threat to voiceover artists is real. There are already tools that spit out entire audiobooks in minutes. Even if a voice actor takes control of their own voice clone to help out, it still devalues their work. On the other hand, voice-cloning can be used for quick fixes that would be way harder with live voices.

"I use AI tools daily in my role and use a cloned version of my voice when editing my videos. Often, there will be a word or two that I want to change in my final recording," Fergal O'Shea, AI researcher and co-founder of AI tool company Aiifi, told Lifewire via email. "Rather than re-recording a single word or sentence, I now use an AI voice clone tool built into my video editor. The AI tool matches the tone in the sentence so the new word sounds natural when added. However, I never use my cloned voice to record entire voiceovers. My cloned voice is less expressive and emotive than my real one."

Trash Talk

The downside of cloned voices is that they can be used to scam people. Aug X, the company behind Augie, only gives the creator of a clone access, which means other people in a corporate-use scenario cannot grab and use it. 

Easy voice cloning can result in significant privacy and security risks.

But that's just one product. The proliferation of AI voice clones has very real dangers

"I have real concerns over using voice cloning to deceive people into believing they are receiving voice notes from someone when it is, in fact, a clone they are hearing. I sent cloned voice notes to several friends and family as a test. None realized it was a clone!" says O'Shea. 

This story neatly encapsulates the entire promise and menace of AI voice cloning. It's miraculous, and AI tools can be genuinely transformative. But there's a definite letting-the-genie-out-of-the-bottle aspect to it all. 

Woman recording in a studio, using microphone and pop shield, wearing headphones.
Voice actor recording.

andresr/Getty

Joshua Spencer, founder of AI healthcare company BastionGPT, summed up this dichotomy perfectly in an email interview with Lifewire. 

"I have seen that voice cloning can be used to help those with speech impairments or to create voiceovers for content, among other useful uses. However, it has also been used dishonestly, raising issues with credibility and trust in various contexts," Spencer told Lifewire via email. "Easy voice cloning can result in significant privacy and security risks. I have seen instances where it has been used to pose as someone else for unlawful purposes, such as phishing schemes or the production of false audio content. This gives rise to moral questions about consent and how this technology might be used for malicious purposes."

For now, you might consider keeping any recordings of your voice away from the internet, just in case. 

Was this page helpful?