Ai & DeepFake VOICE CLONes

#CyberDivision #Cybersecurity #AI #MachineLearning #DeepFake #Infosec

The Potential and Drawbacks of Machine Learning Voice Clones

When my former employee told me he tried to clone my voice without my consent, I was annoyed but not surprised. Deepfakes have existed for years, but the software to execute them is popping up in abundance.

What Are Deepfakes?

A deepfake is video or audio that uses deep learning a type of artificial intelligence to depict content that isn’t real. Deepfake technology takes something that exists- a video or audio- and manipulates to create something that appears real but is not.

Eleven Labs website touts its Prime voice AI as "The most realistic and versatile AI speech software, ever. Eleven brings the most compelling, rich, and lifelike voices to creators and publishers seeking the ultimate tools for storytelling." When I visited the site, I was elated! The voiceovers were richer, more human, and more accurate than I imagined. I was immediately inspired by the opportunities to use pre-made and custom voice applications to accomplish my strategic business goals, such as voiceovers for commercials, websites, and audiobooks. I signed up immediately and can't wait to get started.

There are other case studies of how machine learning voice technology is being used for good. For example, in 2021, with the blessing of Anthony Bourdain's estate, Hollywood director, Morgan Neviile, used audio deepfake to recreate Anthony Bourdain's voice in the documentary, Roadrunner. Although this sparked a debate on social media over the ethics of simulating Bourdain's voice, the intent was good.

The first time I heard of this voice deepfakes was several years ago while watching a video where software was used to demonstrate how what was said could be altered in almost real-time. The person demonstrating the application recorded the interviewer's voice and input text into a computer that changed what the interviewer said in mere seconds after he spoke. The demonstrator recreated the subject's voice and entered text that stated something completely ridiculous and compromising- in his own voice- in seconds. While this was done with the interviewer's consent and in jest, it showed the perils of what happens if this technology falls into the wrong hands.

In 2019 a voice deepfake was used to scam a company out of €220,000. The fraudster used machine learning or audio deepfake technology to impersonate the voice of the firm's CEO. (1) In January of 2020, the FTC conducted a workshop on voice cloning with a demonstration of voice cloning, good and bad use cases, a discussion on the ethics of voice cloning, and authentication, detection, and mitigation. (2) In the same year, security consulting firm NISOS released a report analyzing a fraud attempt using voice cloning. (3)

In April of 2022, the FBI gave the following warning about voice cloning scams: "It's a very sophisticated program that is used medically, right, for people that have throat cancer, ALS, that they've lost their voice. Hollywood uses it, as well, to change voices for movies and so on and so forth. So it's a very expensive program. But again, some of these nefarious groups have the money, so they'll go out there and pay for it." "So this is a message, a clear message, to those individuals who think that it's a way to get a quick dollar that we are actively working to arrest them." (4)

If you believe you have been a victim of a voice cloning scam, contact the FBI immediately at

1. Vincent, James. "This is what a deepfake voice clone used in a failed fraud attempt sounds like / AI voice clones are getting more and more realistic.", Jul 27, 2020

2. Fair, Leslie, "Voice cloning: Where WOW meets OMG." Federal Trade Commission, January 16, 2020

3. Volkert, Robert & Badlu, Dev. "The Rise of Synthetic Audio Deepfakes." NISOS Consulting, July 19, 2020

4." FBI warns about voice cloning scam." News West 9, April 4, 2022