The FBI has sent out a warning to individuals and organizations regarding the growing cases of cybercriminals using AI to clone voices and pose as other people to conduct phishing and other social engineering attacks. Imagine getting a call and hearing your mother’s voice which you know like the back of your hand, but it is not her, its AI, you would be shocked and appalled. Well, that is the growing trend, bad actors can get a recording of your voice or someone you know and place a call saying it is urgent to trick you into disclosing PII(Personal Identifiable Information), or worse, send them money.
Sites like Elevenlabs and Descript offer this service to anyone free of charge. All you need to do is create an account. Now while some people may use it to do bad things, there is good that can come with this advancement in technology. People in the voiceover industry and blogging can leverage this new AI to get projects done quicker with automation. But the question that comes with this amazing technology is do we own our voices; can someone use our voice for their financial gain? Can we remake songs with well-known artists’ voices? That will have to be determined as more people become aware of this new AI.
So how do we stay safe and catch someone trying to pose as someone we know? There are a few things you can do. Bring up old memories that you know no one else knows. Ask a lot of personal questions to stump the person impersonating your loved one. See if you can video chat to verify it is the person you know on the other end of the phone and make sure that the person you know name displays as it normally does with the phone number you programmed when you added them as a contact. Stay safe.