2304-AI-Voice-Mini-Blog-LinkedIn-Ad-1

How AI voice mimicking is a security risk

Quickly evolving in sophistication, some AI voice-generating software requires just a few sentences of audio to convincingly produce speech that conveys the sound and emotional tone of a speaker’s voice, while other options need as little as three seconds.

 

AI voice mimicking

Will advanced hackers use this to gain access to your IT system? The real question is of course “How”? Assume that criminals will use all available technology to help them.

Imagine a situation where a “user” calls the help desk and asks for resources or a password from the supporters at a service desk. How should they be able to distinguish between the real user and a very good AI generated voice?

The headline of an Article in Washington Post:

They thought loved ones were calling for help. It was an AI scam.

When loved ones can be fooled then surely a supporter who hasn’t any close relationship can be fooled too.

The criminals only have to call an important employee, record the conversation and give the voice print to the AI tool. Now they can have a conversation with the user’s voice convincing a central supporter or staff person to give away important assets: Money, resources, passwords!

Some service desks even use voice recognition to verify the users’ identity. Can you still trust this with computer generated AI voices?

Unfortunately, the answer is not simple. You must combine multiple factors and understand the context to make a secure identity verification if you can’t have people meeting up in person.

The point is: You can’t only rely on the voice anymore.

Finn Jensen

Finn Jensen | Founder, FastPasscorp

Related Posts

Scroll to Top