Fastest News Updates around the World

Uncovering the Driving Factors Behind the Increasing Online Voice Scams!

20

- Advertisement -

Experts have warned that artificial intelligence technologies are fueling an explosion of audio cloning scams.

Fraudsters can now imitate a victim’s voice using as little as a three-second snippet of audio, often stolen from social media profiles.

It is then used to contact a friend or family member to convince them that they are in trouble and desperate for money.

According to cybersecurity experts McAfee, one in four Brits say they or someone they know has been scammed.

It is generally accepted that most victims admitted to losing money as a result, with about a third of the victims receiving more than £1,000.

The company’s report says that artificial intelligence has “really changed the game for cybercriminals,” as fraud tools are available for free on the Internet.

Experts, academics and presidents from across the tech industry are calling for tighter regulation of AI as they fear the sector is spiraling out of control.

McAfee’s “synthetic impostor” report says that human voice cloning has become “a powerful tool in the arsenal of cybercriminals” – and finding a victim is not difficult.

And a survey of more than 1,000 adults in the UK found that half of them shared their voice data online at least once a week on social media or voice memos.

The investigation has identified more than a dozen publicly available AI voice cloning tools on the Internet, many of which are free and require only a basic level of knowledge to use.

In one case, just three seconds of audio was enough for an 85% match, while she had no problem repeating accents from around the world.

When everyone voted for the verbal equivalent of a biometric fingerprint, 65% of respondents admitted they were not confident in their ability to identify a clone of the real one.

And more than three in 10 said they would answer a voicemail or audio message purporting to be from a friend or loved one in need of money, especially if they thought it was from a partner, child or parent. The messages most likely to elicit a response are those that claim that the sender was in a car accident, was robbed, lost his phone or wallet, or needed help while traveling abroad.

One in twelfth people said they had personally been the victim of some kind of AI voice scam, and another 16% said it had happened to someone they knew.

The cost of falling into a voice AI scam can be significant: 78% of people admit they lost money because of them.

Source: Daily Mail

Leave a Reply

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More