Sign In  |  Register  |  About Menlo Park  |  Contact Us

Menlo Park, CA
September 01, 2020 1:28pm
7-Day Forecast | Traffic
  • Search Hotels in Menlo Park

  • CHECK-IN:
  • CHECK-OUT:
  • ROOMS:

Artificial Intelligence Voice Scams on the Rise with 1 in 4 Adults Impacted

  • McAfee researchers find you can clone a voice from just three seconds of audio
  • 77% of AI voice scam victims lost money
  • More than half (53%) of all adults share their voice at least once a week online or on social media

Today, McAfee Corp., a global leader in online protection, published a report, The Artificial Imposter, on how artificial intelligence (AI) technology is fueling a rise in online voice scams, with just three seconds of audio required to clone a person’s voice.

McAfee surveyed 7,054 people from seven countries and found that a quarter of adults had previously experienced some kind of AI voice scam, with 1 in 10 targeted personally and 15% saying it happened to someone they know. 77% of victims said they had lost money as a result.

In addition, McAfee Labs security researchers have revealed their insights and analysis from an in-depth study of AI voice-cloning technology and cybercriminal use.

Artificial Intelligence Voice Cloning Scam

Everybody’s voice is unique, the spoken equivalent of a biometric fingerprint, which is why hearing somebody speak is such a widely accepted way of establishing trust. But with 53% of adults sharing their voice data online at least once a week (via social media, voice notes, and more.) and 49% doing so up to 10 times a week, cloning how somebody sounds is now a powerful tool in the arsenal of a cybercriminal.

With the rise in popularity and adoption of artificial intelligence tools, it is easier than ever to manipulate images, videos, and, perhaps most disturbingly, the voices of friends and family members. McAfee’s research reveals scammers are using AI technology to clone voices and then send a fake voicemail or call the victim’s contacts pretending to be in distress – and with 70% of adults not confident that they could identify the cloned version from the real thing, it’s no surprise that this technique is gaining momentum.

Nearly half (45%) of the respondents said they would reply to a voicemail or voice note purporting to be from a friend or loved one in need of money, particularly if they thought the request had come from their partner or spouse (40%), parent (31%), or child (20%). For parents aged 50 or over, this group is most likely to respond to a child at 41%. Messages most likely to elicit a response were those claiming that the sender had been involved in a car incident (48%), been robbed (47%), lost their phone or wallet (43%), or needed help while traveling abroad (41%).

But the cost of falling for an AI voice scam can be significant, with more than a third of people who’d lost money saying it had cost them over $1,000, while 7% were duped out of between $5,000 and $15,000.

The survey also found that the rise of deepfakes and disinformation has led to people being more wary of what they see online, with 32% of adults saying they’re now less trusting of social media than ever before.

“Artificial intelligence brings incredible opportunities, but with any technology there is always the potential for it to be used maliciously in the wrong hands. This is what we’re seeing today with the access and ease of use of AI tools helping cybercriminals to scale their efforts in increasingly convincing ways,” said Steve Grobman, McAfee CTO.

McAfee Labs Research Reveals Voice Cloning Requires Limited Expertise and Just Seconds of Audio

As part of McAfee’s review and assessment of this new trend, McAfee researchers spent three weeks investigating the accessibility, ease of use, and efficacy of AI voice-cloning tools, with the team finding more than a dozen freely available on the internet.

Both free and paid tools are available, with many requiring only a basic level of experience and expertise to use. In one instance, just three seconds of audio was enough to produce an 85%* match, but with more investment and effort, it’s possible to increase the accuracy. By training the data models, McAfee researchers were able to achieve a 95% voice match based on just a small number of audio files.

The more accurate the clone, the better chance a cybercriminal has of duping somebody into handing over their money or taking other requested action. With these hoaxes based on exploiting the emotional vulnerabilities inherent in close relationships, a scammer could net thousands of dollars in just a few hours.

“Advanced artificial intelligence tools are changing the game for cybercriminals. Now, with very little effort, they can clone a person’s voice and deceive a close contact into sending money,” said Grobman. “It’s important to remain vigilant and to take proactive steps to keep you and your loved ones safe. Should you receive a call from your spouse or a family member in distress and asking for money, verify the caller – use a previously agreed codeword, or ask a question only they would know. Identity and privacy protection services will also help limit the digital footprint of personal information that a criminal can use to develop a compelling narrative when creating a voice clone.”

Using the cloning tools they found, McAfee’s researchers discovered that they had no trouble replicating accents from around the world, whether they were from the US, UK, India, or Australia, but more distinctive voices were more challenging to copy. For example, the voice of a person who speaks with an unusual pace, rhythm or style requires more effort to clone accurately and is less likely to be targeted as a result.

The overriding feeling among the research team, though, was that artificial intelligence has already changed the game for cybercriminals. The barrier to entry has never been lower, which means it has never been easier to commit cybercrime.

How to Protect Yourself from AI Voice Cloning

  • Set a verbal ‘codeword’ with kids, family members or trusted close friends that only they could know. Make a plan to always ask for it if they call, text or email to ask for help, particularly if they’re older or more vulnerable.



  • Always question the source. If it’s a call, text or email from an unknown sender, or even if it’s from a number you recognize, stop, pause and think. Does that really sound like them? Would they ask this of you? Hang up and call the person directly or try to verify the information before responding and certainly before sending money.



  • Think before you click and share. Who is in your social media network? Do you really know and trust them? Be thoughtful about the friends and connections you have online. The wider your connections and the more you share, the more risk you may be opening yourself up to in having your identity cloned for malicious purposes.



  • Identity monitoring services can help make sure your personally identifiable information is not accessible or notify you if your private information makes its way to the Dark Web. Take control of your personal data to avoid a cybercriminal being able to pose as you.

To access the full survey data, including results broken down by country, please see here.

Survey Methodology

The survey was conducted by market research agency MSI Research via an online questionnaire between April 13 and April 19, 2023, among a sample of 7,054 adults aged 18 and over from seven countries. The sample size completed per country is as follows: 1,009 respondents in the US; 1,009 respondents in the UK; 1,007 respondents in France; 1,007 respondents in Germany; 1,004 respondents in Japan; 1,008 respondents in Australia; 1,010 respondents in India.

About McAfee

McAfee Corp. is a global leader in online protection for consumers. Focused on protecting people, not just devices, McAfee’s consumer solutions adapt to users’ needs in an always online world, empowering them to live securely through integrated, intuitive solutions that protect their families and communities with the right security at the right moment. For more information, please visit https://www.mcafee.com.

* Voice match accuracy levels indicated are based on the benchmarking and assessment of McAfee security researchers

Stock Quote API & Stock News API supplied by www.cloudquote.io
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.
 
 
Copyright © 2010-2020 MenloPark.com & California Media Partners, LLC. All rights reserved.