• Home
  • News
  • Business
  • Economy
  • Health
  • Politics
  • Science
  • Sports
Don't miss

South Carolina No. 11 returns to claim victory

March 25, 2023

Pro-Iranian forces in Syria warn US to respond to airstrikes | Conflict News

March 25, 2023

GE Hitachi and Partners Commit to $400 Million Global Small Modular Reactor Project (NYSE: GE)

March 25, 2023

Newly Discovered Orchid Species Looks Like a Delicate Piece of Glass Art: ScienceAlert

March 25, 2023

Subscribe to Updates

Get the latest creative news from gnewspub.

Facebook Twitter Instagram
  • Home
  • Contact us
  • Privacy Policy
  • Terms
Facebook Twitter Instagram
Gnewspub
  • Home
  • News
  • Business
  • Economy
  • Health
  • Politics
  • Science
  • Sports
Gnewspub
Home » Scammers use voice-cloning AI to impersonate relatives
Business

Scammers use voice-cloning AI to impersonate relatives

March 5, 2023No Comments3 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp Email
Share
Facebook Twitter LinkedIn WhatsApp Pinterest Email

You could very well receive a call in the near future from a loved one who is in urgent need of help, asking you to send them some money quickly. And you might be convinced it’s them because, well, you know their voice.

Artificial intelligence is changing that. New generative AI tools can create all kinds of output from simple text prompts, including trials written in the style of a particular author, the images worthy of art awardand – with just a snippet of someone’s voice to work with – speech that sounds convincingly like a specific person.

In January, Microsoft researchers demonstrated a text-to-speech AI tool that, when given just a three-second audio sample, can closely simulate a person’s voice. They didn’t share the code for others to play with; instead, they warned that the tool, called VALL-E, “could carry potential risks if misused…such as voice impersonation or impersonation of a specific speaker”.

But similar technology is already in the wild and crooks are taking advantage of it. If they can find 30 seconds of your voice somewhere online, chances are they can clone it and make it say anything.

“Two years ago, even a year ago, it took a lot of audio to clone a person’s voice. Now… if you have a Facebook page…or if you recorded a TikTok and your voice is there for 30 seconds, people can clone your voice,” said Hany Farid, professor of digital forensics at the University of California, Berkeley, at the Washington Post.

“The Money’s Gone”

THE Job reported this weekend on peril, describing how a Canadian family fell victim to scammers using AI voice cloning and lost thousands of dollars. A ‘lawyer’ told the elderly parents that their son had killed a US diplomat in a car accident, was in jail and needed money for legal fees.

The alleged lawyer then allegedly handed the phone over to the son, who allegedly told the parents he loved and appreciated them and needed the money. The cloned voice was “close enough that my parents actually believed they spoke to me,” the son, Benjamin Perkin, told the Job.

The parents sent over $15,000 through a bitcoin terminal to… well, crooks, not their son, as they thought.

“The money is gone,” Perkin told the newspaper. “There is no insurance. There’s no way to get it back. Let’s go.”

A company that offers a generative AI voice tool, ElevenLabs, tweeted on January 30 that he was seeing “an increasing number of cases of voice cloning misuse”. The next day he announcement the voice cloning capability would no longer be available to users of the free version of its tool, VoiceLab.

Fortune contacted the company for comment, but did not receive an immediate response.

“Almost all of the malicious content was generated by free, anonymous accounts,” he added. writing. “Additional ID verification is required. For this reason, VoiceLab will only be available on paid tiers. (Subscriptions start at $5 per month.)

Card verification won’t stop all bad actors, he acknowledged, but it would make users less anonymous and “force them to think twice”.

Learn how to navigate and build trust in your business with The Trust Factor, a weekly newsletter examining what leaders need to succeed. Register here.

Share. Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp Email

Related Posts

GE Hitachi and Partners Commit to $400 Million Global Small Modular Reactor Project (NYSE: GE)

March 25, 2023

Russia pursues Ukraine front after reports of Bakhmut slowdown By Reuters

March 25, 2023

Manhattan DA probing Trump threatened with letter containing powdery substance

March 25, 2023

Postponement of UK pension age increase risks costing £60bn, think tank says

March 25, 2023

Watch all of Friday’s big stock calls on CNBC

March 24, 2023

Tesla Stock: Cathie Wood Sells $27M From TSLA To Buy Drop On Coinbase, Block

March 24, 2023
What's hot

South Carolina No. 11 returns to claim victory

March 25, 2023

Pro-Iranian forces in Syria warn US to respond to airstrikes | Conflict News

March 25, 2023

GE Hitachi and Partners Commit to $400 Million Global Small Modular Reactor Project (NYSE: GE)

March 25, 2023

Newly Discovered Orchid Species Looks Like a Delicate Piece of Glass Art: ScienceAlert

March 25, 2023

Subscribe to Updates

Get the latest creative news from gnewspub.

  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo
  • LinkedIn
  • Reddit
  • Telegram
  • WhatsApp
News
  • Business (3,634)
  • Economy (1,892)
  • Health (1,832)
  • News (3,656)
  • Politics (3,663)
  • Science (3,466)
  • Sports (2,897)
  • Uncategorized (1)
Follow us
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo

Subscribe to Updates

Get the latest creative news from gnewspub.

Categories
  • Business (3,634)
  • Economy (1,892)
  • Health (1,832)
  • News (3,656)
  • Politics (3,663)
  • Science (3,466)
  • Sports (2,897)
  • Uncategorized (1)
  • Home
  • Contact us
  • Privacy Policy
  • Terms
© 2023 Designed by gnewspub

Type above and press Enter to search. Press Esc to cancel.