Voice cloning fraud

ForumIAS announcing GS Foundation Program for UPSC CSE 2025-26 from 26th June. Click Here for more information.

Source- This post is based on the article “How voice cloning through artificial intelligence is being used for scams” published in “The Hindu” on 8th January 2024.

Why in the News?

Voice clone fraud is increasing in India, with a May report indicating that 47% of surveyed individuals in the country have been affected by an AI-generated voice scam.

What are the Findings of the Report?

1) A May report titled ‘The Artificial Imposter‘ disclosed that 47% of surveyed Indians, nearly double the global average of 25%, have either fallen victim to or known someone affected by an AI-generated voice scam.

2) India ranked highest globally in the number of victims to AI voice scams.

3) McAfee reported that 66% of Indians would respond to a call appearing to be from a friend or family member urgently seeking money.

4) The report found that messages claiming the sender was robbed (70%), in a car accident (69%), lost their phone or wallet (65%), or needed financial aid while traveling abroad (62%) were the most effective excuses.

5) The report revealed that 86% of Indians share their voice data online or via voice notes at least once a week, making these tools highly potent.

How are Voice Clones Done?

1) Once a scammer acquires an individual’s audio clip, uploading it to online programs such as Murf, Resemble, and Speechify allows for accurate voice replication, excluding some intonations.

2) Voice cloning often employs deep learning techniques and neural networks, such as recurrent neural networks (RNNs) or convolutional neural networks (CNNs). These models can capture complex patterns in speech and generate realistic-sounding synthetic voices.

UPSC Syllabus- Science & Technology

Print Friendly and PDF
Blog
Academy
Community