NEWS

Scammers are using voice cloning tech to trick people, can create fake voices of anyone in seconds

In the rapidly evolving world of artificial intelligence, the rise of voice cloning technology has sparked both fascination and fear among lawmakers and security experts. Scammers are already exploiting this technology to deceive unsuspecting individuals, while public figures like New York City Mayor Eric Adams have found innovative uses for AI-generated clones of their own voices.

Read More: Reliance JioSpaceFiber: What is satellite internet and how will it work in India

White House Deputy Chief of Staff Bruce Reed, who leads the Biden administration’s AI strategy, has voiced his concerns about the technology, describing it as “frighteningly good.” He emphasized the potential societal impact, stating that people may soon be reluctant to answer phone calls if they cannot distinguish between real and fake voices, according to a report by Officenewz. Voice cloning technology can recreate anyone’s voice with just three or four seconds of audio input, achieving an 85 percent match, according to a report from security software company McAfee.

Scammers have been quick to exploit voice cloning advancements, using AI technology to enhance their fraudulent schemes. The Federal Trade Commission (FTC) has reported instances where scammers used voice cloning to deceive victims in family emergency scams, creating convincing replicas of family members in distress. One alarming case involved a mother in Arizona receiving a call from a scammer who had used voice cloning to impersonate her kidnapped daughter, complete with her daughter’s voice and inflection.

Read More: ID runs on IT, and lots of engineering

In addition to voice cloning, the emergence of deepfake technology has raised concerns about manipulated videos that appear genuine. Actress Rashmika Mandanna just fell victim to a viral deepfake video that portrayed her entering an elevator. The video garnered millions of views, sparking calls for new legal and regulatory measures to combat the spread of fake content on the internet.

The video was initially shared on Instagram, featuring a woman named Zara Patel, who seemingly had no involvement in the creation of the deepfake. The true origin and motivations behind the video remain a mystery. This incident is not isolated, as many celebrities have been targeted by similar fake videos in recent years.

Bollywood actor Amitabh Bachchan highlighted the urgency of legal action against deepfakes, further underscoring the need for measures to tackle the growing threat of AI-generated deception.

Read More: Bye home!: End of remote work era as tech companies make work from office mandatory

As voice cloning and deepfake technologies continue to advance, it is clear that these developments have far-reaching implications for security, privacy, and the potential for widespread deception. Lawmakers and regulators face the challenge of adapting to this new landscape to protect individuals from the threats posed by scammers and fake content.

Source :
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Most Popular

To Top