AI Voice Scam Alert: Fraudsters Cloning Voices to Trick Families
Cybercriminals are using AI voice cloning technology to impersonate family members and request urgent money transfers.
A dangerous new scam powered by artificial intelligence is on the rise, where fraudsters clone voices of individuals to deceive their family members.
In this scam, attackers collect voice samples from social media or public content and use AI tools to generate realistic voice clones. They then call victims, pretending to be a relative in distress, urgently asking for money.
Due to the emotional pressure and realistic voice imitation, many victims transfer money without verifying the situation. Reports indicate several such cases across India and globally.
Cybersecurity professionals recommend verifying such requests through a second communication channel and avoiding immediate financial transfers without confirmation.
Authorities are closely monitoring the misuse of AI technologies in cybercrime.