Thanks to AI, Jamtara-type scammers get a new voice
By
Biju Kumar
Three seconds of audio is all it takes to create a voice clone that is virtually indistinguishable from the real voice, a technological advancement that is as worrying as it is thrilling because it can be put to malicious use to defraud people, said experts.
Voice clones or deepfakes have emerged as the latest tool for cyber scammers, they said, as artificial intelligence (AI)-related scams are being increasingly reported from different parts of the country.