Voice deepfakes are getting easier to spot
And it’s all because of our bodies
When you purchase through links on our site, we may earn an affiliate commission.Here’s how it works.
New research has shown that voice deepfakes are becoming easier to spot as synthetic recreations of real voices, thanks to the anatomy of our vocal tracts.
Researchers at the University of Florida have devised a method ofsimulating images of a human vocal tract’s apparent movementswhile a voice clip - real or fake - is played back.
Professor of Computer and Information Science and Engineering Patrick Traynor and PhD student Logan Blue wrote that they and their colleagues found that simulations prompted by voice deepfakes weren’t constrained by “the same anatomical limitations humans have”, with some vocal tract measurements having “the same relative diameter and consistency as a drinking straw”.
Deepfakes and the media literacy problem
Though scientists are starting to spot voice deepfakes with simulation and anatomical comparison, the risk of an ordinary person being tricked by any deepfake - which could lead toidentity theft- remains a problem.
Ordinary people don’t yet have access to these tools. Even if they ever will, users will still struggle to interpret that data until intuitive and widely available audio-based detection tools materialize.
Because it’s so hard for normal eyes and ears to spot deepfakes, expert advice on doing so isn’t widely known or available yet. People are also less primed to be healthily critical of what they see and hear over the internet, the phone, or any medium that puts a level of disconnect between what’s really happening.
“Disbelief by default”, where people become skeptical about everything they see and hear that isn’t from a trusted source, is a useful tactic here. The problem within the problem is that not everyone will adopt that strategy, as they don’t understand the threat and refuse to engage with it.
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
Media literacy has been a vital skill for a number of years now, as anyone has been able to come across election disinformation or baseless conspiracy theories, butschools aren’t interested in teaching it, and there is still the issue of bridging this skill gap in adults.
Here’s our list of the best online cybersecurity courses right now>Voice biometrics: the new defense against deepfakes>Google is cracking down hard on deepfakes
That skill gap is how fake news proliferated and embedded itself in our societies, and relationships with our loved ones. For this reason, anyone concerned about the media literacy of those close to them should consider investing inidentity theft protection for families.
The rise of the convincing audiovisual deepfake has once again raised the need for a structured, widespread program to educate users in media literacy, and the importance of applying critical thinking to anything with only the thinnest veil of authenticity around it.
ViaThe Conversation
Luke Hughes holds the role of Staff Writer at TechRadar Pro, producing news, features and deals content across topics ranging from computing to cloud services, cybersecurity, data privacy and business software.
Washington state court systems taken offline following cyberattack
Is it still worth using Proton VPN Free?
Top 3 things you have to try with the new ChatGPT search