Deep Fakes and the Year of Elections
Tom Everill | 6 February 2024
Summary
AI-generated calls mimicked President Biden's voice to mislead New Hampshire voters before the 2024 primary.
The hoax, framed as election interference, falsely appeared from a former Democratic Party chair's phone.
Raises alarms on AI disinformation's impact on election security and trust in institutions.
Throughout the weekend preceding the New Hampshire Democratic primary on the 23rd of January 2024, some voters received a voice message from a voice sounding like President Joe Biden, urging them to save their votes for the November general election. However, voters who knew that voting in a state primary would not prevent them from participating in November’s presidential election quickly realised something was wrong. It turns out that the message was not real but was instead, a fabrication using artificial intelligence (AI) to mimic the president’s voice, a technique known as a ‘deep fake.’
The call was confirmed as false by the White House on Monday the 22nd, and its origin is still unknown, but recipients report that the call appeared to come from the personal cell phone of former New Hampshire Democratic Party chair, Kathy Sullivan. Sullivan has since referred to the call as election interference and an attempt to harass New Hampshire voters.
‘Deep fake’ is a portmanteau of deep learning (a type of artificial intelligence where computers learn to recognise patterns based on large datasets), and ‘fake,’ meaning ingenuine. Typically, this technique is used to manipulate or replace auditory or visual aspects of someone’s likeness. As generative artificial intelligence pushes Turing Test levels (Alan Turing’s test to determine if AI can mimic human responses indistinguishably), over four billion people are eligible to go to the polls this year, raising serious concerns about disinformation and election security.
There are several security implications associated with the emergence of widespread and high-quality generative AI. Most notably, the ability of malicious actors to create and spread disinformation surrounding elections, such as creating fake audio or video of candidates or sewing doubt about the electoral process more generally, as seen in New Hampshire. The impersonation of leaders can lead to serious consequences, including social unrest, diplomatic crises, or even military conflict in extreme cases. Furthermore, fabricated content and an inability to immediately identify it can lead to an erosion of trust in institutions, the potential for false claims in a legal context, or market volatility based on fake news.
Forecast
Short-term: highly likely that similar occurrences will continue throughout US primaries and other elections.
Medium-term: likely that losing parties will attempt to blame their losses on AI election interference.
Long-term: highly likely that AI and deep fakes are increasingly problematic to democratic systems.