Episode 84
So What?: Understanding Disinformation and Election Integrity with Hillary Coover
March 27th, 2024
40 mins 39 secs
About this Episode
Can you spot a deepfake? Will AI impact the election? What can we do individually to improve election security? Hillary Coover, one of the hosts of the It’s 5:05! Podcast, and Tracy Bannon join for another So What? episode of Tech Transforms to talk about all things election security. Listen in as the trio discusses cybersecurity stress tests, social engineering, combatting disinformation and much more.
Key Topics
- 04:21 Preconceived notions make it harder to fake.
- 06:25 AI exacerbates spread of misinformation in elections.
- 11:01 Be cautious and verify information from sources.
- 14:35 Receiving suspicious text messages on multiple phones.
- 18:14 Simulation exercises help plan for potential scenarios.
- 19:39 Various types of tests and simulations explained.
- 23:21 Deliberate disinformation aims to falsify; consider motivation.
- 27:44 India election, deepfakes, many parties, discerning reality.
- 32:04 Seeking out info, voting in person important.
- 34:18 Honest cybersecurity news from trusted source.
- 38:33 Addressing bias in AI models, historic nuance overlooked.
- 39:24 Consider understanding biased election information from generative AI.
Navigating the Disinformation Quagmire
Dissecting Misinformation and Disinformation
Hillary Coover brings attention to the pivotal distinction between misinformation and disinformation. Misinformation is the spread of false information without ill intent, often stemming from misunderstandings or mistakes. On the other hand, disinformation is a more insidious tactic involving the intentional fabrication and propagation of false information, aimed at deceiving the public. Hillary emphasizes that recognizing these differences is vital in order to effectively identify and combat these issues. She also warns about the role of external national entities that try to amplify societal divisions by manipulating online conversations to serve their own geopolitical aims.
Understanding Disinformation and Misinformation: "Disinformation is is a deliberate attempt to falsify information, whereas misinformation is a little different." — Hillary Coover
The Challenges of Policing Social Media Content
The episode dives into the complexities of managing content on social media platforms, where Tracy Bannon and Hillary discuss the delicate balance required to combat harmful content without infringing on freedom of speech or accidentally suppressing valuable discourse. As part of this discussion, they mention their intention to revisit and discuss the book "Ministry of the Future," which explores related themes. Suggesting that this novel offers insights that could prove valuable in understanding the intricate challenges of regulating social media. There is a shared concern about the potential for an overly robust censorship approach to hinder the dissemination of truth as much as it limits the spread of falsehoods.
The Erosion of Face-to-Face Political Dialogue
The conversation transitions to the broader societal implications of digital dependency. Specifically addressing how the diminishment of community engagement has led individuals to increasingly source news and discourse from digital platforms. This shift towards isolationistic tendencies, amplified by the creation of digital echo chambers, results in a decline of in-person political discussions. As a result, there is growing apprehension about the future of political discourse and community bonds, with Hillary and Tracy reflecting on the contemporary rarity of open, face-to-face political conversations that generations past traditionally engaged in.
The Shadow of Foreign Influence and Election Integrity
Challenges in India’s Multiparty Electoral System
In the course of the discussion, the complexity of India's electoral system, with its multitude of political parties, is presented as an example that underlines the difficulty in verifying information. The expansive and diversified political landscape poses a formidable challenge in maintaining the sanctity of the electoral process. The capability of AI to produce deepfakes further amplifies the risks associated with distinguishing genuine content from fabricated misinformation. The podcast conversation indicates that voters, particularly in less urbanized areas with lower digital literacy levels, are especially vulnerable to deceptive content. This magnifies the potential for foreign entities to successfully disseminate propaganda and influence election outcomes.
Election Integrity and AI: "Misinformation and disinformation, they're not new. The spread of that is certainly not new in the context of elections. But the AI technology is exacerbating the problem, and and we as a society are not keeping up with our adversaries and social media manipulation. Phishing and social engineering attacks enhanced by AI technologies are really, really stressing stressing the system and stressing the election integrity." — Hillary Coover
Countering Foreign Disinformation Campaigns in the Digital Age
With a focus on the discreet yet potent role of foreign intervention in shaping narratives, Hillary spotlights an insidious aspect of contemporary political warfare, the exploitation of media and digital platforms to sway public perception. This influence is not just limited to overt propaganda but extends to subtler forms of manipulation that seed doubt and discord among the electorate. As the podcast discussion suggests, the consequences of such foreign-backed campaigns could be significant, leading to polarization and undermining the foundational principles of democratic debate and decision-making. The potential for these campaigns to carry a vengeful weight in political discourse warrants vigilance and proactive measures to defend against such incursions into informational autonomy.
Addressing the Impact of Disinformation Through AI's Historical Representation Bias
Tackling Disinformation: AI Bias and the Misrepresentation of Historical Figures
The discussion on AI bias steers toward concrete instances where AI struggles, as Tracy brings forth examples that illustrate the inaccuracies that can arise when AI models generate historical figures. Tracy references a recent episode where Google's Gemini model was taken offline after it incorrectly generated images of German soldiers from World War 2 that did not match historical records. Similar errors occurred when the AI produced images of America's Founding Fathers that featured individuals of different racial backgrounds that did not reflect the true historical figures. These errors are attributed not to malicious intent by data scientists but to the data corpus used in training these models. This segment underscores the significant issues that can result from AI systems when they misinterpret or fail to account for historical contexts.
The Necessity of Addressing AI Bias
Continuing the conversation, Hillary emphasizes the importance of recognizing and addressing the biases in AI. She advocates for the vital need to understand historical nuances to circumvent such AI missteps. Both Hillary and Tracy discuss how biased news and misinformation can influence public opinion and election outcomes. This brings to light the critical role historical accuracy plays in the dissemination of information. They point out that to prevent biased AI-generated data from misleading the public, a combination of historical education and conscious efforts to identify and address these biases is necessary. The recognition of potential AI bias leads to a deeper discussion about ensuring information accuracy. Particularly with regard to historical facts that could sway voter perception during elections. Tracy and Hillary suggest that addressing these challenges is not just a technological issue but also an educational one. Where society must be taught to critically evaluate AI-generated content.
The Challenge of Community Scale Versus Online Influence
Combating Disinformation: The Struggle to Scale Community Engagement Versus Digital Platforms' Reach
The dialogue acknowledges the difficulty of scaling community engagement in the shadow of digital platforms' expansive reach. Hillary and Tracy delve into the traditional benefits of personal interactions within local communities, which often contribute to more nuanced and direct exchange of ideas. They compare this to the convenience and immediacy of online platforms, which, while enabling widespread dissemination of information, often lack the personal connection and accountability that face-to-face interactions foster. The challenge underscored is how to preserve the essence of community in an age where online presence has become overpowering and sometimes distancing.
Navigating the Truth in the Digital Age: “Don't get your news from social media. And then another way, like, I just do a gut check for myself. [...] I need to go validate." — Hillary Coover
Impact of Misinformation and Deepfakes on Political Discourse
The episode reiterates the disquieting ease with which political discourse can be manipulated through deepfakes and misinformation. Showcasing the capabilities of AI, Tracy recalls a deepfake scam involving fake professional meetings which led to financial fraud. These examples underscore the potential for significant damage when such technology is applied maliciously. Hillary emphasizes the critical need to approach online information with a keen eye, pondering the origins and credibility of what is presented. Both Tracy and Hillary stress the importance of developing a defensive posture towards unsolicited information. As the blurring lines between authentic and engineered content could have severe repercussions for individual decisions and broader societal issues.
Stress Testing and Mitigating Disinformation in Election Security Strategies
The Role of Stress Tests in Election Security
Hillary and Tracy discuss the importance of conducting stress tests to preemptively identify and mitigate vulnerabilities within election systems. These tests, which include red teaming exercises and white hat hacking, are designed to replicate real-world attacks and assess the systems' responses under duress. By simulating different attack vectors, election officials can understand how their infrastructure holds up against various cybersecurity threats. This information can be used to make necessary improvements to enhance security. The goal of these stress tests is to identify weaknesses before they can be exploited by malicious actors. Thereby ensuring the integrity of the electoral process.
Mitigating the Impact of Disinformation
The conversation emphasizes the urgent need for preemptive measures against disinformation, which has grown more sophisticated with the advent of AI and deepfakes. As these technological advancements make discerning the truth increasingly difficult, it becomes even more crucial for election officials to prepare for the inevitable attempts at spreading falsehoods. Through stress tests that incorporate potential disinformation campaigns, officials can evaluate their preparedness and response strategies. Including public communication plans to counteract misinformation. By considering the psychological and social aspects of election interference, they aim to bolster defenses and ensure voters receive accurate information.
Election Security Concerns: "Other instances are going to happen where criminals are gonna be impersonating legitimate sources to try to suppress voters in that case, or steal credentials, spread malware." — Hillary Coover
Importance of Proactive Approaches to Election Safeguarding
The exchange between Tracy and Hillary reveals a clear consensus on the necessity of proactive strategies for protecting elections. Proactively identifying potential threats and securing electoral systems against known and hypothetical cyber attacks are central to defending democratic processes. By focusing on anticipation and mitigation, rather than simply responding to incidents after the fact, authorities can improve election security and reinforce public trust. This proactive stance is also crucial in dealing with the spread of disinformation, which may be specifically tailored to exploit localized vulnerabilities in the electoral infrastructure.
Reflecting on the Challenges of Election Security in the Digital Era
This episode serves as a thorough examination of the challenges posed by digital communication in modern democracies. They delve into the dangers of misinformation and the manipulation of public opinion, highlighting how biases in AI can affect the information that individuals receive. They underscore the importance of stress-testing election systems against digital threats and recognize the complexities inherent to securing contemporary elections. The episode ultimately helps listeners to better grasp the ever-evolving landscape of election security and the continued need for informed, strategic action to safeguard democratic processes.
About Our Guest
Hillary Coover is one of the hosts of It’s 5:05! Podcast, covering news from Washington, D.C. Hillary is a national security technology expert and accomplished sales leader currently leading product strategy at G2 Ops, Inc.
Episode Links
- Billy Joel - Turn the Lights Back On
- Deepfakes and AI: How a 200 Million Scam Highlights the Importance of Cybersecurity Vigilance
- The Ministry for the Future: A Novel
- It’s 5:05! Podcast