- Biden AI advisor Ben Buchanan said a way to clearly verify White House releases is "in the works."
- This year, Biden was subject to an AI deepfake used to misinform voters.
- "We recognize the potential for harm," Buchanan told BI.
The White House is increasingly aware that the American public needs a way to tell that statements from President Joe Biden and related information are real in the new age of easy-to-use generative artificial intelligence.
People in the White House have been looking into AI and generative AI since Joe Biden became president in 2020, but in the last year, the use of generative AI exploded with the release of OpenAI’s ChatGPT. Big Tech players like Meta, Google, Microsoft, and a range of startups have raced to release consumer-friendly AI tools, leading to a new wave of deep fakes — an AI-generated robocall last month attempted to undermine voting efforts related to the 2024 Presidential election using Biden’s voice.
The Federal Communications Comission on Thursday declared that such calls are illegal. Yet, there is no end in sight for more sophisticated new generative AI tools that make it easy for people with little to no technical know-how to create images, videos, and calls that seem authentic while being entirely fake.
That’s a problem for any government looking to be a trusted source of information. Ben Buchanan, Biden’s Special Advisor for Artificial Intelligence, told Business Insider that the White House is working on a way to verify all of its official communications due to the rise in fake generative AI content.
Buchanan said the aim is to “essentially cryptographically verify” everything that comes from the White House, be it a statement or a video.
While last year’s executive order on AI created an AI Safety Institute at the Department of Commerce, which is tasked with creating standards for watermarking content to show provenance, the effort to verify White House communications is separate. And Buchanan said it’s “a longer process,” although it is “in the works.”
Ultimately, the goal is to ensure that when anyone sees a video of Biden released by the White House, they can tell immediately it is authentic and unaltered by a third party.
“This is a case where we recognize the potential for harm,” Buchanan said. “We're trying to get ahead of it.”
Are you a tech employee or someone with a tip or insight to share? Contact Kali Hays at khays@insider.com or on secure messaging app Signal at 949-280-0267. Reach out using a non-work device.