A group of 20 tech companies announced on Friday they have agreed to work together to prevent deceptive artificial intelligence content from interfering with elections across the globe this year.
Growing Concerns Over AI Influence in Elections:
The rapid growth of generative artificial intelligence (AI), which can create text, images, and video in seconds in response to prompts, has heightened fears that the new technology could be used to sway major elections this year, as more than half of the world’s population is set to head to the polls.
Signatories of the tech accord, which was announced at the Munich Security Conference, include companies building generative AI models used to create content, including OpenAI, Microsoft, and Adobe.
Other signatories include social media platforms that will face the challenge of keeping harmful content off their sites, such as Meta Platforms, TikTok, and X, formerly known as Twitter.
Commitments Under the Accord:
The agreement includes commitments to collaborate on developing tools for detecting misleading AI-generated images, video, and audio, creating public awareness campaigns to educate voters on deceptive content, and taking action on such content on their services.
The companies said technology to identify AI-generated content or certify its origin could include watermarking or embedding metadata.
Implementation Plans and Timeline:
The accord did not specify a timeline for meeting the commitments or how each company would implement them.
“I think the utility of this (accord) is the breadth of the companies signing up to it,” said Nick Clegg, president of global affairs at Meta Platforms.
Real-World Examples of Deceptive AI Influence:
Generative AI is already used to influence politics and convince people not to vote.
Despite the popularity of text-generation tools like OpenAI’s ChatGPT, the tech companies will focus on preventing harmful effects of AI photos, videos, and audio, partly because people tend to have more skepticism with text, said Dana Rao, Adobe’s chief trust officer, in an interview.
“There’s an emotional connection to audio, video, and images,” he said. “Your brain is wired to believe that kind of media.”