‘Deepfakes’ could trigger election security alert, gov’t officials say
News | 07/09/2019 4:50 pm EST
Artificial intelligence technology allowing videos to be seamlessly manipulated for political purposes could potentially meet a national public interest threshold designated by a panel assigned to weed out election security concerns, federal officials said Tuesday.
The panel of senior civil servants assigned to, if necessary, make extraordinary announcements about possible manipulation attempts on the October federal election has been familiarizing itself with what could trigger the Critical Election Incident Public Protocol (CEIPP), government officials said in a technical briefing. The panel will be meeting during the summer for further consultation, before the writ period in September thrusts them into action if required.
The CEIPP was established in May and “sets out responsibilities for both the digital media platforms and the Government of Canada to help safeguard this fall’s election and to support healthy political discourse and open public debate online,” a government guide said.
Examples of what could trigger a “last resort” panel announcement were noted in a technical briefing on Tuesday.
“The threshold triggering the use of the CEIPP will be limited to addressing exceptional circumstances that could impair Canada’s ability to have a free and fair election, whether based on a single incident or an accumulation of incidents,” a press release said. “Determining whether this threshold has been met will require considerable judgement.”
The Critical Election Incident Public Protocol lays out a simple, clear and impartial process by which Canadians should be notified of a threat to the integrity of the 2019 General Election. https://t.co/c58SCR6O1R #ProtectingDemocracy pic.twitter.com/hXLG7tI1LF
— Canadian Democracy (@CdnDemocracy) July 9, 2019
One example used during the technical briefing was the use of machine learning technology known as deepfakes, which is used to make appear like someone said something they had in fact not said. That technology, which rose to prominence in or around 2017, has been headlining news in recent months after the surfacing of manipulated videos of Facebook Inc. CEO Mark Zuckerberg and U.S. Speaker of the House Nancy Pelosi. Experts this publication previously spoke to noted that social media, with its automated accounts that amplify certain messages and, in this case, spread deepfake videos, will challenge the federal election landscape.
“It is possible, yes,” an official said in an email about deepfakes rising to an election threat level, “however, it is entirely dependent on context and situation and subject to the panel’s judgement.”
Any panel announcement will be delivered to the Prime Minister, the major political parties, and Elections Canada, all of which will get the same briefing information. Once a decision has been made by the panel about what to announce, it cannot be vetoed by anybody, including the Prime Minister. The announcement will include what is known about the event and guidance for Canadians on how to protect themselves.
An extreme result of an interference case could mean the postponing of the election.
The panel is made up of the clerk of the privy council, the Prime Minister’s national security and intelligence advisor, the deputy minister of justice and deputy attorney general, the deputy minister of public safety, and the deputy minister of foreign affairs.
The government said the panel’s diverse background will help it capture a broad body of expertise on a wide array of threats.
The panel will be looking at issues of national importance, not something that can be contained in a single riding, which could include an actor’s covert attempt to misrepresent a party’s position, officials said.
Officials added the panel, which must come to a consensus on what to make an announcement on, will also assess the scope of the harm: whether Canadians are taking the issue “to heart,” if it’s impacting political campaigns, if the harm is being neutralized by civil actors, like journalists, and if the panel could make a useful contribution to mitigating the problem.
The panel will draw on international empirical examples to help it with decision-making, such as the 2017 email leaks associated with the campaign of Emannuel Macron ahead of the French presidential elections and manipulation seen in the 2016 U.S. presidential election, both of which have been linked to Russian interference.
The cabinet directive said the panel’s focus will be on foreign interference, despite the fact that a threat “may emanate from domestic and/or foreign actors,” it said.
In April, the Communications Security Establishment announced that it was “very likely” that there would be some foreign “interference” attempt on the October election. The next month, Facebook Inc. and Microsoft Corp. signed on to a declaration of principles around the promotion of electoral integrity online, aimed at combatting the spread of disinformation on social media sites in the run up to the 2019 federal election. Twitter Inc. and Alphabet Inc. did not sign on at the time.
That announcement followed a fall CSE update that suggested the election landscape was still ripe for “opinion-shaping.”
Officials on Tuesday also said the government expects social media companies to pick up the slack, as it has said in the past, to help mitigate any such interference and to abide by the CEIPP. That can include educating users on disinformation. The government also reiterated its hope that journalists would be able to correct any misinformation that is disseminated on such platforms.
On Tuesday, one official said those companies have been doing a good job taking down misinformation, even though democratic institutions minister Karina Gould has said the government expects them to have more proactive disclosure on what they are doing to combat the malign actors.
There is no guarantee that the panel will ever make an announcement. When asked about whether the government expects there to be an announcement, one official said they hope they don’t have to hear one, “but we have to be ready.”