Home Page News Briefs People Media Telecom Calendar Archives About Us
Advertising Subscribe Reuse & Permissions
The Hill Times Parliament Now The Lobby Monitor HTCareers Classifieds

Social media threats challenge gov’t, platforms ahead of election

News | 08/30/2018 3:12 pm EST
Photo via Pxhere.

In the three years since the last Canadian federal election, emerging technologies such as automated bots have rapidly evolved to make it easier to spread misinformation on social media platforms, which experts say has the potential to confuse and radicalize the voting public ahead of next year’s election.

These new forms of deceit are contained in targeted advertisements on social media designed to mislead Canadians, for example, as to where their polling station is, while new technologies like artificial intelligence and automated accounts and bots make it harder to determine what’s good and bad information, experts agreed. Illustrating this advancing technology is artificial intelligence known as “deepfakes,” which first emerged in 2017 and which seamlessly superimposes people in videos they are otherwise not involved in.  

The 2019 general election will be Canada’s first since the Cambridge Analytica revelations unveiled an exploitation of social media data to influence elections around the world, and the first since it was uncovered that foreign interference plagued elections in a number of other countries.

“We know that people are using social media in different ways, and one of those tactics that we’ve seen in other countries is this attempt to confuse people and obfuscate reality,” Elizabeth Dubois, a law professor at the University of Ottawa, said in a phone interview. “Even in the last election [in 2015], it was relatively difficult for the average citizen to create automated accounts on Twitter, Facebook, or other sites, but that’s getting easier and easier,” she added.

In fact, an annual report by the Commissioner of Canada Elections published in May this year noted that at the time of the 2015 election, “emerging technologies had posed relatively little concern to the enforcement of Canada’s election laws.”

The commissioner added, as noted in the report, that “the shift towards the use of social media, both by political and non-political entities, would most likely give rise to issues that the Act is not currently designed to accommodate.” The Canada Elections Act has no specific rules about bots or social media platforms.   

Dubois said that while she doesn’t have data or evidence suggesting voter suppression tactics in this form happens in Canada, it has happened in other jurisdictions.

Erinn Atwater, research director at Vancouver-based non-profit Open Privacy Research Society, said in an email that one potential form that could take is incorrect dates and locations of polling stations and incorrect information about identification requirements to vote.

Bret Schafer, social media analyst at United States-based advocacy group Alliance for Securing Democracy, said in an email that three years ago, “very few Canadians or Americans were aware of the tactics being deployed on social media by bad actors, and governments in both countries were doing little to address the problem and educate their publics.”

Back then, few were actually aware of automation to manipulate trending algorithms, which determine what topics are seen and interacted with the most. Nor were very many aware of the use of fake personas to infect public discourse or the number of fake websites and news outlets that contaminate online discourse.

And while Schafer said we are now “far more aware” of the problem and are better prepared to defend against that malevolent online influence, the tools used by malicious actors, such as Russia, to commit these influence campaigns are now being adopted by many other actors, foreign and domestic. U.S. intelligence officials have said that Russia definitively meddled in the 2016 U.S. election.

“We haven’t figured out how to adequately defend ourselves and respond to the kinds of attacks that we saw in the 2016 U.S. presidential election, let alone threats associated with new and emerging technologies,” Schafer said. “That should be of great concern to the Canadian government.”

What’s also hazy is who could be behind some of these threats. Janis Sarts, director of the NATO Strategic Communications Centre of Excellence in Latvia, told The Hill Times this summer that while Russia might be the most obvious threat, there’s no way to determine who could be interested in targeting Canada. “I believe what we’re experiencing right now is that more and more countries are getting one or other capability in that area… it’s not necessarily they would choose to do it, but they would have some capacity,” he said. Sarts also noted that non-state actors, such as extremist groups, are also gaining these types of abilities, The Hill Times reported.

In acknowledging the increasing role social media will play in the 2019 election in its May report, the elections commissioner said it has and will urge the social media platforms to communicate with it to “seek a firm commitment to doing everything in their power to facilitate the work” of its office, particularly related to gathering evidence for its investigations.

The Wire Report reached out to Facebook Inc. and Twitter Inc. The former did not get back but the latter sent a number of blog posts on its news page addressing these issues. Twitter’s initiatives include working on finding and eliminating more spam and malicious automated accounts while reducing their visibility. To illustrate, it said its systems in May “identified and challenged” over 9.9 million “potentially spammy or automated accounts per week,” up from 6.4 million in December 2017 and 3.2 million in September.  It also launched an ad transparency centre that allows anyone to search who is advertising what on the platform within the last seven days.

Similarly, Alphabet Inc. said in mid-August that it added a section to its twice-yearly transparency report to show who buys U.S. election ads and how much is spent on political ads on Google, according to a Reuters report.

Facebook has also released a series of initiatives to combat what ails it, especially in the wake of the Cambridge Analytica revelations. It has deleted a number of apps that were used to scrape data from users. Many apps use Facebook as a method for users to sign up, since it’s often more convenient than creating a separate log-in. Facebook asks app developers to submit to a review process before it can approve them to better protect personal information. It also said in August that users of its photo-sharing subsidiary Instagram will be able to check the authenticity of accounts to see from what country they originate and what ads they are running.

Facebook has also put in initiatives to test more transparent ads, including political ones, and has previously added tools to combat false information on its platform. Most recently, the California-based company said it banned the accounts, followed by 12 million people, of military officials from Myanmar for their role in what the United Nations is calling a genocide perpetrated on the minority Rohingya Muslim population. “We want to prevent them from using our service to further inflame ethnic and religious tensions,” Facebook said in the Aug. 27 blog post.

Being a democracy, the Canadian government has historically been agnostic toward content on the internet, which has led to the perception that it has placed an emphasis on cybersecurity over that of information security.

The issue has therefore become a bit of a hot potato because Facebook and Google deny being “arbiters of truth” — words they used in front of a Parliamentary committee last year addressing the issue of fake news.

But Nicky Cayer, spokeswoman for the Minister of Democratic Institutions Karina Gould, said in an email that social media and online platforms have become the “new arbiters of information,” and while they have “begun to take initial steps to address” issues such as foreign interference, data misuse, hate speech and misinformation, “it’s clear that much more needs to be done.”

She noted that Gould introduced proposed Bill C-76 to modernize the elections process, which eliminates foreign ad spending during that period and makes it easier to prosecute those players who try to influence the country’s elections. Third parties selling ad space will also not be allowed to knowingly run foreign-funded ads, though how the government will enforce this wasn’t made clear.  

“We don’t really have good ways of defending against information operations,” Chris Parsons, managing director of the University of Toronto’s Citizen Lab, said in a phone interview. “One of the reasons is we are set-up as democracies to be ill-prepared to respond to them” because we value free speech, freedom of association and the ability to communicate freely.

Other departments within the government, for their part, have put out reports on both cyber threats and disinformation campaigns. The Communications Security Establishment (CSE) released one in June 2017 on cyber threats to the country’s democratic processes. The conclusion of that report was that while cyber threat activity against electoral events is increasing around the world and Canada was targeted by “low-sophistication” cyber threat activity in 2015, it did not suffer an infiltration that would’ve influenced the democratic process. That report added that it expects that there will be other similarly low-sophistication attempts next year.

It noted that adversaries spread disinformation and propaganda to shape the opinions of voters. Those strategies, the CSE added, include hijacking social media accounts, creating social media accounts to act as trustworthy producers and disseminators of information, have groups of people spread propaganda on comment sections and on social media — called “troll farms” — and similarly have a group of computers controlled by one user — called botnets — to spread that information.

The agency said 51 per cent of Canadians receive news from digital sources first and 22.4 million Canadians access Facebook daily.

In its February 2018 report called Security Challenges of Modern Disinformation, the Canadian Security Intelligence Service (CSIS) said that the nature of the internet and social media has “partially displaced” conventional journalism with a “torrent of data from an infinite number of originators,” which features a “current of lies and distortions that threatens the integrity of public discourse, debate and democracy.”

The goal of these “adversaries,” as the CSE calls them, isn’t immediately to delegitimize the election, Parsons said, but it’s to “make it very challenging for a governing party to be able to find a consensus and govern.” That, he added, makes it difficult for politicians to run as moderates, causing Parliament to be that much more chaotic. “You can imagine all the political parties, I would presume, are the targets of third-party adversaries on a regular basis.”

To be sure, the CSE notes in its report that part of discrediting a political party or politicians is to riddle them with “obscene or misleading information, which can fool voters and embarrass the politician.” The intention is that that information will make mainstream news and knock the party “off message, even if only temporarily.” The chilling effect on democracy, it added, is that it could be a disincentive for qualified candidates to run for office to avoid the potential negative impact on their personal life and reputation.

Depending on the timing, it said, that could become a “major turning point in a close election campaign.”

To illustrate, it is now widely considered that Russia meddled in Britain’s June 2016 referendum to leave the European Union. A January 2018 U.S. Senate foreign relations minority report noted that the influence was similar to other elections thereafter — including the U.S.’s own later that year — which included “common elements” such as “false or inflammatory stories circulated by bots and trolls [and] allegations of cyber hacking.” That can’t be a coincidence, the report suggested.

The U.S. National Security Agency (NSA) also said Russians meddled in the 2017 French election, according to a report from publication Wired.

“I don’t think the federal government is worried enough,” Nathan Cullen, MP and NDP critic for democratic reform, said in a phone interview. Cullen recalls that people, at one point, didn’t mind getting targeted ads, but it’s “all increased so much and it’s become so much more invasive.”

Cullen said more regulations need to be imposed on these platforms, similar to ones that are placed on radio stations and newspapers. Michael Pal, a University of Ottawa law professor, said earlier this year during a panel discussion about the 2019 election that social media companies should be treated like broadcasters — in that they should be held to similar campaign advertising standards.

CSE spokesman Ryan Foreman said in an email that the organization has seen an upward trend in threat activity on the democratic processes around the world over the past five years and it’s highly probable that the volume and sophistication will increase in the next year and beyond. It said it has held “productive meetings” with political parties, parliamentarians and electoral officials to talk about the issues and said it will release an updated public report on cyber threats to Canada’s democratic institutions earlier next year.

Meanwhile, Melanie Wise, assistant director at Elections Canada, said in an email that the independent agency will be running a public information campaign to “tell electors when, where, and how to register and vote, using TV and radio ads, web content, videos, social media, mailings, etc.” She said the agency will also be working with all the relevant government bodies domestically and around the world to tackle the issue.

But some are not so sure about how serious an impact these campaigns can have on the political discourse. Wilfrid Laurier political science professor Jason Roy said in a phone interview the impact of these kinds of influences is “fairly small, if there at all.” That’s in part because there’s either too much information, meaning that the average citizen may just block it out, or they will look for “reinforcing” messages. Dubois noted that these “reinforcing” messages lead to “echochambers,” which are robbing people of a wholesome view of issues.

“The more attention it receives in the mainstream, that certainly increases the chances that they receive it,” Roy added. “Whether they accept it, is something different.”

While it’s anyone’s guess whether foreign actors will impact next year’s election in a tangible way, from Parsons’ point of view, it doesn’t help that certain provisions in the government’s proposed new national security legislation could invite such interference. Bill C-59 proposes to enhance the CSE’s powers, the most relevant of which is to engage in active cyber operations.

That, Parsons said, could be viewed by the Kremlin as authorization for governments to meddle with each others’ elections, which could make Canada “fair game” for interference.  

— With reporting by Ahmad Hathout at ahathout@thewirereport.ca and editing by Anja Karadeglija at akarad@thewirereport.ca

Related Stories

Conservatives name Michelle Rempel industry critic 

Briefs | 11/29/2019 4:12 pm EST

The Conservative Party has announced its shadow cabinet ahead of the Dec. 5 return of the House of Commons, with MP Michelle Rempel taking on the telecom file. She said in a statement Friday...

New public safety minister, wireless prices on agenda in Liberal minority

News | 10/22/2019 6:35 pm EST

With the Liberal Party’s victory in the federal election Monday, its innovation, heritage, and rural economic development ministers are all set to return to the House of Commons — but there will...

CPC plans for Netflix, rural broadband still unclear days ahead of vote

News | 10/17/2019 12:54 pm EST

Despite Conservative leader Andrew Scheer indicating throughout this fall’s election campaign that he would make sure Netflix Inc. pays its “fair share,” his party is refusing to clarify whether...

Reuse & Permissions

Unauthorized distribution, transmission, reuse or republication of any and all content is strictly prohibited. To discuss re-use of this material, please contact:

Chris Rivoire, 613-288-1146 | crivoire@hilltimes.com