“We have 35,000 people working on safety and security”
“Britain needs to sort out its own electoral laws.”
That was the blunt message from Facebook this week, as it struggled to convince policy makers and pundits that it is doing enough to tackle fake news, misleading campaigning and opaquely funded political adverts.
“We shut down millions of fake accounts every day… We have 35,000 people working on [election] safety and security… every political ad is labelled” said Rebecca Stimson, Facebook’s head of UK public policy on a call.
“We have brought together an Elections Taskforce of people from our teams across the UK, EMEA and the US who are already working together every day to ensure election integrity on our platforms”, she added Thursday.,
“No Evidence of Widespread Foreign Operations”
Nathaniel Gleicher, head of cybersecurity policy, added: “My team has not seen evidence of widespread foreign operations aimed at the UK.”
(That comment came as former Conservative Party attorney general Dominic Grieve alleged that Prime Minister Boris Johnson had personally prevented publication of a parliamentary report into alleged Russian interference into the Brexit referendum, despite it having been signed off by intelligence services).
And when it comes to egregiously false claims made by candidates or parties on Facebook, that’s a matter for regulators, not the company, it said.
Detailing some of the changes Facebook has made to its own processes – with the upcoming election the first since they were made – the social media giant, which has over 36 million UK users, said that:
- Those running political adds need to undergo a verification process to prove who they are and that they are based in the UK;
- Every political ad is labelled: viewers can see who has paid for them;
- Anybody can click on any ad they see on Facebook and get more information on why they are seeing it, as well as block ads from particular advertisers;
- Political ads are put in an Ad Library for seven years so that everyone can see what ads are running, the types of people who saw them and how much was spent
Comments Follow Parliament’s “Disinformation” Report
Yet as Facebook’s Stimson put it this week: “The UK has decided that there shouldn’t be rules about what political parties and candidates can and can’t say in their leaflets, direct mails, emails, billboards, newspaper ads or on the side of campaign buses.
She added: “Questions around what constitutes a political ad, who can run them and when, what steps those who purchase political ads must take, how much they can spend on them and whether there should be any rules on what they can and can’t say – these are all matters that can only be properly decided by Parliament and regulators.”
“We believe UK electoral law needs to be brought into the 21st century to give clarity to everyone – political parties, candidates and the platforms they use to promote their campaigns.”
The call comes nine months after Digital, Culture, Media and Sport Committee has published its final report on Disinformation and ‘fake news’.
It identified a “disturbing disregard for voters’ personal privacy” in the use of data analytics in political campaigns, adding that evidence given to the Committee shows that current electoral law is not fit for purpose. “It has failed to reflect a move away from billboards and leaflets to online micro-targeted campaigning.”
As Damian Collins MP, Chair of the DCMS Committee acknowledged at the time: “We also have to accept that our electoral regulations are hopelessly out of date for the internet age. We need reform so that the same principles of transparency of political communications apply online, just as they do in the real world.”
Fact-Checkers Have (Some) Influence…
In order to tackle misinformation and fake news on the platform Facebook says that it has enlisting the help of third-party fact-checkers who are now reviewing content and rating its accuracy. In the UK this will be undertaken by Full Fact and FactCheckNI. These organisations will be sent content that has been flagged by people reporting suspicious posts and machine learning systems.
However, these third-party groups are under no obligation to check this content, as Antonia Woodford, product manager of misinformation at Facebook notes: “These fact-checkers are independent organizations, so it is at their discretion what they choose to investigate.” Even if these groups do fact check a post and find it to be deeply misleading, all Facebook has committed to do in that instance is heavily down rank the content in the news feed so “It’s seen by fewer people and far less likely to go viral.”