Facebook: Nick Clegg says 'no evidence' of Russian interference in Brexit vote

  • Published
Sir Nick CleggImage source, PA

There is "absolutely no evidence" Russia influenced the Brexit result using Facebook, the company's vice-president, Sir Nick Clegg, has said.

The former deputy PM told the BBC the company had carried out analyses of its data and found no "significant attempt" by outside forces to sway the vote.

Instead, he argued that "the roots to British euroscepticism go very deep".

In a wide-ranging interview, Sir Nick also called for more regulation of Facebook and other tech giants.

In response, Damian Collins, chair of the Digital, Culture, Media and Sport select committee, tweeted that Sir Nick was wrong to suggest that there was no Russian interference on Facebook during the referendum, quoting a link to research carried out by a communications agency.

Sir Nick, the former leader of the Liberal Democrats and deputy prime minister during the coalition government, was hired by Facebook in October last year.

In an interview with BBC Radio 4's Today programme, he said Facebook was now arguing for greater regulation of tech firms.

He said there was a "pressing need" for new "rules of the road" on privacy, election rules, the use of people's data and adjudicating on what constitutes hate speech.

It follows growing criticism of the tech giant and calls from MPs for far stricter regulation over issues including fake news, harmful content and the way user data is used.

Asked whether Facebook should not be fixing some of these issues itself, Sir Nick said it was not something big tech companies "can or should" do on their own.

"It's not for private companies, however big or small, to come up with those rules. It is for democratic politicians in the democratic world to do so," he said.

But he stressed companies like Facebook should play a "mature role" in advocating - rather than shunning - regulation.

Deputy Labour leader Tom Watson tweeted that Sir Nick's proposed remedy of an oversight committee was inadequate.

This Twitter post cannot be displayed in your browser. Please enable Javascript or try a different browser.View original content on Twitter
The BBC is not responsible for the content of external sites.
Skip twitter post by Tom Watson

Allow Twitter content?

This article contains content provided by Twitter. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. You may want to read Twitter’s cookie policy, external and privacy policy, external before accepting. To view this content choose ‘accept and continue’.

The BBC is not responsible for the content of external sites.
End of twitter post by Tom Watson

'Conspiracy'

In the interview, Sir Nick dismissed claims that data analytics firm Cambridge Analytica influenced people's decision to vote Leave in the EU referendum in 2016.

Media caption,

Sir Nick Clegg tells Today that "the roots to British euroscepticism go very deep"

"Much though I understand why people want to sort of reduce that eruption in British politics to some kind of plot or conspiracy - or some use of new social media through opaque means - I'm afraid the roots to British Euroscepticism go very, very deep," he said.

Instead, he argued attitudes had been influenced far more by "traditional media" over the last 40 years than by new media.

The scandal around the way data was used by Cambridge Analytica was first exposed by Carole Cadwalladr, an investigative journalist at the Guardian newspaper.

Image source, Reuters
Image caption,
Guardian journalist Carole Cadwalladr exposed the Cambridge Analytica data scandal

Christchurch attack video

Sir Nick also claimed the company was getting better at removing harmful content, saying it was a "matter of minutes" before the first video of the Christchurch mosque shooting was removed.

A video of March's attack, in which 51 people were killed, was livestreamed on Facebook.

The issue, he said, was the huge numbers of people reposting that initial video afterwards, including the mainstream media.

"In the case of Facebook, I think 200 people saw the video as it was being livestreamed," he said.

But in the 24 hours following the shooting, Sir Nick said Facebook took down 1.5 million versions of the video. He said about 1.3 million of those were removed before they were reported.

Self-harm images

Sir Nick was also asked about how well Instagram - which is owned by Facebook - was responding to images of self-harm on the platform.

After 14-year-old Molly Russell took her own life in 2017, her family found distressing material about depression and suicide on her Instagram feed.

Sir Nick said Instagram had spent a lot of time with experts on teenage mental health and had been told it was "important to allow youngsters to express their anguish", including allowing them to post images of self-harm.

Image source, PA wire
Image caption,
Molly Russell died in November 2017

"We have now shifted things dramatically. We take down all forms of graphic content. The images that are still available on Instagram have a sort of filter, if you like, so they can't be clearly seen," he said.

On wider attitudes towards the sector, Sir Nick said there had been a shift in recent years from "tech utopia" - where people like Facebook's Mark Zuckerberg "could do no wrong" - to a culture of "tech phobia".

But he cautioned against any excessive backlash against technology: "I think we end up with the risk that we throw the baby out with the bathwater and make it almost impossible for tech to innovate properly."

"Technology is not good or bad," he said. "Technology down the ages is used by good and bad people for good and bad ends."

Facebook's recent enthusiasm for regulation marks a bit of a contrast from Mark Zuckerberg's previous refusals to meet with UK politicians on the subject of the spread of fake news and inappropriate content.

Having once dismissed the notion that Russia used Facebook to try to interfere with the US Presidential election in 2016 as "a pretty crazy idea" Zuckerberg was forced to backtrack when it became apparent that state actors were indeed at work posting material deliberately designed to divide opinion.

The tech giant now realises that regulation is inevitably coming its way, and perhaps feels it's more strategic for it to be as involved as it can be in the creation of any new rules and whichever body would enforce them.

There are many examples of occasions when Facebook has failed to self-police - no small feat with 2.3 billion users posting their own material in real time - and by playing ball with national or international regulation perhaps it absolves itself of some of that heavy responsibility.