Facebook CEO Mark Zuckerberg has called for governments and regulators to take “a more active role” in policing content online following New Zealand terror attacks.
In an open letter released yesterday, Zuckerberg suggested the responsibility of online regulation is too big a job for firms alone and said if we were starting from scratch in policing the web, that “we wouldn’t ask companies to make these judgements alone”.
The statement comes more than two weeks after the New Zealand terrorist attacks were live-streamed on Facebook, where 50 individuals were murdered by white supremacist Brenton Tarrant, for practising their right to express religious freedom.
The reputation and transparency of the firm have been challenged by governments and regulators for a while now, with the social media giant famously refusing to answer questions at a Facebook committee hearing last year, where parliaments from five countries called for Zuckerberg to attend an inquiry into fake news and disinformation.
In a co-signed letter from the five parliamentary committees, the chairs wrote: “We call on you… to take responsibility to Facebook users, and to speak in person to their elected representatives”, but despite this, he still didn’t emerge.
But the 34-year-old boss has seemingly changed his tune.
“Lawmakers often tell me we have too much power over speech, and frankly I agree”, he said in an open letter.
And has suggested that regulation is needed in four areas including harmful content, election integrity, privacy and data portability – with the implementation of an independent body so people are able to appeal Facebook’s decisions.
Regarding harmful content, Zuckerberg said internet companies should be held accountable for enforcing standards, however, stated that it is “impossible” to remove all such content from the web due to the amount of sharing services that have their own policies and processes.
Nevertheless, Facebook is still using themselves as an example of how high standards are enforced online, despite being so reluctant to address the Cambridge Analytica scandal and the dissemination of Russian disinformation during the 2016 US presidential campaign.
In the open letter released on Saturday, Zuckerberg said Facebook already publishes “transparency reports” on how effectively they are removing harmful content. And subsequently, suggested every other major internet service should do the same due to it being “just as important as financial reporting.”
He said: “Once we understand the prevalence of harmful content, we can see which companies are improving and where we should set the baselines.”
But notwithstanding the statement, Facebook was clearly responsible for the spread of Brenton Tarrant’s terrorist attack, and with it being live-streamed and shared 1.5 million times – the online firm still has a way to go.