The Speaker
Saturday, 20 July 2024 – 07:59
Facebook Website

Why the way Facebook can be utilised needs urgent addressing

We now live in a world where mass shootings can be live streamed and watched all over the globe.

The deaths of 50 people in Christchurch, New Zealand last week has resurfaced debate on whether social media firms do enough to regulate their sites.

Facebook has said the 17-minute video was viewed fewer than 200 times while it was live but was not first reported until at least 12 minutes after the broadcast had ended.

This was only after a user on 8chan, an alt-right sharing site, posted a link to a copy of the video there, giving it another audience.

In the first 24 hours after the shootings 1.5 million videos of the attack were removed but some edited copies or recordings proved harder for them to censor.

Facebook has given billions of people the ability to broadcast live, while unable to moderate this content at the same time.

The company, valued at nearly $140 billion, has rightly been heavily fined recently over data breaches and privacy violations but Facebook has yet to face comparable fines for the prevalent hateful content that plagues its platform.

Clouded in memes and other niche forms of internet humour, hate speech perpetrated by the extreme far-right can be hard for Facebook to spot, presenting a growing problem as this political sect have become well-equipped at using social media to spread its ideology.

New Zealand Prime Minister Jacinda Ardern has criticized Facebook’s handling of the footage, and this has been met by similar assertions from other prominent politicians.

She is right in pointing how the attacker sought notoriety from his actions, and even more just in declaring “you won’t hear me speak his name” to avoid giving him that satisfaction.

In a letter to Shinzo Abe, Prime Minister of Japan and chair of the 2019 G20 Summit, Australia’s Prime Minister, Scott Morrison, said: “it is unacceptable to treat the internet as an ungoverned space.”

Home Secretary Sajid Javid said in the Daily Express that social media platforms “have a responsibility not to do the terrorists’ work for them.”

Perhaps there are some routes the UK government could take here, with inspiration from what is happening elsewhere.

Germany caused a stir with the implementation of its Network Enforcement Act (NetzDG) in January 2018 that requires social media firms to remove “obviously illegal” hate speech within 24 hours of being notified of it, or face fines of up to £44 million.

Commonly known as Germany’s ‘Facebook Law’, the act has come under scrutiny from free speech advocates considering Facebook’s approach too restrictive.

To others, it is seen as a test of whether sites like Facebook can be counted on to separate free speech from hate speech on its platforms.

Tweets and posts from far-right politicians in Germany have been taken down since this law was implemented, but it has had the effect of also censoring comment and satirising of such speech.

One German tabloid, Bild, said in an op-ed the NetzDG should be scrapped as it was turning some far-right politicians into “opinion martyrs”.

Reuters uncovered how over 6 months after the NetzDG was enforced, Facebook had employed 65 people to handle content under the newly imposed act.

It is said that 1,704 complaints were made under the law, with 362 posts being removed between January and June 2018.

Maybe the UK isn’t calling for an exact copy of this legislation, but it still shows how these tech-giants can be held to account for what is on their platform, through fear of significant fines.

If assurances can be met that don’t impact on free speech or people commenting on hateful content still online, legislation on this path in the UK would restrict the spreading of hateful, inflammatory and inciteful content.

Last week’s tragic events in New Zealand showed what this content can dangerously manifest into, with at least some responsibility lying with our government to ensure something similar doesn’t happen here and even more resting on Facebook to stop hateful content becoming widely accessible in any form.

This article was originally posted at by Gurjeet Nanrah, a student journalist at Nottingham Trent University.

Skip to content