Facebook: the cyber police?

Joyce Wong, Renaissance College

Social network faces dilemma over the filtering of content on website

Joyce Wong, Renaissance College |

Latest Articles

BTS’ ‘Dynamite’ and Blackpink’s ‘Ice Cream’ use this secret weapon to become global hits

6 thoughts we had watching the Raiden x Chanyeol remix video

‘The Clockwork Crow’ book review: Catherine Fisher’s gothic YA mystery tale

Ruth Bader Ginsburg, second female Supreme Court Justice, dies, had wished to remain until new US president was installed

Asking for a Friend: Help! How do I stop procrastinating when I have so much freedom because of online learning via Zoom?

With more than 500 million members, Facebook would be the third-largest nation if it were a country. The website has become the world's largest social network and although it brings people together, it has its downsides, such as free speech and the dilemmas associated with it.

Recently, the Facebook team responsible for taking down content that is either illegal or against the rules came into the public eye when the WikiLeaks controversy caught the world in a whirlwind. It blocked WikiLeaks supporters from trying to organise attacks on corporate groups, but it did not take down WikiLeaks' own Facebook pages.

This stirred debate on whether Facebook is rightly policing the site's content and it revealed the network's difficulties in defining what is acceptable and what is not. Facebook's "police unit" monitors everything from controversies to religious prejudice to cyber bullying. And with the power to remove any content deemed inappropriate, it appears to have the ultimate control on "free speech" inside the network.

This puts a huge responsibility on the organisation in that it is accountable for preventing cyber aggression and abuse among its users. It is impossible to please everyone and despite efforts, complaints about the site's efficiency at eradicating offensive messages still arise. But is it really fair to hold the network responsible for all the messages posted?

It is undeniable that Facebook has the responsibility of keeping the network a safe and tolerant place for users, but ultimately, it cannot be accountable for the behaviour of everyone. If it is impossible to combat intolerance in the real world completely, let alone online.

There have been cases of parents complaining about their children's safety on the network. Parents have also complained the website and has not tried to improve its tools to protect young users.

Internet safety is crucial, but to eliminate online threats to children or anyone else for that matter, self-regulation is required. The strengthening of privacy settings on Facebook does not solve the problems.

With free speech comes inevitable problems concerning hate and harassment. It is impossible to dictate what can or cannot be said online, and what is perceived as acceptable or offensive differs for everyone. The fairest approach perhaps would be to either allow everything or ban everything that reveals even the slightest trace of bigotry. But is that realistic?

Facebook did not ask to become the cyber police; it can only do so much at making it a utopia for all.