With more than 500 million members, Facebook would be the third-largest nation if it were a country. The website has become the world's largest social network and although it brings people together, it has its downsides, such as free speech and the dilemmas associated with it.
Recently, the Facebook team responsible for taking down content that is either illegal or against the rules came into the public eye when the WikiLeaks controversy caught the world in a whirlwind. It blocked WikiLeaks supporters from trying to organise attacks on corporate groups, but it did not take down WikiLeaks' own Facebook pages.
This stirred debate on whether Facebook is rightly policing the site's content and it revealed the network's difficulties in defining what is acceptable and what is not. Facebook's 'police unit' monitors everything from controversies to religious prejudice to cyber bullying. And with the power to remove any content deemed inappropriate, it appears to have the ultimate control on 'free speech' inside the network.
This puts a huge responsibility on the organisation in that it is accountable for preventing cyber aggression and abuse among its users. It is impossible to please everyone and despite efforts, complaints about the site's efficiency at eradicating offensive messages still arise. But is it really fair to hold the network responsible for all the messages posted?
It is undeniable that Facebook has the responsibility of keeping the network a safe and tolerant place for users, but ultimately, it cannot be accountable for the behaviour of everyone. If it is impossible to combat intolerance in the real world completely, let alone online.
There have been cases of parents complaining about their children's safety on the network. Parents have also complained the website and has not tried to improve its tools to protect young users.