Brain Game: If you could add a new rule to all social media platforms, what would it be?
- Each week, our readers vote for their favourite answer and the contestant with the least votes is eliminated
- This week, contestants say who what rules should exist for Facebook, Instagram, Twitter and more
Social media companies should hire psychologists to look over all posts, photos and videos that discuss mental illness before a user can upload it.
This can help stop the spread of misinformation about mental health, especially posts that make mental illness sound like something to aspire towards, rather than properly deal with. The psychologists will instead approve posts with resources and support for working through these problems.
Regulating content about mental health would especially help young people, who are the main users of social media, because they lack the judgement to filter out harmful content.
For example, there have been cases in which people had seen and reposted content about suicide on their social media accounts before ending their own lives. However, if this rule had been in place, perhaps, they would have found helpful information directing them to a hotline or a professional therapist.
Social media companies have a responsibility to provide their users with content that helps their mental well-being, instead of causing more damage.
Have you ever been exposed to mean, violent or inappropriate content on the internet? Well, 56 per cent of 11- to 16-year-olds have seen explicit and worrying material online. Oftentimes, this exposure happens on social media.
To fight this, social media needs a system where users can vote (after verifying that they are not a bot) to remove inappropriate content. After reaching a certain number of votes, posts will be removed, and the sender will be contacted with a warning that encourages them to be more careful with their words and to reflect on how they can improve.
There will be different ways of making sure this rule is working. Users will read the rule announcement and warnings. And social media companies will work with the government to make sure the rule is keeping harmful content out of their platforms.
Putting this into place will create a safe and inclusive environment for all!
I would get rid of the existing algorithms, which are programmes that decide what individual users see on their social media platforms.
It is scary how much these algorithms have changed humanity today. Because social networks use algorithms to show people content that they are more likely to pay attention to, they end up feeding people with one-sided information. This causes them to only look at posts and videos that repeat what they already believe.
In the end, these social networks are causing these people to ignore other voices and perspectives, and society becomes more and more divided.
In light of this, social media companies need to create a system that balances different points of view, and that values truth and unity, instead of division.