
Singapore seeks ‘online safety’ rules to remove or block harmful social media content
- Examples of content that could be blocked reportedly include viral social media challenges that encourage young people to perform dangerous stunts
- The rules would push platforms to take greater responsibility for user safety and shield users from harmful content, said communications minister Josephine Teo
The envisioned rules also require social media platforms to implement community standards and content moderation processes to shield users from harmful content, communications minister Josephine Teo said in a Facebook post on Monday. The guidelines, which will undergo public consultation starting next month, push social media companies to take greater responsibility for user safety, she said.

Singapore has long defended the need for laws to police content on the internet, saying the island nation is especially vulnerable to fake news and misinformation campaigns given that it’s a financial hub with a multi-ethnic population that enjoys widespread internet access.
The Southeast Asian nation joins countries such as Australia, Germany and Britain, which have enacted or proposed online content and safety laws.
Governments are concerned about their influence in an era where more and more people get their news online and through social media.
‘The most powerful law’ in Singapore: Foreign Interference bill sparks concern
Under the proposed laws, examples of content that could be blocked include live-streamed videos of mass shootings and viral social media challenges that encourage young people to perform dangerous stunts, according to The Straits Times. They will also take into account sensitive issues such as race and religion, the newspaper added.
“Online safety is a growing concern and Singapore is not alone in seeking stronger safeguards for our people,” Teo said in her post. “There is a growing global movement pushing to enhance online safety, recognising harms come along with the good when people engage on social media.”
