Facebook on Monday said a human rights report it commissioned on its presence in Myanmar showed it had not done enough to prevent its social network from being used to incite violence. The report by San Francisco-based non-profit Business for Social Responsibility (BSR) recommended that Facebook more strictly enforce its content policies, increase engagement with both Myanmar officials and civil society groups and regularly release additional data about its progress in the country. “The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more,” Alex Warofka, a Facebook product policy manager, wrote in a blog post. BSR also warned that Facebook must be prepared to handle a likely onslaught of misinformation during Myanmar’s 2020 elections, and new problems as use of its WhatsApp grows in Myanmar, according to the report, which Facebook released. A Reuters special report in August found that Facebook failed to promptly heed numerous warnings from organisations in Myanmar about social media posts fuelling attacks on minority groups such as the Rohingya. In August 2017 the military led a crackdown in Myanmar’s Rakhine State in response to attacks by Rohingya insurgents, pushing more than 700,000 Muslims to neighbouring Bangladesh, according to UN agencies. The social media website in August removed several Myanmar military officials from the platform to prevent the spread of “hate and misinformation” for the first time banning a country’s military or political leaders. It also removed dozens of accounts for engaging in a campaign that “used seemingly independent news and opinion pages to covertly push the messages of the Myanmar military”. ‘I worry about HIV every day’: Rohingya women prostituted for US$2 in overcrowded refugee camps The move came hours after United Nations investigators said the army carried out mass killings and gang rapes of Muslim Rohingya with “genocidal intent.” Facebook said it has begun correcting shortcomings. Facebook said that it now has 99 Myanmar language specialists reviewing potentially questionable content. In addition, it has expanded use of automated tools to reduce distribution of violent and dehumanising posts while they undergo review. In the third quarter, the company said it “took action” on about 64,000 pieces of content that violated its hate speech policies. About 63 per cent were identified by automated software, up from 52 per cent in the prior quarter. Facebook has roughly 20 million users in Myanmar, according to BSR, which warned Facebook faces several unresolved challenges in Myanmar. BSR said locating staff there, for example, could aid in Facebook’s understanding of how its services are used locally but said its workers could be targeted by the country’s military, which has been accused by the UN of ethnic cleansing of the Rohingya.