Facebook is taking action in Myanmar, the Southeast Asian country where the social network has been used to incite racial tension and violence, after it banned four armed groups from its service.
The U.S. company said in a blog post that it has booted the rebel and insurgent groups — the Arakan Army (AA), the Myanmar National Democratic Alliance Army (MNDAA), Kachin Independence Army (KIO) and the Ta’ang National Liberation Army (TNLA) — and that “all related praise, support and representation” will be removed.
The groups are among the many Ethnic Armed Organizations (EAOs) that exist in Myanmar, which was under military rule until elections in 2015 partially opened the country.
Explaining that the groups have violated its terms of service, Facebook said:
There is clear evidence that these organizations have been responsible for attacks against civilians and have engaged in violence in Myanmar, and we want to prevent them from using our services to further inflame tensions on the ground.
…
We recognize that the sources of ethnic violence in Myanmar are incredibly complex and cannot be resolved by a social media company, but we also want to do the best we can to limit incitement and hate that furthers an already deadly conflict.
In recent times, the MNDAA was blamed for at least 30 deaths, including civilians, in 2017 during a skirmish with security forces on the northern border with China — the group responded at the time using a Facebook group — while, in January of this year, the AA was said to have killed 13 government troops as part of an attack on Independence Day.
The internet only became available and affordable to people in Myanmar after 2015, and already swathes of the country has flocked to Facebook, which is widely heralded as ‘the internet’ in the country. Facebook has some 20 million users in Myanmar, that’s nearly all of the country’s internet users and nearly 40 percent of the population.
That digital revolution has allowed anyone to sign up to reach audiences and spread messages. That’s true across the world, but particularly extreme in Myanmar, where the country’s religious conflict has expanded into the digital arena in a short space of time only to exacerbate the problems.
Since August 2017, an estimated 700,000 Rohingya Muslims are said to have fled Myanmar following the destruction of their homes and persecution in the northern Rakhine province. Much of the violence is reported to have been state-led. A UN fact-finding mission last year concluded that Facebook played a “determining role” in inciting the genocide.
In response, Facebook has introduced new security features and announced plans to increase its team of Burmese language content translators to 100 people. While it doesn’t intend to open an office in Myanmar due to security concerns, it has ramped up its efforts to expel bad actors. Beyond today’s removals, it has taken down scores of accounts and groups and, among others, removed the commander-in-chief of the armed forces and military-owned TV network Myawady from its platform.
But still, those on the ground are critical. Facebook, they said, still isn’t fully committed to Myanmar, it is moving too slow and it is asking too much of the volunteers and civic groups that have offered their assistance.
For one thing, the social network was rounded criticized after CEO Mark Zuckerberg suggested that it was using AI to remove hate speech in Myanmar. The company was, in fact, reliant on locals reporting the content before it took action, a process that took days.
To skeptics, the latest removals look like more ‘whack-a-mobile’ tactics to remove problems only once they have become problems. In banning (EAOs), Facebook has set a precedent that could sweep up other legitimate organizations based on the actions of minorities within their ranks, government claims or other situations.
Read Full Article
No comments:
Post a Comment