-4.1 C
Minnesota
Thursday, March 23, 2023

Hate Speech in Myanmar Continues to Thrive on Facebook

Internal papers obtained by The Associated Press indicate that, years after being criticized for contributing to ethnic and religious bloodshed in Myanmar, Facebook is still having trouble recognizing and regulating hate speech and disinformation on its platform in the Southeast Asian country.

The firm commissioned an investigation three years ago that indicated Facebook was being used in the nation to “foment discord and instigate offline violence.” It promised to do better and created a number of tools and regulations to combat hate speech.

Since the military takeover on Feb. 1 this year, which resulted in horrible human rights atrocities across the nation, the vulnerabilities have persisted — and even been used by enemy actors.

It’s not difficult to discover remarks on Facebook now threatening death and rape in Myanmar.

Over 56,000 people have seen a 2-and-a-half-minute film produced on Oct. 24 by a military supporter asking for violence against opposition organizations.

“So, from now on, we are the gods of death for all (of them),” the man says in Burmese as he stares into the camera. “Come back tomorrow and we’ll see if you’re genuine guys or gays,” says the narrator.

A military defector’s home address and a photo of his wife are posted on one account. A photo showing troops escorting tied and blindfolded prisoners down a dirt path was posted on Oct. 29. “Don’t capture them alive,” says the Burmese caption.

Despite the ongoing challenges, Facebook regarded Myanmar as a model to transmit throughout the world as well as a developing and thorny situation. According to documents obtained by the Associated Press, Myanmar served as a test bed for new content moderation technologies, with Facebook attempting to automate the identification of hate speech and disinformation with different degrees of success.

Former Facebook employee-turned-whistleblower Frances Haugen’s legal counsel exposed Facebook’s internal conversations on Myanmar in filings with the Securities and Exchange Commission, which were redacted and handed to Congress. A group of news organizations, including The Associated Press, got the redacted copies that Congress received.

In Myanmar, Facebook has a shorter but more turbulent history than in most other nations. Myanmar was linked to the internet in 2000, after decades of suppression under military control.

Soon after, Facebook partnered with local telecom companies to allow clients to access the site without having to pay for data, which was still pricey at the time. The platform’s popularity skyrocketed. For many people in Myanmar, Facebook has replaced the internet altogether.

Around 2013, it also became “a hub for extremism,” according to Htaike Htaike Aung, a Myanmar internet policy advocate, coinciding with religious disturbances across Myanmar between Buddhists and Muslims. It’s unclear how much, if any, human or automated content monitoring took place at the time.

Htaike Htaike Aung said she met with Facebook that year to discuss problems such as how local groups were experiencing an increase in hate speech on the site and how the network’s preventive procedures, such as reporting content, didn’t operate in Myanmar.

“Let us be prepared because there will be a riot that will happen inside the Muslim community,” she said, referring to a photo of a pile of bamboo sticks that was shared with the message “Let us be prepared because there will be a riot that will happen within the Muslim community.”

The photo was reported to Facebook, but it was not taken down since it did not break any of the company’s community rules, according to Htaike Htaike Aung.

“Which is absurd since it was actually advocating violence.” But Facebook didn’t think so,” she explained.

Years later, the world community became aware of the absence of moderation. Human rights experts examining atrocities on Myanmar’s Muslim Rohingya minority indicated in March 2018 that Facebook had a role in disseminating hate speech.

When asked about Myanmar at a U.S. Senate hearing a month later, Facebook CEO Mark Zuckerberg said the company planned to recruit “dozens” of Burmese speakers to censor content, collaborate with civil society groups to identify hate figures, and develop new tools to combat hate speech.

Internal Facebook records indicate that, while the business increased its efforts to combat hate speech in the nation, the means and methods to do so never fully materialized, and employees frequently raised the alarm. An employee claimed in a document dated May 2020 that a hate speech text classifier that was accessible wasn’t being used or updated. A month later, another paper stated that there were “serious holes” in Myanmar’s disinformation detection.

“I believe Facebook made symbolic moves to reassure policymakers that something was being done and that they didn’t need to go any further,” Ronan Lee, a visiting scholar at Queen Mary University of London’s International State Crime Initiative, said.

Rafael Frankel, Facebook’s director of policy for APAC Emerging Countries, claimed the company “has developed a dedicated staff of over 100 Burmese speakers” in an emailed response to the Associated Press. He wouldn’t say how many people were employed. Myanmar has roughly 28.7 million Facebook members, according to online marketing firm NapoleonCat.

On Nov. 8, the whistleblower Haugen spoke before the European Union Parliament, criticizing Facebook for not investing in third-party fact-checking and instead relying on automatic algorithms to flag offensive information.

“If you focus on these automatic systems, they will not work for the world’s most ethnically diverse countries, with the world’s most linguistically varied regions, which are frequently the most vulnerable,” she said of Myanmar.

Following Zuckerberg’s congressional testimony in 2018, the company established digital tools to prevent hate speech and disinformation, as well as a new organizational structure to deal with global crises like Myanmar.

Facebook compiled a list of “at-risk nations” with tiered tiers for a “critical countries team” to focus on, as well as a list of languages that require additional content filtering. Myanmar has been designated as a “Tier 1” high-risk country, with Burmese being designated as a “priority language” with Ethiopian languages, Bengali, Arabic, and Urdu.

Facebook’s automated algorithms were taught Burmese slang phrases for “Muslims” and “Rohingya” by Facebook programmers. They also taught systems to recognize “coordinated inauthentic activity,” such as a single person publishing from many identities or cooperation between accounts to share the same information.

The business also attempted something called “repeat offender relegation,” which reduced the effect of posts from users who broke the rules regularly. According to a 2020 report included in the papers, demotion performed effectively in Ethiopia but poorly in Myanmar in a test in two of the world’s most turbulent countries. This disparity perplexed engineers.

“We’re not sure why… “However, this data serves as a starting point for future analysis and user study,” according to the paper. Facebook refuses to say whether the problem has been resolved a year after it was discovered, or whether the two tools have been successful in Myanmar.

According to an internal 2020 report, the business also implemented a new technique dubbed “reshare depth promotion” to minimize the virality of material, which favors content shared by direct connections. This “content-agnostic” strategy reduced viral inflammatory prevalence by 25% and picture misrepresentation by 48.5 percent, according to the study.

Staffers shared the Myanmar experience as part of a “playbook” for intervening in other at-risk nations including Ethiopia, Syria, Yemen, Pakistan, India, Russia, the Philippines, and Egypt.

While these new approaches developed in Myanmar’s civil wars were spread all across the world, papers reveal that by June 2020, Facebook was aware that problems in its Myanmar safety efforts still existed.

An internal assessment of the company’s “integrity coverage” discovered “substantial gaps in our coverage (particularly in Myanmar and Ethiopia), demonstrating that our existing signals may be inadequate.” Myanmar was colored red, with less than 55% coverage, which was worse than Syria but better than Ethiopia.

The company’s internal approach of intervening “only once a crisis has developed” was questioned by Haugen.

Facebook “slows the platform down rather than observing as the temperature rises and makes the platform safer as a result,” she said to the British Parliament on Oct. 25.

According to Frankel, a Facebook spokeswoman, the firm has taken preemptive measures.

“Facebook’s attitude in Myanmar now is substantially different from what it was in 2017,” Frankel said. “Accusations that we haven’t invested in safety and security in the nation are false.”

Nonetheless, the Myanmar Social Media Insights Project discovered in a September 2021 study that posts on Facebook include organized targeting of activists, ethnic minorities, and journalists –– a practice with a long history in the military. The military is also claimed to be laundering propaganda through public sites posing as media outlets, according to the investigation.

According to an October report shared with the AP by Myanmar Witness, a U.K.-based organization that archives social media posts related to the conflict, opposition and pro-military groups have used the encrypted messaging app Telegram to organize two types of propaganda campaigns on Facebook and Twitter.

Myanmar is a “highly contested information environment,” according to the research, in which users work together to overwhelm Facebook’s reporting system in order to remove others’ postings, as well as propagate coordinated disinformation and hate speech.

According to Benjamin Strick, head of investigations at Myanmar Witness, the coordinated networks collected film of slaughtered victims recorded in Mexico by the Sinaloa cartel in 2018 and fraudulently branded it as evidence of the opposition killing Myanmar soldiers on June 28, 2021.

“It’s very tough to collect water when it’s coming out of a fire hydrant,” he added. “Some of these platforms are so huge, and possibly the teams looking for it are so little that it’s very hard to grab water when it’s coming out of a fire hydrant.”

In late October, the group tracked one soldier’s digital trail at the destruction of 160 homes in the hamlet of Thantlang. He posed in body armor on a ledge with a post blaming opposition troops for the devastation in a litany of aggressive words.

According to Facebook spokesman Frankel, the company “conducted human rights due diligence to evaluate and address the dangers in Myanmar,” and “blocked the military and employed technology to decrease the volume of infringing content.”

However, Myanmar digital rights advocates and academics believe Facebook can still improve by being more transparent about its content moderation, demotion, and removal procedures, as well as acknowledging its responsibility to Myanmar’s people.

“We need to start looking at the harm that platforms like Facebook have done to our communities.” They present themselves as a virtual platform, which allows for less control,” said Lee, the visiting scholar. “The reality is, there are real-world ramifications.”

Cedric Blackwater
Cedric Blackwater
Cedric is a journalist with over a decade of experience reporting on local US news, and touching on many global topics. He is currently the lead writer for Bulletin News.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

5,000FansLike
500FollowersFollow
3,000FollowersFollow
0SubscribersSubscribe
- Advertisement -

Latest Articles