Homo Sapiens
ELITE MEMBER
- Joined
- Feb 3, 2015
- Messages
- 9,641
- Reaction score
- -1
- Country
- Location
ETHNIC VIOLENCE: There have been repeated outbreaks of communal violence in Myanmar. In March, a United Nations investigator said Facebook had been used to incite hatred against the Rohingya. REUTERS/Soe Zeya Tun
Inside Facebook’s Myanmar operation
Hatebook
A REUTERS SPECIAL REPORT
Why Facebook is losing the war on hate speech in Myanmar
https://www.reuters.com/investigates/special-report/myanmar-facebook-hate/
Reuters found more than 1,000 examples of posts, comments and pornographic images attacking the Rohingya and other Muslims on Facebook. A secretive operation set up by the social media giant to combat the hate speech is failing to end the problem.
By STEVE STECKLOW
Filed Aug. 15, 2018, 3 p.m. GMT
YANGON, Myanmar – In April, Facebook founder Mark Zuckerberg told U.S. senators that the social media site was hiring dozens more Burmese speakers to review hate speech posted in Myanmar. The situation was dire.
Some 700,000 members of the Rohingya community had recently fled the country amid a military crackdown and ethnic violence. In March, a United Nations investigator said Facebook was used to incite violence and hatred against the Muslim minority group. The platform, she said, had “turned into a beast.”
Four months after Zuckerberg’s pledge to act, here is a sampling of posts from Myanmar that were viewable this month on Facebook:
One user posted a restaurant advertisement featuring Rohingya-style food. “We must fight them the way Hitler did the Jews, damn kalars!” the person wrote, using a pejorative for the Rohingya. That post went up in December 2013.
Another post showed a news article from an army-controlled publication about attacks on police stations by Rohingya militants. “These non-human kalar dogs, the Bengalis, are killing and destroying our land, our water and our ethnic people,” the user wrote. “We need to destroy their race.” That post went up last September, as the violence against the Rohingya peaked.
A third user shared a blog item that pictures a boatload of Rohingya refugees landing in Indonesia. “Pour fuel and set fire so that they can meet Allah faster,” a commenter wrote. The post appeared 11 days after Zuckerberg’s Senate testimony.
Related content
The remarks are among more than 1,000 examples Reuters found of posts, comments, images and videos attacking the Rohingya or other Myanmar Muslims that were on Facebook as of last week. Almost all are in the main local language, Burmese. The anti-Rohingya and anti-Muslim invective analyzed for this article – which was collected by Reuters and the Human Rights Center at UC Berkeley School of Law – includes material that’s been up on Facebook for as long as six years.
The poisonous posts call the Rohingya or other Muslims dogs, maggots and rapists, suggest they be fed to pigs, and urge they be shot or exterminated. The material also includes crudely pornographic anti-Muslim images. The company’s rules specifically prohibit attacking ethnic groups with “violent or dehumanising speech” or comparing them to animals. Facebook also has long had a strict policy against pornographic content.
The use of Facebook to spread hate speech against the Rohingya in the Buddhist-majority country has been widely reported by the U.N. and others. Now, a Reuters investigation gives an inside look at why the company has failed to stop the problem.
For years, Facebook – which reported net income of $15.9 billion in 2017 – devoted scant resources to combat hate speech in Myanmar, a market it dominates and in which there have been regular outbreaks of ethnic violence. In early 2015, there were only two people at Facebook who could speak Burmese reviewing problematic posts. Before that, most of the people reviewing Burmese content spoke English.
To this day, the company continues to rely heavily on users reporting hate speech in part because its systems struggle to interpret Burmese text.
Even now, Facebook doesn’t have a single employee in the country of some 50 million people. Instead, it monitors hate speech from abroad. This is mainly done through a secretive operation in Kuala Lumpur that’s outsourced to Accenture, the professional services firm, and codenamed “Project Honey Badger.”
According to people familiar with the matter, the project, which handles many Asian countries, hired its first two Burmese speakers, who were based in Manila, just three years ago. As of June, Honey Badger had about 60 people reviewing reports of hate speech and other content posted by Myanmar’s 18 million active Facebook users. Facebook itself in April had three full-time Burmese speakers at a separate monitoring operation at its international headquarters in Dublin, according to a former employee.
Honey Badger employees typically sign one-year renewable contracts and agree not to divulge that the client is Facebook. Reuters interviewed more than a half-dozen former monitors who reviewed Southeast Asian content.
A Facebook official said outsourcing its content monitoring is more efficient because the companies it uses are specialists in ramping up such operations. He declined to disclose how many Burmese speakers the company has on the job worldwide, saying it was “impossible to know, to be definitive on that.”
“It’s not enough,” he added.
HATE MONITORS: Facebook reviews hate speech in Myanmar from an outsourced operation run by Accenture and codenamed Project Honey Badger. The poster refers to employees as “silent heroes.” Source: Via Facebook
For many people in this emerging economy, Facebook is the internet: It’s so dominant, it’s the only site they use online. Yet, the company ignored repeated warnings as far back as 2013 that it faced trouble.
Researchers and human rights activists say they cautioned Facebook for years that its platform was being used in Myanmar to promote racism and hatred of Muslims, in particular the Rohingya.
“They were warned so many times,” said David Madden, a tech entrepreneur who worked in Myanmar. He said he told Facebook officials in 2015 that its platform was being exploited to foment hatred in a talk he gave at its headquarters in Menlo Park, California. About a dozen Facebook people attended the meeting in person, including Mia Garlick, now the company’s director of Asia Pacific policy, he said. Others joined via video. “It couldn’t have been presented to them more clearly, and they didn’t take the necessary steps,” Madden said.
In a statement, Garlick told Reuters: “We were too slow to respond to the concerns raised by civil society, academics and other groups in Myanmar. We don’t want Facebook to be used to spread hatred and incite violence. This is true around the world, but it is especially true in Myanmar where our services can be used to amplify hate or exacerbate harm against the Rohingya.”
She added that Facebook is focused on addressing challenges that are unique to Myanmar “through a combination of people, technology, policies and programs.” The company also said it has banned several “hate figures and organizations” on Facebook in Myanmar.
Facebook’s struggles in Myanmar are among much broader problems it faces. Zuckerberg’s congressional testimony in April primarily focused on the company’s mishandling of user data, whether it censors conservative views and Russia’s exploitation of Facebook to meddle in the 2016 U.S. presidential election.
“Cut off those necks of the sons of the dog and kick them into the water”
April 2018
“Stuff pig’s fat inside the damn kalar’s mouth”
September 2017
“Pour fuel and set fire so that they can meet Allah faster”
April 2018
Of all of Facebook’s travails, though, Myanmar may be the bloodiest. The Myanmar military stands accused by the U.N. of having conducted a brutal campaign of killings, mass rape, arson and ethnic cleansing against the Rohingya. The government denies the allegations.
The social media giant doesn’t make public its data on hate speech in Myanmar. It says it has 2.2 billion global users and each week receives millions of user reports from around the world about problematic content.
In compiling examples of hate speech for this article, Reuters found some that Facebook subsequently removed. But the vast majority remained online as of early August.
After Reuters alerted Facebook to some of the derogatory posts included in this story, the company said it removed them. “All of it violated our policies,” it said.
Reuters itself sometimes flags to Facebook threats posted on the platform against its reporters. These include the Burmese journalists Wa Lone and Kyaw Soe Oo, who are on trial in Myanmar on charges of violating a state secrets law. The two were arrested in December while reporting on the massacre of 10 Rohingya men and have received a deluge of death threats on social media over their story. Facebook has removed such content several times at the news agency’s request.
’Sending flowers’
Myanmar emerged from decades of military rule in 2011, but religious violence has marred its transition to democracy. In 2012, clashes in Rakhine State between ethnic Rakhine, who are Buddhists, and the Rohingya killed scores of people and left 140,000 displaced – mostly Muslims.
Facebook’s extraordinary dominance in Myanmar began taking root around the same time. But not by design.
As recently as six years ago, Myanmar was one of the least connected countries on earth. In 2012, only 1.1 percent of the population used the internet and few people had telephones, according to the International Telecommunication Union, a U.N. agency. The junta that had ruled the country for decades kept citizens isolated.
That all changed in 2013, when a quasi-civilian government oversaw the deregulation of telecommunications. The state-owned phone company suddenly faced competition from two foreign mobile-phone entrants from Norway and Qatar.
UBIQUITOUS: A cellphone user looks at a Facebook page in a shop in downtown Yangon in early August. For many in Myanmar, Facebook is the internet. REUTERS/Ann Wang
2014: 1.2m
2015: 7.3m
2016: 11m
2017: 15m
2018: 18m
(Facebook users in Myanmar in millions)
Sources: We Are Social, Hootsuite, Kepios and Facebook
The price of SIM cards dropped from more than $200 to as little as $2 and people purchased them in droves. By 2016, nearly half the population had mobile phone subscriptions, according to GSMA Intelligence, the research arm of the industry’s trade association. Most purchased smartphones with internet access.
One app went viral: Facebook. Many saw it as an all-in-one solution – offering a messaging system, news, and videos and other entertainment. It also became a status symbol, said Chris Tun, a former Deloitte consultant who advised the government. “If you don’t use Facebook, you’re behind,” he said. “Even grandmas, everyone was on Facebook.”
To capture customers, Myanmar’s mobile phone operators began offering a sweet deal: use Facebook without paying any data charges.
“Facebook should be sending flowers to me, because we have been an accelerator for bringing the penetration,” said Lars Erik Tellmann, who until July was chief executive of Telenor Myanmar, part of Norway’s Telenor Group. “This was an initiative we took fully on our own. And this was extremely popular.”
In Myanmar today, the government itself uses Facebook to make major announcements, including the resignation of the president in March.
’Genocide all of the Muslims’
In the fall of 2013, Aela Callan, an Australian documentary filmmaker studying at Stanford University, began a project on hate speech and false reports that had spread online during conflicts between Buddhists and Rohingya Muslims the prior year. In June 2012, at least 80 people had died in riots and thousands of Rohingya were moved into squalid internment camps. Anti-Rohingya diatribes appeared on Facebook. One Buddhist nationalist group set up a page called the “Kalar Beheading Gang.”
In November 2013, she met at Facebook’s California headquarters with Elliott Schrage, vice president of communications and public policy. “I was trying to alert him to the problems,” she said.
Emails between the two show that Schrage put Callan in touch with internet.org, a Facebook initiative to bring the internet to developing countries, and with two Facebook officials, including one who worked with civil-society organizations to assist the company in coping with hate speech.
“He didn’t connect me to anyone inside Facebook who could deal with the actual problem,” she said.
Asked for comment, Schrage referred Reuters to a press person at Facebook. The company didn’t comment on the meeting.
Matt Schissler, a doctoral student at the University of Michigan, said that between March and December 2014, he held discussions with Facebook officials in a series of calls and online communications. He told them how the platform was being used to spread hate speech and false rumors in Myanmar, he said, including via fake accounts. He and other activists provided the company with specific examples, including a Facebook page in Burmese that was called, “We will genocide all of the Muslims and feed them to the dogs.” The page was removed.
ARMY CRACKDOWN: A car stands next to a house that was burned down in Maungdaw in Rakhine State last year during a military campaign that the United States has denounced as ethnic cleansing. Human rights activists and researchers say they warned Facebook for years that its platform was being used to spread hate speech against Muslims in Myanmar. REUTERS/Soe Zeya Tun
Schissler belonged to a private Facebook group that was set up so that Myanmar human rights activists, researchers and company employees such as Asia Pacific policy chief Garlick could discuss how to cope with hate speech and other issues. The activists brought up numerous problems with Facebook’s multi-step reporting system for problematic content. As one example, they cited a photograph of an aid worker in Rakhine State in a post that called him “a traitor to the nation.” It had been shared 229 times, according to messages reviewed by Reuters.
One of the private group’s members had reported it to Facebook as harassment of an individual but later received a message back: “We reviewed the photo you reported for containing hate speech or symbols and found it doesn’t violate our Community Standards.” After multiple complaints by activists over six weeks, a Facebook employee finally explained to the activists that the takedown request was rejected because the photo had been reported, but not the comment above it. It eventually was taken down.
In March 2015, Schissler gave a talk at Facebook’s California headquarters about new media, particularly Facebook, and anti-Muslim violence in Myanmar. More than a dozen Facebook employees attended, he said.
Two months later, Madden, the tech entrepreneur, gave a talk at Facebook headquarters about tensions and violence between Buddhists and Muslims. He said he showed a doctored picture that had spread on Facebook of the country’s de facto leader, Aung San Suu Kyi, who is Buddhist, wearing a Muslim head scarf. The image, Madden said, was meant to imply she was sympathetic to Muslims – a “very negative message” in Myanmar.
“The whole point of this presentation was really just to sound the alarm, to show very vividly the context in which Facebook was operating, and already the evidence of how it was being misused,” he said. He left the meeting thinking his audience took the talk seriously and would take action.
”May the Rakhine people … and all Myanmar citizens be free from the dangers of sons of a dog, grandchildren of a pig kalar, and rapists”
September 2013
“Just feed them to the pigs”
October 2016
“If it’s kalar, get rid of the whole race”
October 2016
Madden had founded a technology hub and start-up accelerator in Yangon called Phandeeyar. He said he and others involved with the venture interacted with Facebook “many dozens” of times over the next several years, including via email, in the private Facebook group and in person, showing how the network’s systems for detecting and removing dangerous content were ineffective. He isn’t sure what steps the company took in response. “The central problem is that the mechanisms that they have to pull down hate speech in a timely way, before it does real world harm, they don’t work,” he said.
Madden and Jes Kaliebe Petersen, Phandeeyar’s chief executive, said Facebook was still relying too much on their group and other volunteers to report dangerous posts. “It shouldn’t be incumbent on an organization like ours or people who happen to be well-connected with folks inside Facebook to report these things,” Petersen said.
In April, shortly before Zuckerberg’s Senate testimony, Phandeeyar and five other Myanmar groups blasted him for claiming in an interview with Vox that Facebook’s systems had detected and removed incendiary messages in September last year. “We believe your system, in this case, was us,” they wrote. Zuckerberg apologized.
Back in 2014, tech organizations and researchers weren’t the only ones sounding alarms with Facebook. So was the Myanmar government.
In July of that year, riots broke out in the central city of Mandalay after false rumors spread online, on Facebook and elsewhere, that a Muslim man had raped a Buddhist woman. A Buddhist man and a Muslim man were killed in the fighting.
The Myanmar government asked Tun, then a Deloitte consultant, to contact the company. He said he didn’t succeed at first, and the government briefly blocked Facebook.
Tun said he eventually helped to arrange meetings between the government and Facebook. “What they promised to do was, when you spot fake news, you could contact them via email,” Tun said of Facebook. “And they would take action – they were willing to take down pages after their own verification process.”
The government began reporting cases to Facebook, but Tun said he quickly realized the company couldn’t deal with Burmese text. “Honestly, Facebook had no clue about Burmese content. They were totally unprepared,” he said. “We had to translate it into English for them.”
’I don’t know the language’
In August 2013, Zuckerberg announced a plan to make the internet available for the first time to billions of people in developing countries.
“Everything Facebook has done has been about giving all people around the world the power to connect,” he said. The company would now work, he added, to make “internet access available to those who cannot currently afford it.”
But in Myanmar, the language barrier would cause trouble. Most people here don’t speak English. Although Myanmar users at the time could post on Facebook in Burmese, the platform’s interface – including its system for reporting problematic posts – was in English.
FACING SCRUTINY: Facebook CEO Mark Zuckerberg, seen testifying here on Capitol Hill in April, was quizzed over Facebook’s failure to stem hate speech on its platform in Myanmar. For years, the social media giant invested scant resources in combating hate speech in the country. REUTERS/Leah Millis
Making matters worse, the company’s operation for monitoring content in Burmese was meagre.
In 2014, the social media behemoth had just one content reviewer who spoke Burmese: a local contractor in Dublin, according to messages sent by Facebook employees in the private Facebook chat group. A second Burmese speaker began working in early 2015, the messages show.
In Manila – the original site of the outsourced Project Honey Badger – there were no content reviewers who spoke Burmese. People who reviewed Myanmar content there spoke English.
“In cases like hate speech where we didn’t understand the language, we would say, ‘I don’t know the language,’” said a person who worked there. “So the client had to solve that,” the person said, referring to Facebook.
By 2015, Facebook had around four Burmese speakers reviewing Myanmar content in Manila and Dublin. They were stretched thin: that year Facebook had 7.3 million active users in Myanmar.
Accenture slowly began to hire more Burmese speakers. With the help of volunteer translators, Facebook also introduced a Burmese-language interface.
By 2016, the Honey Badger project had moved to Kuala Lumpur after Accenture convinced Facebook it would be easier to recruit Burmese and others to work in Malaysia’s capital than in further-off Manila, according to a person familiar with the matter.
In an office tower in Kuala Lumpur, teams of content monitors are assigned to handle different Asian countries, not just Myanmar. They are paid around $850 to $1000 a month and are often employed by temporary staffing agencies, according to ex-employees and online recruitment ads.
Facebook said in a statement: “We've chosen to work only with highly reputable, global partners that take care of their employees, pay them well and provide robust benefits - this includes Accenture in Asia Pacific.”
A spokesperson for Accenture confirmed it partners with Facebook. “The characterization of our operations as ‘secretive’ is misleading and confidentiality is in place primarily to protect the privacy and security of our people and the clients we serve,” the spokesperson said.
The communications man
Former content monitors said they often each had to make judgments on 1,000 or more potentially problematic content items a day, although the number is now understood to be less. Facebook’s complete rules about what is and isn’t allowed on its platform are spelled out in its internal community standards enforcement guidelines, which the company made public for the first time in April. It defines hate speech as “violent or dehumanising speech, statements of inferiority, or calls for exclusion or segregation” against people based on their race, ethnicity, religious affiliation and other characteristics.
In response, Facebook said: “Content reviewers aren’t required to evaluate any set number of posts … We encourage reviewers to take the time they need.”
A Facebook official also told Reuters the community standards policy is global, “but there are local nuances,” such as slurs, that content reviewers who are native speakers can consider when making decisions. But former content monitors told Reuters the rules were inconsistent; sometimes they could make exceptions and sometimes they couldn’t.
Former content monitors also said they were trained to err on the side of keeping content on Facebook. “Most of the time, you try to give the user the benefit of the doubt,” said one former Facebook employee.
The ex-monitors said they sometimes had as little as a few seconds to decide if a post constituted hate speech or violated Facebook’s community standards in some other way. They said they didn’t actually search for hate speech themselves; instead, they reviewed a giant queue of posts mostly reported by Facebook users.
MARKET PENETRATION: The sun rises behind the entrance sign to the Facebook headquarters in Menlo Park, California, in 2012, a time when only 1.1 percent of people in Myanmar used the internet. Six years on, the company has 18 million users in the country, about a third of the population. REUTERS/Beck Diefenbach/File Photo
Many of the millions of items flagged globally each week – including violent diatribes and lurid sexual imagery – are detected by automated systems, Facebook says. But a company official acknowledged to Reuters that its systems have difficulty interpreting Burmese script because of the way the fonts are often rendered on computer screens, making it difficult to identify racial slurs and other hate speech.
Facebook’s troubles are evident in a new feature that allows users to translate Burmese content into English. Consider a post Reuters found from August of last year.
In Burmese, the post says: “Kill all the kalars that you see in Myanmar; none of them should be left alive.”
Facebook’s translation into English: “I shouldn’t have a rainbow in Myanmar.”
In response, Facebook said: “Our translations team is actively working on new ways to ensure that translations are accurate.” The company said it uses a different system to detect hate speech.
Guy Rosen, vice president of product management, wrote in a blog post on Facebook in May about the problems the company faced in identifying hate speech. “Our technology still doesn’t work that well and so it needs to be checked by our review teams,” he wrote.
Facebook officials say they have no immediate plans to hire any employees in Myanmar itself. But the company does contract with local agencies for tasks unrelated to content monitoring. One is Echo Myanmar, a communications firm whose managing director is Anthony Larmon, an American.
Larmon has expressed strong opinions on the Rohingya. Toward the end of 2016, the Myanmar army launched an onslaught across some 10 villages after Rohingya militants attacked border posts. At the time, a U.N. official accused the government of seeking “ethnic cleansing” of the Rohingya.
In November 2016, Larmon wrote that an article about the U.N. allegation was “misleading.” He cited what he said were claims by multiple “local journalists” that the ethnic minority “purposely exaggerate (lie about)” their situation to “get more foreign aid and attention.”
He also wrote: “No, they aren’t facing ethnic cleansing or anything remotely close to what that incendiary term suggests.” He said he later removed the post.
A Facebook spokesperson said that Larmon’s post “does not represent Facebook’s view.”
Larmon told Reuters: “It was overly-emotional, under-informed commentary on a highly nuanced subject that I do regret. My view on the Rohingya, same today as then, is that they should be safely repatriated and protected.”
The platform on which he aired his views about the Rohingya? Facebook.
Additional reporting by Tin Htet Paing, Simon Lewis, Shoon Naing and Aye Min Thant in Yangon.
FLEEING: Rohingya refugees who fled an army crackdown in western Myanmar last year are seen here after crossing the border into Bangladesh in October. REUTERS/Jorge Silva
Facebook isn’t alone
By STEVE STECKLOW
Facebook isn’t the only social-media platform that contains hate speech against Rohingya Muslims. It also has proliferated on Twitter.
In Myanmar, Twitter is far less popular than Facebook. But after Rohingya insurgents attacked police stations in August 2017, sparking an army crackdown that forced 700,000 people to abandon their homes, hundreds of new Twitter accounts suddenly sprang up in Myanmar.
Many of the tweets on these accounts appeared to be attempts to counter sympathetic portrayals of the Rohingya by the Western news media and human rights activists. They portray the ethnic minority as illegal immigrants from neighboring Bangladesh, or “Bengalis.” Members of the ethnic group regard themselves as native to Rakhine State in western Myanmar, but the country has denied most of them citizenship.
Some tweets on these accounts were in broken English:
“There is no Rohingya in Myanmar they are only illegal immigrant and terrorists.”
“They are Originally Bangalis, Illegally migrants and Land Robbers”
These and similar tweets could still be found online this month. Twitter’s “Hateful conduct policy” forbids attacking groups of people on the basis of race, ethnicity or national origin, or engaging in “behavior that incites fear about a protected group.”
Twitter removed a number of tweets flagged to it by Reuters.
Matthew Smith, co-founder of Fortify Rights, a Southeast Asia-based human rights group, said that after the attacks by Rohingya insurgents in August last year he noticed “suspicious” accounts suddenly following him on Twitter.
Reuters analyzed Smith’s new Twitter followers in the aftermath of the attacks with assistance from two Twitter analytics services, ExportTweet.com and Mentionmapp Analytics.
The analysis showed that more than 1,200 of the new Twitter accounts following Smith were created between August 27 and August 31, at the height of the military crackdown against the Rohingya in western Rakhine state. A review of 564 of those accounts found that 349, or 62 percent, were anti-Rohingya, according to John Gray, Mentionmapp’s co-founder. Mentionmapp did not analyze the viewpoints of the other accounts.
A report by Mentionmapp found that the emergence of these accounts probably wasn’t automated - meaning they weren’t bots - but appeared to be mainly a short-lived, “orchestrated” anti-Rohingya campaign designed to resemble a grassroots movement. Thirty-one percent of the 1,239 new accounts stopped tweeting by the end of September and became dormant.
Gray said he couldn’t “confirm or determine there’s a central organization/operator behind all of the behavior.” Mentionmapp’s report also stated: “It’s fair to say Facebook wasn’t the only home to ‘hate speech’ directed at the Rohingya.”
Hatebook
By Steve Stecklow
Photo editing: Dharmasari Haroun
Design: Troy Dunkley
Visual editor: Sarah Slobin
Edited by Peter Hirschberg
Last edited: