Facebook’s recommendation algorithm amplifies military propaganda and other material that breaches the company’s own policies in Myanmar following a military takeover in February, rights group Global Witness said in a report.
One month after the Burmese military seized power in the country and imprisoned elected leaders, Facebook’s algorithms were still prompting users to view and “like” pro-military pages with posts that incited and threatened violence, pushed misinformation that could lead to physical harm, praised the military, and glorified its abuses, Global Witness said.
It filtered the search results to show pages and selected the top result — a military fan page whose name translates as “a gathering of military lovers.”
Older posts on this page showed sympathy for Myanmar’s soldiers, and at least two advertised for young people to join the military — but none of the newer posts since the coup contravened Facebook policies.
However, when Global Witness’ account “liked” the page, Facebook began recommending related pages with material inciting violence, false claims of interference in an election last year’s and support of violence against civilians.
Read the original story here.
Sign in or become a tippinsights member to join the conversation.
Just enter your email below to get a log in link.