Associated Press Coverage of the Facebook Papers – Alana Bonfiglio 11/18

In the first Associated Press article addressing the Facebook papers, Amid the Capital Riot, Facebook faced its own insurrection, Alan Suderman and Joshua Goodman show that Facebook missed critical signs of the planning of the Jan. 6 capitol riots on its platform, causing employee frustration. On Jan. 6, one employee wrote on an internal message board “Haven’t we had enough time to figure out how to manage discourse without enabling violence?” Suderman and Goodman report that new internal documents showed that “Facebook put its growth and profits ahead of public safety.” According to Suderman and Goodman, Facebook had procedures in place to stop the spread of dangerous or violent content that were rolled back after the 2020 election. The company said it’s not responsible for the actions of the rioters and that having stricter controls in place prior to that day wouldn’t have helped. In 2019 research was done using a fake Facebook profile. Researchers indicated a preference for conservative media such as Fox News and followed humor groups that mocked liberals. The profile embraced Christianity and was a fan of Melania Trump. Within a single day, Facebook’s algorithm was recommending extremist content such as QAnon. A week later, the test subject’s feed contained posts claiming Obama wasn’t born in the U.S. and linking the Clintons to the murder of a former Arkansas state senator. The researcher called for recommended safety measures such as removing content with known conspiracy references, disabling “top contributor” badges for misinformation commenters and lowering the threshold number of followers required before Facebook verifies a page administrator’s identity. Despite being widely supported by Facebook employees, it is unclear whether these suggestions were implemented. 

Two days later, AP published Facebook dithered in curbing divisive user content in India. In this piece, By Sehikh Saaliq and Krutika Pathi show that Facebook in India has “been selective” in preventing the spread of hate speech, misinformation and inflammatory posts, particularly anti-Muslim content. Documents show that Facebook has been aware of this content for years which has led people to question whether they have done enough to address them. Saaliq and Krutika Pathi reported that Facebook documents identified India as one of the most “at risk countries” in the world for hate speech, yet didn’t have enough local language moderators or content-flagging in place to stop misinformation that at times led to real-world violence. Similarly to the research done in the U.S., a Facebook employee in India created a page where all they did was follow pages and groups solely recommended by the platform itself. After three weeks, the employee said they were shocked at how the newsfeed had “become a near constant barrage of polarizing nationalist content, misinformation, and violence and gore.”

In its next piece, People or profit? Facebook papers show deep conflict within, Barbara Ortutay argues that Facebook the company is losing control of Facebook the product. According to Ortutay, thousands of pages of documents provided to congress have shown that Facebook, despite good intentions of connecting the world, has “sidelined” problems it has exacerbated or even created. One former Facebook employee said Mark Zuckerberg had dictatorial power over the company by holding the majority of voting shares, controlling its board of directors and surrounding himself with executives who don’t question him. Barbara Ortutay also argues that young poeple see Facebook as a place for old people which limits their future. Because of the loss of the younger demographic in the U.S. and Western Europe, Facebook has begun pushing growth in other countries. However, where they went wrong is their failure to provide systems to limit the spread of hate speech, misinformation and violence among networks of millions of people across the world such as in Afghanistan and Myanmar. Internal efforts to address these problems as well as algorithms that facilitate human traficking, teen suicide and more have been pushed aside when solutions conflict with growth. Facebook claims they do not prioritize engagement over safety and have invested $13 billion and over 40,000 people devoted to keeping people safe on Facebook. According to one former employee, Facebook surveys what percentage of employees believe that Facebook is making the world a better place. Over the two years this employee spent at Facebook, that number dropped from 70% to 50%.

In a piece published the same day, Apple once threatened Facebook ban over Mideast maid abuse,  Jon Gambrell and Jim Gomez report that concerns that the company’s products were being used to trade and sell women in the Middle East in 2019 led Apple to threaten to pull Facebook and Instagram from its app store. Despite public promises of increased vigilance, Facebook acknowledged in internal documents that it was “under-enforcing on confirmed abusive activity.” Facebook said it has been combating human trafficking on its platform for many years, yet the problem persists today. Gambrell and Gomez write that  a quick search for “khadima,” or “maids” in Arabic, brings up accounts of posed photographs of African and South Asian women with ages and prices listed next to their pictures. Many women also go abroad with the misconception that they will be working to be met with horrific conditions. Gambrell and Gomez write that many people believe that Facebook has an obligation to combat this issue. Facebook suggested a pilot program to begin in 2021 that targeted Filipinas with pop-up messages warning them about the dangers of working overseas but it is unclear whether it was implemented. 

In another piece published the same day, Facebook language gaps weaken screening of hate, terrorism, Isabel Debre and Fares Akram show that Facebook’s systemic failings of “muzzling” political speech in the Middle East. In May Instagram banned the hashtag #AlAqsa, which is a mosque in Jerusalem’s Old City. Facebook later apologized and said its algorithm had mistaken the landmark for the militant group Al-Aqsa Martyrs Brigade. Internal documents have shown there are many more instances like this one, even outside of the Arabic language. The Director of the Middle East Institute’s Cyber Program said that the problem is that Facebook was not built to mediate the political speech of everyone in the world. According to Debre and Fares Akram, Facebook does not have enough moderators who speak local languages. The Facebook papers acknowledged they failed to stop the spread of hate speech in Myanmar. Facebook pledged to recruit more staff who speak local dialects, including 100 native Myanmar language speakers to police its platforms. However, it is unclear how many content moderators were actually hired.

On October 26 AP published Facebook froze as anti-vaccine comments swarmed users in which David Klepper and Amanda Seitz show that Facebook’s delayed actions in fighting misinformation on the COVID-19 vaccine may have been due to concerns about profit. In March Facebook employees presented research that suggested altering how vaccine posts were ranked in newsfeeds to reduce exposure to misleading information and direct users to “legitimate” sources like WHO. However, Facebook did not make some of the suggested changes and delayed others for a month. According to critics, this delay was about not wanting to lose engagement on controversial posts which ultimately create profit. Accoridng to Klepper and Seitz, internal documents reveal that Facebook investigated the spread of vaccine misinformation on their platform and acknowledged unanswered employee proposals for mitigation. Facebook also did very little to combat misinformation in comments on the platform which were largely anti-vaccination related on vaccine content. 

The Associated Press’ coverage of the Facebook papers seemed very valid. All research seemed credible. Facebook was given a chance to respond to all criticisms. I find it interesting that no two articles were written by the same journalists. I think it is a massive benefit of having such a well-established news outlet as to have the resources to engage so many journalists on the project. I imagine it reduces bias as opposed to if one journalist who inherently has one perspective, was working on all of these pieces. I found all of the articles to be well-written, well-researched and extremely interesting.

One thought on “Associated Press Coverage of the Facebook Papers – Alana Bonfiglio 11/18

  1. Reading a different news source, I find it interesting to see viewpoints on the same topics being discussed amongst different writers. I agree with you, from a New York Times coverage perspective, that their arguments seemed very valid. However, I did not see that the NYT, unlike the Associated Press’ gave Facebook much room to respond to these points. I think this would validate the NYT’s arguments more effectively.

Leave a Reply

Your email address will not be published. Required fields are marked *