11/18 Reports by CNBC and CBS on the Facebook Papers – Deirdre Kelshaw

Reporting on the Facebook Papers, internal company files made public by whistleblower and former Facebook product manager Frances Haugen, CBS and CNBC collectively describe the internal concerns regarding Facebook’s responsibility in controlling the dangerous content that circulates to its users. 

Reading the articles and watching the videos on CBS and CNBC, it’s clear to see that Facebook consistently has prioritized the platform’s growth over the safety of its users. For example, as described in CBS’s article “Facebook internal documents show execs knew platform spread misinformation and failed to act at times,” after the 2020 election, Facebook researchers discovered that despite being aware of the growing number of Stop the Steal groups (consisting of those who planned to storm the U.S. Capitol on the platform itself) on Facebook and their increasing levels of hate and violence inciting content, the company’s response was lacking. In fact, the report goes on to say that Mark Zuckerberg shared feedback on a proposal to “reduce bad content in News Feed,” saying that he did not want to stick with parts of the plan “if there was material tradeoff” with engagement numbers. I would think that by having such a high prevalence of misinformation on the platform, it would only hinder its growth. I personally don’t use Facebook very much because of its bad reputation that I’ve been made aware of. However, I suppose that the primary demographic on the platform consists of older users who may be more susceptible to believing in what they see online from their friends. Regardless, if anything, the misinformation has tarnished the platform’s reputation for many users. Other interesting reports by CBS illustrate work done by Facebook researchers in which three dummy accounts were created to study how Facebook recommends content to its users in their News Feeds. One account was characterized as being a conservative, one a liberal, and one a user from India. Overall, the researchers found that users are shown posts that are catered to their political views. I believe this can cause many problems as users will not be exposed to differing opinions and views. I think it can be damaging to be surrounded solely by people who think the same way that you do. In a statement to CBS News, a Facebook spokesperson said that they want to improve people’s experiences on the platform by prioritizing posts that inspire interactions, particularly conversations between family and friends.” Yet, in my opinion, their algorithm seems to do the complete opposite.

Lastly, I will talk about CNBC’s article “The Facebook Papers: Documents reveal internal fury and dissent over site’s policies.” Here, CNBC reveals how Facebook employees feel about the platform’s ultimate impact on society. More specifically, documents show how employees debated one another in regards to whether Facebook needs to be steered in a new direction, while others defended management, one calling Facebook executives “brilliant, data-driven futurists.” However, another employee stated that they’re “struggling to match [their] values to [their] employment [at Facebook].” The implications of these types of debates go beyond Facebook. Rather, Facebook impacts its users’ lives outside of the platform, and even the lives that those users interact with. The discussion surrounding Facebook’s impact is bigger than the platform itself. While individuals have been hired to improve the Facebook experience, much of the time efforts seem to be lacking. Thus, I believe real change needs to take place. 

In terms of Facebook’s algorithm, why do you think the most engagement on Facebook comes from content that is more so hateful and polarizing? Perhaps, because it may be easier to inspire people to anger than it is to other emotions. Why might this be?

Leave a Reply

Your email address will not be published. Required fields are marked *