In “Facebook Wrestles With the Features it Used to Define Social Networking” the author goes into detail on an experiment in which the like button was removed from the platform. This was done to test if people would see less pressure in using the platform and less social comparison. Once this change was made, it was concluded that people actually interact with posts much less frequently than if the like button was there.
In the next piece, “The Facebook Papers and their Fallout”, the author talks mainly about the disturbing data that Facebook hides about its users’ worsening body image. In addition, Facebook spreads misinformation, hate speech, and celebration of violence. Haugen says that the users deserve more transparency with the platforms they are engaging in and says that “they [Facebook] are very good at dancing with data”.
In “In India, Facebook Grapples With an Amplified Version of Its Problems”, the author spoke about a Facebook researcher who created an account situated in India to see what social media is like in another country. The researcher was confronted with the disturbing reality: misinformation, celebration of violence, and hate speech. The author also shared a very disturbing statistic, saying that “87% of the companies global budget for time spent on classifying misinformation is earmarked for the US while only 13% is set for the rest of the world”. This goes to show that there are many fake accounts spreading misinformation which has even impacted Indian elections.
Lastly, in “Internal Alarm, Public Shrugs: Facebook’s Employees”, the author further emphasized the points of the previously referenced articles, underscoring the misinformation riddled in the Facebook platform. In addition, this article goes into further detail to explain the negative real-life impacts this has had, outlining its effects on the election and the Capitol Hill riots. While Facebook tries to implement some preventative methods, such as slowing the creation of Facebook groups, this article proves that not enough is being done to censor what is being put publicly on their platform.
I found NYT coverage of the Facebook papers very informative – much of this information I do not believe many Facebook users know about. It seems as though Facebook has a lot of work to do in the area of preventing misinformation, celebration of violence, and hate speech in its platform. I also believe the least that Facebook can do is it be transparent with their users about this, and beyond that, take action to counteract this.
I read the articles from the Washington Post, and had one similar to the one you described about the fake account of an Indian woman to see what her feed would look like. They explained that over time the algorithm presented her account with more and more inappropriate content, and that was seriously disturbing to me. I find it interesting that India has more FaceBook users in their country than we do in the United States, as well as the fact that the platform doesn’t fully adapt to Indian users, including not changing the language of the terms and conditions of the app for the country, and keeping them in English.
When I saw the video on FOX news, it mentioned that NYT is the first media to have the report among all other media companies. I also agree with you that these media talked about a lot of things that many Facebook users had no idea about. I am curious about the users’ feeling after they have read these articles. Would someone discard this social media?
Hi Talia! Great summary of the NYT coverage of the Facebook papers! I am eager to hear more about this in class. I totally agree with you. I think that Facebook should be as transparent as possible. More and more consumers and users are prioritizing social responsibility in the companies that they support. So I think it would only benefit Facebook in the long run to be transparent.