In the Facebook Paper published by the Washington Post, The newspaper discusses the new discoveries and repercussions triggered by whistleblower Frances Haugen, a former Facebook product manager. This group of articles discusses and analyzes the different actions and consequences Facebook is being faced with, and the specific aspects of the platform that are to blame for an exponential amount of misinformation, violence, hate and illegal activity not being checked.
The key takeaways from these papers include Zuckerberg’s public claims conflicting with internal research, the platform dropping its guard before the January 6th insurrection, failing to effectively police content in many areas of the world, choosing maximum engagement over user safety and taking years to implement a simple fix for anger and misinformation. Within all of these articles, there is discussion over the ways in which the five ways to react to a post have created a shift in the facebook algorithm and the angry reaction being used the most, as well an accusation of Zuckerberg choosing company growth over public good and safety. We also learn about the ways in which FaceBook has neglected certain countries, such as India, by tracking a dummy account of 21 year old woman from North India. They watched the accounts page turn more and more violent, starting with some inappropriate posts and escalating into pictures and videos of murder and dead bodies. The lack of regulation and adjustability to certain countries have been outed to the public. Lastly, it is significant to note that FaceBook is said to have been partially responsible for the January 6th insurrection by not regulating and forcefully shutting down the Stop the Steal movement that was pushed by Trump’s political allies.
I think it is reasonable, after reading these papers and articles, that it is not necessarily what FaceBook has done, but rather what they haven’t done to stop the spread of misinformation, violence, hate, and lack of adaptation to the ways in which different countries utilize this platform. FaceBook was and has been revolutionary in terms of technology, but with that comes a lot of unknowns and differing views on how to handle situations, including what they are being exposed for now. I think that these platforms needs to constantly be checked in order to prevent these harmful consequences that people all over the world are paying for now.
Leah, I watched Frances Haugen’s testimony and I was shocked by the fact that Facebook’s blatantly cared more about their growth rather than the good of the individuals on the platform. It’s crazy to me that this was internally known and no one spoke up until Haugen. I would assume someone would think of this situation from a reversed role and would never want to be on a platform that promotes mental health issues and negativity. I wonder how long it’ll take for action to occur and for stricter measures to be put in place on the platform.
Leah, I loved your analysis of this article. I find Facebook’s stance on this whole thing baffling. They say they did so much but they didn’t. There are so many ways they can optimize the platform for a better, safer environment or care more about safety than growth. You’d think that these people would care, but I do think it shows how messed up the company is if their employees are coming out to share their fears and stories. We now live in a time period where the Internet and social media platforms wield so much power and influence over the entire world and just because it’s not in the Constitution doesn’t mean there shouldn’t be restrictions and laws in place to moderate the masses.