In The Atlantic’s article, Facebook Papers: ‘History Will Not Judge Us Kindly,’ LaFrance discusses the events leading up to January 6th and the influence of Facebook over the span of the presidential election.
LaFrance said the Facebook documents were astonishing for two reasons: 1) the sheer volume is unbelievable and 2) the documents leave little room for doubt about Facebook’s crucial role in advancing the cause authoritarianism in America & around the world. The papers reveal truths about Facebook’s influence on the widespread distribution of information and harmful content. LaFrance argues that Facebook amplifies extremism and misinformation, incites violence, encourages radicalization and political polarization. For example, the Facebook group “Stop The Steal” took off on the platform with 100 new members joining every two seconds. Within 24 hours the group had 333,000 members. The group played a role in spreading misinformation about the election and inciting violence. Even though the group was dismantled, copycat groups formed on the platform, and it remained the central hub for the plans of the attack on Jan. 6. Some believe Facebook had no impact or position in the insurrection, yet it was the catalyst. LaFrance argues that Facebook “made people’s efforts at coordination highly visible at a global scale,” and “helped recruit people and offered people a sense of strength in numbers.”
However, Mark Zuckerberg’s stance on Facebook’s role in the insurrection is troubling. Zuckerberg believes Facebook “did more and did better than journalism outlets in its response to Jan. 6.” But, LaFrance counters Zuckerberg’s beliefs by saying journalism outlets are in no position to help investigators because insurrectionists don’t use newspapers or magazines. Yet, Facebook is aware of the platform’s issues. They acknowledge the platform’s power and influence, and many employees have come forth to share their knowledge of the harm. They know there are ways to create a safer and healthier environment for users, and reduce and prevent violence, hate crimes, misinformation, and harmful content. LaFrance offers a few fixes and changes FB could utilize including, banning reshares, optimizing its platform for safety and quality rather than for growth, create a transparent dashboard to view viral content in real time, or fix the algorithm to prevent widespread distribution of harmful content. But Facebook chooses not to.
LaFrance compares Facebook to a Doomsday Machine because of its “technologically simple and unbelievably dangerous – a black box of sensors designed to suck in environmental cues and deliver mutually assured destruction.” She calls us to recognize a disturbing reality – Facebook is untouchable. We’re watching employees come out and share the dangers of Facebook and their fears. Yet, the platform’s enough size wields tremendous power. So, her advice is to be more vigilant and attentive to our feeds and what we’re being targeted with and be hesitant to social scenarios you see on Facebook.
Grayson, you wrote a really interesting summary of The Atlantic’s coverage of the Facebook Papers. The last paragraph was unsettling and really elucidates the power/scope of Facebook. Even though employees are speaking out about their fears and the dangers associated with Facebook, they are often minimized or just normalized. This is concerning because Facebook and its characteristics affect our daily lives.
Reading about the January 6th insurrection and FaceBook’s role in it is unbelievable to me. I find that it is more of a lack of what the platform didn’t do as opposed to what it has done. Common to all of these articles is that there are so many different ways that the spread of misinformation and violence could have been halted, and yet FaceBook did the bare minimum, and clearly not enough. The platform needs to be checked, as well as others for future incidents such as these ones.