This video featured an interview between someone employed for TedTalk and the head of Whatsapp discussing the changes the platform has seen due to the pandemic and the resulting consequences of the heightened usage and privacy. The app has seen a lot of changes due to the world moving remotely, such as an increase in usage (around 100 billion messages are sent everyday) and more usage of the video and audio chat features of the app. People use Whatsapp because they trust it as they trust an in person conversation, and the interviewer asked why people have this trust. Cathcart, the Whatsapp representative explained many safety measures within the app that cause this trust: for example, that Whatsapp cannot see the messages sent on the platform. Upon being asked what Whatsapp does to prevent the spread of misinformation, Cathcart said that there is a policy in which you can only forward messages to a maximum of five people, and in addition, when it is detected that a message has been forwarded frequently, people can now only send that message to one person at a time. Another safety measure in place is that journalists can see what is being sent in that country (while Whatsapp cannot) and therefore, limit the amount of misinformation being spread.
I found this TedTalk very informative – I did not know everything Whatsapp does to keep the privacy of its users and I found that very admirable. Compared to other platforms we have studied, it also is apparent that Whatsapp is not as profit-focused, shown by Cathcart actually saying they do not make much profit and its main goal being to limit the expense for people in developing countries to talk to their loved ones elsewhere. This TedTalk made me think that other platforms have the ability to do more to protect their users’ information. In addition, because Facebook and Whatsapp are connected, Facebook could be taking the same measures that Whatsapp is and is choosing not to, making it obvious that profit-driven companies care less about their users.
I am a frequent user of WhatsApp and I was unaware of all of these safety features on the platform. I’ve always enjoyed using the app for its other features, but reading about its safety features makes me enjoy using the platform even more. I feel like it’s rare to use a platform that isn’t using your private information. Or at least, most of the apps I use don’t take your privacy into as much consideration. Additionally, its feature regarding misinformation is a great tool that I’d like to see elsewhere. I know that WhatsApp is owned by Facebook, so I wonder why Facebook isn’t making as much of an effort.
I didn’t read this article and, like you, was surprised at the safety measures WhatsApp has in place. I also see a major discrepancy between these safety measures and those of Facebook, which seems ironic given that Facebook owns WhatsApp. It makes me wonder, who is implementing these policies? What control does Facebook have over Whatsapp privacy policies? I think it’s a tough look for Facebook that one of its own subsidiaries seems to be excelling in the area that has been a contributor to Facebook’s recent poor reputation.