ProPublica made this explosive claim in which it talks about how Facebook has been aggressively marketing end-to-end encryption for WhatsApp since 2016.
Facebook-owned WhatsApp comes with end-to-end encryption which promises to keep the safety and security of chats intact. Now WhatsApp has unveiled end-to-end encryption for chat backups also.
However, a new report claims that WhatsApp messages are not end-to-end encrypted and Facebook sees the content of messages on the platform.
ProPublica made this explosive claim in which it talks about how Facebook has been aggressively marketing end-to-end encryption for WhatsApp since 2016.
Picture this, the claims in the ProPublica report is based on observations of 1,000 contract workers of WhatsApp which examine millions of user content. The company further added that these workers have access to special Facebook software in order to check private WhatsApp messages, images, and videos.
“These contractors pass judgment on whatever flashes on their screen — claims of everything from fraud or spam to child porn and potential terrorist plotting — typically in less than a minute,” the report added.
The report further revealed that these workers are from Austin, Dublin, Singapore, and Texas and they were given the task to examine the reported chats.
According to WABetainfo, the screenshot shared by WhatsApp revealed that “The most recent messages from this contact will be forwarded to WhatsApp. This contact will not be notified.”
The second pop-up showed, “The last 5 messages from this contact will be forwarded to WhatsApp. If you block this contact and delete the chat, it will be deleted from this device only. The contact will not be notified.”
The ProPublica report further showed that WhatsApp has invested a lot of time and money in the fierce promotion of WhatsApp’s privacy. Besides that, the report claims that the company compares its brand character to the Immigrant Mother.
The report further said, “Because WhatsApp’s content is encrypted, artificial intelligence systems can’t automatically scan all chats, images and videos, as they do on Facebook and Instagram. Instead, WhatsApp reviewers gain access to private content when users hit the “report” button on the app, identifying a message as allegedly violating the platform’s terms of service.”
In response to these allegations, Facebook responded by saying, “We build WhatsApp in a manner that limits the data we collect while providing us tools to prevent spam, investigate threats, and ban those engaged in abuse, including based on user reports we receive. This work takes extraordinary effort from security experts and a valued trust and safety team that works tirelessly to help provide the world with private communication.”
“Based on the feedback we’ve received from users, we’re confident people understand when they make reports to WhatsApp, we receive the content they send us,” the company added.