Owing to the dire consequences of the spread of disinformation on WhatsApp, sometimes fatal, the instant messaging service had introduced new limitations on the platform. Originally, users could forward messages to up to 256 groups but that had been limited to only five. This change has reportedly worked as intended, according to a new study.
There’s misinformation and then there’s disinformation. While misinformation is simply factually incorrect data that is spread without any malicious intentions, disinformation, on the other hand, is intentionally falsified information, such as government propaganda, that’s meant to deceive the receiver. Facebook, Twitter, and WhatsApp have been in the middle of this issue where the use of WhatsApp has apparently influenced elections in Brazil and India.
Based on new research from MIT, the limitation was successful in curbing the spread of information, ultimately disinformation via WhatsApp. According to one of the coauthors Kiran Garimella, about 80 percent messages stopped spreading in just two days, whereas the remaining 20 percent was still viral.
This research is based on public data since WhatsApp is a private tool with end-to-end encryption enabled. Essentially, there’s no way they could have accessed private chats. Hence, the only way they could take this research forward was to enter public groups where political candidates stay in contact with their voters. They joined thousands of groups across Brazil, India, and Indonesia. After scraping and analysing the data of over six million public messages, the researchers were able to simulate how messages were forwarded.
It was reported that after the implementation of the forwarding limit, the total number of forwarded messages reduced by 25 percent, according to WhatsApp. There are new labels that denote if a forwarded message is from a chain message and it also mentions how many times a particular message has been forwarded.