A recent study sheds light on the evolving role of messaging platforms in the realm of political discourse, revealing their potential as tools for propaganda and manipulation of public sentiment. Researchers from New York University (NYU) conducted a comprehensive survey involving 4,500 users across nine countries, alongside interviews with political strategists from 17 nations, to explore the ways in which platforms like WhatsApp and Telegram are being exploited by malicious actors.
The findings indicate that a significant 62% of respondents reported receiving political content through these messaging apps. Alarmingly, 55% of this content originated from unknown sources, highlighting a concerning trend. The report attributes this phenomenon to the inherent lack of traditional content moderation mechanisms on platforms such as WhatsApp, Viber, and Telegram, which, while promoting secure communication, also monetize features that inadvertently facilitate the spread of disinformation.
Paid features boost audiences for disinformation
The report further reveals that political operatives are leveraging paid features on these messaging platforms to broaden their reach. WhatsApp’s Business Platform, for instance, offers a verification badge, automated messaging, and unlimited outreach capabilities, thereby amplifying the visibility of their content. Users can opt-in to receive messages from verified businesses, yet the platform’s policies explicitly prohibit political parties and campaigns from utilizing its services. Despite these restrictions, some users have found ways to circumvent the rules by impersonating legitimate entities or creating fictitious business profiles to gain access to WhatsApp’s business features.
Viber operates similarly, allowing users to opt-out of receiving messages, yet the report highlights instances in Ukraine where political consultants have obtained verified accounts through third-party vendors. These actors then initiate social media campaigns encouraging users to subscribe to mailing lists via QR codes, often without their informed consent.
On Telegram, users can access a suite of additional features for a nominal fee, enabling political operatives to present themselves as legitimate accounts without undergoing a verification process. The platform also allows for the purchase of ad placements in high-subscriber channels, which collectively generate approximately 1 trillion views monthly.
Rakuten, the parent company of Viber, emphasized in a statement that their policies aim to empower users to make informed decisions regarding content engagement. Meanwhile, inquiries directed at Meta, the parent company of WhatsApp, and Telegram for further comments went unanswered.
Platforms also amplify disinformation
The report underscores how the paid features of these messaging apps can exacerbate existing disinformation strategies. A common tactic involves infiltrating pre-existing social media groups, with Viber’s lack of participant caps facilitating this approach. Even groups that appear apolitical can be exploited by propagandists who tailor political messages to resonate with group members. The presence of “sock puppet” accounts—fake profiles created to represent specific viewpoints—adds another layer of complexity, as these accounts can operate with greater anonymity on messaging platforms compared to traditional social media.
Cross-posting is another prevalent tactic, where users share identical content across multiple platforms simultaneously. For instance, Telegram users can create bots that automate content sharing to X, while an Indian app called ShareChat enables cross-posting between Telegram and other Meta-owned platforms like Facebook and Instagram. This interconnectedness fosters what researchers describe as “feedback loops,” where the same content circulates across various segments of the platform ecosystem.
Recommendations for messaging apps
In light of these findings, the researchers propose several recommendations for messaging app developers. They highlight the dual-edged nature of encryption, which, while providing privacy for activists at risk of surveillance, can also be exploited by propagandists. Suggested measures include implementing stricter account-creation limits and enhancing the vetting process for business accounts.
For policymakers, the report advocates for the inclusion of encrypted messaging platforms within existing regulatory frameworks, emphasizing the need to preserve their value for human rights defenders while addressing the threat of disinformation. One potential approach is to mandate transparency regarding the effectiveness of policies and enforcement mechanisms aimed at combating disinformation.
Report shows how messaging apps are used to spread propaganda
A recent study sheds light on the evolving role of messaging platforms in the realm of political discourse, revealing their potential as tools for propaganda and manipulation of public sentiment. Researchers from New York University (NYU) conducted a comprehensive survey involving 4,500 users across nine countries, alongside interviews with political strategists from 17 nations, to explore the ways in which platforms like WhatsApp and Telegram are being exploited by malicious actors.
The findings indicate that a significant 62% of respondents reported receiving political content through these messaging apps. Alarmingly, 55% of this content originated from unknown sources, highlighting a concerning trend. The report attributes this phenomenon to the inherent lack of traditional content moderation mechanisms on platforms such as WhatsApp, Viber, and Telegram, which, while promoting secure communication, also monetize features that inadvertently facilitate the spread of disinformation.
Paid features boost audiences for disinformation
The report further reveals that political operatives are leveraging paid features on these messaging platforms to broaden their reach. WhatsApp’s Business Platform, for instance, offers a verification badge, automated messaging, and unlimited outreach capabilities, thereby amplifying the visibility of their content. Users can opt-in to receive messages from verified businesses, yet the platform’s policies explicitly prohibit political parties and campaigns from utilizing its services. Despite these restrictions, some users have found ways to circumvent the rules by impersonating legitimate entities or creating fictitious business profiles to gain access to WhatsApp’s business features.
Viber operates similarly, allowing users to opt-out of receiving messages, yet the report highlights instances in Ukraine where political consultants have obtained verified accounts through third-party vendors. These actors then initiate social media campaigns encouraging users to subscribe to mailing lists via QR codes, often without their informed consent.
On Telegram, users can access a suite of additional features for a nominal fee, enabling political operatives to present themselves as legitimate accounts without undergoing a verification process. The platform also allows for the purchase of ad placements in high-subscriber channels, which collectively generate approximately 1 trillion views monthly.
Rakuten, the parent company of Viber, emphasized in a statement that their policies aim to empower users to make informed decisions regarding content engagement. Meanwhile, inquiries directed at Meta, the parent company of WhatsApp, and Telegram for further comments went unanswered.
Platforms also amplify disinformation
The report underscores how the paid features of these messaging apps can exacerbate existing disinformation strategies. A common tactic involves infiltrating pre-existing social media groups, with Viber’s lack of participant caps facilitating this approach. Even groups that appear apolitical can be exploited by propagandists who tailor political messages to resonate with group members. The presence of “sock puppet” accounts—fake profiles created to represent specific viewpoints—adds another layer of complexity, as these accounts can operate with greater anonymity on messaging platforms compared to traditional social media.
Cross-posting is another prevalent tactic, where users share identical content across multiple platforms simultaneously. For instance, Telegram users can create bots that automate content sharing to X, while an Indian app called ShareChat enables cross-posting between Telegram and other Meta-owned platforms like Facebook and Instagram. This interconnectedness fosters what researchers describe as “feedback loops,” where the same content circulates across various segments of the platform ecosystem.
Recommendations for messaging apps
In light of these findings, the researchers propose several recommendations for messaging app developers. They highlight the dual-edged nature of encryption, which, while providing privacy for activists at risk of surveillance, can also be exploited by propagandists. Suggested measures include implementing stricter account-creation limits and enhancing the vetting process for business accounts.
For policymakers, the report advocates for the inclusion of encrypted messaging platforms within existing regulatory frameworks, emphasizing the need to preserve their value for human rights defenders while addressing the threat of disinformation. One potential approach is to mandate transparency regarding the effectiveness of policies and enforcement mechanisms aimed at combating disinformation.