Internal revelations have emerged that Meta has neglected its messenger application ‘WatsApp’ even t..

Internal revelations have surfaced regarding Meta’s handling of its messaging application, WhatsApp, particularly concerning a significant security vulnerability that the company allegedly chose to overlook. According to a report by the New York Times on the 8th, former WhatsApp security director Ataullah Baik has initiated legal proceedings in federal court in Northern California. Baik asserts that numerous employees at Meta and WhatsApp had unrestricted access to sensitive user data, including profile photos, locations, group subscription histories, and contacts. He claims that despite an alarming average of over 100,000 account hacking incidents daily, the company failed to address these issues.

In his lawsuit, Baik alleges that he brought these concerns directly to Meta CEO Mark Zuckerberg and other senior executives. However, he contends that he faced retaliation and was terminated in February as a result. In response to these allegations, Meta has firmly denied Baik’s claims, with WhatsApp spokesperson Carl Ug stating, “An employee who was laid off due to underperformance is distorting the company’s efforts.”

Concerns Over Child Safety and Internal Research

This lawsuit emerges amid a wave of internal accusations against Meta. On the 7th, four current and former employees testified before the U.S. Congress, revealing that the company has allegedly minimized or eliminated research focused on the potential risks faced by children and adolescents on its virtual reality platform, “Horizon Worlds.” Reports from the Washington Post indicate that Meta’s legal team directed employees not to gather research data related to minors or to alter their findings to downplay potential risks.

In a related incident from 2021, former Meta employee Frances Haugen disclosed that Facebook prioritized profit over user safety, even when aware that its algorithms could negatively impact the mental health of adolescents. Recently, internal documents from Meta AI were uncovered, revealing features that allowed for “splitting” (expressing love emotions) or “romantic role-play” aimed at 8-year-old children. This prompted a collective response from 44 U.S. attorneys general, who stated, “We will use all our powers to prevent children from being exploited by predator-like AI.”

As these revelations continue to unfold, Meta appears to be grappling with a profound trust crisis, not only regarding security and privacy but also concerning the protection of children. This situation holds significant implications for WhatsApp, one of the largest messaging platforms globally, as it faces scrutiny over its reliability and commitment to user safety.

AppWizard
Internal revelations have emerged that Meta has neglected its messenger application 'WatsApp' even t..