Direct Messages

AppWizard
May 1, 2025
TikTok has enhanced its Promote ads feature by allowing brands to target direct messages (DMs) through third-party messaging applications, including WhatsApp, Facebook Messenger, LINE, and Zalo. Advertisers can now drive message traffic to these external platforms instead of only encouraging interactions within the TikTok app. This change is particularly useful for lead generation campaigns targeting users with compatible messaging apps installed. The supported messaging platforms currently are LINE, Zalo, WhatsApp, and Facebook Messenger. This update aims to improve customer engagement and simplify management of interactions across different messaging channels.
AppWizard
April 23, 2025
Law enforcement officials in New Jersey have filed a lawsuit against Discord, claiming the company misled parents about its safety controls and failed to protect children from sexual and violent content on its platform. New Jersey Attorney General Matthew Platkin stated that Discord has violated consumer protection laws, exposing children to risks from online predators. The lawsuit alleges that Discord's safety features were inadequate and that the app's claims of protecting young users were misleading. Specifically, it criticized the "Safe Direct Messaging" feature for failing to effectively detect or delete explicit content. The lawsuit seeks remedies including an injunction against further violations, civil penalties, and forfeiture of profits generated in New Jersey.
AppWizard
April 11, 2025
Meta Platforms Inc. is implementing new safety and privacy measures for users under 16 on its platforms, including Instagram, Facebook, and Messenger. These measures include prohibiting teens from hosting live videos without parental consent and banning the sharing of images with suspected nudity via direct messages. Additionally, Meta is introducing "Teen Accounts" on Facebook and Messenger, which will have stricter privacy settings, following a similar rollout on Instagram.
AppWizard
April 9, 2025
Meta is expanding its Teen Accounts protections, initially launched on Instagram, to Facebook and Messenger for users aged 13-15. These accounts limit messaging capabilities, filter out inappropriate content, and remind users to take breaks from screen time. Teens under 16 need parental permission to change default settings, with 97% choosing to maintain protections. The rollout of similar Teen Accounts on Facebook and Messenger is happening in the US, UK, Australia, and Canada, with global expansion planned. New features for Instagram Teen Accounts will require parental approval for going Live and disabling a feature that blurs suspected nude images in direct messages. An Ipsos survey indicated that 94% of US parents find Teen Accounts beneficial, and 85% believe they promote positive online experiences. Meta has over 54 million active Teen Accounts globally and is committed to enhancing safety for young users.
AppWizard
April 8, 2025
Meta is expanding its “Teen Accounts” initiative to include Facebook and Messenger, which were initially introduced on Instagram. This feature automatically implements privacy settings, content restrictions, and parental controls for users under 18. The initial rollout targets users in the United States, the United Kingdom, Australia, and Canada, with plans for global availability. Teen Accounts aim to reduce exposure to harmful content and enhance parental oversight by limiting interactions to friends or previously contacted users. Meta reports that 54 million teens worldwide have adopted Teen Accounts on Instagram, with 97% of teens aged 13-15 opting to retain their built-in protections. A study showed that 94% of parents found Teen Accounts helpful, and 85% believed they fostered a positive online experience. Additional protections for teens under 16 on Instagram include requiring parental approval for live broadcasts and disabling nudity protection. Meta has also introduced features to encourage breaks from device usage, such as daily reminders and the activation of “Quiet Mode” at night.
AppWizard
April 8, 2025
Meta is expanding its Teen Accounts initiative to include Facebook and Messenger, enhancing safety measures for young users in the United States, United Kingdom, Australia, and Canada, with plans to reach more regions soon. The Teen Accounts feature, launched on Instagram, provides a secure environment for adolescents, limiting their exposure to inappropriate content and interactions. Users under 16 will be restricted from messages from unknown individuals, and only friends can view and respond to their stories. Teens will receive reminders to take breaks after one hour of use and will enter "Quiet mode" at night. Parental consent is required for users under 16 to modify safety settings, go live on Instagram, or disable nudity blurring in direct messages. Since its launch, over 54 million teens have used Teen Accounts, with 97% of users aged 13 to 15 keeping these protections active. Research shows 94% of parents find the initiative helpful, and 85% believe it creates positive experiences for their teens. The expansion reflects Meta's response to regulatory scrutiny and its commitment to providing safer experiences for younger users.
Search