Meta has implemented protections for teens since 2017, including suicide prevention tools, restrictions on adult-initiated chats with teens, default private accounts for users under 16, time limit prompts, and expanded Teen Accounts across Instagram, Facebook, and Messenger in 2024 and 2025.
Recent updates include content filters inspired by 13+ movie ratings applied by default to Teen Accounts, with no opt-out without parental permission, and stricter options available.
The post highlighted collaborations with organizations such as the National Center for Missing and Exploited Children for tools to prevent non-consensual image sharing, awareness campaigns on sextortion, and industry programs like Lantern for sharing signals on predatory accounts.
Meta reported responding to over 9,000 emergency law enforcement requests in 2024, averaging 67 minutes per case, with faster handling for child safety matters.
Meta argued that recent lawsuits misrepresent its record by selectively citing internal documents to claim prioritization of growth over safety.
The company asserted that teen mental health involves multiple factors, including academic pressure and school safety, and that social media provides benefits like community building and opportunities, supported by reports from the National Academies of Sciences, Engineering, and Medicine.
Why This Matters Today
The statement arrives as Meta faces multidistrict litigation and state lawsuits alleging its platforms contribute to teen mental health harms through addictive design and inadequate protections.
Court proceedings continue, with some appeals indicating cases will proceed and trials scheduled in 2026. Meta’s post positions the company to defend against these claims by emphasizing proactive changes and evidence of declining teen depression rates.
According to the 2024 National Survey on Drug Use and Health from the U.S. Department of Health and Human Services, major depressive episodes among 12-to-17-year-olds dropped from 21% in 2021 to 15% in 2024, and serious suicidal thoughts fell from nearly 13% to 10%.
Meta cited this data to argue that social media cannot be singled out as the primary cause of teen well-being trends.
The post underscores Meta’s push for parental controls, such as supervision tools allowing time limits as low as 15 minutes per day and monitoring of messages, while noting that some restrictions may reduce engagement but were implemented for safety.
Our Key Takeaways:
- Meta detailed over a decade of teen safety features, from 2017 suicide tools to 2025 content restrictions modeled on 13+ ratings and expanded Teen Accounts.
- The company contended that lawsuits oversimplify complex teen mental health issues and ignore scientific evidence of social media benefits alongside declining U.S. teen depression and suicide thought rates from 2021 to 2024.
- Ongoing litigation against Meta on youth mental health grounds will continue into 2026, with potential trials testing the company’s internal decisions and safety implementations.
You may also want to check out some of our other tech news updates.
Wanna know what’s trending online every day? Subscribe to Vavoza Insider to access the latest business and marketing insights, news, and trends daily with unmatched speed and conciseness. 🗞️





