top of page

Meta Launches Facebook and Messenger for Teens

  • Writer: paolo bibat
    paolo bibat
  • Apr 8
  • 2 min read

Meta has launched enhanced safety features for adolescent users on Facebook and Messenger, automatically enrolling individuals under 18 into restricted “Teen Accounts” designed to minimize exposure to inappropriate content and unwanted interactions.


The initiative, now active in the United States, the United Kingdom, Australia, and Canada, builds on similar protections introduced for Instagram in 2023 and represents Meta’s latest effort to address growing regulatory and parental concerns about youth mental health and online safety.

Under the new framework, teens will only receive direct messages from accounts they follow or have previously contacted, while stories, tags, mentions, and comments will be restricted to friends or followed users.


Live streaming on Instagram remains prohibited for users under 16 unless explicit parental consent is granted via Meta’s Family Center tools. Additionally, a default nudity-blurring feature for direct messages cannot be disabled by teens under 16 without guardian approval.


Teens will encounter automated reminders to log off after one hour of daily use and will be enrolled in “Quiet Mode” overnight, silencing notifications during designated hours. Parents can monitor activity and adjust settings through Meta’s Family Center, which provides insights into screen time, privacy configurations, and account interactions.


The company reported that 97% of teens aged 13–15 retain these default protections on Instagram, where 54 million accounts have already transitioned to the restricted experience.


The expansion follows lawsuits from 41 U.S. states alleging Meta prioritized engagement over youth safety, alongside warnings from the U.S. Surgeon General about social media’s mental health risks.


Meta’s internal study, conducted by Ipsos, found 94% of parents view Teen Accounts as beneficial, with 85% acknowledging improved guidance for their children’s online experiences.


Meta plans to extend these protections to additional regions, positioning the updates as part of a broader industry shift toward age-appropriate digital ecosystems.


The move aligns with legislative efforts in states like Florida and Texas, which now mandate parental consent for minors accessing social platforms, and reflects intensifying scrutiny of tech giants’ roles in adolescent well-being.


By integrating parental controls with automated safeguards, Meta aims to balance user autonomy with protective measures, though critics argue persistent design elements—such as algorithmic content feeds—still pose risks.


The initiative underscores the company’s attempt to preempt stricter regulations while navigating complex debates about digital responsibility and youth autonomy.

bottom of page