Social media giant Meta has begun the mandatory phase of complying with new Australian government legislation, announcing it will deactivate the accounts of users under 16 across its primary platforms starting next month. The move, affecting Facebook, Instagram, and Threads, marks the first concrete action taken by a major tech company ahead of the national deadline aimed at enhancing online safety for children.
The staggered deactivation process begins this week. Australian users identified as underage will receive immediate 14-day notice via in-app alerts, email, and SMS messages, notifying them of the impending access termination. Current access to affected accounts will cease on December 4, leading to the complete removal of all identified underage profiles by December 10. While the primary social media networks are included, the standalone Messenger service remains exempt, with Meta developing a new system to allow teens continued private messaging access without an active Facebook profile.
Navigating Deactivation and Identity Verification
Meta, which employs proprietary methodologies to pinpoint underage accounts, is providing users with clear options before their access is blocked. Teen users facing deactivation have the ability to download and preserve all previously created content, including private messages, posts, and Reels, before the final cutoff date.
Mia Garlick, Meta’s Regional Policy Director, confirmed that content is not immediately lost upon deactivation. When the user reaches the age of 16, they will be permitted to reactivate their account with all content restored. Alternatively, users may request the permanent deletion of their data at any point.
The enforcement process is not infallible, and minors who believe they have been incorrectly flagged can contest the decision. Meta offers two methods for age verification: submitting a video selfie processed through facial recognition technology, or submitting official government identification through the third-party age assurance provider, Yoti.
Industry Debate Over Compliance Strategy
While adhering to the government’s mandate, Meta maintains a critical stance on the blanket ban. The company argues that its existing safety settings—including tools for parental supervision, limitations on contact requests from non-friends, and restrictions on targeted advertising for younger demographics—offer a more nuanced and superior safety solution than a complete ban.
Furthermore, Meta has publicly advocated that the responsibility for age verification should ideally be shifted to the point of download, specifically at the app store level, rather than being placed solely on individual social media platforms.
Meta is the first major platform required by the legislation to publicly detail its compliance timeline. The ban also covers other widely used platforms, including TikTok, Snapchat, and X (formerly Twitter), all of which are expected to outline their respective strategies—or potential legal challenges—in the coming weeks.
This large-scale deactivation underscores the increasing global pressure on tech companies to implement robust age verification and safety protocols. For Australian parents and guardians, the upcoming December deadline necessitates a review of their children’s online presence and a discussion about transitioning to the new messaging-only access for younger teens.