**A recent report indicates that over 80% of Australian children aged 12 or under violated age restrictions on popular social media platforms in 2022, prompting regulatory scrutiny and upcoming bans for users under 16.**
**Australia's Youth and Social Media: 80% Exceed Age Limits, Report Finds**

**Australia's Youth and Social Media: 80% Exceed Age Limits, Report Finds**
**Australia's eSafety regulator reveals alarming data on children's social media usage amid new policy considerations.**
In a striking revelation, Australia's eSafety regulator has disclosed that more than 80% of children aged 12 and under actively used social media or messaging platforms last year that are designed for users aged 13 and older. The report highlighted YouTube, TikTok, and Snapchat as the most frequented platforms among this young demographic.
In response to growing concerns, Australia is anticipated to enforce a ban on social media access for those under 16 by the year's end. The companies under scrutiny—including Discord, Google (YouTube), Meta (Facebook and Instagram), Reddit, Snap, TikTok, and Twitch—have yet to provide statements regarding this significant finding.
While platforms typically restrict account creation to users aged 13 and older, exceptions do exist. For instance, YouTube allows parents to set up supervised accounts for younger children through Family Link, as well as a dedicated YouTube Kids app. However, this report did not factor in usage of YouTube Kids, highlighting a concerning trend toward unrestricted access.
Julie Inman Grant, the eSafety commissioner, emphasized that the importance of these findings could serve as vital groundwork for future actions. She asserted that ensuring online safety for children is a collective responsibility involving social media companies, device manufacturers, educators, and parents.
In an extensive survey involving over 1,500 children aged eight to 12, researchers found that 84% have engaged with social media or messaging platforms since early 2022. Notably, over half accessed these accounts through a parent's or guardian's profile, while approximately one-third possessed their own accounts. Alarmingly, only 13% of children had their accounts deactivated by the social media companies for being underage.
The report's authors pointed out significant inconsistencies across the industry concerning age verification during the user sign-up process, indicating a major loophole that facilitates access for underage users. The analysis also questioned the efficacy of age verification measures cited by platforms such as Snapchat, TikTok, Twitch, and YouTube, which rely heavily on user interaction to identify younger users, leaving children vulnerable to risks during initial engagement.
In response to growing concerns, Australia is anticipated to enforce a ban on social media access for those under 16 by the year's end. The companies under scrutiny—including Discord, Google (YouTube), Meta (Facebook and Instagram), Reddit, Snap, TikTok, and Twitch—have yet to provide statements regarding this significant finding.
While platforms typically restrict account creation to users aged 13 and older, exceptions do exist. For instance, YouTube allows parents to set up supervised accounts for younger children through Family Link, as well as a dedicated YouTube Kids app. However, this report did not factor in usage of YouTube Kids, highlighting a concerning trend toward unrestricted access.
Julie Inman Grant, the eSafety commissioner, emphasized that the importance of these findings could serve as vital groundwork for future actions. She asserted that ensuring online safety for children is a collective responsibility involving social media companies, device manufacturers, educators, and parents.
In an extensive survey involving over 1,500 children aged eight to 12, researchers found that 84% have engaged with social media or messaging platforms since early 2022. Notably, over half accessed these accounts through a parent's or guardian's profile, while approximately one-third possessed their own accounts. Alarmingly, only 13% of children had their accounts deactivated by the social media companies for being underage.
The report's authors pointed out significant inconsistencies across the industry concerning age verification during the user sign-up process, indicating a major loophole that facilitates access for underage users. The analysis also questioned the efficacy of age verification measures cited by platforms such as Snapchat, TikTok, Twitch, and YouTube, which rely heavily on user interaction to identify younger users, leaving children vulnerable to risks during initial engagement.