The federal parliament has taken a significant step by passing legislation that prohibits people under 16 from having an account with certain social media platforms. This decision has sparked much debate and raised several important questions.
Unraveling the Implications of the Under-16 Social Media Ban
Details of the Legislation
The legislation amends the 2021 Online Safety Act and defines an "age-restricted user" as anyone under 16. It doesn't name specific platforms but includes those where the "sole purpose or a significant purpose" is to enable "online social interaction" between people. People can "link to or interact with" others, post material, or meet other conditions set out in the law. Some services like those providing "online business interaction" are excluded, but the exact platforms remain unclear. Tech companies face fines of up to A$50 million if they don't take "reasonable steps" to prevent under-16s from having accounts. While YouTube is reported to be exempt, this isn't explicitly confirmed. Messaging apps and gaming platforms like Minecraft are also not specifically mentioned, but news reports suggest they may be excluded along with "services with the primary purpose of supporting the health and education of end-users."In passing the final legislation, additional amendments were included. Tech companies can't collect government-issued identification like passports and driver's licenses as the only means of confirming age. They can do so if other alternative age assurance methods are provided to users. There must also be an "independent review" after two years to assess privacy protections and other issues.Challenges for Tech Companies
Tech companies now face a significant logistical challenge of verifying the age of both new and existing account holders. There are several options they might pursue. One is using credit cards as a proxy linked to app store accounts, as suggested by Communications Minister Michelle Rowland. However, this would exclude those over 16 who don't hold credit cards. Another option is facial recognition technology, which is being trialled by a consortium led by the Age Check Certification Scheme in the UK. But facial recognition systems have significant biases and inaccuracies. For example, commercially available systems have an error rate of 0.8% for light-skinned men and nearly 35% for dark-skinned women. Even some of the best-performing systems like Yoti have an average error of almost two years for 13 to 16-year-olds.The Digital Duty of Care
Earlier this month, the government promised to impose a "digital duty of care" on tech companies. This requires them to regularly conduct thorough risk assessments of platform content and respond to consumer complaints to remove harmful content. Experts, including the Human Rights Law Centre, support this duty. A parliamentary inquiry also recommended it. However, it's unclear when the government will fulfill this promise. Even if the duty is legislated, more investment in digital literacy is needed. Parents, teachers, and children need support to navigate social media safely.In the end, social media platforms should be safe spaces for all users. They offer valuable information and community engagement. But the work to keep us all safe and hold tech companies accountable is just beginning.READ MORE