Tech giants face eSafety fines
The eSafety Commissioner is demanding major tech companies take action against child abuse.
Legal notices have been issued to major technology companies including Apple, Google, Meta, and Microsoft under Australia's Online Safety Act, which require the companies to report biannually on their measures to combat online child sexual abuse.
The scope of these notices extends to services such as Discord, Snap, Skype, and WhatsApp.
All recipients must explain their strategies to address child abuse material, livestreamed abuse, online grooming, sexual extortion, and the production of synthetic or deepfaked child abuse material created using generative AI.
For the first time, the Commissioner says these companies are obligated to provide regular reports to eSafety over the next two years.
The eSafety Commissioner will publish summaries of the findings to enhance transparency, highlight safety weaknesses, and incentivise improvements.
Commissioner Julie Inman Grant says the selection of companies was based partly on their previous responses to eSafety inquiries in 2022 and 2023.
These responses revealed numerous safety concerns related to child protection.
“We’re stepping up the pressure on these companies to lift their game,” Ms Inman Grant said.
“They’ll be required to report to us every six months and show us they are making improvements. When we sent notices to these companies back in 2022/3, some of their answers were alarming but not surprising as we had suspected for a long time that there were significant gaps and differences across services’ practices.
“In our subsequent conversations with these companies, we still haven’t seen meaningful changes or improvements to these identified safety shortcomings.”
Ms Inman Grant said the agency would be following up on some specific concerns.
“Apple and Microsoft said in 2022 that they do not attempt to proactively detect child abuse material stored in their widely used iCloud and OneDrive services,” she said.
“This is despite the fact it is well-known that these file storing services serve as a haven for child sexual abuse and pro-terror content to persist and thrive in the dark.
“We also learnt that Skype, Microsoft Teams, FaceTime, and Discord did not use any technology to detect live-streaming of child sexual abuse in video chats. This is despite evidence of the extensive use of Skype, in particular, for this long-standing and proliferating crime.”
Meta was also criticised for not sharing information between its services when an account is banned for child abuse, potentially allowing offenders to continue their activities on other platforms like Instagram and WhatsApp.
eSafety found that Google services, including YouTube, were not blocking links to known child abuse websites, despite the availability of databases that could be utilised.
Snapchat was identified as another platform with significant safety gaps. Despite regular observations of its use for grooming and sexual extortion, it was found not to be using any tools to detect grooming in chats.
Compliance with the notices is mandatory, with potential financial penalties of up to $782,500 a day for non-compliance. The companies must provide their first responses by 15 February 2025.