An Australian regulator, after using new powers to make the tech giants share information about their methods, accused Apple and Microsoft not doing enough to stop child exploitation content on their platforms.
The e-Safety Commissioner, an office set up to protect internet users, said that after sending legal demands for information to some of the world’s biggest internet firms, the responses showed Apple and Microsoft did not proactively screen for child abuse material in their storage services, iCloud and OneDrive.
Our use of world-leading transparency powers found some of the world’s biggest tech companies aren’t doing enough to tackle child sexual exploitation on their platforms, with inadequate & inconsistent use of tech to detect child abuse material & grooming: https://t.co/ssjjVcmirD pic.twitter.com/onfi3Ujt85
— eSafety Commissioner (@eSafetyOffice) December 14, 2022
The two firms also confirmed they did not use any technology to detect live-streaming of child sexual abuse on video services Skype and Microsoft Teams, which are owned by Microsoft and FaceTime, which is owned by Apple, the commissioner said in a report published on Thursday.
A Microsoft spokesperson said the company was committed to combatting proliferation of abuse material but “as threats to children’s safety continue to evolve and bad actors become more sophisticated in their tactics, we continue to challenge ourselves to adapt our response”.
Apple was not immediately available for comment.
The disclosure confirms gaps in the child protection measures of some of the world’s biggest tech firms, building public pressure on them to do more, according to the commissioner. Meta, which owns Facebook, Instagram and WhatsApp, and Snapchat owner Snap also got demands for information.
The responses overall were “alarming” and raised concerns of “clearly inadequate and inconsistent use of widely available technology to detect child abuse material and grooming”, commissioner Julie Inman Grant said in a statement.
Microsoft and Apple “do not even attempt to proactively detect previously confirmed child abuse material” on their storage services, although a Microsoft-developed detection product is used by law enforcement agencies.
An Apple announcement a week ago that it would stop scanning iCloud accounts for child abuse, following pressure from privacy advocates, was “a major step backwards from their responsibilities to help keep children safe” Inman Grant said.
The failure of both firms to detect live-streamed abuse amounted to “some of the biggest and richest technology companies in the world turning a blind eye and failing to take appropriate steps to protect the most vulnerable from the most predatory”, she added.
© Thomson Reuters 2022