An Australian regulator has sent legal notices to several Big Tech companies requiring them to explain what they are doing to prevent the use of their platforms for child sexual exploitation, requiring them to respond within 28 days and saying they will face penalties of up to $555,000 a day if they fail to do so.
The notices were sent to Apple, Meta, Microsoft, Snap and Omegle, and more such notices will be sent to other tech companies, according to a press release issued Tuesday (Aug. 30) by Australia’s eSafety Commissioner.
“As more companies move towards encrypted messaging services and deploy features like livestreaming, the fear is that this horrific material will spread unchecked on these platforms,” eSafety Commissioner Julie Inman Grant said in the release.
The notices were issued under Australia’s Basic Online Safety Expectations, which is part of the Online Safety Act 2021 and outlines the minimum online safety requirements expected of tech companies that operate in the country, according to the press release.
“Industry must be upfront on the steps they are taking, so that we can get the full picture of online harms occurring and collectively focus on the real challenges before all of us,” Inman Grant said in the release.
As PYMNTS has reported in recent months, Australian regulators and courts have focused on other practices of big tech companies as well.
In June, an Australian court found Google liable for defamatory videos posted on its YouTube platform that targeted a senior politician.
See: Today in TechREG: FinCEN Proposes Rulemaking For New Enforcement Tool
In March, planned new laws in Australia were announced that would allow the country’s media watchdog to force Big Tech to turn over data on how they deal with misinformation, and let the Australian Communications and Media Authority enforce an internet industry code on platforms that don’t cooperate.
Read more: Australia to Force Big Tech to Share Information Data
“Digital platforms must take responsibility for what is on their sites and take action when harmful or misleading content appears,” Communications Minister Paul Fletcher said in a statement at the time.