Australia’s online safety watchdog has fined X – the social media platform formerly known as Twitter – 610,500 Australian dollars (around £316,000) for failing to fully explain how it tackled child sexual exploitation content.

Australia’s eSafety Commission describes itself as the world’s first government agency dedicated to keeping people safe online.

The commission issued legal transparency notices early this year to X and other platforms questioning what they were doing to tackle a proliferation of child sexual exploitation, sexual extortion and the livestreaming of child sexual abuse.

Commissioner Julie Inman Grant said X and Google had not complied with the notices because both companies had failed to adequately respond to a number of questions.

The platform renamed X by its new owner Elon Musk was the worst offender, providing no answers to some questions including how many staff remained on the trust and safety team that worked on preventing harmful and illegal content since Mr Musk took over, Ms Inman Grant said.

“I think there’s a degree of defiance there,” Ms Inman Grant said.

“If you’ve got a basic HR (human resources) system or payroll, you’ll know how many people are on each team,” she added.

X did not immediately respond to a request for comment.

After Mr Musk completed his acquisition of the company in October last year, he drastically cut costs and shed thousands of jobs.

X could challenge the fine in the Australian federal court. But the court could also impose a fine of up to 780,000 Australian dollars (£404,000) per day backdated to March, when the commission first found the platform had not complied with the transparency notice.

The commission would continue to pressure X through notices to become more transparent, Ms Inman Grant said.

“They can keep stonewalling and we’ll keep fining them,” she said.

The commission issued Google with a formal warning for providing “generic responses to specific questions”, a statement said.

Google regional director Lucinda Longcroft said the company had developed a range of technologies to proactively detect, remove and report child sexual abuse material.

“Protecting children on our platforms is the most important work we do,” Ms Longcroft said in a statement.

“Since our earliest days we have invested heavily in the industrywide fight to stop the spread of child sexual abuse material,” she added.

Share with a friend!

About the Author: Jonathan Thomas
Posting science and tech news provided by the PA/Press Association. All content and images are used with a license.
Leave A Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

2 min readCategories: social mediaLast Updated: October 17, 2023

Share this post

View my Flipboard Magazine.