TikTok fined £12.7m after it ‘did not do enough’ to keep under-13s off platform

Social media platform TikTok has been fined £12.7 million because it “did not do enough” to make sure underage children were not using its platform and ensure that their data was used correctly.

The Information Commissioner’s Office (ICO) said that TikTok allowed up to 1.4 million children under 13 to use its platform in 2020, which was against its terms of service.

However the fine is less than half the £27 million that the ICO originally said it might fine TikTok for breaches.

TikTok nevertheless said that it disagrees with the decision and would consider its options.

The regulator slashed the potential fine, which it first announced in September last year, after deciding not to pursue an initial finding that the company had unlawfully used “special category data”.

Special data includes ethnic and racial origin, political opinions, religious beliefs, sexual orientation, trade union membership, genetic and biometric data or health data.

But the regulator upheld its findings that TikTok failed to ensure that users under 13 had permission from their parents or carers to use the platform.

It also did not carry out adequate checks to identify and remove these children from its site despite concerns being raised to senior members of staff.

“There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws,” said information commissioner John Edwards.

“As a consequence, an estimated one million under-13s were inappropriately granted access to the platform, with TikTok collecting and using their personal data.

“That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll.

“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had.

“They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”

TikTok said: “TikTok is a platform for users aged 13 and over. We invest heavily to help keep under 13s off the platform and our 40,000 strong safety team works around the clock to help keep the platform safe for our community.

“While we disagree with the ICO’s decision, which relates to May 2018 – July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year.

“We will continue to review the decision and are considering next steps.”

In 2019, US regulators hit the company with a 5.7 million dollar (£4.5 million) fine for similar practices related to improper data collection from children under 13.

Earlier on Tuesday, Australia became the latest country to ban the Chinese-owned app from its federal government’s devices.

Last month the UK Government said it would block TikTok from its devices and networks over safety concerns, with the Scottish Government following suit.

TikTok, owned by Chinese internet company ByteDance, has regularly said that it does not share data with China.

But Beijing’s intelligence legislation requires firms to help the Communist Party when requested.

TikTok chief executive Shou Zi Chew also made a rare public appearance to be questioned by the US Congress over data security and user safety.

“Let me state this unequivocally, ByteDance is not an agent of China or any other country,” he said, as he made his case for why the popular app should not be banned, at the March hearing.

No comments yet.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.