Video sharing app TikTok is set to be on the end of a legal challenge being described as a “landmark case” for allegedly illegally collecting the personal information of underage users.
The claim alleges that TikTok are taking personal information from children without proper warning, information, transparency, or the level of informed consent needed under law.
The case is being brought by Anne Longfield, the former Children’s commissioner until February 2021, on behalf of millions of children in the UK and the European Economic Area. Ms Longfield claims that the app is in breach of UK and EU laws relating to children’s data protection and is fighting for it to delete all existing such data and pay compensation which could reach billions of pounds.
TikTok’s official minimum age required to be an active user is 13; however, last year Ofcom found that 42% of 8-12 year olds in the UK used the video sharing platform. The regulator has also been monitoring and investigating the app’s handling of personal information. Just last year, TikTok’s parent company, ByteDance, was fined a record $5.7in the US for illegally collecting personal information on children under 13.
Ms Longfield estimates that more than 3.5 million children in just the UK could be effected by the mishandling or illegal storage of their personal data through TikTok.
One of the case’s main points is that, while TikTok aren’t unique in their use of personal user information to drive profits via advertising and marketing, their disproportionately young user base makes the issue of consent all the more contentious. As Ms Longfield states, “kids can’t give consent”, to the handling of their user data and won’t be aware of the full context and consequences of agreeing to certain clauses in the terms and conditions of the app.
It is believed that this case could be a landmark when drawing up frameworks for social media companies’ responsibilities to children using their platforms.