Profiling, Advertising, and Mental Health: Why is TikTok under scrutiny for how it uses Children’s data?

Social media platforms have been under scrutiny over the impact they have on people’s mental health and their questionable processing of children’s data. This increased scrutiny has led to an increase in the frequency of legislative action against large social media/entertainment platforms which processing of children data. Most recently, on 5 September 2022, Instagram was been fined €405m by Irish regulators for violating children’s privacy concerning their phone numbers and email addresses.

Notably, in 2020, it was realised that TikTok’s suggestion algorithm based on viewing history was reported to potentially suggest videos encouraging self-harm and suicide on TikTok.[1] This is particularly dangerous, as TikTok’s algorithms may recommend content for people and influence already vulnerable users who suffer from mental health issues. Additionally, the platform was under scrutiny by the European Commission and by the European Consumer Organisation (BEUC)[2] regarding its reliance on consent for personalised advertisement, and failure to comply with GDPR requirements.

In this sense, adverts encouraging the self-diagnosis in social media platforms on mental health conditions such as ADHD, disorders or anxiety are now common place. The problem is that the symptoms indicated by ads are normally misleading and can relate to a number of different health issues. Teenagers, children or users with mental health issues are commonly targeted by personalised medical advertising, promoting health companies and products.[3]

So how does TikTok have a lawful basis to suggest content of this nature? In its legal section[4], TikTok describes two different ways used by the platform to collect personal data to personalise ads:

  • on-TikTok activity (the accounts the user follows, “liked” videos and user’s profile information)
  • off-TikTok activity (data about the user shared with TikTok by businesses to reach potential customers).

For personalised advertisements, TikTok uses consent as lawful basis to access or use information. For “on-TikTok activity”, data about users’ activity on the platform, TikTok planned to change, from 13 July 2022, the consent basis to legitimate interests. For “off-TikTok activity”, the company would continue relying on consent. Based on this change, users who still have not provided consent for using their data for personalised ads, would start then receiving personalised ads without consent. However, privacy activist groups[5] criticised TikTok on the intention to use legitimate interest and argued that the changes should be investigated by the EDPB. Until further notice, TikTok has paused[6] the changes to personalised advertising settings in the EEA, UK and Switzerland.

On July 2022, the Italian Supervisory Authority[7] issued a formal warning against personalised ads based on legitimate interest under Article 58(2)(a) GDPR and Section 154(1)(f) of Italian data protection law. The modification of the privacy policy would concern people above age 18, on receiving personalised ads based on “profiling” user’s behaviour on the platform. The Italian Supervisory Authority (SA) concluded that the only legal basis for the storage of information or access to stored information in user’s terminal equipment would be consent. Besides,the Italian SAhighlighted difficulties around the implementation of adequate age verification measures and the possible risk of children aged below 14 years receiving ‘personalised’ ads with harmful and unsuitable contents.

In its Terms of Service,[8] TikTok considers that it is necessary to be 13 years or older to use the platform:

“You can only use the Platform if you are 13 years of age or older. We monitor for underage use and we will terminate your account if we reasonably suspect that you are underage. You can appeal our decision to terminate your account if you think we have made a mistake about your age.”

However, the reality is different. TikTok is a popular platform for 8-12 years old in the UK, which makes them easy targets for profiling and advertising.

Issues regarding children’s data are not new. In February of 2021, the Italian DPA[9] ordered TikTok to restrict processing of personal data of persons whose age TikTok is not certain of. In this event, a 10-year-old girl from Palermo taking part in a “blackout challenge”, which was an asphyxiation online challenge, died.[10] The legal issue was the age verification measures used by TikTok. The Italian DPA imposed a limitation on the processing performed by TikTok with regard to the data of users whose age could not be established with certainty.

To consider both sides, in a reply to the Italian SA, TikTok informed that it will implement measures to ban access to below 13 years users, such as the deployment of AI-based systems for age verification purposes, campaigns to improve children’s and parent’s awareness, request to re-insertion of ages from February 2021, deletion of users’ accounts aged below 13 years and inclusion of in-app button to report these users. As TikTok has its main establishment in Ireland in the EU, it would act alongside the Irish Data Protection Commission (DPC) regarding age verification with the use of AI.

Following this, in September of 2021, the Irish DPC opened an investigation on whether TikTok has complied with transparency obligations, public by default processing for users under 18 and age verification measures for under 13. On 13 September 2022, the draft decision has been submitted to Supervisory Authorities under Article 60 of the GDPR[11] to review the draft decision and raise any objections.

In March 2022, UK High Court allowed a class action against TikTok in SMO (A child) v TikTok and others[12], following from a representative action brought by Children’s Commissioner for England for violations of the EUGDPR, concerning misuse of private information and unlawful processing of data. The claim alleged that TikTok had violated the EUGDPR and UKGDPR, and that it has failed to be transparent regarding children’s data and the purposes of its collection, including collection of behavioural and content information. It was pending depending on the outcome of Lloyd v Google, which decision did not allow collective privacy damages. One of the grounds in the SMO v TikTok and others claim aimed to distinguish the case from Lloyd v Google, as below:

“148. … The class is very different, comprising children with a TikTok account and who actually used TikTok while logged into that account in the Claim Period. The personal data and private information collected and processed not only includes all of the information required for setting up an account, device information and location, but also includes behavioural and content information (including the content viewed, how long the user views videos, what advertisements are viewed and for how long, how many times videos are viewed and search history), and inferred information such as age-range and gender. This extends well beyond the situation in LloydSMO (A Child) v Tiktok & Other

In the UK, the Children’s Code refers, for example, to the age appropriate design and to meeting the standards for the context under the idea that: “organisations can not use personal data in ways that are detrimental to children or that go against industry codes of practice”[13]. Meeting the “privacy by default” standard is to ensure the use of privacy-enhancing methods to determine age. For profiling, there is a need for content controls, algorithmic risk assessment and transparency about how content recommendation works.[14] TikTok will need to prove that undertook necessary measures to align its services with these requirements. The platform currently provides publication about security tools, on “private” profile settings and the inclusion of a shorter version of privacy notice for young users, as it is indicated in the platform’s website[15].  

For the DPC, companies should demonstrate that the activities are in the best interests of the child, independently of commercial interests. For profiling, marketing and advertising activities, this consist of a “zero interference” on children’s interests. For platforms, this is particularly difficult as, for example, ‘profiling’ may be suitable and used for other purposes in addition to personalised advertising and marketing, such as to estimate individual’s age for age-appropriate content, helping to promote their rights.

TikTok and other online platforms have demonstrably worked on improvements to their platforms, however, some issues remain unresolved. The scrutiny of the inappropriate and unlawful use of children’s data by large tech companies, particularly when it comes to personalised advertising is unlikely to change. With the Digital Services Act, in the EU, and the UK Online Safety Bill, it is expected that online platforms’ will be dealt with more attention. Overall, this shows the need to ensure children’s online safety in a number of areas, from age assurance to the transparent determination of lawful basis and purposes for the processing. Although it necessary to protect fundamental rights, this should not increase the burden on online services. A balanced approach under proportionality principle and technical measures taken will be essential to consider both sides and maintain the provision of services for online platforms.


[1] BBC. Tiktok tries to remove widely shared suicide clip. 8 September 2020.

[2] BEUC. Press release. 21.06.2022 https://www.beuc.eu/press-releases/investigation-tiktok-closed-important-questions-unresolved-consumers-left-dark

[3] Financial Times. Self-diagnosis ads on TikTok blur mental health fears with reality. Source: https://www.ft.com/content/dd63fb93-fa81-4a29-918e-93fa06fb8c4c

[4] TikTok. Changes to personalised advertising in the EEA. https://www.tiktok.com/legal/changes-to-personalised-advertising-in-the-eea?lang=en

[5] Accessnow. 5 July 2022. Intervening In TikTok’s upcoming changes to personalized advertising settings in the EEA, UK and Switzerland. Source: https://www.accessnow.org/cms/assets/uploads/2022/07/Access_Now_TikTok_EDPB_Personalised_Ads.pdf

[6] TikTok Changes to our personalized advertising settings in the EEA, UK and Switzerland. Source: https://www.tiktok.com/legal/changes-to-personalised-advertising-in-the-eea?lang=en

[7] EDPB. TikTok: Italian SA warns against ‘personalised’ ads based on legitimate interest. Source: https://edpb.europa.eu/news/national-news/2022/tiktok-italian-sa-warns-against-personalised-ads-based-legitimate-interest_en

[8] Terms of Service.TikTok. Minimum age. Source:https://www.tiktok.com/legal/terms-of-service-eea?lang=en

[9]Italian SA. Garante Per La Protezione Dei Dati Personali. Source: https://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/9524224

[10] EDPB. Italian DPA imposes limitation on processing on TikTok after the death of a girl from Palermo. 26 January 2021. Source: https://edpb.europa.eu/news/national-news/2021/italian-dpa-imposes-limitation-processing-tiktok-after-death-girl-palermo_en

GPDP. https://www.gpdp.it/web/guest/home/docweb/-/docweb-display/docweb/9524194

[11] Irish DPC submits Article 60 draft decision on inquiry into TikTok. 13 September 2022. Source: https://www.dataprotection.ie/en/news-media/irish-dpc-submits-article-60-draft-decision-inquiry-tiktok-0

[12] SMO (A Child) v Tiktok & Others Source: https://www.bailii.org/ew/cases/EWHC/QB/2022/489.html

[13] ICO. Detrimental use of data. https://ico.org.uk/for-organisations/childrens-code-hub/faqs-on-the-15-standards-of-the-children-s-code/

[14] ICO. Children’s Code hub. Source :https://ico.org.uk/for-organisations/childrens-code-hub/how-to-use-our-guidance-for-standard-one-best-interests-of-the-child/children-s-code-best-interests-framework/profiling-for-content-delivery-and-service-personalisation/

[15] TikTok. Privacy Policy for Younger Users. Source: https://www.tiktok.com/legal/privacy-policy-for-younger-users?lang=en

Share:

More Posts

Send Us A Message