Misapplied consent to justify FRT in schools: what is the legal basis?

On 31st January 2023, the Information Commissioner’s Office (ICO) issued a letter to the North Ayrshire Council (NAC) regarding the use of Facial Recognition Technology (FRT) in schools. This article will briefly discuss this specific use case and present the European ruling on this topic to contest ICO’s position.

Facial Recognition Technology (FRT)

FRT is a type of biometric recognition that consists of the “automated recognition of individuals based on their biological and behavioural characteristics”[1]. Fundamentally, FRT links one’s image to a pre-taken image stored in a database to verify one’s identity. However, this linkage can be used to infer other characteristics, such as age, sex, gender or ethnicity, which, especially when considering children as data subjects, pose privacy rights and data protection issues.

European ruling 

The use of FRT in schools was already ruled out in 2019 when a school in Sweden and two French schools were fined by their national data protection authorities (DPAs) after implementing FRT to keep track of students’ attendance and control entry access. 

The Swedish case highlighted the controversy of having ‘consent’ as a legal basis due to the inherent power dynamic of schools over the pupils. The Swedish Authority for Privacy Protection considered that parents’ consent to monitoring the students was not valid to collect such sensitive personal data. It also said there were less intrusive ways to control students’ attendance. Consequently, the DPA found that sensitive biometric data was unlawfully processed and that the school failed to complete an adequate data protection impact assessment (DPIA)[2]. Should a DPIA had been duly completed prior to the processing, the school would have found that such processing resulted in a high risk and would have been obliged to consult the supervisory authority under GDPR Article 36.

Weeks after this decision, the French case recognised that using FRT violated data protection law as FRT was a disproportionate measure to track attendance. Similar to the Swedish school, the lawful basis applied by the school was also consent and the French DPA also found the lawful basis to be invalid. The signature by the parents of a form to consent to such processing was not considered as freely given since the targeted data subjects are in a relation of authority with the school[3].

According to the CNIL (Commission Nationale de l’Informatique et des Libertés), using FRT in high schools is contrary to GDPR’s proportionality and data minimisation principles. It recognises that FRT is an intrusive biometric mechanism which involves significant risks of infringing people’s privacy and individual freedoms. Additionally, streamlining student entrances can be done by less intrusive means, such as badge control. Furthermore, they are also likely to create a feeling of reinforced surveillance. These risks arise when facial recognition devices are applied to minors. Finally, the CNIL recalls that strict vigilance is required given the damage that could result from security incidents on such biometric data.

Those decisions were acknowledged by the UK Information Commissioner in their June 2021 report, page 22, ‘The use of live facial recognition technology in public places.’, which recognises the action taken by the DPAs against controllers using FRT in schools[4].  

Although it would seem reasonable to the ICO to take a similar approach, its latest letter to NAC raises eyebrows. 

Background case and Privacy groups and social media concerns 

In 2019, the first survey of public opinion on the use of FRT by the Ada Lovelace Institute (a research institute focusing on data and artificial intelligence) found that a majority of people (67% of those surveyed) are uncomfortable with the use of FRT in schools, compared to 61% when considering it on public transport and 29% by police forces[5].

In October 2021, concerns regarding the use of FRT in school canteens were raised in the media and directly to the ICO.

Defend Digital Me, a non-profit NGO focused on protecting children’s rights to privacy and data protection, also concluded that using FRT for such purposes is likely unlawful under UK GDPR, in line with France and Sweden’s 2019 decisions. The use of biometrics in school is an excessive, unnecessary and disproportionate interference with children’s privacy rights that are enshrined in Article 16 of the UN Convention on the Rights of the Child and Article 8 of the European Convention on Human Rights. It also argued that no consent can be freely given when the power imbalance with authority makes it hard to refuse. Moreover, it claimed that the notifications sent to parents were worded to make acceptance seem compulsory[6]. 

Additionally, the campaign group Big Brother Watch has also argued that “no child should have to go through border-style identity checks just to get a school meal” and recognized that biometrics are “highly sensitive, personal data that children should be taught to protect”[7].

NCA schools, on the other hand, said this system would speed up queues and is more Covid-secure[8].

ICO (Lack of) Guidance

As a special category of data, processing through FRTs should be necessary for a specific purpose. It must also be “a targeted and proportionate way of achieving that purpose”, as the condition does not apply if you could achieve the same purpose by other less intrusive means, especially if that would avoid the use of special category of data. 

Furthermore, when directing FRT to children, the standards to collect and process their data should be higher as “they may be less aware of the risks involved” (Recital 3 UK GDPR). 

On 4th November 2022, in the House of Lords, Lord Scriven accurately recognised that “If we leave it to individual schools, the unintended consequences and problems that will arise will be not just technical but deeply ethical and societal. There must be a balanced debate within this Parliament, and legislation must be brought forward”[9]. 

Out of turn, it is not reassuring, to say the least, that the authority responsible for promoting public awareness regarding the processing of personal data, rather than following European ruling, recognised the possibility of lawfully deploying FRT in schools, provided there is a lawful basis and the conduction of a DPIA. 

The ICO still stated that the NAC could not demonstrate a valid lawful basis for the processing considering the consent statement was not specific and could apply to a broad range of processing activities. Additionally, because of the power imbalance between the parties, individuals may have felt compelled to consent because of the way the information was worded and how the introduction of FRT was presented[10].

Furthermore, the NAC did not consult parents or children before the processing, which could have helped the controller identify mitigations and safeguards to address risks and privacy concerns of the data subjects. The DPIA then did not contain enough detail on how the personal data being processed complied with the data minimisation and the data accuracy principles[11].

All in all, the ICO, far from acknowledging the lack of necessity or proportionality of the use of FRT in schools and setting its boundaries, recognised that FRT and similar technologies can potentially be used lawfully with appropriate assessment and care.

Unfortunately, in the absence of legislation to cover the use of FRT giving schools more guidance, the ICO missed the opportunity to alert about the risks raised by FRT to protect children, but rather focused on the need to DPIA as the golden rule to answer all questions.  

[1] Biometrics Institute – What is Biometrics? https://www.biometricsinstitute.org/what-is-biometrics/

[2] Facial recognition: School ID checks lead to GDPR finehttps://www.bbc.co.uk/news/technology-49489154

[3] French privacy watchdog says facial recognition trial in high schools is illegal.  https://www.politico.eu/article/french-privacy-watchdog-says-facial-recognition-trial-in-high-schools-is-illegal-privacy/

[4] Information Commissioner’s Opinion: The use of live facial recognition technology in public places (18 June 2021): https://ico.org.uk/media/2619985/ico-opinion-the-use-of-lfr-in-public-places-20210618.pdf

[5] Beyond face value: public attitudes to facial recognition technology https://www.adalovelaceinstitute.org/report/beyond-face-value-public-attitudes-to-facial-recognition-technology/

[6] Defend Digital Me.  Facial recognition in schools. https://defenddigitalme.org/2021/10/17/facial-recognition-in-schools/

[7] ICO to step in after schools use facial recognition to speed up lunch queue. https://www.theguardian.com/education/2021/oct/18/privacy-fears-as-schools-use-facial-recognition-to-speed-up-lunch-queue-ayrshire-technology-payments-uk

[8]  Should schools use facial recognition technology?. https://www.tes.com/magazine/analysis/general/should-schools-use-facial-recognition-technology

[9] Ibid.

[10] ICO Letter: North Ayrshire Council’s use of Facial Recognition Technology in its schools. https://ico.org.uk/media/action-weve-taken/4023847/ico-letter-to-nac-appendix.pdf

[11] Ibid.


More Posts

Send Us A Message