‘Little-Known Start-Up’ to ‘World’s Largest Facial Network’: An Update on Clearview AI

The name ‘Clearview AI’ is becoming increasingly notorious in data protection circles. In January 2020, it was described by the New York Times as ‘[a] little-known start-up’. Two years later, it has acquired the status of ‘the world’s largest facial network’. This article explores a new development in the clash between Clearview AI and data protection.

What is Clearview AI?

Clearview AI is a New York-based company which provides access to facial recognition technology (FRT) in return for a subscription fee. The technology – which relies on a database of over 10 billion images that Clearview AI has ‘scraped’ from the internet – has proven highly controversial. For instance, in November 2021, the UK Information Commissioner’s Office (ICO) issued a provisional decision that Clearview AI had breached UK data protection legislation and should, as a result, be fined over £17 million. The ICO’s decision – which echoed the concerns of other European regulators – has been discussed in a previous article.

This article explores a new development in the saga. Just over a month after the ICO’s decision, Clearview AI returned to the news. This time, it had not been accused of unlawful activity: instead, it had signed an $18,000 contract to provide a subscription licence to the FBI.

What is the significance of the FBI contract?

The FBI has an annual budget of $10 billion. In that context, an $18,000 contract may seem relatively minor. However, it is arguably extremely significant, because of what it represents: the main federal law enforcement agency in the US formalising its relations with a FRT company that has attracted alarm and sanction elsewhere in the world on data protection grounds.

Beyond this symbolic significance, it is unlikely that the FBI would have signed the contract unless it envisaged that it would expand its use of FRT in the future. In this sense, the contract can be interpreted as an indication of what is to come: an increasingly widespread and normalised use of FRT in the pursuit of law enforcement objectives (at least in the US).

This indication as to the future of FRT is reinforced by two further points. First, other US federal agencies have also invested in FRT recently. This includes the US Immigration and Customs Enforcement (ICE), which has reportedly doubled its investment in Clearview AI since January 2021.

Second, the Covid-19 pandemic has accelerated the development and deployment of FRT. Indeed, in respect of development, FRT companies such as Tech5 and Trueface have had to update their technology to account for people wearing face masks. Further, in respect of deployment, FRT has increasingly been linked to CCTV cameras in large cities, to monitor whether Covid-19 restrictions are being complied with. Moscow is a widely-reported example of this.

The benefits and risks of FRT in the law enforcement context

It is clear, then, that FRT is set to become more mainstream in the law enforcement context.

On the one hand, this could provide some very valuable benefits, which should not be overlooked. For example, if the police could use FRT to identify a murder or terrorism suspect easily and quickly, then the general public would undoubtedly support that.

However, clearly, at the level of principle, the fact that FRT can be used to support law enforcement activities does not mean that a law enforcement agency should have a blank cheque to use FRT however it pleases. This has been recognised in the English courts, in the case of R (on the application of Edward Bridges) v The Chief Constable of South Wales Police [2020] EWCA Civ 1058.

Further, and more substantively, FRT raises a host of serious data protection issues, which must never be omitted from the discussion. As identified by the ICO, these include (amongst others):

·       A lack of choice and control for individuals.

·       Problems of transparency.

·       Problems of effectiveness and statistical accuracy.

·       The potential for bias and discrimination.

·       The processing of children’s and vulnerable adults’ data.

·       The potential for wider, unanticipated impacts for individuals and their communities.

It is well-established that US law places less emphasis than other legal systems on privacy and data protection rights and obligations. However, as the use of FRT by agencies such as the FBI expands, it is to be hoped that data protection issues are given the serious attention that they deserve.

Share:

More Posts

Send Us A Message