Computer Says ‘Discriminate’? Human Rights and Proposed Changes on Automated Decision-Making in the DPDI (No. 2) Bill

With the human population almost reaching 8.1 billion and after years of checking the daily coronavirus death statistics in our local areas, it is now easier than ever in 2024 to view our fellow humans as merely numbers. Statistical analysis, scientific advancement, and automation have so far served humanity well, and it is now on us to ensure they keep delivering for us.

The European data protection regime currently offers one of the world’s most robust and comprehensive protection mechanisms against the use of artificial intelligence to make important decisions about human lives. Significant changes are underway in the UK as the Data Protection and Digital Information (No. 2) Bill [1], which is currently in the Committee Stage of the House of Lords, keeps progressing through the Parliament.

The current rule on automated decision-making:

The Data Protection and Digital Information (“DPDI”) (No. 2) Bill aims to make numerous amendments to the UK’s current data protection regime, one of which is the relaxation on the prohibition against solely automated decision-making (“ADM”). The European Data Protection Board defines solely automated decision-making as “the ability to make decisions by technological means without human involvement.” [2] UK’s current data protection rules originate from the GDPR [3], which was retained following UK’s withdrawal from the EU. Article 22 UK GDPR deals with rules around ADM. As it stands, individuals have a general right not to be subject to a decision based solely on automated processing, unless one of the three conditions applies:

  • The processing is necessary to perform a contract between a data subject and controller.
  • The processing is required or authorised by UK law.
  • The individual has given explicit consent.

Similarly, such decisions cannot be based on special categories of data, which include sensitive information such race, ethnicity, religion, and health, unless:

  • The individual has given explicit consent, or
  • The processing fulfils the substantial public interest conditions.

The data controller for the processing activity involving ADM is responsible for providing the individuals about the use of ADM and the logic involved behind the automation.

What will the DPDI (No. 2) Bill change?

Clause 14 of the DPDI (No. 2) Bill will replace Article 22 UK GDPR with Articles 22A-D. The most controversial change to the current rules on ADM is the proposal to remove the general prohibition against ADM and replace it with a prohibition against ADM based, entirely or partly, on special categories of data. For processing non-special category data, relying on one of the Article 6 legal bases will now suffice, which is necessary for all processing activities regardless of whether they fall under ADM or not.

Article 22B(1)-(3) prohibits the use of special categories of data for ADM unless one of the two conditions applies:

  • The individual has given explicit consent, or
  • The processing activity is both:
    • (i) necessary to perform a contract between a data subject and controller or (ii) required or authorised by law. The Explanatory Notes [4] for the Bill specify that the second sub-requirement includes the reasonable use of such processing to comply with legal obligations or where processing is necessary for the performance of a public task, and,
    • in the substantial public interest.

Why does it matter?

One of the main criticisms of AI technologies is their tendency to make blanket assumptions about a subset of people and amplify them. In their Public Sector Equality Duty assessment for DPDI (No.2) Bill, the Government acknowledged that, historically, ADM has had a “disproportionately detrimental effect upon people with protected characteristics”. [5] There is a significant overlap between these protected characteristics under the Equality Act 2010 and special categories of personal data under UK GDPR, e.g., some health data, race, religion, and sexual orientation.

Shifting the general prohibition on ADM from personal data to only special categories of personal data is a bold move, and there is more to it than meets the eye. [6] Natural intelligence, as opposed to artificial, can deploy empathy on command, comprehend nuance, and actively decide whether to discriminate or not. AI is much more susceptible to associating, purposely or accidentally, seemingly non-sensitive characteristics (proxies) with characteristics falling under special categories of data. [7] Non-sensitive factors used in automated decision-making, such as name and postcode, can correlate with more sensitive characteristics, such as race and ethnicity. If an algorithm makes decisions based on a non-sensitive dataset, such as postcode, which incidentally happens to be interlinked with race or ethnicity, the organisation could argue that the decision was not made based on sensitive data when, in reality, it was an indirect factor. Whereas Article 22 UK GDPR does not allow such a processing activity unless one of the exceptions apply, the proposed changes under DPDI (No. 2) Bill would allow it with the simple condition that the organisation rely on one of the Article 6 legal bases, which is necessary for all processing activities. This pitfall forms the rationale behind the general prohibition against the use of ADM under the GDPR for all personal data, unless the exceptions above apply. Because algorithms, with the scientific progress we made so far with AI technologies, cannot stay fully confined to the scope of “non-special category data”. Either through deliberately masking the real decision factor, or accidentally and indirectly factoring in other criteria, algorithms can make significant decisions based on special category data, and organisations could still argue that they are, in fact, not based on special category data.

Discriminatory algorithms making significant decisions based on seemingly innocuous data have already been subject to, albeit limited, judicial scrutiny in Europe. In 2020, Hague’s District Court found the use of SyRI unlawful. SyRI was a social welfare fraud detection algorithm, and the Dutch Government conceded that the algorithm was only used to assess the so-labelled “problem districts”. Given that SyRI processed data on a very large scale, the use of SyRI inadvertently made biased links, and the algorithm ended up targeting individuals from less advantaged socio-economic groups and with an immigration history. [8] The District Court found the use of SyRI in violation of the right to private life enshrined under Article 8 of the European Convention on Human Rights (“ECHR”). This is a real-life case of how non-sensitive proxies, such as postcode, can help algorithms make biased decisions that indirectly discriminate based on sensitive, protected characteristics, such as ethnicity.

The use of the “Gangs Matrix” which ended up disproportionately targeting young Black men [9] and the “Sham Marriage Triage Tool” which flagged Albanian, Bulgarian, Romanian, and Greek couples at a higher rate [10] are just a few of the many examples of how the UK Government relied and continues to rely on ADM to make critical decisions about human life. [11] Although these were found to be human rights violations, they still fell within what is permissible under Article 22 UK GDPR as public organisations can more easily rely on one of the exemptions.

The real difference that the DPDI (No. 2) Bill will make concerns the use of such algorithms by private organisations. Consider the following scenario: An algorithm is developed by a private employer seeking to filter out applicants that would likely not succeed in getting shortlisted for an interview at the company. Under DPDI (No. 2) Bill, the employer could rely on the Article 6(1)(f) ‘legitimate interest’ legal basis and use the applicants’ name and which secondary school they went to for their initial screening. If the algorithm is trained on the company’s past applicants’ relative success rates and the company has historically favoured White applicants, similar names and secondary schools will also be favoured by the algorithm. The algorithm is technically not making any decisions based on special category data; however, racial and ethnic characteristics will undeniably be encoded within some applicants’ name and secondary school information.

Was the written evidence of the Equality and Human Rights Commission sidelined?

The Equality and Human Rights Commission highlighted in their written evidence of the DPDI (No. 2) Bill that even the current version of Article 22 of UK GDPR has shortcomings when it comes to avoiding unfair and discriminatory effects of ADM. [12] Although their more pedantic, yet paternalistic, point on explicitly including the word profiling in all relevant provisions was implemented, the two main issues they raised regarding ADM remain unaddressed:

First, the fact that public authorities will be able to rely on ADM under a wider set of circumstances poses an increased risk of interference with individuals rights under the ECHR. In the ECHR memorandum to the DPDI (No. 2) Bill, the Government concedes that the amendments Article 22 of UK GDPR to relax restrictions on ADM can make the use of such technologies more prevalent. [13] This interference, in turn, can engage many citizens’ right to privacy under Article 8 ECHR and Article 14 ECHR which provides that the enjoyment of the rights outlined in ECHR shall be secured without discrimination on any ground.

The Government in the ECHR memorandum then goes on to state that public authorities can still rely on ADM with the current lawful bases outlined in Article 22, which is true, and that the main influx of additional processing will be by private bodies, whose decisions cannot engage Convention rights because they are not an emanation of the State. From a constitutional standpoint, this might not raise major concerns as there still exist proposed safeguards in the Bill; however, from a data protection perspective, it seems self-defeating to now allow private organisations to rely on one of the legal bases under Article 6 UK GDPR to carry out ADM that could intentionally or unintentionally factor in protected characteristics, only to underplay the risk of discrimination simply because such discrimination will be committed by a private organisation.

Second, the Equality and Human Rights Commission suggest the definition of ADM be extended to cover partly ADM as well in line with the Information Commissioner’s Office’s (“ICO”) recommendation. The ICO, in their response to the Department for Digital, Culture, Media & Sport Consultation in 2021, encouraged the Government to consider expanding the scope of Article 22 UK GDPR to not solely ADM but also partly ADM to offer better protection to data subjects’ rights and dilute the opaqueness of ADM technologies. [14] Neither the GDPR nor the Bill in its current form contains any references to how partly ADM should be treated under data protection legislation.

And finally, what does this mean for adequacy?

Following Brexit, the European Commission has adopted an adequacy decision under the EU GDPR on 28 June 2021. Although the adequacy decision allows for the data to flow freely between the European Economic Area and the UK, this does not come without caveats. To ensure data can flow freely, the EU Commission will monitor the developments in the UK on an ongoing basis to make sure UK’s data protection regime continues to offer an ‘adequate’ level of data protection. The Commission is expected to start their assessment later in 2024 to decide whether UK’s adequacy should be extended. [15]

It will not come as a shock that in July 2023, 28 civil society organisations and privacy experts, including Max Schrems, Amnesty International, Big Brother Watch, Open Rights Group, and Privacy International, have written an open letter to the European Commission, warning them of the changes proposed in the DPDI (No. 2) Bill. [16] The open letter argued that the DPDI (No. 2) Bill would allow ‘private companies to seek shelter in the UK to circumvent European data protection standards, and turn the UK into a “test lab” for experimental and abusive uses of data’ which is one of the main criticisms the proposed changes to the rule on ADM have received. Similarly, both the IAPP [17] and Thomson Reuters Practical Law have described the proposed changes to the rule on ADM as one of the main areas of divergence from EU GDPR. Last month, the European Commission has published its report on the first review of the pre-GDPR adequacy decisions of 11 third countries and territories. [18] With the DPDI (No. 2) Bill progressing further through the Parliament and the UK’s adequacy review date getting closer, this space will definitely be one to watch in 2024.

References

[1] Data Protection and Digital Information (No. 2) Bill, March 2024, https://bills.parliament.uk/bills/3430

[2] Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679 by the Article 29 Data Protection Working Party, February 2018, https://ec.europa.eu/newsroom/article29/items/612053

[3] Regulation (EU) 2016/679 of the European Parliament and of the Council

[4] Data Protection and Digital Information (No. 2) Bill Explanatory Notes, December 2023 https://bills.parliament.uk/publications/53323/documents/4144 

[5] Public Sector Equality Duty assessment for Data Protection and Digital Information (No.2) Bill, December 2023, https://www.gov.uk/government/publications/data-protection-and-digital-information-bill-impact-assessments/public-sector-equality-duty-assessment-for-data-protection-and-digital-information-no2-bill  

 [6] Big Brother Watch Briefing on the Data Protection and Digital Information 2.0 Bill for House of Commons Committee Stage, May 2023, https://bills.parliament.uk/publications/51054/documents/3385 

 [7] Discrimination, Artificial Intelligence, and Algorithmic Decision-Making by Prof. Frederik Zuiderveen Borgesius, 2018, https://rm.coe.int/discrimination-artificial-intelligence-and-algorithmic-decision-making/1680925d73 

 [8] Judgment of the Hague District Court on SyRI, March 2020, ECLI:NL:RBDHA:2020:1878

 [9] Met to Overhaul ‘Racist’ Gangs Matrix After Landmark Legal Challenge by LIBERTY, November 2022, https://www.libertyhumanrights.org.uk/issue/met-to-overhaul-racist-gangs-matrix-after-landmark-legal-challenge/ 

[10] Boarders Immigration Citizienship Systems Equality Impact Assessment by Home Office, November 2020,  https://www.whatdotheyknow.com/request/sham_marriages_8/response/1693677/attach/3/61422%20Maxwell%20Annex%20C%20Redacted.pdf 

[11] Tracking Automated Government Register by Public Law Project, March 2024, https://trackautomatedgovernment.shinyapps.io/register/ 

[12]Written evidence submitted by the Equality and Human Rights Commission for the Data Protection and Digital Information (No. 2) Bill, May 2023, https://publications.parliament.uk/pa/cm5803/cmpublic/DataProtectionDigitalInformation/memo/DPDIB38.htm 

[13] Data Protection and Digital Information (No. 2) Bill: European Convention on Human Rights Memorandum, March 2023, https://publications.parliament.uk/pa/bills/cbill/58-03/0265/echrmemo.pdf 

[14]Information Commissioner’s Office Response to DCMS consultation “Data: a new direction”, October 2021, https://ico.org.uk/media/about-the-ico/consultation-responses/4018588/dcms-consultation-response-20211006.pdf 

[15] Information Commissioner’s Office’s Guidance on ‘Adequacy’, https://ico.org.uk/for-organisations/data-protection-and-the-eu/data-protection-and-the-eu-in-detail/adequacy/ 

[16] Open Letter to the EU Commission regarding UK’s data bill, July 2023, https://peoplevsbig.tech/open-letter-to-the-eu-commission-regarding-uk-s-data-bill 

[17] The International Association of Privacy Professionals’ Blog on UK data protection reform: An overview, April 2023, https://iapp.org/resources/article/uk-data-protection-reform-an-overview/  [18] European Commission’s Press Release ‘Commission finds that EU personal data flows can continue with 11 third countries and territories’, January 2024, https://ec.europa.eu/commission/presscorner/detail/en/ip_24_161

[18] European Commission’s Press Release ‘Commission finds that EU personal data flows can continue with 11 third countries and territories’, January 2024, https://ec.europa.eu/commission/presscorner/detail/en/ip_24_161 

Share:

More Posts

Send Us A Message