Data Protection News Update 30 March 2026

MPs urge the UK Government to reconsider giving Palantir access to the Financial Conduct Authority (FCA) data

  • The FCA, the regulatory authority governing multiple financial bodies in the UK, has contracted with Palantir to apply its AI systems to two years of internal intelligence data to help it address financial crime.
  • Liberal Democrats this week called for a government investigation into the contract, while the Green Party has urged for the contract to be blocked given the alleged close connection with Donald Trump.
  • Palantir, in the UK, has more than £500 million contracts with government organisations, including NHS organisations, police and Ministry of Defence.
  • The FCA insists that it will be the data controller and Palantir the data processor. Furthermore, the FCA states that it will retain exclusive control over the encryption keys for the most sensitive files with the data being hosted and stored solely in the UK.
  • Palantir’s European head has stated that the company will not process the data for its own purposes as it is “something that we have no business interest in, and that we are legally and contractually prevented from doing”.

ICO and Ofcom align on age assurance rules in joint statement

  • The ICO and Ofcom issued a joint statement clarifying how age assurance measures must comply with both data protection and online safety laws, requiring organisations to choose age assurance methods that effectively protect children while minimising unnecessary data processing. 
  • The guidance targets platforms likely to be accessed by children and within scope of the Online Safety Act and UK GDPR, emphasising lawful, fair, and proportionate processing of personal data in age checks, meaning organisations can only process data strictly necessary for age checks and must ensure clear user information, safeguards, and accountability. 
  • Organisations must balance child safety with privacy, and use age assurance to prevent unlawful processing of children’s data (e.g. under minimum age thresholds) or apply Children’s Code protections by default where age cannot be reliably determined.

Government Digital Service (GDS) establishes Vulnerabilities Working Group (VWG) to promote data standards for public services

  • The VWG aims to develop common data standards to enable agencies to identify risks earlier and respond more effectively.
  • Fragmented public sector data could lead to delays in support, missed safeguarding risks, and inconsistent services for vulnerable groups. Therefore, creating shared definitions and standardised terminology could improve how data is exchanged securely across organisations.
  • The VWG includes representatives from the central government, local authorities, NHS England, and external partners, to develop common definitions, and validate these through real-world use cases.
  • To ensure compliance with the data protection legislation, GDS has published a practical guidance on ‘Principles for Securing Personal Data in Government Services’. These principles emphasise building privacy by design, setting clear standards for data access, collection, and sharing to maintain public trust.

United States

US’ Online Privacy Bill seeks to give individuals broad rights over their personal data while preserving stronger state protections

  • U.S. Rep. Zoe Lofgren has introduced a federal privacy Bill, the Online Privacy Act, as the Congress continues to consider adopting a national baseline for data privacy.
  • This new Bill aims to set nationwide rules for how personal data is collected, used, and shared, while shifting control back to individuals through enforceable privacy rights and a new federal enforcement body.
  • Furthermore, the Bill has provided a comprehensive framework on covering individual rights, corporate data duties, data security, breach notification, enforcement, and the creation of a standalone Digital Privacy Agency.
  • The Bill also deals with processing activities using automated decision-making requiring a covered entity to inform an individual what personal information is being used in a decision made solely through automated processing “when that processing materially increases reasonably foreseeable significant privacy harms”.

Biometric ID developed by OpenAI’s CEO for AI commerce raises data protection risks 

  • Sam Altman’s initiative Tools for Humanity has launched a beta verification tool called AgentKit to support agentic commerce, essentially enabling AI agents to make purchases on behalf of users while proving a real human is behind the transaction. 
  • The system relies on World ID, which is generated through users’ biometric iris scans via the Orb device, converting them into encrypted identifiers, thereby processing special category biometric data, which raises significant GDPR concerns around lawful basis, security, and potential re-identification. 
  • AgentKit integrates this identity layer into blockchain-based payment infrastructure, allowing automated transactions between systems while linking them to a verified individual, which introduces risks around traceability, profiling and potential misuse of identity-linked transaction data. 
  • While positioned as a safeguard against fraud and AI-driven abuse online, the model is based on sensitive identity data and expands its use across commercial ecosystems, creating heightened risks around function creep, consent validity and cross-platform data sharing. 

Europe

CJEU clarifies limits on abusive GDPR access requests

  • The Court of Justice of the European Union ruled in Brillen Rottler (C-526/24) that even a first Article 15 GDPR access request can be refused if demonstrably abusive, particularly where it is made solely to generate compensation claims rather than to verify the lawfulness of data processing. 
  • Controllers may rely on publicly available evidence such as systematic access requests and repeated compensation claims across organisations to show misuse of data subject rights, helping prevent exploitation of GDPR mechanisms. 
  • The judgment reinforces that individuals can still claim compensation for genuine GDPR breaches but must prove actual material or non-material damage rather than relying on procedural violations alone. 
  • Crucially, compensation may be denied where the data subject’s own conduct caused the damage, signalling a stricter approach to balancing data protection rights with the prevention of abusive or strategic litigation.

The Court of Justice European Union (CJEU) has ruled that biometric data may only be collected by police authorities if deemed ‘strictly necessary’

  • The CJEU has clarified that national police authorities do not have the power to collect biometric data such as fingerprints and photographs from suspects without first carrying out a case-by-case assessment under the ‘strictly necessary’ test.
  • This ruling comes after the French court sought clarification from the CJEU on whether an individual can be convicted for refusing to consent to biometric data collection when they were not prosecuted for, or convicted of, the original offence for which the data was requested.
  • The CJEU held that as biometric data qualifies as sensitive data its processing must satisfy the ‘strictly necessary’ test, requiring authorities to demonstrate that collecting the data is strictly necessary for the specific circumstances of a particular case, and that alternative measures with “less serious interference with the rights and freedoms” of the individuals could not be pursued.

Ireland to bolster airport security checks using Passenger Records Data (PRD)

  • Ireland’s Minister for Justice, Home Affairs and Migration has won the Government support to introduce amendments through the Criminal Law and Civil Law (Miscellaneous Provisions) Bill 2026, expanding the use of Passenger Name Record (PNR) data.
  • PNR data is collected by airlines when passengers book their flights such as names, travel dates, payment information and contact details. This data is shared with national Passenger Information Units (PIUs) for passengers travelling into the EU from other countries, to aid in spotting suspicious travel patterns.
  • The new amendments proposed by the Irish Minister will extend the use of the PNR data to intra-EU flights.
  • When drafting the amendments as part of the new Bill, strong data protection safeguards will be ensured such as those related to data retention periods and presence of an independent authority (separate from the PIU) to approve every request to disclose PNR data to police or other agencies.

International

Nigeria joins 60 other countries in a global pact to address AI privacy issues

  • Nigeria Data Protection Commission (NDPC) has signed the “Joint Statement on AI-Generated Imagery and the Protection of Privacy”, established by the International Enforcement Cooperation Working Group of the Global Privacy Assembly.
  • This statement seeks to address AI privacy issues such as creation of non-consensual imagery, defamatory content, and other harmful materials, particularly affecting children and vulnerable groups.
  • It further proposes implementing strong safeguards, ensuring transparency, providing effective content removal mechanisms, and fully complying with applicable data protection laws.

Brazil’s court limits credit data sharing without consent

  • Brazil’s Superior Court of Justice clarified that while credit protection can justify internal processing of personal data for risk analysis, it does not permit sharing identifiable consumer data with third parties without valid consent. 
  • The ruling draws a clear legal distinction between credit scoring, credit history, and identifiable registration data, emphasising that directly identifiable data requires stronger safeguards and cannot be freely circulated across the credit ecosystem. 
  • Unlawful sharing of identifiable data now gives rise to presumed moral damages, meaning individuals do not need to prove harm, significantly increasing litigation and compliance risk for organisations handling consumer data. 
  • The decision has major operational implications: organisations must reassess data-sharing practices, implement robust consent mechanisms, and redesign data governance frameworks, reinforcing privacy-by-design and limiting downstream data use without a clear legal basis.

For the latest updates on the Palantir FCA data deal UK, ICO and Ofcom age assurance rules, AI biometric privacy risks, GDPR rulings from the CJEU, and global data protection developments, visit our Data Protection News hub.

Share:

More Posts

Send Us A Message