Data Protection News Update 14 April 2025

United Kingdom

Britain to invest up to $767 mln in new health data research

  • On April 7th, Keir Starmer announced that Britain will be investing up to 767 million pounds into a new health data research service designed to accelerate scientific research and shorten the time it takes to conduct clinical trials.
  • Set to launch by the end of 2026, the research service reportedly aims to offer a single, secure, and accessible platform, making it easier for researchers to access data without having to navigate multiple systems.
  • Emma Walmsley, chief executive at British pharmaceutical company GSK, welcomed the announcement; “The UK has unique potential to bring health data securely together with an NHS system that recognises the value of innovation, to accelerate and deliver the next generation of medicines and vaccines for patients,” she said in a statement. “This offers value to society and to the economy.”

Use facial recognition for all crimes, police told

  • UK police forces have been directed to use facial recognition searches in all criminal investigations. The independent police inspectorate has urged forces to “fully exploit” the technology, after finding disparities in how frequently it is used across different forces.
  • A Telegraph investigation revealed that police forces are conducting searches on the public every two minutes. Officers are encouraged to gather images of their targets, including witnesses and victims, from sources including social media platforms, doorbell footage, CCTV footage, and the police national database (PND).
  • Initially introduced to target serious and violent offenders, facial recognition technology is now being most frequently used in low-level investigations. This growing reliance on digital technology comes as police forces face cuts to staff to reduce costs.
  • Despite its increasing use, facial recognition technology is not subject to national guidelines from either the Home Office or the College of Policing, the latter typically provides advice on investigative practices.

United States

Meta whistleblower alleges company worked with China on censorship

  • On April 9th, Sarah Wynn-Williams, former Global Public Policy Director at Facebook, testified before US senators, claiming that Meta executives provided the Chinese Communist Party (CCP) with access to Meta users’ data, including that of Americans.
  • Meta disputed Wynn-Williams’ claims, with spokesperson Ryan Daniels calling her testimony “divorced from reality and riddled with falsehoods.” He acknowledged CEO Mark Zuckerberg’s interest in operating in China but emphasised, “[T]he fact is this: we do not operate our services in China today.” However, Meta continues to generate advertising revenue from Chinese advertisers.
  • Wynn-Williams also alleged that Meta worked “hand in glove” with Beijing to create censorship tools aimed at silencing CCP critics. Meta denied these allegations.
  • “One thing the Chinese Communist Party and Mark Zuckerberg share is that they want to silence their critics. I can say that from personal experience,” Ms Wynn-Williams said during her testimony.

Europe

Ireland’s data regulator investigates X’s use of European user data to train Grok

  • Ireland’s Data Protection Commission (DPC) has launched an investigation into Elon Musk’s X over the social media platform’s use of personal data from European users to train its AI chatbot, Grok.
  • The DPC will examine how X processes personal data “comprised” in publicly accessible posts by European users for generative AI training.
  • The Irish regulator has previously issued fines to Microsoft, TikTok, and Meta, with Meta’s total fines nearing €3 billion.
  • In 2024, X quietly opted users in to sharing data with Musk’s AI company, xAI, for Grok’s training. Last month, Musk announced that xAI acquired X.
  • Under the EU GDPR, the DPC can impose fines up to 4% of a company’s global revenue for unlawful data processing. This investigation comes after it sought a court order last year to restrict X from processing European user data for AI training.

International

Apple’s encryption row with UK should not be secret, court rules

  • In a ruling published on April 7th, the tribunal hearing Apple’s UK encryption legal challenge rejected the government’s request to keep the hearing secret, citing extensive media coverage and the legal principle of open justice.
  • The tribunal stated, “It would have been a truly extraordinary step to conduct a hearing entirely in secret without any public revelation of the fact that a hearing was taking place.”
  • It further noted, “For the reasons that are set out in our private judgement, we do not accept that the revelation of the bare details of the case would be damaging to the public interest or prejudicial to national security.”

Notes from the Asia-Pacific region: Regulatory developments in China, Hong Kong

  • On April 9th, the Cyberspace Administration of China (CAC) published a Q&A for multinational corporations, clarifying that general data—excluding personal or important data—can be freely transferred out of China. However, transfers of important and personal data above certain thresholds will require a security assessment or certification.
  • China’s National Information Security Standardisation Technical Committee (TC26) has introduced six new national standards, covering areas such as data security evaluations, automatic decision-making with personal data, and organisational requirements for large internet corporations, set to take effect on 1 October of this year.
  • China’s Cybersecurity Law, one of its three cornerstone data protection laws, is set to undergo new changes. The revised Cybersecurity Law draft introduces stricter compliance requirements for Critical Information Infrastructure operators and increases penalties for non-compliance, while allowing leniency for first-time or minor violations that are promptly rectified.
  • The Hong Kong Privacy Commissioner’s Office has released new guidelines for the use of generative AI by employees, highlighting the importance of protecting personal data and ensuring lawful and ethical AI usage, with penalties for violations.

Share:

More Posts

Send Us A Message