United Kingdom
NHS pledges to ‘protect data’ as researchers in China access UK study data
- Following a report by The Guardian which disclosed that researchers from China are to be allowed access to half a million GP records from the research hub UK Biobank, NHS officials have insisted they will continue to protect patient information.
- The report by The Guardian raised previous concerns by the MI5 about the possibility of Chinese intelligence agencies ordering Chinese organisations and individuals “to carry out work on their behalf” by having access to UK data.
- Professor Sir Rory Collins, principal investigator and chief executive at UK Biobank, reiterated that all patients who joined the initiative have given explicit consent to have their de-identified health data studied. In addition, an NHS England spokesperson said that “NHS England is working closely with the Government, the GP profession, privacy campaigners and patient representatives to allow GP data to be shared with specific approved research studies in cases where individual patient consent has been provided”.
- Nicola Perrin, chief executive of Association of Medical Research Charities, stated that the press coverage made it sound “as though there will be a free-for-all on Chinese access to NHS data”, which is “simply not the case”. Any request for access goes through rigorous application and approval processes and GP data will only be accessed in UK Biobank’s Research Analysis Platform, which is a secure data environment.
Law firm fined £60,000 following cyber attack
- The ICO has fined Merseyside-based DPP Law Ltd (DPP) £60,000, following a cyber-attack in June 2022 where a brute force attempt gained access to an administrator account that lacked multi-factor authentication (MFA). This account was used to access a legacy case management system, which enabled cyber attackers to move laterally across DPP’s network and take over 32GB of data.
- DPP only became aware of the incident when the National Crime Agency contacted the firm to advise information relating to their clients had been posted on the dark web. DPP did not consider that the loss of access to personal information constituted a personal data breach, so did not report the incident to us until 43 days after they became aware of it.
- DPP specialises in law relating to crime, military, family fraud, sexual offences, and actions against the police. According to the ICO, the very nature of this work means it is responsible for both highly sensitive and special category data, including legally privileged information.
- The ICO reiterated that organisations are required to take continual and proactive steps to protect themselves against cyber-attacks. This includes ensuring all IT systems have MFA or equivalent protection, regularly scanning for vulnerabilities and installing the latest security patches without delay.
United States
Shopify must face data privacy lawsuit in US
- On 21 April, a US appeals court revived a proposed data privacy class action against Shopify, a decision that could make it easier for American courts to assert jurisdiction over internet-based platforms.
- Brandon Briskin, a California resident, claimed Shopify installed cookies on his iPhone without his consent and used his data to create a profile it could sell to other merchants. Shopify said it should not be sued in California because it operates nationwide and did not aim its conduct toward that state.
- A lower court judge and a three-judge 9th Circuit panel had agreed the case should be dismissed, but this was reversed in a 10-1 decision by the 9th US Circuit Court of Appeals in San Francisco, stating that Shopify can be sued in California for collecting personal identifying data from people who make purchases on websites of retailers from that state.
- According to Circuit Judge Kim McLane Wardlaw, who wrote for the majority, Shopify knowingly installed “tracking software onto unsuspecting Californians’ phones so that it could later sell the data it obtained, in a manner that was neither random, isolated, or fortuitous”.
- Matt McCrary, a lawyer for Briskin, said the court bolstered accountability for internet-based companies by rejecting the argument that “a company is jurisdictionally ‘nowhere’ because it does business ‘everywhere.'” Shopify’s next legal steps are unclear.
Europe
Meta says it will resume AI training with public content from European users
- Meta announced that it will start using publicly available content from European users to train its artificial intelligence models, resuming work put on hold last year after activists raised concerns about data privacy.
- The company said that it would train its AI systems using public posts and comments shared by adult users in the 27-nation European Union. Users’ interactions with Meta AI, like questions and queries, will also be used to train and improve the models.
- The company’s AI training efforts had been hampered by EU data privacy laws. Vienna-based group NOYB had complained to various national privacy watchdogs about Meta’s AI training plans and urged them to stop the company before it started training its next generation of AI models.
- Meta noted that a panel of EU privacy regulators in December “affirmed” that its original approach met legal obligations. The company said it won’t use private messages to train its AI model and repeated its point that it is merely following the example of rivals Google and OpenAI, “both of which have already used data from European users to train their AI models.”
EU standards bodies flag delays to work on AI Act
- Development of technical standards to be used by companies seeking to demonstrate compliance with the EU’s AI Act is behind schedule, according to CEN-CENELEC, the main standardisation bodies working on the European Commission.
- In 2023 the Commission asked the organisations to work on standards in support of the AI Act, which allow manufacturers to demonstrate that their products, services or processes comply with the rules and that they are safe, trustworthy and compliant.
- The standards were scheduled to be ready by August 2025, but “based on the current project plans, the work will extend into 2026,” said CEN-CENELEC, which consists of 34 national standardisation bodies of European countries.
- The AI Act, which aims to regulate high-risk applications, became applicable in August of last year, and is gradually being implemented. The act will be fully in force in 2027. By August this year member states should have set up national regulators to oversee companies’ compliance on a domestic level, which will work together with the Commission’s AI Office, a unit within DG Connect.
International
Notes from the Asia-Pacific region: India strides ahead on the digital front
- In India, the digital governance landscape keeps evolving across different fronts.
- Over 120 members of Parliament came together on 26 March to urge repeal of Section 44(3) of the Digital Personal Data Protection Act (DPDPA). They contend this section allows authorities to withhold any “personal information” without applying a public interest test, even if that information is required for public accountability. Thereby, they believe it would remove the balance between privacy and transparency.
- In addition, the government has indicated its plans to amend the Aadhaar (Targeted Delivery of Financial and Other Subsidies, Benefits and Services) Act, which establishes the legal framework for the Aadhaar unique identification number system in India. The decision to amend it is to harmonize the act with the DPDPA.
- The Aadhaar Act has been subject to criticism over the years due to its conflicts with the requirements of the DPDPA. For example, while consent is required to be free, specific, informed, and unambiguous under the DPDPA, in practice, several agencies force the use of Aadhaar regardless of consent. There are also several use cases where Aadhaar data is used and reused for purposes the user is not aware of.
- Meanwhile, on the AI front, the government is considering local storage of AI models. The stated objective is to reduce any risks associated with the models and prevent flow of data outside the country.



