Data Protection News Update 16 December

United Kingdom

New app lets police identify suspects in the street

  • Police forces in Wales have become the first to launch a facial recognition app in the UK. The app, known as Operator Initiated Facial Recognition (OIFR), will allow officers to use their phones to confirm someone’s identity. It can be used on people who have died or are unconscious, as well as those unable to or who refuse to provide details.
  • Police have said that photos taken using the app would not be retained, and those taken in private places (such as houses, schools, medical facilities and places of worship) would only be used in situations relating to a risk of significant harm.
  • Jake Hurfurt, of Big Brother Watch, said the app “creates a dangerous imbalance between the public’s rights with the police’s powers”, and “in Britain, none of us has to identify ourselves to police without very good reason but this unregulated surveillance tech threatens to take that fundamental right away.”

Google defeats UK privacy lawsuit over medical data deal

  • The Royal Free London NHS Trust transferred patient data to Google’s artificial intelligence DeepMindTechnologies in 2015, to help develop an app to detect kidney injuries. In 2017 the Information Commissioner’s Office (ICO) said Royal Free misused patient data when it provided the information to DeepMind.
  • Google and DeepMind were sued in 2022 by a Royal Free patient on behalf of 1.6m people for alleged misuse of private information. This decision was thrown out by London’s High Court last year, as there was no prospect of establishing the misuse of all the claimant’s private information, or establishing their expectation of privacy.
  • The Court of Appeal dismissed the appeal for the case last Wednesday.

United States

AI Generated Police Reports Raise Concerns Around Transparency, Bias

  • A growing number of police departments are adopting software products suing artificial intelligence (AI) to draft police reports for officers. The American Civil Liberties Union (ACLU) has published a white paper explaining why police departments should not use this technology.
  • Police reports are used in criminal investigations and prosecutions, raising civil liberties and civil rights concerns over introducing AI into them. The concept behind the use of AI products is that officers can select a body camera video and have the audio file transcribed, which a large language model can then turn into a first-person narrative for an officer’s police report, that can be edited and submitted.
  • Problems with using the AI include the issues of bias; the incorrect relaying of what was written by the AI; reduction in the accuracy of the memories of officers; reduced transparency of the production of evidence; and reduced accountability for officers.

Deleting Your 23andMe Genetic Data? There’s a Way, But Also a Catch

  • Over 15m people used the genetic testing and ancestry tracking company 23andMe. Previously, there was a major data leak at the company with the data of 6.9m users affected and there have been ongoing financial and management struggles at the company. More users are considering closing their accounts, and deleting their data from the platform.
  • Users have the option to delete their account and personal information, and the process of deletion begins “immediately and automatically”. Users additionally have the option to have the company discard their genetic sample.
  • However, whilst users can refuse to have their data used in the company’s research projects, their data cannot be removed from research that has already been conducted. Additionally, genotyping laboratories that worked on a 23andMe customer’s sample will also hold on to the customer’s sex, date of birth and genetic information, even after they’re “deleted.” These labs retain the information for a set amount of time, the data retained is anonymised, and genetic information is raw and unprocessed.

Europe

BeReal hit with privacy complaint over how it asks EU users to agree to tracking

  • NOYB, the European privacy rights nonprofit, is behind a complaint against BeReal concerning how users consent to tracking. NOYB has accused BeReal of using manipulative tactics (also known as “dark patterns”) to pressure users into consenting to ad tracking. This would be against the GDPR standard that consent must be “freely given.”
  • On the app, after the consent banner is employed offering users the choice to accept or deny tracking, users are not offered the same experience after the banner. An aggressive “nudging tactic” is used on users who refuse tracking, where the banner reappears every day users try to publish a post. This is the target of NYOB’s complaint.
  • A data protection lawyer at NOYB has said “BeReal’s nudging tactics are particularly absurd. When first confronted with the consent banner, users get the impression that the app actually respects their choice — only to find out that BeReal actually won’t take no for an answer. It is obvious that BeReal is trying to pressure users into consenting to tracking.”
  • The complaint has been filed with the French data protection watchdog (CNIL), as BeReal’s parent company is based in France. The complaint asks for the regulator to order the app to fix the consent flow and delete any data processed since implementing the dark pattern. Additionally, the complaint urges a fine is imposed.

International

Türkiye fines Meta $330,000 over Instagram’s child privacy breach

  • Türkiye’s data protection authority has imposed a fine of 11.5 million Turkish Liras (nearly $330,000) on Instagram’s parent company Meta, due to breaches of privacy from accounts operated by minors on Instagram.
  • Accounts created by users under 18 were allowed to be converted into private accounts, rendering them publicly accessible. This change facilitated the exposure of children’s personal data, raising allegations of privacy violations and prompting the investigation.
  • Email addresses and phone numbers associated with business accounts became universally accessible, and exposed child users to heightened vulnerabilities.

Australia: The right to privacy – a new cause of action?

  • A County Court of Victoria decision, Lynn Waller (A Pseudonym) v Romy Barrett (A Pseudonym) [2024] VCC 962, suggests the existence of an Australian common law cause of action for invasion of privacy.
  • The trial judge considered the right to privacy as distinct from the right to keep information confidential. As a result, they considered privacy required separate protection to breach of confidence claims.
  • The trial judge considered breach of confidence actions that protect confidential information and trade secrets; and the invasion of privacy actions focusing “upon the protection of human autonomy and dignity—the right to control the dissemination of information about one’s private life and the right to the esteem and respect of other people.” 

Share:

More Posts

Send Us A Message