Back in 2015, somewhere in the halls of the Royal Free London NHS Foundation Trust, a decision was made that would take almost ten years for the courts to fully unravel. Patients were not informed. Consent forms did not explain it. Millions of patients who had visited that hospital, including those who had undergone kidney tests, blood draws, overnight stays, and sat in waiting rooms clutching appointment letters, had their private medical records discreetly given to one of the world’s most powerful technology companies. The majority of them learned about it through the news, just like everyone else.
As part of a clinical safety trial for an app called Streams, Google’s DeepMind was granted access to 1.6 million patients’ medical records spanning five years. The app itself, which was intended to identify patients at risk of acute kidney injury before conditions worsened, was actually helpful. That portion of the narrative is true and significant. However, the British legal system has been grappling with the questions of what happened to the data, how it was obtained, who approved it, and whether any of those 1.6 million people had any say in the matter, and they are still unresolved.
| Detail | Information |
|---|---|
| Core Incident | Google’s DeepMind received health records of 1.6 million NHS patients without proper consent |
| NHS Trust Involved | Royal Free London NHS Foundation Trust |
| Year of Data Transfer | 2015 |
| AI Application | “Streams” — an alert and detection system for acute kidney injury |
| ICO Ruling | July 2017 — Royal Free Trust found in breach of the Data Protection Act 1998 |
| Legal Case | Prismall v Google — representative action led by Mishcon de Reya |
| Lead Plaintiff | Andrew Prismall |
| Law Firm | Mishcon de Reya |
| Supreme Court Ruling | November 2021 — struck down Lloyd v Google representative claim; individuals must show financial loss or distress |
| Number of Affected Patients | Approximately 1.6 million |
| Data Records Scope | Medical history going back five years |
| Information Commissioner | Elizabeth Denham (at time of 2017 ruling) |
| Reference Website | ICO — Google DeepMind and Class Action Lawsuit |
After conducting an investigation, the Information Commissioner’s Office found that the Royal Free Trust had violated the Data Protection Act in July 2017. Not because of a technicality. The ICO discovered that patients had not received sufficient information about the use of their records, that the amount of data shared was neither necessary nor proportionate, and that no basic privacy impact assessment had ever been carried out prior to a single file being transferred. The Information Commissioner at the time, Elizabeth Denham, stated unequivocally that innovation does not have to come at the expense of basic privacy rights. It was a line that stuck because it was both perfectly reasonable and a little too late in light of what had already transpired.
DeepMind expressed regret. An undertaking was signed by the Trust. A third-party audit was agreed upon by both. After that, it appeared for a while that the issue might fade into the background noise of tech-industry accountability—acknowledged, denounced, and forgotten. It didn’t. In 2021, the law firm Mishcon de Reya filed a lawsuit on behalf of over a million impacted patients. Andrew Prismall, the lead plaintiff, stated that he had been “greatly concerned” about what had been done with his data. In the Lloyd v. Google case that same year, the Supreme Court made matters much more complicated by holding that people cannot just claim that their data was improperly handled and expect to be compensated; instead, they must show financial loss or actual distress. Although not completely closed, the path to simple class-action justice was reduced.
The tension in that Supreme Court decision is difficult to ignore. On the one hand, it makes legal sense because damages need proof of harm, not just outrage. However, a framework that requires patients to demonstrate that they were sufficiently harmed by a privacy violation to which they never gave their consent is unsettling. Both Liberty and Inclusion London intervened in the Lloyd case, arguing that access to justice shouldn’t be based on how loudly you feel your rights were violated and that some people—disabled people, young children, etc.—may not experience or express distress in ways that courts can readily recognize. That argument was not directly addressed by the Supreme Court. It dismissed the representative claim for additional reasons. However, the disagreement persisted.
As the Prismall case moves closer to trial against Google, the British courts are currently dealing with a larger issue than a single 2015 hospital deal. Some of the most private information in the nation is kept by the National Health Service, including diagnoses, surgeries, mental health records, fertility treatments, and chronic illnesses that people don’t share with their neighbors or employers. Additionally, the NHS is always short on funds, always looking for ways to finance innovation, and always appealing to tech firms that recognize the research value of large amounts of health data. There has always been danger in that combination. DeepMind was an example of that risk manifesting itself in the most obvious way imaginable.
Since DeepMind was integrated into Google Health, the Streams app has been decommissioned. Technology advanced. The legal ramifications didn’t. Ben Lasserson, a partner at Mishcon de Reya, stated that the case should help address basic concerns regarding the handling of sensitive personal data, not only in this particular case but also in the future as more NHS trusts contemplate data partnerships with private tech firms. If the trial goes through to the end, it might result in rulings that clearly define what consent means in the context of healthcare and what responsibilities tech companies have when they receive data from public institutions.
Observing all of this slowly proceed through the legal system gives me the impression that the initial transaction was never truly about a single app. In 2015, DeepMind was a young and well-known AI startup that was recently purchased by Google for an estimated £400 million. The Streams project provided DeepMind with access to real, messy, longitudinal health data from a large urban hospital, something that nearly no private company could otherwise obtain. The kidney app’s effectiveness is now practically irrelevant. Whether any of the individuals whose records made that possible ever had a real choice in the matter is what mattered and what courts are still debating. The majority of them were unable to say no. No one inquired.