Does caring mean sharing?

Anxieties about patient data confidentiality continue to simmer in the wake of the Information Commissioner’s report into DeepMind’s partnership with the Royal Free. If nothing else, it has served as a reminder that the end-game of technology enabling better care in no way negates the need for hospitals to be crystal clear about their data sharing responsibilities.

By
Piers
Ford

The controversy which erupted around the Royal Free NHS Foundation Trust’s collaboration with Google-owned AI pioneer DeepMind Health on the development of an app to help with the early detection of Acute Kidney Injury (AKI), laid bare deep-rooted concerns about privacy and the protection of patient data in healthcare.

To all intents and purposes, this was a ground-breaking and exciting project focused on making a practical difference to patients and clinicians across the NHS. An initial approach to DeepMind in 2015 from a team of kidney specialists exploring ways to improve detection and diagnosis of AKI, led to the development of Streams, a secure clinical app, which does not use AI. 

DeepMind has always been clear that Streams is a separate strand of technology from its long-term AI projects, which use depersonalised data.

Streams was implemented early last year, with impressive anecdotal results: some 26 doctors and nurses were using the app by the end of February, and it was alerting clinicians to an average 11 patients at risk of AKI every day. Nursing staff estimated time savings of up to two hours.

The problem was, however, that during the development and pilot testing phases of the app, the data transfer of 1.6 million patient records – identifiable data – to DeepMind’s servers appeared to have taken place without any lawful basis.

Testing vs caring

That was certainly the perspective of the National Data Guardian (NDG), Dame Fiona Caldicott, who wrote to both the trust and DeepMind co-founder Mustafa Suleyman in December 2016, expressing her view that these records were transferred for the purposes of testing Streams rather than the provision of direct care to patients. 

“My considered opinion… remains that it would not have been within the reasonable expectation of patients that their records would have been shared for this purpose,” she wrote.

That the data had been used solely to deliver direct care services had been the defence of both parties since a New Scientist investigation in May 2016 first revealed that the agreement between the trust and DeepMind Health gave the company access to all admissions, discharge and transfer data, accident & emergency, pathology & radiology, and critical care at the trust’s three London hospitals involved in the project.

This claim was dismissed by the Information Commissioner (ICO) Elizabeth Denham following a year-long investigation into the project. Her report, published in July 2017, found that the trust had failed to comply with the Data Protection Act (DPA), and agreed that patients had been inadequately informed about the use of their data.

Whether this will eventually be enough to satisfy critics who remain deeply uneasy about a company owned by Google being given any kind of access to patient data and unconvinced by assertions that it is only being used to deliver direct care services remains to be seen. 

But the case continues to fuel a debate which will only become more complex as patients’ own awareness of their data ownership in an increasingly tech-driven world begins to exert a stronger influence. And it has, at least, helped to establish some specific guidelines for the NHS around the use of patient data in app development and testing.

As Elizabeth Denham suggested in her blog post, ‘Four lessons NHS trusts can learn from the Royal Free case’, nobody should need to make a choice between privacy innovation if they follow the law, prepare adequate privacy impact statements, and make judicious use of enabling technologies.  

The trust was required to sign an undertaking to comply with the act, to establish a proper legal basis under the DPA for its work with DeepMind and any future trials, to set out its plans for future compliance reflecting its duty of confidence to patients in future trials involving personal data, to complete a privacy impact assessment, including specific steps to ensure transparency, and to commission an audit of the trial with results to be shared with the ICO.

Cause for concern

For some commentators, the ICO report failed to hold DeepMind sufficiently to account for its part in the project’s DPA compliance failures – as did DeepMind’s own Independent Review panel. 

“Google DeepMind continues to receive excessive amounts of data in breach of four principles of the Data Protection Act, and the Independent Reviewers didn’t think this worth a mention,” said Phil Booth, Co-ordinator at health data privacy group MedConfidential, in a response to the report.

“DeepMind did something solely because they thought it might be a good idea, ignorant of the law, and are now incapable of admitting that this project has unresolvable flaws. The ICO has forced both parties to fix them within weeks having ignored them for approaching two years.”

MedConfidential had previously raised strong concerns about why the trust was sharing data streams for all patients rather than just those who had been given kidney function tests.

Another commentator, John Naughton, Professor of the Public Understanding of Technology at the Open University, thought little of the Royal Free Hospital Trust’s own response to the report, which expressed its gratitude for being allowed to continue using the app. 

The trust said: “We have signed up to all of the ICO’s undertakings and accept their findings. We have already made good progress to address the areas where they have concerns. For example, we are now doing much more to keep our patients informed about how their data is used. We would like to reassure patients that their information has been in our control at all times and has never been used for anything other than delivering patient care or ensuring their safety.”

“In our rush to collaborate with nurses and doctors to create products that addressed clinical need, we didn’t do enough to make patients and the public aware of our work or invite them to challenge and shape our priorities,” DeepMind said

The trust also referenced its full co-operation with the ICO investigation, and said, “It is helpful to receive some guidance on the issue about how patient information can be processed to test new technology” – something that the ICO indicated should have been addressed before Streams progressed beyond the drawing board.

This, wrote Naughton in the Guardian, bearing in mind that the trust was found to have broken the law, was “like a burglar claiming credit for co-operating with the cops and expressing gratitude for their advice on how to break and enter legally next time.”
The report appeared to accept DeepMind’s status as a processor rather than holder of data on behalf of NHS – an important distinction in the definition of data ‘sharing’ - and the company welcomed the ICO’s “thoughtful resolution of this case”. 

It acknowledged that its initial legal agreement with the Royal Free could have been “much more detailed”, and pointed out that it had in fact been replaced with a more comprehensive contract in 2016. This was also reflected in subsequent contracts with Streams NHS partners including Taunton and Somerset NHS Foundation Trust and Imperial College Healthcare NHS Trust, all of which are published on its website.

DeepMind also committed itself to major improvements in transparency, oversight and engagement – areas for which it had attracted strong criticism from MedConfidential and other industry watchers.

Too hasty

“In our rush to collaborate with nurses and doctors to create products that addressed clinical need, we didn’t do enough to make patients and the public aware of our work or invite them to challenge and shape our priorities,” the company said. This has been addressed by the formulation of a patient and public engagement strategy.

The company also pointed out that it had formed an independent review panel “long before any regulatory or media criticism”. This panel produced its first annual report in July 2017, which amplified the recommendations raised by the ICO, as well as exploring the benefits and broader implications of the project in some detail.

Piers Ford

Related News

BLOG: Touchscreens, digital patients and data

App stores are flooded with thousands of applications, ranging from weight reduction, to nutrition, diabetes management and blood pressure control

'Patient-to-patient support can make all the difference'

WarOnCancer CEO and co-founder, Fabian Bolin, tells HIMSS TV why patients must share experiences to transform patient care and boost mental health

COCIR and SIA launch ID recommendations

A new joint publication, launched at the HIMSS Europe and Health 2.0 Conference, gives critical guidance on the identity challenge facing healthcare