The UK’s health data watchdog, the National Data Guardian (NDG), has published correspondence between her office and the national privacy watchdog which informed the ICO’s finding in 2017 that a data-sharing arrangement between an NHS Trust and Google-owned DeepMind broke the law.
The exchange was published following a Freedom of Information request by TechCrunch.
In fall 2015 the Royal Free NHS Trust and DeepMind signed a data-sharing agreement which saw the medical records of 1.6 million people quietly passed to the AI company without patients being asked for their consent.
The scope of the data-sharing arrangement — ostensibly to develop a clinical task management app — was only brought to light by investigative journalism. That then triggered regulatory scrutiny — and the eventual finding by the ICO that there was no legal basis for the data to have been transferred in the first place.
Despite that, the app in question, Streams — which does not (currently) contain any AI but uses an NHS algorithm for detecting acute kidney injury — has continued being used in NHS hospitals.
DeepMind has also since announced it plans to transfer its health division to Google. Although — to our knowledge — no NHS trusts have yet signed new contracts for Streams with the ad giant.
In parallel with releasing her historical correspondence with the ICO, Dame Fiona Caldicott, the NDG, has written a blog post in which she articulates a clear regulatory position that the “reasonable expectations” of patients must govern non-direct care uses for people’s health data — rather than healthcare providers relying on whether doctors think developing such and such an app is a great idea.
The ICO had asked for guidance from the NDG on how to apply the common law duty of confidentiality, as part of its investigation into the Royal Free NHS Trust’s data-sharing arrangement with DeepMind for Streams.
In a subsequent audit of Streams that was a required by the regulator, the trust’s law firm, Linklaters, argued that a call on whether a duty of confidentiality has been breached should be judged from the point of view of the clinician’s conscience, rather than the patient’s reasonable expectations.
Caldicott writes that she firmly disagrees with that “key argument”.
“It is my firm view that it is the patient’s perspective that is most important when judgements are being made about the use of their confidential information. My letter to the Information Commissioner sets out my thoughts on this matter in some detail,” she says, impressing the need for healthcare innovation to respect the trust and confidence of patients and the public.
“I do champion innovative technologies and new treatments that are powered by data. The mainstreaming of emerging fields such as genomics and artificial intelligence offer much promise and will change the face of medicine for patients and health professionals immeasurably… But my belief in innovation is coupled with an equally strong belief that these advancements must be introduced in a way that respects people’s confidentiality and delivers no surprises about how their data is used. In other words, the public’s reasonable expectations must be met.”
“Patients’ reasonable expectations are the touchstone of the common law duty of confidence,” she adds. “Providers who are introducing new, data-driven technologies, or partnering with third parties to help develop and test them, have called for clearer guidance about respecting data protection and confidentiality. I intend to work with the Information Commissioner and others to improve the advice available so that innovation can be undertaken safely: in compliance with the common law and the reasonable expectations of patients.
“The National Data Guardian is currently supporting the Health Research Authority in clarifying and updating guidance on the lawful use of patient data in the development of healthcare technologies.”
We reached out to the Royal Free NHS Trust and DeepMind for comment on the NDG’s opinion. At the time of writing neither had responded.
In parallel, Bloomberg reported this week that DeepMind co-founder, Mustafa Suleyman, is currently on leave from the company. (Suleyman has since tweeted that the break is temporary and for “personal” reasons, to “recharge”, and that he’s “looking forward to being back in the saddle at DeepMind soon”.)
The AI research company recently touted what it couched as a ‘breakthrough’ in predictive healthcare — saying it had developed an AI model for predicting the same condition that the Streams app is intended to alert for. Although the model was built using US data from the Department of Veterans Affairs which skews overwhelmingly male.
As we wrote at the time, the episode underscores the potential value locked up in NHS data — which offers population-level clinical data that the NHS could use to develop AI models of its own. Indeed, a 2017 government-commissioned review of the life sciences sector called for a strategy to “capture for the UK the value in algorithms generated using NHS data”.
The UK government is also now pushing a ‘tech-first’ approach to NHS service delivery.
Earlier this month the government announced it’s rerouting £250M in public funds for the NHS to set up an artificial intelligence lab that will work to expand the use of AI technologies within the service.
Last fall health secretary, Matt Hancock, set out his tech-first vision of future healthcare provision — saying he wanted “healthtech” apps and services to support “preventative, predictive and personalised care”.
So there are certainly growing opportunities for developing digital healthcare solutions to support the UK’s National Health Service.
As well as — now — clearer regulatory guidance that app development that wants to be informed by patient data must first win the trust and confidence of the people it hopes to serve.
Read Full Article
No comments:
Post a Comment