Source:
https://scmp.com/tech/article/2101197/googles-deepmind-made-illegal-health-data-deal-uk-watchdog-says
Tech

Google’s DeepMind made illegal health data deal in the UK, watchdog says

Google’s artificial intelligence firm was allowed access to health information of 1.6 million patients to develop an app monitoring kidney disease

Artwork used by DeepMind to illustrate the concepts of its continual learning project. Photo: DeepMind/Google

By Arjun Kharpal

A deal between Google’s artificial intelligence (AI) firm DeepMind and the U.K.’s National Health Service (NHS) “failed to comply with data protection law”, a key British regulator said on Monday.

DeepMind — which Google acquired in 2014 — struck a deal in 2015 with the Royal Free NHS Foundation Trust which runs a number of hospitals in Britain. The Google company got access to a wide range of health information from 1.6 million patients, according to the full agreement which was revealed by New Scientist in April 2016.

The deal was made to help DeepMind develop an app called Streams with the aim of monitoring patients with kidney disease. Streams would alert the right clinician when a patient’s condition deteriorates.

But New Scientist revealed that DeepMind would be getting access to other health information such as whether a patient had HIV as well as details of drug overdoses, for example, which stirred a lot of controversies.

A deal between Google’s artificial intelligence (AI) firm DeepMind and the U.K.’s National Health Service (NHS) “failed to comply with data protection law”, a key British regulator said on Monday.

DeepMind — which Google acquired in 2014 — struck a deal in 2015 with the Royal Free NHS Foundation Trust which runs a number of hospitals in Britain. The Google company got access to a wide range of health information from 1.6 million patients, according to the full agreement which was revealed by New Scientist in April 2016.

Demis Hassabis, co-founder of Google’s artificial intelligence (AI) startup DeepMind.

The Information Commissioner’s Office (ICO), which is the U.K.’s data protection watchdog, launched its probe into the DeepMind-NHS deal in May 2016. On Monday, the ICO released its conclusion and found that the agreement “failed to comply with data protection law”.

“Our investigation found a number of shortcomings in the way patient records were shared for this trial. Patients would not have reasonably expected their information to have been used in this way, and the Trust could and should have been far more transparent with patients as to what was happening,” Information Commissioner Elizabeth Denham said in a statement.

“We’ve asked the Trust to commit to making changes that will address those shortcomings, and their co-operation is welcome. The Data Protection Act is not a barrier to innovation, but it does need to be considered wherever people’s data is being used.”

In essence, the ICO has taken issue with the fact that patients were not informed about how their data would be used.

“The processing of patient records by DeepMind significantly differs from what data subjects might reasonably have expected to happen to their data when presenting at the Royal Free for treatment,” the ICO’s letter to the Trust said.

“For example, a patient presenting at accident and emergency within the last five years to receive treatment or a person who engages with radiology services and who has had little or no prior engagement with the Trust would not reasonably expect their data to be accessible to a third party for the testing of a new mobile application, however positive the aims of that application may be.”

What next?

The Trust which runs the hospitals now has to “establish a proper legal basis” under the U.K.’s Data Protection Act for the Google DeepMind project and any future trials. It will also need to outline how it will comply with privacy laws in any future trials. The NHS Trust will also need to commission an audit of the trial to be shared with the ICO.

If the Trust complies with the ICO’s demands, then the Streams apps will not be shut down.

“We accept the ICO’s findings and have already made good progress to address the areas where they have concerns,” the Royal Free NHS Foundation Trust said in a statement on Monday.

“For example, we are now doing much more to keep our patients informed about how their data is used. We would like to reassure patients that their information has been in our control at all times and has never been used for anything other than delivering patient care or ensuring their safety.”

‘We need to do better’

DeepMind also released a statement shortly after the ICO’s announcement and welcomed the “thoughtful resolution” of the case while admitting that it still has work to do.

“In our determination to achieve quick impact when this work started in 2015, we underestimated the complexity of the NHS and of the rules around patient data, as well as the potential fears about a well-known tech company working in health,” DeepMind co-founder Mustafa Suleyman wrote in a blog post.

“We were almost exclusively focused on building tools that nurses and doctors wanted, and thought of our work as a technology for clinicians rather than something that needed to be accountable to and shaped by patients, the public and the NHS as a whole. We got that wrong, and we need to do better.”

Suleyman outlined steps DeepMind has already taken including creating a new agreement with the NHS in 2016 to replace the original 2015 version in order to be more detailed, publishing contracts signed with different NHS parties, creating a public engagement strategy to be more transparent, and creating an independent review panel.

“This is an amazing opportunity for us to prove what we have always believed: that if we get the ethics, accountability and engagement right, then new technology systems can have an incredible positive social impact,” Suleyman said.