admin
Pinned July 20, 2017

<> Embed

@  Email

Report

Uploaded by user
DeepMind’s data deal with the NHS broke privacy law
<> Embed @  Email Report

DeepMind’s data deal with the NHS broke privacy law

Nick Summers, @nisummers

July 03, 2017
 
DeepMind’s data deal with the NHS broke privacy law | DeviceDaily.com
 
 
Bloomberg via Getty Images

An NHS Trust broke the law by sharing sensitive patient records with Google’s DeepMind division, the UK’s data watchdog has ruled. The long-awaited decision falls in line with the conclusion drawn by Dame Fiona Caldicott, the UK’s National Data Guardian in May. The pair’s agreement “failed to comply” with the Data Protection Act 1998, according to the Information Commissioner’s Office (ICO), because patients weren’t informed that their information was being used. The ICO also took issue with the size of the dataset — 1.6 million partial patient records — leveraged by DeepMind to test Streams, an app for detecting acute kidney injury.

In April 2016, New Scientist revealed that DeepMind and Royal Free London NHS Trust were working together on a medical project. As the ICO notes in its letter to the Trust, their agreement was actually formalised in September 2015, with Royal Free serving as the data controller (owner) and DeepMind as the data processor (partner).

Initially, the deal was hashed out so Streams could be further developed by DeepMind and put through clinical safety testing. During this period, the ICO says the Trust — which ultimately takes responsibility, as data controller — broke four principles in the Data Protection Act. The first, which demands processing be “fair, lawful and transparent,” was broken because the Trust and DeepMind didn’t go far enough in their efforts to contact patients. Common law states, however, that consent can be implied if the data is being used for “direct care.” Tests and full-blown medical use are different, however, so the ICO disregarded this defence.

“The Royal Free did not have a valid basis for satisfying the common law duty of confidence and therefore the processing of that data breached that duty,” the ICO said in its letter to the Royal Free NHS Trust. “In this light, the processing was not lawful under the Act.”

The second broken principle, which is actually the third in the Act, requires data processing to be “adequate, relevant and not excessive.” Following its investigation, the ICO concluded that 1.6 million patient records were too many for DeepMind’s purposes. The Commissioner considered the Trust’s defence, but couldn’t see why that volume of information was necessary for the company’s trials. “The Commission is not persuaded that it was necessary and proportionate to process 1.6 million partial patient records in order to test the clinical safety of the application. The processing of these records was, in the Commissioner’s view, excessive,” the ICO said.

The Trust’s failure to alert patients broke another principle that states data should be processed “in accordance with the rights of data subjects.” Under section 10 of the Data Protection Act, patients should have the opportunity to pull their information from any data processing. The lack of transparency meant there was no way for patients to know, never mind act on this right. “Put plainly, if the patients did not know that their information would be used in this way, they could not take steps to object.”

The final principle concerns data protection standards. The ICO notes that additional agreements, including a privacy impact assessment, were drawn up in January and November 2016. By this time, however, DeepMind had already processed patient data. The watchdog is content with how DeepMind handles data, but disagrees with the documentation that was drawn up back in September 2015. It didn’t go far enough, the ICO says, and could have resulted in a breach of the Data Protection Act. “The Commissioner does, however, recognise that the Royal Free has since improved the documentation in place between the Trust and DeepMind.”

The documentation drawn up last November means that Streams is now on a better legal footing. That’s important, given the app has now passed the testing phase and is being used in hospitals. Nevertheless, the ICO has asked the Trust to commit to a series of changes. These include establishing “a proper legal basis” under the act for the current DeepMind agreement, and any future trials, completing a new privacy impact assessment and commissioning an audit of the initial trial period.

The ICO has stressed that it doesn’t want to impede innovation. It recognises the work DeepMind has done and the impact Streams has already had on the UK”s National Health Service. Privacy, however, should not be at the cost of progress, Information Commissioner Elizabeth Denham argues. “What stood out to me [after] looking through the investigation is that the shortcomings we found were avoidable,” she said in a blog post. “The price of innovation didn’t need to be the erosion of legally ensured fundamental privacy rights. I have every confidence the Trust can comply with the changes we’ve asked for and still continue its valuable work.”

Source: ICO
Origin: Engadget UK
 
 

(19)

Pinned onto


Top