DeepMind starts working on Verifiable Data Audit
Article posted on: March 9, 2017
DeepMind starts developing system to track data interference that will use blockchain attributes
[London, UK] DeepMind Health will now start working on a Verifiable Data Audit that will track any interference with information to increase transparency of their work, DeepMind Co-founder Mustafa Suleyman and Ben Laurie, the company’s Head of Security and Transparency, have revealed.
The system will complement the work of the Independent Reviewers, appointed to scrutinise use of patient data, which are expected to publish annual reports detailing their findings.
“With Verifiable Data Audit, we’ll build on this further. Each time there’s any interaction with data, we’ll begin to add an entry to a special digital ledger.
“That entry will record the fact that a particular piece of data has been used, and also the reason why – for example, that blood test data was checked against the NHS national algorithm to detect possible acute kidney injury,” the two DeepMind experts wrote in a blog today. In an interview for BJ-HC earlier this week, Suleyman emphasised easing NHS pressures through technology remains one of their main priorities.
‘Last move’ of a Jenga game
The digital ledger will reportedly be sharing properties of blockchain, increasing transparency by removing any possibility of erasing any evidence of unapproved data tampering.
However, it will also ‘differ’ from it in a number of ways, as Suleyman and Laurie reveal they intend to replace ‘the chain part of blockchain’ and use a ‘tree-like structure’.
“The overall effect is much the same. Every time we add an entry to the ledger, we’ll generate a value known as a ‘cryptographic hash’. This hash process is special because it summarises not only the latest entry, but all of the previous values in the ledger too.
“This makes it effectively impossible for someone to go back and quietly alter one of the entries, since that will not only change the hash value of that entry but also that of the whole tree,” they added, calling it ‘the last move’ of a Jenga game.
Three ‘technical challenges’
Furthermore, DeepMind is set to develop a ‘dedicated online interface’ that staff at trusts they have collaborations with will be able to access in real-time, which will also trigger alarms in case ‘anything unusual’ happened.
However, Suleyman and Laurie acknowledge the development of the project will have to surpass a number of ‘technical challenges’, such as having ‘no blind spots’ in the system, ‘different uses for different groups’, explaining that trusts may wish to give patient groups access to the interface, posing further ‘complex design questions’, and ‘federated data and logs, without gaps’, focusing on interoperability.
“Enhancing such audit logs with high-integrity cryptographic controls, inspired by block chains, provides a higher level of assurance that mistakes or violations of policy will be found, and unauthorised parties cannot hide their trails,” commented George Danezis, UCL Professor of Security and Privacy Engineering.