Are clinicians willing to trust an algorithm’s predictions?
AI is transforming a number of industries at the moment. What potential has it to transform healthcare?
Commentators talk about artificial intelligence (AI) as being like ‘the electricity or the internet’ for its ability to transform different industries. AI has captured the interest of the general public and governments around the world who are brainstorming its role and ways it can be safely deployed, going forward.
At DeepMind Health, we think AI could lead to substantial improvements in all areas of healthcare. Medicine and healthcare delivery are becoming increasingly complex and new technology has the potential to address some of the issues caused by this. We have more elderly and frail patients than ever before - and, at the same time, more and more data is available to us but the current systems cannot make effective use of it. AI is very good at making sense of complex data and making useful and accurate predictions and I think it has a role in providing invaluable guidance to clinicians to support them in delivering expert care.
How will AI change the roles and responsibilities of stakeholders and who will drive the change?
A lot of articles focus on the threat of AI replacing clinicians, but I don’t think that this will be the case. I see the role of AI as supporting and enabling nurses, doctors and ultimately patients in making complex decisions on care.
Clinicians have trained for many years to understand the context of a patient’s illness and AI could help support decision making by helping make sure things haven’t been missed.
How do you rate ethical and privacy worries in connection with AI? There was a recent UK court ruling that London’s Royal Free Hospital had illegally provided DeepMind access to 1.6 million patient records. How is DeepMind addressing these issues and concerns?
Patients have the absolute right to know how their sensitive data is being processed and by who. The Royal Free Hospital received criticism from UK regulators in the UK around whether patients knew how their data was being processed. We have learnt from this experience and are doing our best with partners to ensure people understand the work we are doing and how their data will be processed. There is wider discussion in the UK and other countries about how we ensure that patients are able to see how their data is used and how their consent can be provided.
What are the current obstacles in the way to implementing and using AI in medicine? How long do you think it will take before healthcare fully embraces AI?
AI based approaches are, for the most part, not ready for clinical deployment and there are still many questions to answer before this is the case. ‘How do we provide clinicians with the ability to question what algorithms are doing?’ is one, for example. ‘How do we safely deploy this technology so that we don’t cause harm?’ is another.
There are also issues around ensuring we have public permission for AI in healthcare. Are patients willing to be on the receiving end of predictions from an algorithm? Are clinicians willing to trust an algorithm’s predictions? The clear message that we get from clinicians and patients is that they don’t simply want a black box sending out a prediction or a diagnosis.
There are numerous areas - from scientific progress and clinical applicability, to clinician and patient permissions, to regulatory, ethical and information governance – that all need to be resolved before we can see the widescale scale-up of this technology.
Dr Dominic King will be delivering the opening keynote “AI-enabled healthcare: potential and challenges” at HIMSS Impact18, Potsdam, Germany, October 17-18, 2018.
To learn more about HIMSS Impact18 visit: www.himssimpact.eu
Dr. Dominic King is an Honorary Clinical Lecturer in Surgery at Imperial College London, where he previously worked as an academic general surgeon. His research interests lie in digital health, health policy and behavioural economics. As part of his PhD, he co-authored the Cabinet Office Mindspace report, which served as the initial operating framework for the Behavioural Insights Team, established by UK Prime Minister.
He has published research in the Lancet, BMJ and Health Affairs and contributed to sessions at the World Economic Forum and Tech Crunch Disrupt. The digital health company Hark that Dominic founded was aquired by DeepMind in 2016