Today we launch the second project from the RSA’s Tech and Society programme’s Forum for Ethical AI series. In partnership with NHSX we deliberated with healthcare professionals and tech specialists on artificial intelligence in our NHS.
Healthy artificial intelligence
Artificial intelligence: miracle or mirage? There are certainly many reasons to suggest the former. From identifying drug combinations, to combating rare diseases, to distinguishing between cancerous and normal human tissue within surgeries, to zoning in on tumours using an augmented reality microscope – the possibilities are endless. The potential to both improve the healthcare of patients and to ease the load on NHS staff is immense.
The government and healthcare professionals alike are alive to the possibilities. The public too seem encouraging. In the first study of the RSA’s Forum for Ethical AI series, we learnt that there is public support for, and leeway to experiment with, radical technologies in healthcare – under certain specified conditions. The public know clinicians are stretched and understand that labour-saving technologies can help them do their jobs. There is a mandate for innovation.
Given this mandate, we wanted to understand whether our health system is currently set up to take advantage in the right way. We wanted to know how and whether that mandate is being realised by NHS professionals and their contractors on the ground. What is the culture of innovation in the NHS like? What sort of conversations about commissioning and contracting are happening? Who is in control? Whose interests dominate? Put another way: are our health and care systems in a position to realise the potential of artificial intelligence, while avoiding the perils of bad implementation in what should be the safest of spaces?
To the doctors
The RSA Tech and Society team undertook a range of designed conversations and structured interviews with actors across the NHS and the wider health system who are involved in the design, innovation, delivery, clinical management or implementation of AI. You can watch a video of some of our interviews, which was shown at the recent NHS Expo that launched the new NHSX AI Lab:
Our conversations surfaced a range of views and experiences of integrating radical technologies into the NHS, as well as key barriers and pain points. They also offered tantalising glimpses of how things work when they work well – and what needs to be done to make things work better. It was remarkable how the experts we interviewed converged on both challenges and solutions with a remarkable degree of coherence. You can download our short report of this work if you’d like to read the recommendations in full and see our interview questions. We also include a structured appendix with detailed transcripts for those who really want to get into the meat of the conversations.
Download the report
For those who are in a rush, here is a slideshow which provides an overview of our method and recommendations:
See the slideshow
What we learned
In brief, we surfaced three major conditions for creating a human-centred culture of innovation and enabling the successful uptake of AI in the NHS.
1. Patient adoption
Disruption is great if you’re a Silicon Valley enthusiast – not so great if you’re managing a ward list or have several patients in intensive care. Our respondents wanted slow and steady adoption of technologies to integrate into the clinical workflow. For technology to be successfully spread through the health system, conversations about where and why it is being used need to happen. Clinical need – not political priorities – need to determine the ingress of AI in the system.
As well as acting patiently in the adoption of technology, we also need public consultation and patient consent. As we saw in our previous Forum for Ethical AI work, there is a real risk of public backlash against promising technologies without proper deliberation. The public are better informed than professionals often think. There are plenty of stories of AI not picking up skin cancers effectively for those with darker skin, or concerns about data security (to select only the negative). We need to ensure is that we are also deliberating the potential positives and having an honest debate about the risks and challenges in getting there.
2. Where’s the evidence?
Evidence matters. Clinical trials, putting something to work in the back office prior to the front, and having a sense of scale and proportion was part of this challenge.
NHS staff wanted to see evidence that the technology overcomes the problem it sets out to do. It is also important to ensure that staff are well trained for the use of the technology and that this should be ongoing training.
Lastly, having enough evidence to prove that the danger of bias from the technologies has been mitigated is crucial.
3. Clinical champions
The NHS is a huge system and innovation can be held back by entrenched structural barriers. Misaligned incentives or antiquated procurement models are unhelpful; cultural resistance and lack of know-how is perhaps even more fatal.
Across the health system, motivated professionals are already making a difference. They are trialling new ideas and working to overcome structural blocks. Through the conversations we had it was clear that clinical champions should not be seen as ‘heroes’, rather as individuals and teams working in a system seeking to blueprint and share best practice. At the RSA we refer to these individuals as ‘system entrepreneurs’ and they are crucial to the health of a system and its capacity to change and grow while keeping its human values front and centre. The NHS must do the same in this age of radical technologies and unparalleled opportunities.