AI can help doctors spot anomalies in patient scans, but building a real-world AI medical imaging system at scale faces significant challenges.
LONDON — A new project is set to build the first AI platform for medical imaging, to be used by the UK’s National Health Service (NHS). The project, which will be run by King’s College London (KCL) in collaboration with Nvidia, aims to automate the interpretation of radiology data, the most time-consuming parts of the radiology process.
While AI and machine learning have long been touted as an enabling technology for medical imaging analysis and diagnostics, implementing such a system in the real world is significantly harder than the rapid pace of technology advances would suggest.
“A key challenge will be perception, within the clinical community and the public,” said M. Jorge Cardoso, CTO of the London Medical Imaging and Artificial Intelligence Centre for Value-Based Healthcare at KCL.
The AI system will not make patient care decisions, but will support clinical teams to ensure the right patients are prioritized, and provide supporting information to assist in decision-making.
“It is our responsibility to ensure that this [system] is seen as making radiologists more effective, supporting their decision-making process and ultimately improving patient care and patient throughput through the hospital,” Cardoso said.
Professor Sebastien Ourselin, Head of the School of Biomedical Engineering and Imaging Sciences,
KCL with Jaap Zuiderveld, vice president for EMEA at Nvidia. (Source: KCL)
Another big challenge of building a system like this is the need for patient confidentiality. For the first time, the NHS will use federated learning techniques to tackle patient privacy and data governance issues.
Federated learning “outright avoids many of the classical issues of privacy in healthcare,” said Cardoso. “The key goal is to not move the clinical data from where it currently rests — this means that it stays within the hospitals’ protection, under its governance and security procedures.”
While most AI systems centralize data from end devices to train one central model in the cloud, federated learning lets end devices download the model, train the model locally using locally available data, then summarize the changes and send that data to update the central model. This means individual NHS trusts can develop their own algorithms without the data travelling outside of their domain. Trusts’ separate models can be combined to build an overall model, which has effectively been trained on all demographics.
“Architecting a federated [learning] solution on this scale is challenging as you need to ensure that all these large technical and clinical systems interact correctly so that all the expert teams from all the consortiums can work together successfully,” Cardoso said.
Cardoso explained that there are also many ethical and governance challenges to overcome, from the agreement with clinicians around the clinical pathways, to the deployment of the system in the hospital IT environment, to making sure that patients, caregivers and clinicians see AI as a benefit to the clinical workflow.
Getting the algorithm ready for deployment “will be a slow process and will require a continuous increase in evidence supporting these systems,” Cardoso said. “It’s likely that algorithms will not be deployed directly for patient diagnosis; first, they will be used as safety systems, then maybe as prioritization systems, then triaging systems, then as secondary readers, and only then as primary readers. Every one of these steps will require substantial validation and continuous development to ensure these algorithms will not harm a patient.”
Researchers and engineers from KCL and Nvidia will join clinicians on site at several major London hospitals as part of the project.
Another huge challenge to making a system like this work is assembling a team with the right expertise, said Craig Rhodes, Nvidia Industry Business Development lead for AI in Pharma, Healthcare and Life Sciences in EMEA.
“It’s hard to assemble a team of computer scientists, systems engineers, algorithmic and AI researchers, and people with deep understanding of clinical data and clinical systems that can communicate and build this effectively,” he said.
Nvidia’s DGX-2 AI supercomputer. (Source: Nvidia)
The system will be based on Nvidia DGX-2 supercomputers, the two petaflops AI accelerator box that combines 16 V100 GPUs with 512GB of memory for AI training. The project will also use Nvidia’s Clara AI toolkit alongside the NiftyNet convolutional neural network platform co-developed by KCL, plus technologies from partners including Kheiron Medical, Mirada and Scan.
“Medical data is 3D, 4D, and in some situations even 5D. Memory becomes a huge bottleneck in these models, and systems that can sync parameters and memory across many GPUs become essential to train these state-of-the-art models,” said Rhodes.
Once the initial project is complete, future rollout of the system to additional NHS sites may make use of inference-specific hardware.
“AI supercomputers such as the DGX-2, with its large memory pool, are ideal for key parts of the algorithms KCL is building,” Rhodes said. “[In the future], it will also be interesting to think about mixing DGXs for training, with Nvidia T4 inference specific cards for deployment, and introducing edge computing with Jetson Xavier and Nano system-on-modules in multiple parts of the hospital for data collection and simplification.”