How will artificial intelligence change radiology?

Artificial intelligence and cognitive computing is being heralded as the brave new frontier of clinical IT, Kim Thomas reports on how it is already beginning to reshape radiology imaging and diagnostics

IBM chose December’s annual meeting of the Radiological Society of North America to showcase the ability of its Watson supercomputer to rapidly analyse medical images and suggest a diagnosis. Mark Griffiths, a clinical radiologist at University Hospital Southampton NHS Foundation Trust, who attended RSNA, says he saw some “stunning demonstrations” of the technology, including chest X-rays being “reported in milliseconds.”

Watson is an example of a technology that IBM refers to as “cognitive computing”. Using a form of artificial intelligence known as natural language processing, Watson, a cloud-based system, is able to analyse vast stores of scholarly articles, patient records and medical images. (When IBM acquired Merge Healthcare in 2015, it gained access to the company’s database of 30 billion images.) This ability to interpret written language is what marks Watson out as different from other computer-based tools used to aid diagnosis.

Not enough radiologists to meet demand

In England, the volume of radiology images taken has increased at the rate of 3.6% a year for 20 years, and there are not enough radiologists to meet demand. Could Watson ­– and other AI tools – provide a solution to the problem of overstretched radiology departments? And – as some fear – could it replace radiologists altogether?

Many radiologists are already comfortable with the idea of using computer-aided diagnosis. Neelam Dugar, consultant radiologist at Doncaster & Bassetlaw Hospital NHS Trust, has used Siemens’s CAD tool for detecting lung nodules, finding its detecting abilities “far superior to radiologists”, though she adds that, while its sensitivity is high, its specificity is low, so that it also “picks up a lot of rubbish”.

The UK firm Blackford Analysis has had success in the US market with Smart Localizer, a tool that enables radiologists to compare multiple imaging studies simultaneously. CEO Ben Panter says: “It lines up scans to allow radiologists to read them more quickly. We do that by being able to link lesion locations of any sort between current and prior scans or across modalities. If a radiologist has to track the change in five or six different lesions there’s a huge amount of scrolling they have to do and we eliminate that time.”

AI tools ‘taught’ like a medical student and continually learn

But solutions such as Watson, and other similar AI tools being developed by Enlitic and Sectra, seem to offer a radical departure, mimicking the processes used by the human brain. Watson has been “taught” by experts how to interpret the reams of data it stores. Steve Tolle, global VP of imaging strategy at IBM Watson Health, explains: “We train this platform as if we’re training a medical student, through repetitive learning.”

As it acquires new information, Watson is able to improve its diagnostic capabilities.  “What will evolve over time will be the medical literature,” says Tolle. “There’s going to be a new therapy, there’s going to be a new published article on different techniques or risk scoring for a specific disease. Watson will be able to surface those changes in the medical literature, and that’s where you’ll see the evolution of our technology. You will see accuracy and specificity of some of these algorithms improve.”

Dugar can certainly see the opportunities offered by cognitive computing. In radiology, she says, two of the most common errors are missed lung cancer and missed fractures:  “I can see huge potential for improving patient care. Missed fractures and missed lung cancers will reduce substantially, and it will reduce the cost to the NHS from litigation.”

Misinterpretation remains key challenge

But she is also mindful of the technology’s limitations. While she believes CAD tools can reduce one kind of radiological error (“not being able to perceive something”), she is less sure that they can help with the problem of misinterpretation – spotting something on an image but failing to identify it correctly. “We do develop visual fatigue,” she says. “There are too many images there. But what we are good at is putting together clinical history, blood results, histopathology reports and giving a diagnosis, and I think that’s a very long way away for computers.” She also notes that the use of free text in an EPR might cause problems for an algorithm – in a phrase such as “no evidence of lung cancer”, the computer might only pick up “lung cancer”.

AI access to large validated imaging datasets vital

While optimistic about the potential of cognitive computing, Griffiths points to the importance of technologies such as Watson having access to validated datasets, so that they learn when a diagnosis has been correct.

In the UK, he notes, large imaging datasets have been accumulated in the 12 years since PACS systems were rolled out throughout the NHS: “If we can link that properly to our electronic medical records or even with the Cancer Registry, that’s where we’re going to get to win with these powerful tools.” He sees potential in the decision by NHS Scotland to hand over 30m studies (more than two billion images) from its national Carestream PACS to the Farr Institute, a research collaboration between academic institutions and health organisations that is analysing data analytics to improve patient care.

AI diagnosis only as good as algorithms used

The effectiveness of AI tools in making diagnoses can only be as good as the algorithms used to inform them. It’s an issue that Sectra, also working on an AI solution for radiology, understands all too well, says Claes Lundstrom, the company’s research director: “How we can be sure that this algorithm is still going to work when we put in a new version of the CT scanner? Or when we try to apply it to another department down the street? And that’s the very fundamental problem, and that’s where the work lies ahead. We have good core algorithms but we have to find out how to put them into a workflow context and organise the quality assurance measures that make sure we don’t arrive at these horrible mistakes.”

We may find out quite soon how successful Watson is when used in the real world. This year, IBM will launch two Watson-based imaging applications: a retrospective peer review tool and a patient summarisation tool. Other applications will follow in 2018. In the longer term, IBM’s plans for the technology are ambitious: Tolle says that when the imaging tool has helped diagnose a patient’s cancer, the Watson oncology tool can then identify the best treatment ­– and data on the effectiveness of the treatment will then be used to refine the diagnostic and treatment applications further.

For the present, at least, the consensus is that AI will aid radiologists, but not replace them. As Lundstrom says: “The role of the radiologist will change but it will be more like a Formula One driver kind of role. You have this extremely powerful machine but you need a very competent driver to know where to go and how to get there in the most effective and efficient way.”

 This article originally appeared in Digital Health.

Share this page on Twitter