A huge NHS hospital’s radiology reading room is largely unchanged from five years ago: low illumination, two monitors, a half-drunk cup of coffee cooling on the desk, and the distinct, focused silence of someone closely examining something. What has already occurred prior to the radiologist taking a seat is what has changed. An AI system has already evaluated a scan, identified anything it deems urgent, ranked the case by severity, and, in many cases, written a preliminary report by the time it reaches a human set of eyes in an increasing number of hospitals. The first reader is no longer the radiologist. They serve as a backup opinion.
The figures underlying this change have begun to resemble a ledger rather than a prediction. Google’s AI system identified 24% more tumors than skilled radiologists operating alone, according to a big NHS breast cancer screening trial involving 115,000 women. In independent studies, DeepMind’s HEAL and a model named AsymMirai have shown the capacity to detect tumors that human eyes missed; in certain screening scenarios, this has reduced false negatives by as much as 94%. These advantages are not incremental. These are the kinds of figures that would lead to a rather straightforward discussion about staffing in any other industry. Though it’s a more complex discussion, medicine is having it.
| Topic | AI-assisted cancer diagnosis in radiology — how machine learning systems are reshaping screening, detection, and clinical workflow in 2026 |
|---|---|
| Scale of Impact | AI now routinely triages and analyses millions of scans annually across NHS and major health systems worldwide |
| Key NHS Study Finding | Google’s AI detected 24% more cancers than trained radiologists in a UK screening study covering 115,000 women |
| False Negative Reduction | Models including DeepMind’s HEAL and AsymMirai have reduced false negatives in some screenings by up to 94% |
| Report Turnaround Improvement | AI tools have cut reporting times by up to 83% — from 48 hours to under 9 hours in tested environments |
| Interval Cancer Detection | AI identifies roughly 25% of “interval cancers” — tumours that develop between regular screenings and are typically missed by human review |
| Notable AI Tools | Qure.ai (real-time flagging of lung nodules and intracranial haemorrhage), DeepMind HEAL, AsymMirai |
| Doctor Consensus | Radiologists shifting from image readers to diagnostic coordinators — integrating AI findings with patient history and clinical context |
| Outstanding Challenges | Liability when AI errs, “black box” decision-making, false positives in complex cases, trust gaps in clinical settings |
| Prevailing Industry View | “AI will not replace radiologists, but radiologists who use AI will replace those who don’t” |
What doctors refer to as interval cancers—tumors that form and become apparent in the window between planned screenings, usually undetected until the patient’s next appointment—may be the most subtly startling discovery in recent research. Radiologists are plagued by these situations, which are discovered too late. By reading ahead from present imagery with a type of pattern recognition that doesn’t neatly map onto how human visual cognition functions, AI systems are currently detecting about 25% of these future cases at earlier stages. What precisely the models observe that people do not is still unknown. That knowledge gap is a contributing factor to the issue.
The changes in process are easier to quantify but more difficult to romanticize. Critical findings, such as lung nodules or cerebral hemorrhages, are flagged in real time by tools like Qure.ai, guaranteeing that the patients who are most at danger are evaluated first rather than going through the line in order of arrival. In certain testing conditions, report turnaround times have decreased by 83%, from 48 hours to less than 9. That is more than a measure of efficiency. The interval between a scan and a treatment decision is significant in oncology, with compounding effects. The patient is waiting every day that a report remains unread, but the illness is not.
There’s a tension that doesn’t cleanly settle into either jubilation or fear as the medical community navigates all of this. The phrase “AI will not replace radiologists, but radiologists who use AI will replace those who don’t” sums up the majority opinion among radiologists, at least those who are willing to express it clearly. It has been used so frequently in department meetings and conferences that it has practically become a professional motto. It’s a framing that recognizes the change while maintaining the crucial role, and it’s probably accurate—at least for the time being. However, it ignores the more difficult question of what “using AI” really means when the system is making judgments that you can’t always explain to a patient seated across from you and producing results that you can’t always verify.

When physicians are candid about their reluctance, the word “liability” comes up. Who is responsible if a tumor is discovered three months after an AI system marks a scan as clear? Does the existence of an AI finding alter the legal calculus if a radiologist overrides an AI flag and turns out to be incorrect? There are currently no definitive answers to these problems, and the rate at which AI is being used in clinical contexts is surpassing the rate at which medical boards, insurers, and regulators are creating frameworks to deal with them. Technology is evolving. The government is catching up.
According to a number of senior practitioners, the function of the radiologist is evolving toward something more akin to orchestration: combining the AI output, the patient history, the clinical context, and the information that a scan is unable to provide about the person inside the machine. That framing acknowledges the change without downplaying its importance. The reading room has the same appearance. The coffee continues to chill. However, anything that doesn’t require sleep, doesn’t lose focus after the sixth hour of a shift, and, for better or worse, doesn’t have to bear the burden of recalling the mistakes it made has already sorted the pile of work on the desk.