Generative AI in Radiology Is Finally Here. Now Can It Reach Everyone?
Earlier this month, something happened in radiology AI that hadn't happened before.
Cognita, the AI division of Mosaic Clinical Technologies, received FDA Breakthrough Device Designation for a vision-language model that interprets chest X-rays and drafts preliminary findings for radiologist review.1 Cognita CXR is still investigational, working through the FDA review process this designation is designed to accelerate. What the designation signals is that the FDA has concluded there's a reasonable expectation this technology could offer more effective diagnosis of life-threatening conditions, and that it will receive prioritized interaction and review as a result. The agency is treating it as important enough to move to the front of the line.
That's a first for a generative AI model in radiology. And it matters, not just for Cognita, but for what it signals about where the field is finally heading.
A Decade of Detection
Radiology has been an AI success story by almost every conventional measure. The field now accounts for just over three-quarters of all FDA-authorized AI-enabled medical devices, with the total number surpassing 1,000 in 2025.2,3 Between 1995 and 2015, only 33 devices were authorized in total.2 By 2023 alone, that number was 221 in a single year.2
Nearly all of those algorithms do one thing: analyze images and flag findings. A nodule here. A hemorrhage there. A suspected PE on a CTPA. They're detection tools, built to identify specific pathologies with high sensitivity and cleared for a narrow clinical purpose. A 2024 survey of 43 US health systems found that imaging and radiology were the most widely deployed clinical AI use cases, with 90% of organizations reporting at least partial deployment.4
Detection matters. Missing a PE or an intracranial hemorrhage has real consequences for real patients. But detection is only a part of what we do. A major component of our job is to see and to say: to review images and produce a structured, clinician-ready, actionable report that communicates what we found and what it means. Those are two very different tasks. And AI has spent the last decade almost entirely focused on the first one.
Here's what that looks like in practice. You're working through a busy list. A chest CT lands on your worklist, and an AI flag appears in the corner of your viewer. You click over, look at it, decide it's a lymph node and not a nodule, dismiss the flag, and return to your read. Then you open your report and dictate the study. The same way you would've dictated it before the AI was ever installed.
The AI did something. Whether it saved you any time is a different question.
Even the best detection tools couldn't close that gap. A study in the European Journal of Radiology found that deep learning detection of pulmonary embolism on CTPA, technically sound and correctly flagging positive studies, produced no significant improvement in report communication times or patient turnaround in the emergency department nine months after implementation.5 A prospective study in Radiology confirmed the pattern: AI triage reduced wait time for positive PE studies, a real benefit, but did not improve radiologist accuracy, miss rate, or overall report turnaround times.6 The bottleneck was never in the detection. It was in everything that came after.
A 2025 summary representing 27 radiology societies described "wide gaps between expectations, desires, needs, and the current state of emerging technology," with "broad concerns regarding the perception of limited impact of AI on radiologist workflows."7 This is our specialty, saying collectively, that the tools haven't delivered. A 2023 systematic review put it plainly: AI currently has "modest to moderate penetration in clinical practice, with many radiologists still being unconvinced of its value and the return on its investment."8
Why Generative Reporting Is Different
The report is our deliverable. It's what the referring physician acts on, drives follow-up, and guides treatment. Everything we do ultimately flows into that document. For a decade, that part of the workflow was untouched, not because anyone thought it was unimportant, but because the technology to address it wasn't ready.
Cognita's Breakthrough Device Designation is a meaningful signal that the regulatory path for generative reporting is beginning to take shape. A vision-language model that interprets images and drafts preliminary findings is solving a fundamentally different problem than a detection algorithm that flags and hands back. It's addressing the say half of our job, not just the see half. Getting AI output into an actual draft report, positioned for radiologist review and sign-off, is the right problem to be working on.
The Harder Question
But here's what I keep coming back to: who gets this, and when?
Large consolidated networks have structural advantages that are hard to overstate. When you have thousands of radiologists reading on a common platform, you can build tightly integrated AI, validate it at scale with real clinical data, and deploy it across an entire network in ways smaller groups can't easily replicate. That's not a criticism — it's a description of what scale makes possible.
Most of us don't practice that way. We practice in independent groups, academic medical centers, and community hospitals. We're reading on legacy reporting platforms that often take years to upgrade or replace. For us, the question isn't whether the AI is good. It's whether the AI works within the environment we already have. And interoperability is only part of that. Radiology groups also need confidence in accuracy, governance, and how draft output fits into clinical accountability.
A draft reporting solution that requires migrating to a new reporting platform faces adoption barriers unrelated to model quality. The technology doing the image interpretation and the system where the radiologist dictates and signs don't have to be the same product. Getting AI-generated draft content into the reporting workflows most radiologists already use, without asking them to rebuild their infrastructure, is the next problem the field needs to solve.
Generative AI in radiology is finally here. Bringing it to everyone is the work that remains.
William Boonn, MD is a practicing cardiovascular radiologist at Penn and Chief Medical Officer at HOPPR, where he developed Presto along with John Paulett, Woojin Kim, MD, and Brandon Smith
References
-
Mosaic Clinical Technologies. Mosaic Clinical Technologies Announces FDA Breakthrough Device Designation for Cognita's Generative AI Model for Radiology. BusinessWire. March 4, 2026. https://www.businesswire.com/news/home/20260304633206/en/Mosaic-Clinical-Technologies-Announces-FDA-Breakthrough-Device-Designation-for-Cognitas-Generative-AI-Model-for-Radiology
-
Sivakumar R, Lue B, Kundu S. FDA Approval of Artificial Intelligence and Machine Learning Devices in Radiology: A Systematic Review. JAMA Netw Open. 2025 Nov 3;8(11):e2542338. doi: 10.1001/jamanetworkopen.2025.42338. PMID: 41201805
-
The Imaging Wire (December 2025). FDA AI Approvals Surge Past 1k for Radiology.
-
Poon EG, Lemak CH, Rojas JC, Guptill J, Classen D. Adoption of artificial intelligence in healthcare: survey of health system priorities, successes, and challenges. J Am Med Inform Assoc. 2025 Jul 1;32(7):1093-1100. doi: 10.1093/jamia/ocaf065. PMID: 40323320
-
Schmuelling L, Franzeck FC, Nickel CH, Mansella G, Bingisser R, Schmidt N, Stieltjes B, Bremerich J, Sauter AW, Weikert T, Sommer G. Deep learning-based automated detection of pulmonary embolism on CT pulmonary angiograms: No significant effects on report communication times and patient turnaround in the emergency department nine months after technical implementation. Eur J Radiol. 2021 Aug;141:109816. doi: 10.1016/j.ejrad.2021.109816. PMID: 34157638
-
Rothenberg SA, Savage CH, Abou Elkassem A, Singh S, Abozeed M, Hamki O, Junck K, Tridandapani S, Li M, Li Y, Smith AD. Prospective Evaluation of AI Triage of Pulmonary Emboli on CT Pulmonary Angiograms. Radiology. 2023 Oct;309(1):e230702. doi: 10.1148/radiol.230702. PMID: 37787676
-
Bruno MA, Grimm L, Pang LJ, Bezold A, Fields BKK, Baird GL, Smetherman D, Coombs L, Rawson JV, Wald C, Hublall RV, Schwartz ES, Chang PJ, Lexa FJ. Artificial Intelligence and Its Impact on Radiology: Summary of the 2024 Intersociety Summer Conference. J Am Coll Radiol. 2025 Nov;22(11):1417-1422. doi: 10.1016/j.jacr.2025.07.025. PMID: 40754127
-
Mello-Thoms C, Mello CAB. Clinical applications of artificial intelligence in radiology. Br J Radiol. 2023 Oct;96(1150):20221031. doi: 10.1259/bjr.20221031. PMID: 37099398