CIRSE: Can you tell us about yourself, your hospital, and your personal interest in AI and how it affects your daily practice?
Wood: I am lucky to lead a team of smart, eclectic, multi-disciplinary IR MDs, PhDs, software engineers, chemical engineers, image processing and regulatory science experts, and trainees with a clinical arm in IR, and a research arm in the Center for Interventional Oncology in the Intramural Research Program of the National Institutes of Health (USA), the National Cancer Institute, the National Institute of Biomedical Imaging and Bioengineering, and the NIH Clinical Center, the world’s largest hospital dedicated solely to clinical and translational research. AI is a tool that is becoming integrated in every corner of what we do, in hidden ways as well as more obvious ones. Today, we use AI for image reconstruction, segmentation, registration, and optimization of pathways, and endpoints. AI identifies and selects specific biopsy targets and classifies lesions on prostate MRI, and can actually predict a tumour margin or treatment plan that is not always available to the human naked eye. What is most exciting though is what is yet to come. These are super exciting times to be in IR.
Daye: I am currently an academic interventional radiologist at Massachusetts General Hospital and an Assistant Professor of Radiology at Harvard Medical School with a joint appointment in the MGH/HST Martinos Center for Biomedical Imaging where I lead the Precision Interventional and Medical Imaging (PIMI) Lab. My research centres on the applications of machine learning and computer vision for precision medicine. I am an engineer by training. I completed my MD/PhD at the University of Pennsylvania, where I obtained my PhD in Bioengineering. Part of my research there focused on using machine learning for breast cancer diagnosis, risk prediction, and prognostication. I have used the skills I gained during my PhD to focus my current research endeavours on the applications of AI in IR. I am very excited about all the opportunities this new technology will bring to advance the care of the patients we serve in interventional radiology.
We are currently in the very early stages of seeing AI affecting our daily practice in interventional radiology. While there are a number of applications on the market today to aid in care coordination and patient selection for IR procedures, we are just scratching the surface. I can’t wait to see how AI will change our practice over the coming decades.
Lévy-Boeken: Let me start by expressing how thrilled I am to be joining the editorial board. Being part of a new section is just like AI in IR: few landmarks, many blank pages, and hopefully a bright future! I joined Professor Sapoval’s team three years ago. Our team is fully dedicated to interventional radiology and consists of three (and soon four) interventional units. We are based in Paris, at the European Hospital Georges Pompidou, within APHP (greater Paris public hospitals).
I come from a mathematical background and later entered the medical world, which explains why I am working both at the university hospital and in a math lab. My main area of clinical practice is digestive oncology, and we are trying to bring AI towards our interventions to enhance every step of our work.
To be completely honest, AI does not affect my daily practice yet. This is probably true for many physicians: we attend great congress presentations, read impressive publications, yet every day cases seem like… every day cases. I have to admit this is part of the fun, taking part in something that has yet to come true. Regarding the preclinical research we are conducting, I am personally interested in automation and how AI might affect the core of image-guided interventions where we basically perceive and act simultaneously. If you look at AI and radiology, there is a tremendous gap between diagnostic research (whether institutional or industrial) and interventional research. You can’t blame radiologists for focusing on images at first, but we now have the capability to go beyond from personalized multimodal patient selection towards smart robotics.