During a night shift when she was a resident at Yale New Haven Hospital, Cristiana Baloescu was taking care of a patient with a complex presentation of symptoms. The patient was experiencing some back and leg pain. But they also looked really sick.
“Emergency physicians spend years developing the skill to recognize whether a patient looks sick or not,” said Baloescu, now assistant professor of emergency medicine at Yale School of Medicine. “With patients who look sick, we spend a lot of time thinking what’s going on because if they’re doing well now, they probably won’t be doing very well in the next hour or two.”
Luckily, Baloescu figured out what was going on with the patient pretty quickly. Using what is known as a point-of-care ultrasound, she discovered that the patient had an aortic dissection, or a tear in the aorta — the body’s main artery that carries blood from the heart to the rest of the body.
“I saw what I needed to see with ultrasound, and we were able to get that patient the immediate medical care that they needed, and thankfully, the patient lived,” she said.
Point-of-care ultrasound is an ultrasound doctors perform and read at the bedside, giving fast, clear answers to guide care, unlike traditional scans that require a specialist and are more detailed. Portable ultrasound machines provide detailed images to physicians like Baloescu in real time so they can make informed decisions quickly.
These days, artificial intelligence (AI)-guided ultrasounds could make care even more responsive to patients’ needs while also supporting clinicians in making faster, more confident decisions. At Yale, Baloescu has been studying the use of AI-guided ultrasounds, specifically lung ultrasound imaging, which is essential for identifying conditions such as pneumonia and pulmonary edema.
In an interview, Baloescu discusses her interest in point-of-care ultrasound, what AI could mean for this technology, and how that could improve patient care.
The interview has been edited for length and clarity.
How did you first get interested in point-of-care ultrasound?
Cristiana Baloescu: During my residency training, point-of-care ultrasound was a crucial bedside tool for me — it allowed me to quickly understand why patients had come to the emergency department and identify what was going on with a patient in a super timely fashion, which is what emergency medicine is all about. Point-of-care ultrasound also helped me guide treatment once a diagnosis was made. For instance, with pulmonary edema from heart failure, lung ultrasound can help you determine whether you’re at your diuretic goal [which relates to fluid or salt levels in the body] or whether you need to give them more based on what you see on ultrasound. Seeing it in action over several cases, I realized how much ultrasound could impact patient care. There’s something powerful about being able to see inside a patient just by holding a machine in your hands — nothing gives more immediate gratification, and it’s exactly the kind of thing emergency physicians thrive on.
When and why did you start thinking about the potential application of AI within the field of ultrasound medicine?
Baloescu: During my fellowship training, one of my fellowship projects was with my mentor, Christopher Moore, also at Yale School of Medicine. We were looking at the application of a software (or an algorithm) that could quantify ejection fraction — that’s the amount of blood that the left ventricle squeezes out — and I just got involved in that project. I thought it was very interesting and had a lot of potential. I think AI can make point-of-care ultrasound even faster and more efficient, which is naturally something an emergency medicine physician gravitates toward just by the nature of our job.
Where’s the greatest potential for AI within point-of-care ultrasound medicine?
Baloescu: For point-of-care ultrasound, the skill sets that AI can help with are the acquisition and the interpretation of ultrasound clips. With acquisition, that’s when you need to hold the probe and get the best images you can. With interpretation, that’s when you have a clip and try to understand what that tells you. AI-assisted ultrasounds could help health care professionals who are not experts in ultrasound both capture high-quality images and then understand what they mean.
AI can also help with clinical workflow and education. It can help with research and administration as well. When I say administration, I am referring to quality assurance or help with completing documentation. Although practices differ across the United States, most academic institutions and hospitals that use point-of-care ultrasound spend a lot of time on documentation, image archival work, and quality assurance because we treat point-of-care ultrasound as like a diagnostic test within emergency medicine. So, you need to make sure that the work you do is of high quality.
Are there any risks or challenges with using AI-guided ultrasounds?
Baloescu: AI does have some risks. If we use ultrasound as a diagnostic test, for example, there’s the risks of false positives and false negatives — which are still there with AI guidance and interpretation. So, you still need clinicians who are experienced with ultrasound, particularly at the start of integrating AI into everyday practice, to make sure that what we are relying on is true. There also needs to be attention on what training data sets we use for building AI algorithms because really the AI is only as good as the training data you give it, especially for ultrasound.
What could AI-guided ultrasounds mean for the patients and their health journeys?
Baloescu: In an ideal future, it would help them get ultrasound imaging faster in their clinical journey. For example, I can envision AI being used as part of triage. So, as part of your vital signs, if appropriate for your reason for the emergency visit, a technician or a nurse could do a quick ultrasound of your heart or your lungs, and that could give the physician more information quicker so they can treat you faster and perhaps more accurately.
For patients who aren’t in the hospital, but for example those being seen in community health programs or during house visits, you could use the ultrasound, and an expert physician could review the clips either there or through a telehealth platform and help make some immediate decisions.
AI guidance and interpretation could also help during ambulance transport. There are many areas in the United States where ambulance transport times are long. If the paramedics can use AI-powered ultrasound, that might help make faster decisions to start treatment or help facilitate targeted treatment, and more precise care tailored to what we think is going on with the patient.