Tag Archives: radiology

New Technology: Using AI To Interpret Pelvic X-rays

Look out, radiologists! The computers are coming for you!

Radiologists use their extensive understanding of human anatomy and combine it with subtle findings they see on x-ray shadow pictures. In doing this, they can identify a wide variety of diseases, anomalies, and injuries. But as we have seen with vision systems and game playing (think chess), computers are getting pretty good at doing this as well.

Is it only a matter of time until computer artificial intelligence (AI) starts reading x-rays?  Look at how good they already are at interpreting EKGs. The trauma group at Stanford paired up with the Chang Gung Memorial Hospital in Taiwan to test the use of AI for interpreting images to identify a specific set of common pelvic fractures.

The Stanford group used a deep learning neural network (XCeption) to analyze source x-rays (standard A-P pelvis images) from Chang Gung. These x-rays were divided into training and testing cohorts. The authors also applied different degrees of blurring, brightness, rotation, and contrast adjustment to the training set in order to help the AI overcome these issues when interpreting novel images.

The AI interpreted the test images with a very high degree of sensitivity, specificity, accuracy, and predictive values, with all of them over 0.90. The algorithms generated a “heat map” that showed the areas that were suspicious for fracture. Here are some examples with the original x-ray on the left and the heat map on the right:

The top row shows a femoral neck fracture, the middle row an intertrochanteric fracture, and the bottom row another femoral neck fracture with a contralateral implant. All were handily identified by the AI.

AI applications are usually only as good as their training sets. In general, the bigger the better so they can gain a broader experience for more accurate interpretation. So it is possible that uncommon, subtle fractures could be missed. But remember, artificial intelligence is meant to supplement the radiologist, not replace him or her. You can all breathe more easily now.

This technology has the potential for broader use in radiographic interpretation. In my mind, the best way to use it is to first let the radiologist read the images as they usually do. Once they have done this, then turn on the heat map so they can see any additional anomalies the AI has found. They can then use this information to supplement the initial interpretation.

Expect to see more work like this in the future. I predict that, ultimately, the picture archiving and communications systems (PACS) software providers will build this into their product. As the digital images are moving from the imaging hardware to the digital storage media, the AI can intercept it and begin the augmented interpretation process. The radiologist will then be able to turn on the heat map as soon as the images arrive on their workstation.

Stay tuned! I’m sure there is more like this to come!

Reference: Practical computer vision application to detect hip fractures on pelvic X-rays: a bi-institutional study.  Trauma Surgery and Acute Care Open 6(1), http://dx.doi.org/10.1136/tsaco-2021-000705.

The Value Of Reinterpreting Outside CT Scans

Okay, one of your referring hospitals has just transferred a patient to you. They diligently filled out the transfer checklist and made sure to either push the images to your PACS system or include a CD containing the imaging that they performed. For good measure, they also included a copy of the radiology report for those images.

Now what do you do?

  • Read the report and consider the results
  • Look at the images yourself and make decisions
  • Have your friendly neighborhood radiologist re-read the images and produce a new report

Correct answer: all of the above. But why? First, you can get a quick idea of what another professional thought about the images, which may help you think about the decisions you need to make.

And one of the few dogmas that I preach is: “read the images yourself!” You have the benefit of knowing the clinical details of your patient, which the outside radiologist did not. This may allow you to see things that they didn’t because they don’t have the same clinical suspicion. Besides, read the images often enough and you will get fairly good at it!

But why trouble your own radiologist to take a look? Isn’t it a waste of their time? Boston Children’s Hospital examined this practice in the context of taking care of pediatric trauma patients. This hospital accepts children from six hospitals in the New England states. In 2010, they made a policy change that mandated all outside images be reinterpreted once the patient arrived. They were interested in determining how often there were new or changed diagnoses, and what the clinical impact was to the patient. They focused their attention only on CT scans of the abdomen and pelvis performed at the referring hospital.

Here are the factoids:

  • 168 patients were identified over a 2-year period. 70 were excluded because there was no report from the outside hospital (!), and 2 did not include the pelvis.
  • Reinterpretation in 28% of studies differed from the original report (!!)
  • Newly identified injuries were noted in 12 patients, and included 7 solid organ injuries, 3 fractures, an adrenal hematoma, and a bowel injury. Three solid organ injuries had been undergraded.
  • Four patients with images interpreted as showing injury were re-read as normal
  • Twenty of the changed interpretations would have changed management

Bottom line: Reinterpretation of images obtained at the outside hospital is essential. Although this study was couched as pediatric research, the average age was 12 with an upper limit of 17. Many were teens with adult physiology and anatomy. There will be logistical hurdles that must be addressed in order to get buy-in from your radiologists, such as how they can get paid. But the critical additional clinical information obtained may change therapy in a significant number of cases.

Reference: The value of official reinterpretation of trauma computed tomography scans from referring hospitals. J Ped Surg 51:486-489, 2016.

New Technology: Using AI To Interpret Pelvic X-rays

Look out, radiologists! The computers are coming for you!

Radiologists use their extensive understanding of human anatomy and combine it with subtle findings they see on x-ray shadow pictures. In doing this, they can identify a wide variety of diseases, anomalies, and injuries. But as we have seen with vision systems and game playing (think chess), computers are getting pretty good at doing this as well.

Is it only a matter of time until computer artificial intelligence (AI) starts reading x-rays?  Look at how good they already are at interpreting EKGs. The trauma group at Stanford paired up with the Chang Gung Memorial Hospital in Taiwan to test the use of AI for interpreting images to identify a specific set of common pelvic fractures.

The Stanford group used a deep learning neural network (XCeption) to analyze source x-rays (standard A-P pelvis images) from Chang Gung. These x-rays were divided into training and testing cohorts. The authors also applied different degrees of blurring, brightness, rotation, and contrast adjustment to the training set in order to help the AI overcome these issues when interpreting novel images.

The AI interpreted the test images with a very high degree of sensitivity, specificity, accuracy, and predictive values, with all of them over 0.90. The algorithms generated a “heat map” that showed the areas that were suspicious for fracture. Here are some examples with the original x-ray on the left and the heat map on the right:

The top row shows a femoral neck fracture, the middle row an intertrochanteric fracture, and the bottom row another femoral neck fracture with a contralateral implant. All were handily identified by the AI.

AI applications are usually only as good as their training sets. In general, the bigger the better so they can gain a broader experience for more accurate interpretation. So it is possible that uncommon, subtle fractures could be missed. But remember, artificial intelligence is meant to supplement the radiologist, not replace him or her. You can all breathe more easily now.

This technology has the potential for broader use in radiographic interpretation. In my mind, the best way to use it is to first let the radiologist read the images as they usually do. Once they have done this, then turn on the heat map so they can see any additional anomalies the AI has found. They can then use this information to supplement the initial interpretation.

Expect to see more work like this in the future. I predict that, ultimately, the picture archiving and communications systems (PACS) software providers will build this into their product. As the digital images are moving from the imaging hardware to the digital storage media, the AI can intercept it and begin the augmented interpretation process. The radiologist will then be able to turn on the heat map as soon as the images arrive on their workstation.

Stay tuned! I’m sure there is more like this to come!

Reference: Practical computer vision application to detect hip fractures on pelvic X-rays: a bi-institutional study.  Trauma Surgery and Acute Care Open 6(1), http://dx.doi.org/10.1136/tsaco-2021-000705.

But The Radiologist Made Me Do It!

The radiologist made me order that (unnecessary) test! I’ve heard this excuse many, many times. Do these phrases look familiar?

  1. … recommend clinical correlation
  2. … correlation with CT may be of value
  3. … recommend delayed CT imaging through the area
  4. … may represent thymus vs thoracic aortic injury (in a 2 year old who fell down stairs)
Some trauma professionals will read the radiology report and then immediately order more xrays. Others will critically look at the report, the patient’s clinical status and mechanism of injury, and then decide they are not necessary. I am firmly in the latter camp.
But why do some just follow the rad’s suggestions? I believe there are two major camps:
  • Those that are afraid of being sued if they don’t do everything suggested, because they’ve done everything and shouldn’t miss the diagnosis
  • Those that don’t completely understand what is known about trauma mechanisms and injury and think the radiologist does
Bottom line: The radiologist is your consultant. While they are good at reading images, they do not know the nuances of trauma. Plus, they didn’t get to see the patient so they don’t have the full context for their read. First, talk to the rad so they know what happened to the patient and what you are looking for. Then critically look at their read. If the mechanism doesn’t support the diagnosis, or they are requesting unusual or unneeded studies, don’t get them! Just document your rationale clearly in the record. This provides best patient care, and minimizes the potential complications (and radiation exposure) from unnecessary tests.
Related post:

Reference: Pitfalls of the vague radiology report. AJR 174(6):1511-1518, 2000.

The Cost Of Duplicate Radiographic Studies

Duplicate radiographic studies are a continuing issue for trauma professionals, particularly after transfer from a smaller hospital to a trauma center. The incidence has been estimated anywhere from 25% to 60% of patients. A lot has been written about the radiation dangers, but what about cost?

A Level II trauma center reviewed their experience with duplicate studies in orthopedic transfer patients retrospectively over a one year period. They looked at the usual demographics, but also included payor, cost information, and reason for repeat imaging. Radiation dose information was also collected.

Here are the factoids:

  • 513 patients were accepted from 36 referring hospitals
  • 48% had at least one study repeated, 256 CT scans and 161 conventional imaging studies
  • Older patients and patients with low GCS were much more likely to receive repeat studies
  • There were no association with the size of the referring hospital or the ability of the patient to pay
  • Most transfers had commercial insurance; only 11% had Medicaid and 17% were uninsured
  • Additional radiation from repeat scans was 8 mSv. The average radiation dose from both hospitals was 38 mSv. This is 13 years of background radiation exposure!
  • The cost of all the repeat studies was over $96,000

Bottom line: This is an eye-opening study, particularly regarding how often repeat imaging is needed, how much additional radiation is delivered, and now, the cost. And remember that these are orthopedic patients, many of whom had isolated bony injuries. I would expect that patients with multiple and multi-system injuries would require more repeat imaging and waste even more money. It is imperative that all centers that receive transfers look at adopting some kind of electronic data transfer for imaging, be it a VPN or some cloud-based service. With the implementation of the Orange Book by the American College of Surgeons, Level I and II centers will receive a deficiency if they do not have some reliable mechanism for this.

“Level I and II facilities must have a mechanism in place to view radiographic imaging from referring hospitals within their catchment area (CD 11–42).”

Reference: Clinical and Economic Impact of Duplicated Radiographic Studies in Trauma Patients Transferred to a Regional Trauma Center. J Ortho Trauma 29(7):e214-e218, 2015.