FAST is an integral component of major trauma evaluation. Unfortunately, although lots of people do them, quality control is not very consistent.
Researchers at the University of Pennsylvania studied how the use of a standard checklist and it’s impact on exam quality. Detection of fluid in any of the standard 4 FAST locations was recorded for every exam performed. No attempts were made to grade the amount of fluid seen. The exam was recorded in video format.
Reviewers credentialed in FAST later reviewed the study videos in a blinded fashion using a checklist. They were also not aware of any CT or OR findings. The checklist contained grading for quality (poor, fair, good), result (positive, negative, unclear), and initial interpretation (positive, negative) for each of the 4 areas scanned. The study was also graded for its educational value.
A total of 247 studies were reviewed. All study results were compared with CT (240) or OR (7) results. There 235 true negatives, 6 true positives, 4 false positives and 2 false negatives. Sensitivity was 75%, specificity was 98%, and accuracy was 98%.
Overall, 9% of exams were of good quality, 65% were fair, and 26% were poor. Despite this lack of good quality exams, sensitivity, specificity and accuracy adhered to the usual literature standards. The overall quality in both true and false exams were similar.
Bottom line: This study reveals that we are doing an “okay” job with FAST exams in trauma patients. However, it also shows that there is room for improvement, and that FAST evaluation should be a part of the Performance Improvement program of any trauma centers that use FAST.
Reference: Performance Improvement for FAST Exam. University of Pennsylvania. Presented at the Eastern Association for the Surgery of Trauma meeting, Poster #24, January 2010.