Recovery from severe traumatic brain injury (TBI) can be frustrating for families and trauma professionals alike. It occurs in fits and starts, and the longest wait occurs while waiting for the patient to wake up. We perform serial neurologic exams and monitor closely for any visible response to commands.
A group of medical and engineering researchers at SUNY Stony Brook theorized that muscle movements in response to commands might be too subtle for human detection during early emergence from coma. They developed an AI system called SeeMe that studied the patients’ facial appearance down to the level of pores, and trained it to detect very fine motor movements in response to three commands:
- Stick out your tongue
- Open your eyes
- Show me a smile
A group of 16 normal volunteers and 37 TBI patients were then tested while being video recorded. Both SeeMe and trained experts judged responses.
Here are the factoids:
- SeeMe detected facial responses 4 days earlier than the trained experts
- It detected eye opening in 86% vs 74% of human observers
- SeeMe was able to detect mouth movements in 94% of patients without endotracheal tubes
- The SeeMe-detected responses correlated with the clinical outcome at discharge

Bottom line: Severely brain-injured patients are able to respond to commands with subtle facial muscle movements before human observers can detect them. A specially trained AI like SeeMe can identify these movements and help predict recovery sooner than clinicians. Imagine being able to tell the family, who has been seeing their loved one making no progress, that improvement is occurring! And imagine what other applications focused AI can have on other clinical areas where human senses don’t have the capacity that carefully trained machines do!
Reference: Computer vision detects covert voluntary facial movements in unresponsive brain injury patients. Commun Med 5, 361 (2025). https://doi.org/10.1038/s43856-025-01042-y
