How the technology works
The novel AI-driven technology uses the power of facial expression recognition to detect stroke by analysing facial symmetry and specific muscle movements, known as action units.
The Facial Action Coding System (FACS), initially developed in the 1970s, categorises facial movements by the contraction or relaxation of facial muscles, providing a detailed framework for analysing facial expressions.
“One of the key parameters that affects people with stroke is that their facial muscles typically become unilateral, so one side of the face behaves differently from the other side of the face,” de Oliveira said.
“We’ve got the AI tools and the image processing tools that can detect whether there is any change in the asymmetry of the smile – that is the key to detection in our case.”
Video recordings of facial expression examinations of 14 people with post-stroke and 11 healthy controls were used in this study.
Next steps
The team plan to develop the smartphone tool into an App in collaboration with healthcare providers so that it will be able to detect other neurological conditions that affect facial expressions.
“We want to be as sensitive and specific as possible. We are now working towards an AI tool with additional data and where we are going to be considering other diseases as well,” Kumar said.
“Collaboration with healthcare providers will be crucial to integrate this App into existing emergency response protocols, providing paramedics with an effective means of early stroke detection.”
The researchers from RMIT partnered with São Paulo State University in Brazil on this research.
‘Facial expressions to identify post-stroke: A pilot study’ is published in Computer Methods and Programs in Biomedicine (DOI: 10.1016/j.cmpb.2024.108195).
Story: Will Wright