Children with autism spectrum disorder (ASD) often find it difficult to gauge other people’s emotions based on their facial expressions – this can in turn lead to problems in communicating with those people. Scientists at Stanford University, however, are seeing new hope in an approach that utilizes Google Glass smart glasses.
Ordinarily, therapists use things like flash cards of faces to teach ASD kids what different emotions look like. Unfortunately, though, this sort of training has to be performed by a professional in a clinic, plus looking at still images of faces on cards isn’t always transferrable to taking part in real-life social interactions.
In the new Stanford study, children with autism wore Google Glass headsets that were wirelessly linked to a machine learning-based app on a nearby smartphone. That app analyzed the view from the glasses’ forward-facing camera, gauging the expressions/emotions of the people with which the children were interacting. It then determined which of eight core facial expressions those people were showing, and told the child via the glasses’ speaker and in-lens display. Those expressions represented happiness, sadness, anger, disgust, surprise, fear, neutrality or contempt.