Have you ever heard of face pareidolia? This everyday phenomenon where people see faces in everyday objects is a human condition that relates to how our brains are wired. Once considered a symptom of psychosis, it arises from an error in visual perception.
According to new research led by David Alais of the University of Sydney, our brains detect and respond emotionally to these illusory faces the same way they do to real human faces.
In his paper entitled “A shared mechanism for facial expression in human faces and face pareidolia” Alais and his colleagues suggest that human brains are evolutionarily hardwired to recognize faces, with highly specialized brain regions for facial detection and processing.
The Study
According to ScienceAlert, Alais and his colleagues asked 17 volunteers to look at a series of dozens of illusory and human faces, repeated several times over, then rate the strength of emotion in each one through the same computer software.
The researchers found that the study participants mostly agreed on the expressions that the pareidolia faces were showing, and that bias crept in based on the expression of the previous face – something that we do with human faces too. This also happened when real and illusory faces were mixed up.
In other words, a succession of happy faces makes us more likely to see the next one as happy as well. That this bias was observed in both real and illusory faces suggests the brain is processing them in a similar way, and using similar neural networks.
Facial Perception
Facial perception involves more than just the features common to all human faces, like the placement of the mouth, nose, and eyes. Our brains might be evolutionarily attuned to those universal patterns, but reading social information requires being able to determine if someone is happy, angry, or sad or whether they are paying attention to us.
“What we found was that actually these pareidolia images are processed by the same mechanism that would normally process emotion in a real face,” Alais said. “You are somehow unable to totally turn off that face response and emotion response and see it as an object. It remains simultaneously an object and a face.”
Long Term Implications
Alais has been interested in this and related topics for years. For instance, in a 2016 paper published in Scientific Reports, Alais and his colleagues built on prior research involving rapid sequences of faces that demonstrated that perception of face identity, as well as attractiveness, is biased toward recently seen faces.
They designed a binary task that mimicked the selection interface in online dating websites and apps (like Tinder), in which users swipe left or right in response to whether they deem the profile pictures of potential partners attractive or unattractive. Alais et al. found that many stimulus attributes—including orientation, facial expression and attractiveness, and perceived slimness of the online dating profiles—are systematically biased toward recent past experience.
This previous study as well as the most recent one conducted may help to inform research in artificial intelligence or disorders of facial processing such as prosopagnosia.
The post Responding Emotionally to Faces on Inanimate Objects first appeared on Humintell.