Recent research shows that people can read intentions in each other’s gazes. This supports the idea that nonverbal communication is vital in human interactions. Understanding how people with conditions like autism perceive these subtle cues could improve social skills for many.

It’s no secret that our eyes say a lot; “the eyes are the window to the soul” isn’t just a saying. Scientists have long studied how we interpret eye movements. How do we know when a glance has meaning compared to a casual look?
Dr. Jelena Ristic, a psychology professor at McGill University, led the study published in Communications Psychology. She states, “We want to know why our brains handle social information differently.”
The researchers focused on intentional versus unintentional eye movements. They had participants watch videos of people moving their eyes in response to prompts. Sometimes the movements were self-directed; other times, they were instructed.
Ristic explains, “The difference is between intentional and instructed movements.” In one part of the study, about 80 participants watched clips where individuals were ready to move their eyes. The goal was for viewers to predict their gaze direction—left or right.
Interestingly, viewers were quicker at predicting intentional looks, hinting at different brain processes for each type. In follow-up experiments with about 70 more participants, the researchers looked at how intention affected gaze tracking. Surprisingly, the intention didn’t improve tracking speed as expected.
This finding suggests our brains might initially respond to the glance’s direction and only later evaluate its intentionality. Delving deeper into the video recordings, the team noticed subtle movements around the eye area before intentional gazes. These tiny signals could be telling clues about intention.
“These little motion cues communicate quickly, allowing us to sense intention,” Ristic speculates. Future research will aim to use advanced eye-tracking technology to pinpoint these signals better. They also plan to explore how specific intentions, like helping or deceiving, affect viewer perceptions.
Importantly, Ristic’s team wants to include participants with autism spectrum disorder (ASD) and attention-deficit/hyperactivity disorder (ADHD). Individuals with these conditions often struggle to pick up social cues. This research could help identify where their processing differs from neurotypical individuals.
“This question is crucial in the autism field,” Ristic comments. Learning how these systems operate differently can lead to better support for those who need it.
The study opens doors to understanding human connection beyond words. More insights into these interactions could help improve social skills, especially for those facing challenges in comprehending subtle cues.