How looking away prevents pedestrian collisions

One day a friend and I were briskly strolling along a mall corridor, engaged in conversation, until something quite hilarious happened. A burly gentleman was quickly approaching my friend's direct line of trajectory. She and this man had to make either one of two choices; move to the left or to the right to avoid a disastrous collision. Simple, no? And so I thought. With about a foot between them, my tiny-sized friend and this large stranger began this seemingly unending, and surprisingly well-coordinated dance (or if you're an avid sports fan, picture Spud Webb desperately trying to drive on Shaq) one mirroring the others' movements, swaying back and forth, side to side. Both were a bit confused as to which direction to settle on, and for an estimated 5 seconds I stood there in utter disbelief, witnessing this extremely awkward, yet ridiculously entertaining situation.

How do we avoid catastrophes like this from happening on a daily basis? And if navigating through a mall corridor without incident is THAT difficult, how does the multitude of pedestrians in somewhere like Manhattan manage to avoid such annoying or, in my friend's case, embarrassing encounters? Inattention and "mind-blindness", an inability to develop an awareness of what is in the mind of another human, would seem to be the main culprits. However, there's bit more to it than that and it involves eye gazing.

Nummenmaa, Hyona, and Hietanen over at the University of Tampere in Finland have further shed light on the science of oculomotor activity while walking. In their paper, published in the most recent issue of Psychological Science, they investigated how participants predicted where an oncoming pedestrian was going to move by using information found in that pedestrian's gaze. More specifically, they wanted to test whether humans use others' gaze information to avoid collisions during locomotion and to assess how ones' own gaze direction is influenced by the approaching pedestrian's gaze behavior.

The researchers had 35 university student participants view a computer generated male walking toward them either looking to the particpants' left or right. The scene also moved constantly forward, giving the participants the impression that they were moving. Eye movements were recorded with a digital eye tracker. The participants were to answer whether they would skirt the pedestrian by moving to the left or right using a differential button press.

Is it just me or does this guy look like he's about to do something extremely shady? (click image to enlarge)


They found that the frequency of skirting to the left was higher when the oncoming pedestrian looked to the right and conversely, skirting to the right was higher when the pedestrian looked to the left. Additionally, they found that on average the participants looked toward the opposite direction of the pedestrian's gaze. It makes sense that one would "mind read" the oncoming pedestrian traffic and focus on walking toward the path of least resistance.

Nummenmaa and her colleagues speculate that parallel mechanisms guide gaze following. One mechanism is rapid and stimulus driven while the other is slower and governed by social cognition. Interestingly enough, they referenced prior research done on autism spectrum disorders to validate this dual-system model. Individuals with autism typically have no difficulties discriminating other people's gaze direction, but exhibit impairment in both joint attention and inferring other people's mental states through gaze (due to lack of a theory of mind or ToM).

I found an additional fMRI study discovering autistic participants lack of modulation in the superior temporal sulcus, a brain area important in the processing of intentional gaze shifts (Pelphrey, Morris, & McCarthy, 2005). There's also been research looking at the pedestrian patterns of individuals with autism using virtual technology. Parsons, Mitchell, and Leonard (2004) found that a subset of autistic participants were significantly more likely to be judged to have bumped into or walked between virtual characters compared to a control group. The strange tendency couldn't be explained by executive dysfunction or general motor difficulty, suggesting that personal space may be impaired with autism. Recently, efforts have been implemented to train autistic adolescents using the same virtual technology (Mitchell, &amp, & Leonard, 2007)

However, for the rest of us who aren't diagnosed with an autism spectrum disorder, there's still a possibility we may not be utilizing our fully intact eye gazing mechanisms to "pedestrianize" properly. I can't count how many times I've seen here in the Big Apple the obnoxiously loud cellphone user or the texting pedestrian needlessly holding up foot traffic (Seriously?! Find a dark narrow alleyway or something!)

Anyway...there's this iPhone app out called "Type n Walk". New York Times has good coverage on it explaining that "it’s supposed to make it easier to text while strolling by providing a visual, on the phone, of what is happening on the street a few feet ahead". I don't really buy into it because I don't think people are that good at multi-tasking just yet (unless you're a dual-tasking meditation master). This is a prime example of how our limited brain capacity fails to keep up with the rapidly developing human computer interaction technology (props to my older brother, Gene, for making this field more congruent with our realistic cognitive abilities).  


Nummenmaa and company conclude the results from their study "show that people are also aware of how and why others update their visual representations and use this information flexibly for their own movement planning and visual sampling of the environment". Unfortunately, we're unable do this if we're too busy reading or texting while walking. Furthermore, other pedestrians won't be able to tell where we're planning on going either.

So for the 67% of respondents from my survey who answered that they definitely spend more face-to-face time with a screen than with a human face, get off that Blackberry/iPhone and pay attention to where you're walking! Otherwise, CRASH.

The famous Shibuya, Japan crosswalk! It's not as tough as it looks.


Nummenmaa L, Hyönä J, & Hietanen JK (2009). I'll Walk This Way: Eyes Reveal the Direction of Locomotion and Make Passersby Look and Go the Other Way. Psychological science : a journal of the American Psychological Society / APS PMID: 19883491

Pelphrey, K. (2005). Neural basis of eye gaze processing deficits in autism Brain, 128 (5), 1038-1048 DOI: 10.1093/brain/awh404

Mitchell P, Parsons S, & Leonard A (2007). Using virtual environments for teaching social understanding to 6 adolescents with autistic spectrum disorders. Journal of autism and developmental disorders, 37 (3), 589-600 PMID: 16900403

Parsons, S., Mitchell, P., & Leonard, A. (2004). The Use and Understanding of Virtual Environments by Adolescents with Autistic Spectrum Disorders Journal of Autism and Developmental Disorders, 34 (4), 449-466 DOI: 10.1023/B:JADD.0000037421.98517.8d


This post was chosen as an Editor's Selection for ResearchBlogging.org



Popular Posts