- The ability to visually perceive objects accurately enables effective interactions with the surrounding environment.
- The processing of visual information involves several areas of the brain, but how this happens is not fully understood.
- Recent research from George Washington University has shown that what a person knows about an object can influence brain pathways used in visual perception processing, influencing what they see.
- The study authors suggest their findings could have important implications for medical displays, product design and technologies, including augmented reality.
How people perceive what they see, hear, taste or smell is incredibly varied.
For example, looking at a cloud-filled sky, one person may see complex shapes that look like animals or objects, while another sees only clouds.
Yet research on why humans perceive visual inputs differently is limited. But scientists are gaining a deeper understanding of visual processing and its relationship to how an individual perceives and acts on visual stimuli.
Recently, researchers from the Attention and Cognition Laboratory at George Washington University discovered clues about how the brain processes an object in the visual system and where in the brain this occurs.
Specifically, they found that the purpose of the object influences where visual processing occurs in the brain, and that knowledge and experience of the object can impact how it is perceived.
The results suggest that what a person knows about an object directly influences their perception.
Their research appears in the journal Psychological sciences.
The visual perception of objects can involve several areas of the brain.
Study authors Dick Dubbelde, a recent Ph.D. graduate and assistant professor at George Washington University, and Professor Sarah Shomstein, Ph.D., professor of cognitive neuroscience in GWU’s Department of Psychological and Brain Sciences said Medical News Today:
“Usually when we talk about vision, especially for more complex processes like object recognition, we’re talking about the occipital lobe, inferior temporal lobes, and parts of the parietal lobe.”
Additionally, previous research from 2016 suggests that the process of visual perception may involve two distinct but interacting brain pathways: the dorsal and ventral pathways.
The ventral pathway is believed to be responsible for identifying an object, while the dorsal pathway helps determine where and how to use the object. Yet it is less clear whether behavioral ramifications influence the pathway used to process specific items.
The study authors hypothesized that a manipulable object, such as a tool, passes through the dorsal pathway with higher temporal resolution, while visual processing of a non-manipulable item, such as a a potted plant, occurs in the ventral pathway with higher spatial resolution.
To test their theory, the researchers conducted five experiments on the spatial and temporal resolution of manipulative and non-manipulative objects in college-aged adults.
Participants viewed images of easily manipulated objects, including a snow shovel, coffee mug, and screwdriver, as well as non-manipulated objects such as a potted plant, water fountain, and fire hydrant .
The scientists used gap detection and object flicker discrimination tasks to determine processing pathways in study participants as they viewed the images.
After compiling the data, the researchers found that when participants recognized an object as a tool, it was perceived more quickly but with less detail. In contrast, when participants identified an item as a non-tool, it was perceived more slowly with more detail.
However, when the scientists made the objects less recognizable by turning them over, the differences in speed and detail disappeared.
The results suggest that what an individual knows or understands about an object determines where and how quickly it is visually processed in the brain.
For humans, quickly determining if an object is a tool can be critical for survival.
Dubbelde and Professor Shomstein explained:
“Tools are important to us as organizations. One of the most important things for us humans is how we can manipulate things with our hands, and so based on studies like this, it looks like we process objects that are in often near our hands in a different way than objects that do not. often occur near our hands in order to best facilitate interaction with these objects.
Additionally, Dubbelde and Professor Shomstein believe their research “has real implications for how we display information in augmented reality displays.”
“There are real applications for augmented reality giving you real-time information as you need it, but as we start to incorporate this kind of technology into our lives, we need to keep in mind that different types of stimuli, like the difference we showed between tools and non-tools, can alter your perception in subtle ways,” the authors said.
“If you’re performing a high-risk task, like driving a car or even something like surgery, then something like the icon you choose to represent the scalpel site or drone position may slow down neural processing enough. to cause a traffic accident or worse.
— Dick Dubbelde and Professor Sarah Shomstein, study co-authors
In 2015, differences in individual perception came to the fore when a Twitter post questioning the color of a dress sparked intense attention and debate among viewers. The Tweeter showed an image of a blue and black dress with the caption, “my house is divided over this dress.”
What followed was a viral phenomenon.
“The Dress” received over 4.4 million tweets in 2 days, with very different perspectives on color. According
Another 2017 study investigated “the dress” and suggested that the difference in color perception of the dress may be due to viewer assumptions about lighting conditions.
DTM asked Dubbelde and Professor Shomstein if beliefs about environmental factors can also affect the perception of an object, to which they replied:
“Absolutely. This study touches on a concept in cognitive psychology called ‘affordances’ which are the things you know you can do with an object.”
The study authors further explained:
“When you see a tool like a hammer, you not only see the colors and values that make up that image, you also begin to understand how you can interact with that object. Things like lighting can affect affordances, the more obviously by making things harder to recognize, but it also changes how you relate to the hammer.
In addition to assumptions about environmental factors, Mindpath Health-certified psychiatrist Dr. Julian C Lagoy said DTM:
“Our education and our education [have] a huge influence on how we perceive the objects around us. For example, an engineer will see the world differently from an artist. Our upbringing, upbringing and general knowledge [have] a huge influence on the way each human being perceives his environment.
Although emotions can also play a role in object perception, Professors Shomstein and Dubbelde noted:
“There are known connections between these purely ‘visual’ regions and the parts of the brain that we tend to think of as emotional, like the amygdala. Most areas of the visual processing neural network are interconnected, and the amygdala plays a role, although perhaps not a primary one, in object recognition.