Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

October 15, 2012

What you hear could depend on what your hands are doing [Interface Subtleties]

Filed under: Interface Research/Design — Patrick Durusau @ 4:17 am

What you hear could depend on what your hands are doing

Probably not ready for the front of the interface queue but something you should keep in mind.

There are subtleties of information processing that a difficult to dig out but that you can ignore only at the peril of an interface that doesn’t quite “work,” but no one can say why.

I will have to find the reference but I remember some work years ago where poor word spacing algorithms made text measurably more difficult to read, without the reader being aware of the difference.

What if you had information you would prefer readers not pursue beyond a certain point? Could altering the typography make the cognitive load so high that they would “remember” reading a section but not recall that they quit before understanding it in detail?

How would you detect such a strategy if you encountered it?

From the post:

New research links motor skills and perception, specifically as it relates to a second finding—a new understanding of what the left and right brain hemispheres “hear.” Georgetown University Medical Center researchers say these findings may eventually point to strategies to help stroke patients recover their language abilities, and to improve speech recognition in children with dyslexia.

The study, presented at Neuroscience 2012, the annual meeting of the Society for Neuroscience, is the first to match human behavior with left brain/right brain auditory processing tasks. Before this research, neuroimaging tests had hinted at differences in such processing.

“Language is processed mainly in the left hemisphere, and some have suggested that this is because the left hemisphere specializes in analyzing very rapidly changing sounds,” says the study’s senior investigator, Peter E. Turkeltaub, M.D., Ph.D., a neurologist in the Center for Brain Plasticity and Recovery. This newly created center is a joint program of Georgetown University and MedStar National Rehabilitation Network.

Turkeltaub and his team hid rapidly and slowly changing sounds in background noise and asked 24 volunteers to simply indicate whether they heard the sounds by pressing a button.

“We asked the subjects to respond to sounds hidden in background noise,” Turkeltaub explained. “Each subject was told to use their right hand to respond during the first 20 sounds, then their left hand for the next 20 second, then right, then left, and so on.” He says when a subject was using their right hand, they heard the rapidly changing sounds more often than when they used their left hand, and vice versa for the slowly changing sounds.

No Comments

No comments yet.

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.

Powered by WordPress