Recent research has seriously challenged preconceived notions about our five senses. Blind people can “see” with their ears, and teenagers can improve their visual acuity through video games. An in-depth look into the largely uncharted territory of perception.
Aug 25, 2014
It seems that auditory-visual multisensory events yield stronger memories, even when recalling exclusively visual aspects such as faces.
Neuroscientists at the University Hospital Center and University of Lausanne (CHUV-UNIL), Switzerland have shown that your memory abilities can be predicted by how easily you put together auditory and visual information. This aptitude can be used to not only improve current teaching methods, but also enhance training and rehabilitation strategies.
The researchers were able to measure individual differences in multisensory processes in healthy adults by non-invasively recording brain activity with electroencephalography (EEG) while participants performed a task that required them to indicate whether or not a given stimulus was new or had already been presented (similar to identifying robbers in a police lineup).
The later memory performance of some individuals was improved for objects previously presented in the multisensory context, while for others such context impaired their memory performance. This was predictable exclusively by how a person’s brain responded to the multisensory information when it was first presented.
Lead researcher Prof. Micah Murray, director of the Laboratory for Investigative Neurophysiology (The LINE) within the Departments of Radiology and Clinical Neurosciences at CHUV-UNIL and director of the EEG Brain Mapping Core of the Centre for Biomedical Imaging (CIBM), said: “We provided the first evidence for a direct link between brain activity in response to multisensory information at one point in time and later visual object discrimination abilities. These findings show the behavioural relevance and the ethological value of multisensory processes. Multisensory information may therefore constitute a particularly effective strategy for learning; something already suggested over 100 years ago by Maria Montessori but hitherto not neuroscientifically demonstrated.”
The study, entitled ‘Multisensory Context Portends Object Memory’ is published in the journal Current Biology, and was authored by Antonia Thelen, Pawel Matusz, and Micah Murray.
The mechanism of perception long remained a pure mystery. How do we perceive sensory input? How is that “outside” information sent to the brain? And how does the brain process these data? Is sensation the same as perception? For millennia, these complex questions stood within the confines of philosophical debate.
In his treatise On the Soul, Aristotle developed the first genuine theory on sense-perception, a concept which he differentiated from simple sensation. The Greek thinker “studies each of the five external senses – sight, hearing, smell, taste and touch – before introducing a unique and original notion of common sense, distinct from the external senses but inherent to their function,” says Michel Nodé-Langlois, professor of philosophy at the Fermat School in Toulouse, in the review Philopsis. Aristotle believed that this common sense included perception that went beyond external sensory functions. It encompasses the awareness of this perception, and of the different senses.
Nowadays, Aristotle and other philosophers no longer have the monopoly over the notion of perception. Researchers have invaded this area of study and have gained extensive experience in it. Neuroscientific techniques, such as magnetic resonance imaging (MRI), electroencephalograms (EEG) and transcranial magnetic stimulation (TMS), have uncovered some of the mystery surrounding the mechanisms of perception and related disorders, including schizophrenia, autism, anorexia nervosa, dyslexia and even dyschromatopsia (disorder of colour perception).
Some surprising discoveries have been made. “For example, we have noted that the brain processes information much faster than we thought,” says Professor Micah Murray, a neuroscientist specialised in perception at the CHUV and the University of Lausanne (UNIL). “Sensory stimuli can be followed down to the millisecond.” Unlike an MRI, which provides a static image, an EEG shows dynamic brain activity. “As a result, we can trace the information that reaches the brain in real time and determine, for example, what area of the brain perceives which information first.” Transcranial magnetic stimulation is used to “activate” or “deactivate” brain functions and sensory abilities.
These are not the only technologies used to repair or improve our perception capabilities. Today, innovative webcams and video games can also play a part. It is an entirely new field of research waiting to be explored.
Transcranial magnetic stimulation is used to “activate” or “deactivate” brain functions and sensory abilities.
In March 2014, scientists from the Hebrew University in Jerusalem announced a surprising innovation. They have developed “sensory substitution devices” that allow blind people to perceive space and make out shapes. This is achieved with an algorithm that converts a camera image scanned from right to left into music. This new tool gives the blind the possibility to “see”, or at least better perceive their environment.
How can they do that? The blind adjust their perception. Without sight, one’s sense of hearing is highly stimulated. The device converts the image into sound. The pixels are represented by the volume, while the positioning is represented by the duration of the sound. MRI studies have shown that the areas of the brain typically used to process vision can now be activated by sound. “Even those who have been blind since birth can perceive their environment,” says Micah Murray. “Based on the criteria set out by the WHO, they’re no longer legally blind!”
Dividing the brain into areas defined by the type of sensory information processed – visual cortex, auditory cortex – is imprecise. The organ actually appears to be organised into areas based on the tasks accomplished rather than the senses typically used for that task.
In an article in the French newspaper Le Figaro, one of the study’s co-authors, Dr. Ella Striem-Amit, said, “dividing the brain into areas defined by the type of sensory information processed – visual cortex, auditory cortex – is imprecise. The organ actually appears to be organised into areas based on the tasks accomplished rather than the senses typically used for that task.”
Micah Murray suggests that these conclusions can also apply to healthy people. “All of our senses interact, and the information coming from the different senses is not processed separately” he adds. “These ‘multi-sensory’ processes are reshaping our understanding of perception and, as a result, of the disorders that can arise.”
To be able to perceive surrounding shapes, the blind must first learn how the device works. During the experiments led by the Israeli scientists, participants’ perception of the shapes around them gradually improved each time they listened. Understanding this new language takes dozens of hours of practice.
Also in March, a British artist with “achromatopsia” (only seeing in black and white) announced that he had a chip implanted in his brain that is connected to a camera. His “auditory prosthesis” analyses colour frequencies and converts them into sound vibrations. Neil Harbisson confirms that he can again perceive colours. And, with his “eyeborg”, he even claims to be the world’s first cyborg!
To optimise visual acuity What if video games, loathed by so many parents, weren’t so bad for their kids? Recent studies show that, in certain conditions, they can even improve the perception abilities of gamers. “In our studies, we noted the positive effects on visual acuity and contrast sensitivity in subjects who play video games regularly, i.e. more than five hours per week,” says Daphné Bavelier from the University of Geneva.
But the games available on the market do not all produce these effects. “We observed them in first-person or third-person shooter games in which the player has to shoot at and destroy their enemies while keeping an eye on everything happening around them,” says the expert. “In this type of game, the player has to make quick decisions while assessing future actions. Attentional control is augmented.”
The scientist says that perception is also enhanced by dividing one’s attention on the screen and constantly re-evaluating what is important to better anticipate situations. “Playing 20 to 40 minutes per day over several weeks is enough to gain the initial positive effects on perception abilities.”
Video game training can be used in correcting vision problems and could even help patients with disorders such as schizophrenia, depression or attention deficit syndrome. For now, the most concrete effects are limited to vision. “Some patients have already corrected their amblyopia by playing video games. This disorder, also called ‘lazy eye’, is detected during childhood and results in poor depth perception.” As such, video games refute a theory that has been dominant since the 1970s, which claims that vision develops over a “sensitive” period during childhood, after which development stops. “Actually, we can regain these abilities later by stimulating the visual cortex appropriately,” says Daphné Bavelier.
But video game training extends well beyond helping people with perception disorders. “There is also a whole educational side to our project. With video games, we can each enhance the precision of our visual system and our ability to detect objects within our field of vision. All that information is more rapidly integrated.”
Daphné Bavelier works with video game makers to conduct her research. But creating a video game adapted to patients is a long, expensive process. “Most of our work involves creating easy levels. Some of our patients are 85 years old!”
Some patients have already corrected their amblyopia by playing video games. This disorder, also called ‘lazy eye’, is detected during childhood and results in poor depth perception.
In the American television series Perception, the main character is a talented neuropsychiatrist who is also a paranoid schizophrenic. However, he has turned his pathology into an asset, being enlisted by the FBI as a consultant who uses his hallucinations to help him solve cases. Although the series in itself offers little scientific value, it does shed some light on this complex illness.
“People think that schizophrenia is a purely cognitive disorder,” says Micah Murray. “Thanks to the support of the Swiss National Fund and National Centre for Competence in Research – Synapsy, we have in fact discovered that the pathology also leads to vision and hearing disorders. But it is difficult to determine the cause and effect relationship,” he says. “Do the hallucinations occur because of the vision problems? Or do the hallucinations lead to the vision problems? In other words, where is the problem located? In the eye or in the brain?”
Until now, the diagnosis of schizophrenia has been mostly based on subjective criteria. Psychiatrists have to rely on what they are told by patients and those close to them.
By also looking into the condition of patients’ vision or hearing, schizophrenia can more easily be diagnosed. The illness affects about one in one hundred people, with symptoms generally appearing in early adulthood. “Until now, the diagnosis of schizophrenia has been mostly based on subjective criteria. Psychiatrists have to rely on what they are told by patients and those close to them,” says Micah Murray. “After all, the disorder can’t be diagnosed with a blood test!”
The specialist believes that MRIs and EEGs now show an additional “risk factor” that links sensory disorders with schizophrenia, providing a more sound diagnosis. But this has not yet taken root in clinical practice. “The use of an EEG and MRI to diagnose schizophrenia will spread in hospitals over the next five years,” predicts Micah Murray. “These tools can also improve the treatment of the illness. The success of a given therapy becomes easier to gauge by evaluating its effects on sensory abilities.”
The increased use of EEG testing is also expected to improve the diagnosis and treatment of other pathologies. Micah Murray gives the example of the language disorder dysphasia. “Tests to measure how fast the brain processes information can be used to determine whether or not language is impaired or, for example, whether a child’s perception can be enhanced through a given therapy,” he says. “For dysphasia, we assess the child’s ability to differentiate between the sounds /ba/ and /ga/ in order to predict whether he or she has the disorder.”
Everything we sense, through the eyes, ears, mouth, skin or nose, is perception. But this perception is not an “objective” reality. It goes beyond the senses and is closely tied to our cognition. The outside information that reaches our senses is “processed” by mechanisms that give it meaning. Technically, the process involves sensory information that is translated into neural responses. For instance, light is transformed by the photoreceptors in the eye and sound by the hair cells in the ear. And it is the neural responses from our sense organs that form the basis of perception. Disorders, such as hallucinations, can occur that widen the gap between “reality” and perception.
This rare vision disorder affects the ability to distinguish colours. The achromat only sees shades of grey. Achromatopsia can either be congenital or acquired due to brain damage and is characterised by the absence of the retinal photoreceptors used to see colour. The symptoms include reduced visual acuity, uncontrolled oscillatory movement of the eyeball and aversion to bright light.
The main symptoms of schizophrenia, often confused with multiple personality disorder, include social withdrawal and a loss of contact with reality. The acute phase of this psychosis leads to auditory hallucinations that can be very disturbing for the individual. Factors associated with the onset of schizophrenia vary from social to psychological or genetic. Its causes remain for the most part unknown.
The disorders along the autism spectrum are characterised by abnormal social interaction and communication. The patient shows a range of restricted and repetitive behaviour. The symptoms are generally noticed by the parents in the first two years of the child’s life. The basis of autistic disorders may either be genetic or environmental, but its exact cause remains subject to controversy.
This mental disorder, not to be confused with the loss of appetite caused by another pathology, manifests itself by an exaggerated preoccupation towards physical appearance, inducing a restrictive diet that can prove to be fatal. Anorexia nervosa mainly affects women in their teens, which can perceive themselves as being overweight even though it is not the case.
Etymologically, the word dysphasia means “poor language” or “difficult speech”. It indicates a disturbance in the learning and development of spoken language. Dysphasia can affect expression, comprehension, or both. The disorder is characterised by poorly structured oral expression and limited vocabulary, causing difficulty in school and disabilities in everyday life.