Monday, February 6, 2012

Seeing Sounds

Researchers interpret brain signals to decode words being heard by the subject

article and image from:

http://www.scientificamerican.com/article.cfm?id=word-of-mind-researchers-decode


Researchers in Berkeley and San Francisco were able to measure electrical signals in the language-processing center of the brain. These measurements allowed them to decipher words the subject was hearing. The researchers hope that the knowledge gained from this study could help severely disabled individuals regain their ability to communicate. Fifteen volunteers participated in the study. They listened to words, sounds, and sentences through a loudspeaker or headphones as the researches recorded the activity occurring in their brain’s auditory cortex. These volunteers were already being monitored for seizures, due to epilepsy or brain tumors. The researchers would not have been able to measure the brains of these volunteers if it were not for the brain scans already underway due to medical treatment. In this study, made possible because of medicine, researchers created an algorithm that mapped sounds heard by the subject to the electrode’s measurements. This algorithm was then able to match sounds to signals in the brain. In order to test the algorithm, they attempted to recreate the word a listener heard. The algorithm created a sound which could be interpreted after some work as a word. The study created two different versions of the algorithm for different aspects of sound. One version uses a linear representation of sound, showing frequency over time, while the other is nonlinear. The first conveys sound rhythms by oscillations in the brain. The second conveys rhythms by overall brain activity. The nonlinear version is more accurate than the linear model when studying faster speech rhythms. While the research from this experiment focuses purely on actual sounds a listener hears, future studies may be able to discover whether or not similar regions in the brain are used to decipher words we speak to ourselves, internally.

This study has provided further knowledge of the brain’s auditory and language functions which could be used to help the communication of disabled individuals and to improve speech recognition technology. This study was able to understand what someone was hearing by looking at that person’s brain. The researchers were then able to recreate words by simulating brain activity. This information could be used to restore communication to people who are currently unable to send or receive messages to or from their loved ones. Returning any ability to communicate to these individuals would greatly improve their lives. The study is also relevant to the general public in that the information gained could improve current technology for speech recognition. Siri, the technology which accompanies the iphone 4, has amazed many of us with its ability to read text messages and answer questions. The new information provided by this study could make this technology even more impressive.

I am constantly amazed by how much we still have to learn about the human brain. With each new study, we discover a little bit more about this amazingly complex organ. The discovery made in this particular study will help us learn about how we are able to communicate with each other. As an English major, I’m interested in communication and language. These researchers were able to see physical evidence of spoken words. They watched as the brain interpreted sounds and transformed them into words. This in itself is amazing, but the study could lead to even more remarkable discoveries. I can’t wait to learn what our brains still have to teach us.

No comments:

Post a Comment