How do congenitally deaf and mute people think?

How do congenitally deaf and mute people think?

We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

If a person is born deaf and dumb, how can they think? In "what language" do these people think? Do they develop their own inner language?

Unfortunately I have not found an answer, and I actually doubt it has ever been elucidated.

Short answer
The inner voice of congenitally (pre-lingually) deaf people who have not received treatment like cochlear implantation, is not sound-based. Instead, it is mainly based on visual images, such as sign-language or printed material.

According to an anecdotal report published in the Independent of a congenitally deaf person, who deliberately refused cochlear implantation or other treatments, the inner voice is a visual entity taking the form of sign-language, or visual images, or sometimes printed words.

An interesting report from UCL investigated congenitally deaf people suffering from positive psychotic symptoms. In normal-hearing people, positive psychotic symptoms typically involve hearing voices (auditory hallucinations). In congenitally deaf folks, who obviously never had the chance to hear any voices in their lives, these auditory hallucinations were described as

[Not consisting of] sounds, but [… ] the gender and identity of the voice [were recognized. They were] image[s] of [… ] voice[s] signing [,] or lips moving in their mind.

- The Independent, December 21, 2013, UK


How and what does the deaf brain see? It is a question that has been asked for the last half century but which remains not fully answered. However, the studies examined here help reach important conclusions. Behavioral studies point to a visual advantage in the deaf that is acquired around 13 years of age and which is particularly evident for stimuli presented in the visual periphery (Megreya & Bindemann, 2017 Smittenaar et al., 2016 Codina et al., 2011 Stevens & Neville, 2006 Brozinsky & Bavelier, 2004 Bosworth & Dobkins, 1999, 2002 Bavelier et al., 2001). Although this advantage does not universally enhance perception of all visual stimulus features, the deaf are at least as good as normal hearing individuals across all paradigms tested to date (Table 2). Overall, no visual decrements have been identified in the deaf, whereas many investigations have identified specific visual advantages. Although the deaf do not possess visual superpowers, like seeing through walls, they do have particular enhanced visual functions, such as object and facial discrimination, and peripheral visual functions, such as motion detection, visual localization, visuomotor synchronization, and Vernier acuity that are specifically enhanced compared with hearing subjects.

There is also strong evidence that recruitment of the sensory-deprived auditory cortex underlies enhancement of the remaining sensory modalities (Benetti et al., 2017 Bola et al., 2017 Twomey et al., 2017 Land et al., 2016 Almeida et al., 2015 Scott et al., 2014 Vachon et al., 2013 Karns et al., 2012 Meredith et al., 2011, 2012 Meredith & Lomber, 2011 Lomber et al., 2010 Hunt et al., 2006 Fine et al., 2005 Sadato et al., 2004 Finney et al., 2001 Rebillard et al., 1977, 1980). In contrast to the largely separable auditory and visual brain regions which arise through normal development, studies show activation of the deaf auditory cortex during visual stimulation in humans (Bola et al., 2017 Corina, Blau, LaMarr, Lawyer, & Coffey-Corina, 2017 Campbell & Sharma, 2016 Almeida et al., 2015 Bottari et al., 2014 Scott et al., 2014 Vachon et al., 2013 Karns et al., 2012 Fine et al., 2005 Sadato et al., 2004 Finney et al., 2001, 2003 Neville et al., 1983) and animal models of hearing loss (Land et al., 2016 Meredith et al., 2011, 2012 Meredith & Lomber, 2011 Hunt et al., 2006 Kral et al., 2003 Rebillard et al., 1977, 1980). These findings should not be too surprising as a multitude of studies have revealed that primary sensory cortices actually encode, or are influenced by, the presence of inputs from different sensory modalities (e.g., Karns et al., 2012 Kayser & Logothetis, 2007 Ghazanfar & Schroeder, 2006). Crossmodal plasticity exhibits region-dependent differences based on differences on their underlying connectivity. Among the potential connectional sources for primary sensory cortex are other primary sensory areas. Although primary-to-primary connectivity has received a great deal of attention recently, the literature clearly shows that primary-to-primary cortical connectivity occurs in rodents, but there is little consistent evidence for this in nonrodents such as carnivores and nonhuman primates (Meredith & Lomber, 2017).

In some cases, visual enhancements in the deaf can be attributed to specific fields of the presumptive auditory cortex. The cortical loci identified to mediate these functions reside in deaf auditory cortex: BA 41, BA 42, and BA 22, in addition to areas R, PT, Te3 and TVA in humans A1, AAF, DZ, FAES and PAF in cats and A1 and AAF in both ferrets and mice. For example, using a combination of a behavioral approach and reversible deactivation, the auditory cortical area PAF has been shown to mediate enhanced visual peripheral localization, whereas area DZ mediates enhanced movement detection in congenital deaf cats (Lomber et al., 2010). In each case, the region being reorganized adopts a similar function to that which it performs in the normally developed brain, albeit contributing to a novel sensory modality. A similar conservation of function is beginning to emerge in studies of human reorganization following hearing loss.

Although a major restructuring of the neural projections between sensory brain regions was long thought to underlie functional reorganization following sensory loss, recent results from studies using retrograde tracers to quantify crossmodal projections in normal hearing and deaf animals show limited or no changes in the patterns of projections (Butler, de la Rua, Ward-Able, & Lomber, 2018 Butler, Chabot, Kral, & Lomber, 2017 Butler et al., 2016 Meredith et al., 2016 Chabot et al., 2015 Wong et al., 2015 Kok et al., 2014). Thus, future studies should consider alternate hypotheses, including the idea evidenced by Clemo et al. (2016, 2017) that synaptic level changes on projections that exist during development might give rise to functional differences in auditory cortex.

Overall, the findings from these studies show that crossmodal reorganization in auditory cortex of the deaf is responsible for the superior visual abilities of the deaf. Thus, although deafness may cause a general sensory impairment very early in life, behavioral advantages in the remaining sensory modalities emerge in adolescence and persist through adulthood. That visual perception is particularly enhanced for peripheral stimulation and stimuli in motion suggests compensatory function is related to optimizing survival in the absence of sound, a critical source of information about the world around us.

Research Prior to 1930

Pintner and Patterson (1915, 1916, 1917) were the first to administer intelligence tests to deaf children. They found that on the verbal IQ measures which they were using, the deaf as a group were scoring in the mentally retarded range ( Pintner & Patterson, 1915). Realizing that what they were measuring was not intelligence but the language deprivation concomitant with deafness ( Pintner & Patterson, 1921), they developed the Pintner Non-language Test ( Pintner & Patterson, 1924) in order to be able to measure intelligence independent of the language variable. Although this instrument yielded findings which indicated deaf youths to be nearer in intelligence to the normal population than had the verbal tests, Pintner and Patterson's results (1924) still yielded means on samples of deaf children which were significantly below those obtained on normal hearing children.

During this same period, Reamer (1921) tested 2,500 deaf children using a battery of six nonlanguage tests, including the Pintner Drawing Completion Test and an imitation test based on the Knox Cubes. Results indicated a mental age retardation of about two years for the deaf sample. Later, Day, Fusfeld, and Pintner (1928) in a survey of 4,432 pupils ranging in age from 12 to 21 plus came to the same conclusion.

The first investigation to contradict the finding of below-average intelligence among the deaf was that of Drever and Collins (1928). They published results of their performance test administered to 200 deaf and 200 hearing children, from which they concluded that when language was not a factor, deaf and hearing children were approximately equal in mental ability.

These pre-1930 studies were pioneering efforts in a new field. From them was learned the inappropriateness of verbal tests for measuring the intelligence of deaf children. These investigations also gave indications of what has later been found to be the error of attempting to do group intelligence testing with deaf subjects.

In view of later findings using improved psychological measures and techniques, the validity of the Day, Fusfeld, Pintner, and Reamer conclusions of mental age retardation ranging from 2 to 5 years among the deaf is no longer tenable. A contributing factor, aside from errors of test selection and administration, that would account for some of the retardation reported by these early studies could have been the practice (common in the early 1900's) of placing nondeaf mentally retarded children in schools for the deaf.

The Hearing World Must Stop Forcing Deaf Culture to Assimilate

The big summer action movie “Baby Driver” made waves in the Deaf community — CJ Jones, a Deaf actor, plays the deaf foster father of the film’s protagonist, Baby. It’s exciting for two reasons: deaf characters rarely appear in big mainstream films, and it’s even rarer that deaf people play themselves.

But the fight for authentic representation is far from over.

Many in the Deaf community now have their eyes on the new Todd Haynes film, “Wonderstruck,” which makes its mainstream theater debut today. It spotlights the Deaf community, yet stars Julianne Moore as one of the deaf protagonists. Moore is only the latest in a long line of hearing actors playing deaf roles — most recently Chris Heyerdahl in the new Syfy-turned-Netflix series “Van Helsing.”

Filmmakers have offered many excuses for not casting deaf actors — everything from not knowing where to find a one, to asserting a deaf woman would get injured during the shooting of an action sequence.

More disappointing than Hollywood’s justifications, however, is the way the Deaf community’s critiques are portrayed as political correctness run amok.

Consider the way the public treats other Hollywood excuses: When the spotlight falls, for example, on the film industry’s lack of racial and gender diversity, consumers and journalists alike are rightfully outraged when the same thing happens to deaf artists, the mainstream offers only silence.

The reason could lie in a fundamental misunderstanding of deafness: Hearing people view deafness as a deficiency rather than a separate linguistic context, worldview and culture. Conversely, Deaf people who identify with the Deaf community and use a signed language — here in the United States, American Sign Language (ASL) — consider their Deafhood their primary cultural identity. Those who identify this way use the capital “D” to mark the difference between the physicality of not hearing and the social, cultural and linguistic implications of thinking and communicating in a language other than English.

But what is Deaf culture? Like many others, it is rooted its language.

The manual modality of signed language gives rise to common mannerisms and codes of behavior in Deaf settings. By incorporating gestures, movement and facial expressions, Deaf people tend to be far blunter with one another than considered appropriate in hearing company. Stomping on the floor, for example, or throwing something at a Deaf person (known as “beanbagging”) are accepted ways of getting someone’s attention.

What the hearing world calls “hearing loss,” the Deaf community counters with “Deaf gain.” It avoids terminology like “handicapped,” “hearing impaired” or “mute.”

I’m speaking from the North American Deaf cultural experience, which uses ASL. Sign languages are not universal they sprout organically from communication within a deaf population and develop over time, like spoken languages.

But what is Deaf culture? Like many others, it is rooted its language.

American Sign Language likely evolved from a combination of home signed systems created by deaf people with individual families and friends, Martha’s Vineyard Sign Language and French Sign Language. Due to Martha Vineyard’s high rate of genetic deafness (1 in 155 versus 1 in 5,728 on the mainland), its sign language was used by deaf and hearing inhabitants alike from 1714 until the early 20th century. It connected with French Sign Language in early 19th-century Connecticut, with the 1817 establishment of America’s first deaf school — now the American School for the Deaf. Frenchman Laurent Clerc was one of its founders. These two languages entwined with home signs that the diverse student body brought from across the nation to become the ASL we use today.

As is often true with minority cultures, Deaf culture has been carried forward through its connection to a shared history — and a shared oppression.

Alexander Graham Bell offers a prime example. In addition to inventing the telephone, Bell was a prominent eugenicist with mommy issues. (Bell’s mother was deaf, as was his wife.) He sought to purify the human race of the congenitally defective, as laid out in his famous lecture, “Memoir upon the Formation of a Deaf Variety of the Human Race.”

Bell understood that deafness was a social identity as well as a physical attribute. So he campaigned for the closure of Deaf social clubs and schools, and sought to prevent deaf people from marrying each other.

Bell’s call for the eradication of deafness was in part implemented by a shift in deaf education practices. After oral-education resolutions were passed at the Milan Convention in 1880, deaf students were increasingly forbidden to sign some reported having their hands tied to their desks to “encourage” speech. This had devastating linguistic effects on deaf children for generations — and is still propagated today through the Alexander Graham Bell Association.

The association routinely asserts that learning ASL can harm English acquisition. It says speaking and listening are the only way for a deaf child to be a productive member of society. Today’s science negates the idea that bilingualism has a deleterious effect on a child’s development, but with respect to sign language, the stigma remains.

Yet AG Bell, or organizations like it, don’t need to do much to sway public opinion. More than 90 percent of deaf children are born to hearing parents and, for many, their child is the first deaf person they’ve met. The relationship often begins with a doctor saying, “I’m sorry” when he presents the news, using terms like “treatment” and “cure.” Since parents are naturally inclined to want their children to be like them, it’s an easy sell to say that speaking and listening will make a child happy, healthy and a successful part of society.

What if being in a Deaf school, with no communication barriers between one’s teachers and peers, is the least restrictive environment?

Bell’s perspective on deafness also continues to affect U.S. educational legislation. The 1990 Individuals with Disabilities Education Act is designed to funnel students into “the least restrictive environment” wherever possible. For hearing people, this reads as mainstreaming deaf children into regular schools. In reality, it confuses the concepts of language and speech.

Despite technological advances, learning to speak and listen while deaf is a complex process that can take time. Meanwhile, the human brain’s critical period for language acquisition is birth to 5 years. Without sufficient exposure, a person may never be fluent in any language. This is to say nothing of the social and emotional impact of constantly being the only deaf person in one’s class or school. What if being in a Deaf school, with no communication barriers between one’s teachers and peers, is the least restrictive environment? What if bilingualism is the smoothest path to success?

Though the hearing community may view deafness as a hardship, having a common language and collective experience can foster a spirit of inclusivity. Race, class and gender-based discrimination are further amplified by disability. But because deafness can impact people of all races, religions and classes, American Sign Language often serves as a connection between people from otherwise disparate backgrounds.

Such diversity is a fertile breeding ground for rich artistic expression. ASL poets use visual rhythm and rhyme. ASL slam poetry, vlogs, and music videos are burgeoning genres, and the institution of Deaf theater is thriving, most recently in Deaf West’s Tony-nominated Broadway revival of “Spring Awakening,” and a coming 2018 revival of “Children of a Lesser God.”

But the Deaf community is increasingly endangered by education policy crafted without its input, and a scientific community racing toward a cure for deafness without considering the ethical ramifications. There seems little concern about, for example, what inherent value a language or culture can have, or what it might mean to knowingly pursue its extinction. In short, who gets to define normal?

To keep society’s definition of normalcy from becoming too narrow, the hearing mainstream must accept a cultural view of deafness, even when it is inconvenient. Because only when the hearing world respects deaf people as intellectual equals, when it parses out the difference between accessibility and forced assimilation — and yes, when it starts casting deaf actors in deaf roles — will Deaf culture be allowed to reach its full potential.

Sara Nović is the author of the novel “Girl at War” and an assistant professor of creative writing at Stockton University. Find more about her writing on her website.

Sara Nović is the author of the novel “Girl at War” and an assistant professor of creative writing at Stockton University.

Bell's Legacy

Bell applied his study of eugenics to his goal of preventing the creation of a deaf race and presented his paper Memoir Upon the Formation of a Deaf Variety of the Human Race to the National Academy of Sciences in 1883. Bell stated, "Those who believe as I do, that the production of a defective race of human beings would be a great calamity to the world, will examine carefully the causes that will lead to the intermarriage of the deaf with the object of applying a remedy." In this paper, he proposed to reduce the number of the deaf by discouraging deaf-mute to deaf-mute marriages, advocating speech reading and articulation training for an oral-only method of education, removing the use of deaf teachers and sign language from the classroom.  

Suggestions were made to enact legislation to prevent the intermarriage of deaf-mute people or forbidding marriage between families that have more than one deaf-mute member. His preventative strategies for deaf marriage included removing barriers to communication and interaction with the hearing world.

In some respects, Alexander Graham Bell changed the way we look at education for the deaf for the better. Oral methods, the desegregation of education, and facilitating communication between deaf and hearing persons are a positive outcome. Some historians point to this as his legacy just as much as his inventions. However, his reasons behind those suggestions have an origin in a darker agenda and his view of the deaf ushered in an era of seeing that population as less capable and stigmatized a valid method of communication and education.

Causes of deafblindness

There are many causes of deafblindness, including:

  • Medical complications during pregnancy and birth, including cerebral palsy.
  • A range of syndromes, including Usher syndrome, CHARGE syndrome , congenital rubella syndrome and Down Syndrome.
  • Premature birth.
  • Illness and accidents.
  • Sensory loss as a result of ageing.

These causes mean you may have:

  • A mild to profound sight and hearing impairment, with or without other significant disabilities.
  • Changing conditions which cause impairment to sight and hearing.
  • Sight and hearing impairments caused by difficulties with the structure or function of the brain, e.g. Cerebral Visual Impairment (CVI).

Is It Possible To Think Without Language?

Language is so deeply embedded in almost every aspect of the way we interact with the world that it's hard to imagine what it would be like not to have it. What if we didn't have names for things? What if we didn't have experience making statements, asking questions, or talking about things that hadn't actually happened? Would we be able to think? What would our thoughts be like?

The answer to the question of whether thought is possible without language depends on what you mean by thought. Can you experience sensations, impressions, feelings without language? Yes, and very few would argue otherwise. But there is a difference between being able to experience, say, pain or light, and possessing the concepts "pain" and "light." Most would say true thought entails having the concepts.

Many artists and scientists, in describing their own inner processes while they work, say they do not use words to solve problems, but images. The autistic author Temple Grandin, in explaining how she thinks visually rather than linguistically, says that concepts for her are collections of images. Her concept of "dog," for example, "is inextricably linked to every dog I've ever known. It's as if I have a card catalog of dogs I have seen, complete with pictures, which continually grows as I add more examples to my video library." Of course, Grandin has language, and knows how to use it, so it is hard to say how much of her thinking has been influenced by it, but it is not unimaginable—and probably likely—that there are people who lack the ability to use language and think in the way she describes.

There is also evidence that deaf people cut off from language, spoken or signed, think in sophisticated ways before they have been exposed to language. When they later learn language, they can describe the experience of having had thoughts like those of the 15 year old boy who wrote in 1836, after being educated at a school for the deaf, that he remembered thinking in his pre-language days "that perhaps the moon would strike me, and I thought that perhaps my parents were strong, and would fight the moon, and it would fail, and I mocked the moon." Also, the spontaneous sign languages developed by deaf students without language models, in places like Nicaragua, display the kind of thinking that goes far beyond mere sensory impression or practical problem solving.

However, while it appears that we can indeed think without language, it is also the case that there are certain kinds of thinking that are made possible by language. Language gives us symbols we can use to fix ideas, reflect on them and hold them up for observation. It allows for a level of abstract reasoning we wouldn't have otherwise. The philosopher Peter Carruthers has argued that there is a type of inner, explicitly linguistic thinking that allows us to bring our own thoughts into conscious awareness. We may be able to think without language, but language lets us know that we are thinking.

Study shows auditory cortex of hearing and deaf people are nearly identical

The neural architecture in the auditory cortex – the part of the brain that processes sound – of profoundly deaf and hearing people is virtually identical, a new study has found.

The study raises a host of new questions about the role of experience in processing sensory information, and could point the way toward potential new avenues for intervention in deafness. The study is described in a June 18 paper published in Scientific Reports.

The paper was authored by Ella Striem-Amit, a post-doctoral researcher in Alfonso Caramazza’s Cognitive Neuropsychology Laboratory at Harvard, Mario Belledonne from Harvard, Jorge Almeida from the University of Coimbra, Quanjing Chen, Yuxing Fang, Zaizhu Han and Yanchao Bi from Beijing Normal University.

“One reason this is interesting is because we don’t know what causes the brain to organize the way it does,” said Striem-Amit, the lead author of the study. “How important is each person’s experience for their brain development? In audition, a lot is known about (how it works) in hearing people, and in animals…but we don’t know whether the same organization is retained in congenitally deaf people.”

Those similarities between deaf and hearing brain architecture, Striem-Amit said, suggest that the organization of the auditory cortex doesn’t critically depend on experience, but is likely based on innate factors. So in a person who is born deaf, the brain is still organized in the same manner.

But that’s not to suggest experience plays no role in processing sensory information.


How a new mother’s brain responds her infant’s emotions predicts postpartum depression and anxiety

Could listening to music be slowing you down at work or school?

Evidence from other studies have shown that cochlear implants are far more successful when implanted in toddlers and young children, Striem-Amit said, suggesting that without sensory input during key periods of brain plasticity in early life, the brain may not process information appropriately.

To understand the organization of the auditory cortex, Striem-Amit and her collaborators first obtained what are called “tonotopic” maps showing how the auditory cortex responds to various tones.

To do that, they placed volunteers in an MRI scanner and played different tones- some high frequency, some low frequency – and tracked which regions in the auditory cortex were activated. They also asked groups of hearing and deaf subjects to simply relax in the scanner, and tracked their brain activity over several minutes. This allows mapping which areas are functionally connected – essentially those that show similar, correlated patterns of activation – to each other.

They then used the areas showing frequency preference in the tonotopic maps to study the functional connectivity profiles related to tone preference in the hearing and congenitally deaf groups and found them to be virtually identical.

“There is a balance between change and typical organization in the auditory cortex of the deaf” said the senior researcher, Prof. Yanchao Bi, “but even when the auditory cortex shows plasticity to processing vision, its typical auditory organization can still be found”.

The study also raises a host of questions that have yet to be answered.

“We know the architecture is in place – does it serve a function,” Striem-Amit said. “We know, for example, that the auditory cortex of the deaf is also active when they view sign language and other visual information. The question is: What do these regions do in the deaf? Are they actually processing something similar to what they process in hearing people, only through vision?”

In addition to studies of deaf animals, the researchers’ previous studies of people born blind suggest clues to the puzzle.

In the blind, the topographical architecture of the visual cortex (the visual parallel of the tonotopic map, called “retinotopic”) is like that in the sighted. Importantly, beyond topographic organization, regions of the visual cortex show specialization in processing certain categories of objects in sighted individuals show the same specialization in the congenitally blind when stimulated through other senses. For example, the blind reading Braille, or letters delivered through sound, process that information in the same area used by sighted subjects in processing visual letters.

“The principle that much of the brain’s organization develops largely regardless of experience is established in blindness,” Striem-Amit said. “Perhaps the same principle applies also to deafness”.


Third International Conference on Cognitive Hearing Science for Communication

The Third International Conference on Cognitive Hearing Science for Communication will be held June 14–17, 2015, in Linkoping, Sweden. For more information visit More

Language and Perception – Insights from Psychological Science

New research published in Psychological Science, a journal of the Association for Psychological Science, examines the nuanced relationship between language and different types of perception. Bilingual Infants Can Tell Unfamiliar Languages Apart Speaking more than one language can improve our ability to control our behavior and focus our attention, recent research More

Psychology Gives Gallaudet President Leading Edge

Technically, I. King Jordan is a human male of average stature. You’d think he’d need to be a little bigger, not because his last name is Jordan, but because he walks around carrying the hopes and dreams for much of this world’s population of deaf people. Jordan, an APS Charter More

Do deaf schizophrenics hear voices?

I was reading a cyanide and happiness comic that jokes about this and raised the question, what happens with a deaf schizophrenic?

According to my Abnormal Psychology Textbook, "using SPECT (Single photon emission computed tomography) to study cerebral blood flow of men with Schizophrenia, researchers in London made a surprising discovery. They found that the part of the brain most active during hallucinations was Broca's area. This is surprising because Broca's area is known to be involved in speech production, rather than language comprehension. Because auditory hallucinations usually involve understanding the "speech" of others, you might expect more activity in Wernicke's area, which involves language comprehension. These observations support the metacognition theory that people that are experiencing hallucinations are not hearing the voices of others but are listening to their own thoughts or their own voices and cannot recognize the difference."

As far as I know, deaf people can talk to themselves in their own mind in some sort of way, so I would expect that deaf schizophrenics hear auditory hallucinations.

Wow - that's incredible and I didn't know that. Years ago I read The Origin of Consciousness in the Breakdown of the Bicameral Mind and found the theory to be fascinating even though it technically would be non-testable.

What you're saying supports the idea that conscious thought evolved post-speech development.

For those unfamiliar with [Bicameralism]( the idea is basically this: Humans evolved as social creatures, interacting and evolving the ability to help each other. Passing knowledge on to each other and subsequent generations was key. So imagine this scenario - you're teaching your child how to make a fire and you're talking through the steps to him. Next time when you're alone you find yourself talking through the steps to yourself because it's easier to remember.

In fact, back then maybe it was the only way to remember? Talking difficult problems out loud to ourselves is still something many people do today to help figure through the issue. Almost as if wiring internally in the brain didn't exist and so words have to go out your mouth and into your ears - the "long way around" so to speak.

Anyway, some day, you just don't speak the words out loud, but you hear them in your head instead. Whoa! What was that? Must be the gods talking to me directly.

In any event, the theory doesn't have a lot of supporting evidence beyond the writing styles of the earliest human writings. Julian Jaynes uses epics like the Illiad and Odyssey to show that initially all the characters had gods talking directly to them for specific direction, which eventually gave way to people having their own will irrespective of gods.

It's a fascinating theory that's totally unprovable, but in my heart it just seems to explain so much about the origin of religions, how gods spoke to people directly, why talking to yourself helps you work through a particularly thorny problem, how schizophrenics hear voices today - and now you bringing up how those hallucinations happen in the speech production centers instead of language comprehension.

Unlocking the Mysteries of the Deaf Brain

Hauser, a deaf clinical neuropsychologist and associate professor in the American Sign Language and Interpreting Education Department at NTID, is investigating how the brain adapts and takes on different functions based on new parameters. In other words, how does deafness itself change how the brain operates?

“We really understand so little about the human brain,” Hauser says. “Through my research I am seeking to uncover which cognitive processes are hard-wired, which are plastic, and how deafness or sign language may impact them.”

Hauser argues the difference between deaf and hearing brains can have significant clinical impacts that can affect diagnosis and treatment of numerous diseases.

“Suppose a deaf person has a stroke, which impacts his or her communication functions,” Hauser adds. “Because deaf people communicate differently and use different parts of the brain in that process, you can’t assume he or she will have the same symptoms or respond to the same therapies as a person who is hearing.”

Analyzing Visual Attention: Visual attention is the cognitive process by which a person selectively concentrates on an object or scene while ignoring other things and is a key component of learning. To analyze differences in attention between the deaf and hearing, participants were asked to discriminate a briefly presented face in the center of the display and to indicate the location of a peripheral target (a five-pointed star in a circle) via a touch screen.

Analyzing the Cognitive Process

“Peter is regarded nationally as one of the foremost experts in studies comparing deaf and hearing people’s brains and function,” says Daphne Bavelier, a professor of brain and cognitive sciences at the University of Rochester who has collaborated with Hauser for close to a decade. “In particular, he is leading the way in characterizing how growing up deaf or hard of hearing impacts executive functions—a set of skills that is central to academic achievements.”

Much of the previous clinical research involving deaf individuals focused on restoring hearing or adjusting learning style to mirror hearing peers. Instead, Hauser focuses on deaf individuals themselves, how they learn, how they think, and how deaf brains process and use information.

Through partnerships with Gallaudet University’s NSF Science of Learning Center on Visual Language and Visual Learning (VL2) and the University of Rochester’s Brain and Vision Laboratory, he has developed comprehensive testing procedures designed to analyze cognition in hearing and deaf individuals. His research includes studies of visual attention, the act of focusing on an object, and executive function, the part of the brain that controls behavior regulation and metacognition.

Hauser’s team, which includes students and faculty through NTID’s Deaf Studies Laboratory as well as faculty and students at Gallaudet and the University of Rochester, collects data on research participants from all over the world and conducts assessments in multiple written and sign languages. More than 1,000 people have participated in this testing so far.

“We conduct tests when we go to schools and camps for deaf children and academic conferences all over the world—Israel, Turkey, Germany,” Hauser says.

“Seeing” Differently

Results garnered through the research, which has been funded primarily by the National Science Foundation and the National Institutes of Health, show clear differences between deaf and hearing individuals in how information is processed.

In one project, Hauser’s team studied spatial visual attention in elementary school-aged children and adults to compare differences between populations. They found that elementary aged deaf children perform similarly to their hearing peers. However, as people age, differences in attention grow wider, as deaf adolescents and young adults were more attentive to peripheral events. Hauser explains, “this seems to be an important adaptive ability that makes deaf individuals more aware of what is happening around them, to increase their incidental learning, and to prevent them from dangers.”

Hauser says it has been generally understood that deaf people learn to pick up visual cues of what is happening peripherally quicker than hearing individuals, because they have fewer senses to rely on.

“Attention is a key psychological indicator of how information is transmitted from the senses to the brain,” he adds. “By showing how this works differently in deaf people, we can assist in developing techniques that foster visual learning.”

Assessing Sign Language Proficiency: Hauser’s team created one of the first standardized tests to measure proficiency in ASL. The test utilizes standard linguistic techniques used in spoken language assessments to rate fluency among ASL users.

Hauser has further examined differences in visual processing by comparing reading comprehension between hearing and deaf people. His team tested children (with five languages) on letter recognition, word recognition, and how the reader processed semantics and sentence processing. Participants included deaf children of deaf parents, deaf children with hearing parents, hearing children, and hearing children with dyslexia.

The preliminary findings appear to suggest that early sign language acquisition and deaf parents’ indigenous knowledge on how to raise deaf children prepare students to become successful readers regardless of the language, written orthography type, or region. Deaf children raised by deaf parents are able to achieve the same basic reading skills as hearing individuals early in life, suggesting that deafness per se does not cause reading challenges but what does have an effect is being raised in improvised visual language environments that do not foster visual learning.

Hauser’s neuroimaging research also suggests that skilled deaf readers use different parts of their brains for processing reading.

“Traditional methods for teaching reading and assessing comprehension are based on how hearing people learn and do not generally take into account the visual needs of deaf learners,” Hauser says. “Our research shows that deaf students do not necessarily learn to read more slowly than hearing students—just differently.”

Understanding Executive Function

“Attention control, emotional control, impulse control, memory, organizing your thoughts, planning your thoughts— these are all components of executive function that continue to develop in the brain until early adulthood,” Hauser says. “And language appears to be a necessary component of executive function development. But for the majority of deaf people growing up in hearing families, language development is delayed.”

Hauser argues that inefficient executive function development can have a negative impact on learning and academic achievement. His team is conducting a series of experiments, using both deaf and hearing participants, to investigate the impact of language learning on executive development. “The problem we encountered when beginning this research was that there are no standardized tests available to measure individuals’ sign language fluency,” he continues.

Given this, the team developed a highly sensitive test of competency in American Sign Language that can easily be administered in a short period of time. Hauser developed a Web-based administration protocol so the test can be administered remotely, with participant responses sent to his laboratory for analysis.

The test is currently being used in a number of psychological, linguistic, and cognitive neuroscience research studies at universities all over the country.

Deaf Cognition: Hauser is the co-editor of two books on the development of cognitive learning in deaf individuals, both published by Oxford University Press.

“The creation of this test has finally enabled researchers to test research questions related to the effect of sign language skills on learning and cognition,” Hauser adds.

The test has already been adapted to measure German and British sign languages and Hauser hopes to further expand its use in the future.

Promoting the Deaf Learner

On top of his basic research efforts, Hauser has sought to enhance understanding of deaf learners and promote educational and outreach opportunities in the deaf community. This includes efforts to disseminate information on deaf cognition to the broader scientific and education community as well as supporting the next generation of researchers.

Hauser has presented his research at numerous international conferences, served as a presenter/mentor for the Youth Leadership Conference of the National Association of the Deaf, and served as a delegate to the Test Equity Summit, which sought to ensure that educational testing better accounted for deaf learners. He also co-edited, with NTID Professor Marc Marschark, the 2008 book Deaf Cognition: Foundations and Outcomes, and the 2011 book How Deaf Children Learn: What Parents and Teachers Need to Know, both published by Oxford University Press.

Hauser has also worked with numerous students at RIT, NTID, and his partner institutions to promote their research efforts and enhance enthusiasm for the topic as a whole.

Mentoring the Next Generation: Erin Spurgeon (center) worked with Peter Hauser as a research associate before pursuing her Ph.D. in language and communicative disorders at the University of California at San Diego and San Diego State University.

Erin Spurgeon, who enjoyed Hauser’s enthusiasm for his subject matter when he taught a psychology class she was enrolled in while an RIT/NTID master’s student, ended up working as his research associate in the Deaf Studies Laboratory. She worked on several cognition projects and traveled with Hauser to the University of Haifa in Israel in 2009 and to Turkey in 2010 for his international research team meetings. Spurgeon is currently pursuing her Ph.D. in language and communicative disorders in a joint program at the University of California at San Diego and San Diego State University.

“The opportunity to work with Professor Hauser as a research associate was one of the most valuable experiences I had in preparation for this doctoral program,” she says. “Students who are interested in deaf research are fortunate to work with a knowledgeable and respected member of the scientific community.”

With continued research based at RIT/NTID, Hauser believes a legacy is being built here for deaf cognition, education, and outreach in deaf studies and sign language research.

“My hope is to bring more people into research, have junior faculty involved more, mentor them, create a deaf-friendly lab environment where people can come in and learn how to conduct research,” Hauser adds.

The Deaf Studies Laboratory at RIT: Peter Hauser (third from top left) and his team have worked with numerous students, collaborators, and educational and scientific agencies to promote better understanding of deaf cognition and the need to modify testing and assessment to better meet the need of deaf learners.

Watch the video: Πώς σκέφτονται οι πετυχημένοι (August 2022).