Summary: The process by which an observer comprehends speech by watching the movements of the speaker's lips without hearing the speaker's voice.

Top Publications

  1. Smith E, Bennetto L. Audiovisual speech integration and lipreading in autism. J Child Psychol Psychiatry. 2007;48:813-21 pubmed
    ..Speech reception thresholds were calculated for auditory only and audiovisual matched speech, and lipreading ability was measured...
  2. Soto Faraco S, Navarra J, Weikum W, Vouloumanos A, Sebastián Gallés N, Werker J. Discriminating languages by speech-reading. Percept Psychophys. 2007;69:218-31 pubmed
    ..The results obtained are in accord with recent proposals arguing that the visual speech signal is rich in informational content, above and beyond what traditional accounts based solely on visemic confusion matrices would predict. ..
  3. Baart M, Vroomen J. Do you see what you are hearing? Cross-modal effects of speech sounds on lipreading. Neurosci Lett. 2010;471:100-3 pubmed publisher
    It is well known that visual information derived from mouth movements (i.e., lipreading) can have profound effects on auditory speech identification (e.g. the McGurk-effect)...
  4. Vroomen J, Baart M. Phonetic recalibration only occurs in speech mode. Cognition. 2009;110:254-9 pubmed publisher
    ..These results provide new evidence for the distinction between a speech and non-speech processing mode, and they demonstrate that different mechanisms underlie recalibration and selective speech adaptation. ..
  5. Alegria J, Lechat J. Phonological processing in deaf children: when lipreading and cues are incongruent. J Deaf Stud Deaf Educ. 2005;10:122-33 pubmed
    ..The main goal was to establish the way in which lipreading and CS combine to produce unitary percepts, similar to audiovisual integration in speech perception, when ..
  6. Thomas S, Jordan T. Contributions of oral and extraoral facial movement to visual and audiovisual speech perception. J Exp Psychol Hum Percept Perform. 2004;30:873-88 pubmed
    ..Experiments 2 and 3 demonstrated that these results are dependent on intact and upright facial contexts, but only with extraoral movement displays. ..
  7. Sekiyama K. Cultural and linguistic factors in audiovisual speech processing: the McGurk effect in Chinese subjects. Percept Psychophys. 1997;59:73-80 pubmed
    ..The results also showed that the magnitude of the McGurk effect depends on the length of time the Chinese subjects had lived in Japan. Factors that foster and alter the Chinese subjects' reliance on auditory information are discussed. ..
  8. Schwartz J, Berthommier F, Savariaux C. Seeing to hear better: evidence for early audio-visual interactions in speech identification. Cognition. 2004;93:B69-78 pubmed
    ..This early contribution to audio-visual speech identification is discussed in relationships with recent neurophysiological data on audio-visual perception. ..
  9. Mohammed T, Campbell R, MacSweeney M, Barry F, Coleman M. Speechreading and its association with reading among deaf, hearing and dyslexic individuals. Clin Linguist Phon. 2006;20:621-30 pubmed
    ..In particular, in the deaf and dyslexic groups, but not in the hearing controls, speechreading skill correlated with reading ability. ..

More Information


  1. Paulesu E, Perani D, Blasi V, Silani G, Borghese N, De Giovanni U, et al. A functional-anatomical model for lipreading. J Neurophysiol. 2003;90:2005-13 pubmed
    Regional cerebral blood flow (rCBF) PET scans were used to study the physiological bases of lipreading, a natural skill of extracting language from mouth movements, which contributes to speech perception in everyday life...
  2. Calvert G, Bullmore E, Brammer M, Campbell R, Williams S, McGuire P, et al. Activation of auditory cortex during silent lipreading. Science. 1997;276:593-6 pubmed
    Watching a speaker's lips during face-to-face conversation (lipreading) markedly improves speech perception, particularly in noisy conditions...
  3. Vroomen J, Baart M. Recalibration of phonetic categories by lipread speech: measuring aftereffects after a 24-hour delay. Lang Speech. 2009;52:341-50 pubmed
    ..This aftereffect dissipated quickly with prolonged testing and did not reappear after a 24-hour delay. Recalibration of phonetic categories is thus a fragile phenomenon. ..
  4. Auer E. The influence of the lexicon on speech read word recognition: contrasting segmental and lexical distinctiveness. Psychon Bull Rev. 2002;9:341-7 pubmed
    ..The present study provides evidence of a common spoken word recognition system for both auditory and visual speech that retains sensitivity to the phonetic properties of the input. ..
  5. Bernstein L, Auer E, Moore J, Ponton C, Don M, Singh M. Visual speech perception without primary auditory cortex activation. Neuroreport. 2002;13:311-5 pubmed
    ..These results suggest that visual speech perception is not critically dependent on the region of primary auditory cortex. ..
  6. Kauramäki J, Jääskeläinen I, Hari R, Möttönen R, Rauschecker J, Sams M. Lipreading and covert speech production similarly modulate human auditory-cortex responses to pure tones. J Neurosci. 2010;30:1314-21 pubmed publisher
    ..Here, we used whole-scalp 306-channel magnetoencephalography (MEG) to study whether lipreading modulates human auditory processing already at the level of the most elementary sound features, i.e., pure tones...
  7. Wang Y, Behne D, Jiang H. Linguistic experience and audio-visual perception of non-native fricatives. J Acoust Soc Am. 2008;124:1716-26 pubmed publisher
    ..These results point to an integrated network in AV speech processing as a function of linguistic background and provide evidence to extend auditory-based L2 speech learning theories to the visual domain. ..
  8. Grant K, Seitz P. The use of visible speech cues for improving auditory detection of spoken sentences. J Acoust Soc Am. 2000;108:1197-208 pubmed
    ..The amount of visual influence depends in part on the degree of correlation between acoustic envelopes and visible movement of the articulators. ..
  9. Ruytjens L, Albers F, van Dijk P, Wit H, Willemsen A. Activation in primary auditory cortex during silent lipreading is determined by sex. Audiol Neurootol. 2007;12:371-7 pubmed
    Recent studies investigating whether the primary auditory cortex (PAC) is involved in silent lipreading gave inconsistent results...
  10. Campbell R, MacSweeney M, Surguladze S, Calvert G, McGuire P, Suckling J, et al. Cortical substrates for the perception of face actions: an fMRI study of the specificity of activation for seen speech and for meaningless lower-face acts (gurning). Brain Res Cogn Brain Res. 2001;12:233-43 pubmed
    ..However, some temporal regions, such as the posterior part of the right superior temporal sulcus, appear to be common processing sites for processing both seen speech and gurns. ..
  11. Ludman C, Summerfield A, Hall D, Elliott M, Foster J, Hykin J, et al. Lip-reading ability and patterns of cortical activation studied using fMRI. Br J Audiol. 2000;34:225-30 pubmed
    ..These preliminary results justify more extensive investigations of the cortical basis of individual differences in lip-reading. ..
  12. Ruytjens L, Albers F, van Dijk P, Wit H, Willemsen A. Neural responses to silent lipreading in normal hearing male and female subjects. Eur J Neurosci. 2006;24:1835-44 pubmed
    In the past, researchers investigated silent lipreading in normal hearing subjects with functional neuroimaging tools and showed how the brain processes visual stimuli that are normally accompanied by an auditory counterpart...
  13. Murakami T, Restle J, Ziemann U. Observation-execution matching and action inhibition in human primary motor cortex during viewing of speech-related lip movements or listening to speech. Neuropsychologia. 2011;49:2045-54 pubmed publisher
    ..Furthermore, the SICI findings provide evidence that inhibitory mechanisms are recruited to prevent unwanted overt motor activation during action observation. ..
  14. Sadato N, Okada T, Honda M, Matsuki K, Yoshida M, Kashikura K, et al. Cross-modal integration and plastic changes revealed by lip movement, random-dot motion and sign languages in the hearing and deaf. Cereb Cortex. 2005;15:1113-22 pubmed
  15. Pekkola J, Ojanen V, Autti T, Jaaskelainen I, Möttönen R, Tarkiainen A, et al. Primary auditory cortex activation by visual speech: an fMRI study at 3 T. Neuroreport. 2005;16:125-8 pubmed
    ..Further, a significant hemisphere by stimulus interaction occurred, suggesting left Heschl's gyrus specialization for visual speech processing. ..
  16. Capek C, MacSweeney M, Woll B, Waters D, McGuire P, David A, et al. Cortical circuits for silent speechreading in deaf and hearing people. Neuropsychologia. 2008;46:1233-41 pubmed publisher
    ..Together, these findings indicate that activation in the left superior temporal regions for silent speechreading can be modulated by both hearing status and speechreading skill. ..
  17. Hall D, Fussell C, Summerfield A. Reading fluent speech from talking faces: typical brain networks and individual differences. J Cogn Neurosci. 2005;17:939-53 pubmed
  18. Pekkola J, Ojanen V, Autti T, Jaaskelainen I, Möttönen R, Sams M. Attention to visual speech gestures enhances hemodynamic activity in the left planum temporale. Hum Brain Mapp. 2006;27:471-7 pubmed
    ..These findings suggest that attention to visually perceived speech gestures modulates auditory cortex function and that this modulation takes place at a hierarchically relatively early processing level. ..
  19. Tye Murray N, Sommers M, Spehar B. Audiovisual integration and lipreading abilities of older adults with normal and impaired hearing. Ear Hear. 2007;28:656-68 pubmed
    The purpose of the current study was to examine how age-related hearing impairment affects lipreading and auditory-visual integration...
  20. Jaaskelainen I, Kauramäki J, Tujunen J, Sams M. Formant transition-specific adaptation by lipreading of left auditory cortex N1m. Neuroreport. 2008;19:93-7 pubmed publisher
    ..feature specificity of adaptation of auditory-cortex magnetoencephalographic N1m responses to phonemes during lipreading, we presented eight healthy volunteers with a simplified sine-wave first-formant (F1) transition shared by /ba/, ..
  21. Mohammed T, Campbell R, MacSweeney M, Milne E, Hansen P, Coleman M. Speechreading skill and visual movement sensitivity are related in deaf speechreaders. Perception. 2005;34:205-16 pubmed
    ..A control task requiring the detection of visual form showed no such relationship. Additionally, people born deaf were better speechreaders than hearing people on a new test of silent speechreading. ..
  22. Fairhall S, Macaluso E. Spatial attention can modulate audiovisual integration at multiple cortical and subcortical sites. Eur J Neurosci. 2009;29:1247-57 pubmed publisher
  23. Bergeson T, Pisoni D, Davis R. Development of audiovisual comprehension skills in prelingually deaf children with cochlear implants. Ear Hear. 2005;26:149-64 pubmed
    ..The results suggest that lipreading skills and AV speech perception reflect a common source of variance associated with the development of ..
  24. Feld J, Sommers M. Lipreading, processing speed, and working memory in younger and older adults. J Speech Lang Hear Res. 2009;52:1555-65 pubmed publisher
    ..perceptual closure, and perceptual disembedding skill--as factors contributing to individual differences in lipreading performance and to examine how patterns in predictor variables change across age groups...
  25. Bernstein L, Demorest M, Tucker P. Speech perception without hearing. Percept Psychophys. 2000;62:233-52 pubmed
    ..The results suggest that the necessity to perceive speech without hearing can be associated with enhanced visual phonetic perception in some individuals. ..
  26. Capek C, Waters D, Woll B, MacSweeney M, Brammer M, McGuire P, et al. Hand and mouth: cortical correlates of lexical processing in British Sign Language and speechreading English. J Cogn Neurosci. 2008;20:1220-34 pubmed publisher
  27. Nicholls M, Searle D. Asymmetries for the visual expression and perception of speech. Brain Lang. 2006;97:322-31 pubmed
    ..The asymmetries are most likely driven by left hemisphere specialization for language, which causes a rightward motoric bias. ..
  28. Humes L, Wilson D, Humes A. Examination of differences between successful and unsuccessful elderly hearing aid candidates matched for age, hearing loss and gender. Int J Audiol. 2003;42:432-41 pubmed
    ..Discriminant analysis, however, indicated that those in the group who retained their (linear) hearing aids tended to have better finger dexterity and higher loudness discomfort levels than those who did not. ..
  29. Andersson G. Decreased use of hearing aids following training in hearing tactics. Percept Mot Skills. 1998;87:703-6 pubmed
    ..Analysis showed a significant, albeit weak, decrease in daily hearing aid use for those subjects who had received the treatment. The utility of amount of hearing aid use as an indicator of rehabilitation success is discussed. ..
  30. Lidestam B, Beskow J. Motivation and appraisal in perception of poorly specified speech. Scand J Psychol. 2006;47:93-101 pubmed
    ..Suggestions for further research are presented. ..
  31. Bratakos M, Reed C, Delhorne L, Denesvich G. A single-band envelope cue as a supplement to speechreading of segmentals: a comparison of auditory versus tactual presentation. Ear Hear. 2001;22:225-35 pubmed
    ..The benefits observed for segmentals appear to carry over into benefits for sentence reception under both modalities. ..
  32. Kerzel D, Bekkering H. Motor activation from visible speech: evidence from stimulus response compatibility. J Exp Psychol Hum Percept Perform. 2000;26:634-47 pubmed
    ..Rather, the present study provides evidence for the view that visible speech is processed up to a late, response-related processing stage, as predicted by the motor theory of speech perception. ..
  33. Tyler R, Fryauf Bertschy H, Kelsay D, Gantz B, Woodworth G, Parkinson A. Speech perception by prelingually deaf children using cochlear implants. Otolaryngol Head Neck Surg. 1997;117:180-7 pubmed
    ..For this feature, no changes were observed. Vision-alone testing indicated that lipreading performance increased over time...
  34. Okada K, Hickok G. Two cortical mechanisms support the integration of visual and auditory speech: a hypothesis and preliminary data. Neurosci Lett. 2009;452:219-23 pubmed publisher
    ..Lip-reading also activated a much wider network in the superior temporal lobe than the sensory-motor task, possibly reflecting a more direct cross-sensory integration network. ..
  35. Valkenier B, Duyne J, Andringa T, Baskent D. Audiovisual perception of congruent and incongruent Dutch front vowels. J Speech Lang Hear Res. 2012;55:1788-801 pubmed publisher
    ..The findings stress the importance of audiovisual congruency in communication devices, such as cochlear implants and videoconferencing tools, where the auditory signal could be degraded. ..
  36. Tye Murray N, Spehar B, Myerson J, Hale S, Sommers M. Reading your own lips: common-coding theory and visual speech perception. Psychon Bull Rev. 2013;20:115-9 pubmed publisher
    ..These results suggest that visual input activates speech motor activity that links to word representations in the mental lexicon. ..
  37. Campbell R. Speechreading: advances in understanding its cortical bases and implications for deafness and speech rehabilitation. Scand Audiol Suppl. 1998;49:80-6 pubmed
    ..In addition, primary auditory cortex can be activated by silent speechreading in hearing people. The implications of these findings for deafness and for issues of compensation and plasticity are outlined. ..
  38. Andersson U, Lidestam B. Bottom-up driven speechreading in a speechreading expert: the case of AA (JK023). Ear Hear. 2005;26:214-24 pubmed
    ..It is suggested that AA's extreme speechreading skill, which capitalizes on low-order functions in combination with efficient central executive functions, is due to early onset of hearing impairment. ..
  39. Hashimoto M, Kumashiro M. [Intermodal timing cues for audio-visual speech recognition]. J UOEH. 2004;26:215-25 pubmed
    ..Potential applications of this research include noisy workplace in which a worker must extract relevant speech from all the other competing noises. ..
  40. Tobey E, Rekart D, Buckley K, Geers A. Mode of communication and classroom placement impact on speech intelligibility. Arch Otolaryngol Head Neck Surg. 2004;130:639-43 pubmed
    ..Educational environments that incorporate exposure to normal-hearing peers were also associated with higher speech intelligibility scores at 8 to 9 years of age. ..
  41. Auer E, Bernstein L, Tucker P. Is subjective word familiarity a meter of ambient language? A natural experiment on effects of perceptual experience. Mem Cognit. 2000;28:789-97 pubmed
    ..The results are discussed in relation to a simple sampling model of word experience and the language experience of the participant groups. ..
  42. van Linden S, Vroomen J. Recalibration of phonetic categories by lipread speech versus lexical information. J Exp Psychol Hum Percept Perform. 2007;33:1483-94 pubmed
    ..Despite the difference in nature (bottom-up vs. top-down information), lipread and lexical information thus appear to serve a similar role in phonetic adjustments. ..
  43. Lidestam B, Lyxell B, Andersson G. Speech-reading: cognitive predictors and displayed emotion. Scand Audiol. 1999;28:211-7 pubmed
    ..The results were discussed with respect to a model of face-processing (Bruce & Young 1986) and with respect to clinical implications. ..
  44. Meronen A, Tiippana K, Westerholm J, Ahonen T. Audiovisual speech perception in children with developmental language disorder in degraded listening conditions. J Speech Lang Hear Res. 2013;56:211-21 pubmed publisher
    ..The children with DLD were inaccurate at lipreading. Children with DLD have problems in perceiving spoken consonants presented audiovisually and visually...
  45. Yao B, Belin P, Scheepers C. Silent reading of direct versus indirect speech activates voice-selective areas in the auditory cortex. J Cogn Neurosci. 2011;23:3146-52 pubmed publisher
    ..Our results may be interpreted in line with embodied cognition and form a starting point for more sophisticated interdisciplinary research on the nature of auditory mental simulation during reading. ..
  46. Baskent D, Bazo D. Audiovisual asynchrony detection and speech intelligibility in noise with moderate to severe sensorineural hearing impairment. Ear Hear. 2011;32:582-92 pubmed publisher
    ..two opposing expectations were an increase in sensitivity, as hearing-impaired listeners heavily rely on lipreading in daily life, and a reduction in sensitivity, as hearing-impaired listeners tend to be elderly and advanced age ..
  47. Auer E. Investigating speechreading and deafness. J Am Acad Audiol. 2010;21:163-8 pubmed publisher
    ..The results to date are also consistent with the conclusion that deaf individuals, regardless of speechreading ability, recognize spoken words via a process similar to individuals with hearing. ..
  48. Jones J, Callan D. Brain activity during audiovisual speech perception: an fMRI study of the McGurk effect. Neuroreport. 2003;14:1129-33 pubmed
    ..This activation suggests that auditory information modulates visual processing to affect perception. ..
  49. Horn D, Davis R, Pisoni D, Miyamoto R. Development of visual attention skills in prelingually deaf children who use cochlear implants. Ear Hear. 2005;26:389-408 pubmed
    ..Theoretical accounts of these findings are discussed, including cross-modal reorganization of visual attention and enhanced phonological encoding of visually presented numbers. ..
  50. Leybaert J, Lechat J. Phonological similarity effects in memory for serial order of cued speech. J Speech Lang Hear Res. 2001;44:949-63 pubmed
    ..The recency effect was greater in the hearing group provided with sound, indicating that the traces left by auditory stimuli are perceptually more salient than those left by the visual stimuli encountered in CS. ..
  51. de Gelder B, Vroomen J, Annen L, Masthof E, Hodiamont P. Audio-visual integration in schizophrenia. Schizophr Res. 2003;59:211-8 pubmed
    ..control group on the sound localisation task, but in the audio-visual speech task, there was an impairment in lipreading as well as a smaller impact of lipreading on auditory speech information...
  52. Mitterer H. On the causes of compensation for coarticulation: evidence for phonological mediation. Percept Psychophys. 2006;68:1227-40 pubmed
    ..This result is discussed in the light of other compensation-for-coarticulation findings and general theories of speech perception. ..
  53. Davis L. Facemasks. Br Dent J. 2008;204:112 pubmed publisher