Auditory-Visual Speech Processing (AVSP) 2009
University of East Anglia, Norwich, UK
We have investigated if subjects are aware of what natural tongue movements look like, by showing them animations based on either measurements or rule-based synthesis. The issue is of interest since a previous audiovisual speech perception study recently showed that the word recognition rate in sentences with degraded audio was significantly better with real tongue movements than with synthesized. The subjects in the current study could as a group not tell which movements were real, with a classification score at chance level. About half of the subjects were significantly better at discriminating between the two types of animations, but their classification score was as often well below chance as above. The correlation between classification score and word recognition rate for subjects who also participated in the perception study was very weak, suggesting that the higher recognition score for real tongue movements may be due to subconscious, rather than conscious, processes. This finding could potentially be interpreted as an indication that audiovisual speech perception is based on articulatory gestures.
Index Terms: augmented reality, tongue reading, visual speech synthesis, data-driven animation
Bibliographic reference. Engwall, Olov / Wik, Preben (2009): "Can you tell if tongue movements are real or synthesized?", In AVSP-2009, 96-101.