Speech Prosody 2006

Dresden, Germany
May 2-5, 2006

The Prosody of Pet Robot Directed Speech: Evidence from Children

Anton Batliner (1), Sonja Biersack (2), S. Steidl (1)

(1) Lehrstuhl für Mustererkennung (Chair for Pattern Recognition), University of Erlangen-Nuremberg, Erlangen, Germany
(2) Department of Psychology, University of Stirling, UK

In this paper, we present a database with emotional children’s speech in a human-robot scenario: the children were giving instructions to Sony’s pet robot dog AIBO, with AIBO showing both obedient and disobedient behaviour. In such a scenario, a specific type of partner-centered interaction can be observed. We aimed at finding prosodic correlates of children’s emotional speech and were interested to see which speech registers children use when talking to AIBO. For interpretation, we left the weighting and categorization of prosodic features to a statistic classifier. The parameters found to be most important were word duration, average energy, variation in pitch and energy, and harmonics-to-noise ratio. The data moreover suggests that the children used a register that resembled mostly child-directed and pet-directed speech and to some extent computer-directed speech.

Full Paper

Bibliographic reference.  Batliner, Anton / Biersack, Sonja / Steidl, S. (2006): "The prosody of pet robot directed speech: evidence from children", In SP-2006, paper 064.