Interacting with Robots - Tooling and Framework for Advanced Speech User Interfaces


In this paper we present an extension to our freely available modeling tool for specifying human-machine interfaces in automotive and non-automotive domains. The software tool has been extended to control and use a robot for speech synthesis, speech recognition and gestures. This enables linguists or human factors researchers to easily specify robot behavior for user studies or experiments on efficient human-robot interaction. The paper presents the way to model such a human-robot interface and a use case for a dialog scenario. In the modeling tool, an UML-based state graph is added to specify robot interaction and gestures. During runtime it is executed in parallel to the state graphs for speech dialog and haptic interaction. We showcased interaction of a human with a NAO robot connected to our system based on a weather inquiry system: After some small talk the user can request information from the robot, e.g. the weather forecast. While the robot queries a cloud recognition back end, it performs a thinking gesture and then responds with “Tomorrow it will be between 1 and 8◦ C and rain in Berlin.”

Year: 2017
In session: Poster
Pages: 99 to 106