Evaluating Data-Driven Co-Speech Gestures of Embodied Conversational Agents through Real-Time Interaction

PROCEEDINGS OF THE 22ND ACM INTERNATIONAL CONFERENCE ON INTELLIGENT VIRTUAL AGENTS, IVA 2022(2022)

Cited 1|Views12
No score
Abstract
Embodied Conversational Agents (ECAs) that make use of cospeech gestures can enhance human-machine interactions in many ways. In recent years, data-driven gesture generation approaches for ECAs have attracted considerable research attention, and related methods have continuously improved. Real-time interaction is typically used when researchers evaluate ECA systems that generate rule-based gestures. However, when evaluating the performance of ECAs based on data-driven methods, participants are often required only to watch pre-recorded videos, which cannot provide adequate information about what a person perceives during the interaction. To address this limitation, we explored use of real-time interaction to assess data-driven gesturing ECAs. We provided a testbed framework, and investigated whether gestures could affect human perception of ECAs in the dimensions of human-likeness, animacy, perceived intelligence, and focused attention. Our user study required participants to interact with two ECAs - one with and one without hand gestures. We collected subjective data from the participants' self-report questionnaires and objective data from a gaze tracker. To our knowledge, the current study represents the first attempt to evaluate data-driven gesturing ECAs through real-time interaction and the first experiment using gaze-tracking to examine the effect of ECAs' gestures.
More
Translated text
Key words
embodied conversational agent,user study,evaluation instrument,data-driven,gesture generation,gaze tracking
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined