Detecting Collaborative Dynamics Using Mobile Eye-Trackers

ICLS(2016)

Cited 23|Views18
No score
Abstract
: Prior work has successfully described how low and high-performing dyads of students differ in terms of their visual synchronization (e.g., Barron, 2000; Jermann, Mullins, Nuessli u0026 Dillenbourg, 2011). But there is far less work analyzing the diversity of ways that successful groups of students use to achieve visual coordination. The goal of this paper is to illustrate how well-coordinated groups establish and sustain joint visual attention by unpacking their different strategies and behaviors. Our data was collected in a dual eye-tracking setup where dyads of students (N=54) had to interact with a Tangible User Interface (TUI). We selected two groups of students displaying high levels of joint visual attention and compared them using cross-recurrence graphs displaying moments of joint attention from the eye-tracking data, speech data, and by qualitatively analyzing videos generated for that purpose. We found that greater insights can be found by augmenting cross-recurrence graphs with spatial and verbal data, and that high levels of joint visual attention can hide a free-rider effect (Salomon u0026 Globerson, 1989). We conclude by discussing implications for automatically analyzing students’ interactions using dual eye-trackers.
More
Translated text
Key words
collaborative dynamics,mobile,eye-trackers
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined