Learning Gaze Behaviors for Balancing Participation in Group Human-Robot Interactions

Sarah Gillet, Maria Teresa Parreira, Marynel Vázquez, Iolanda Leite (2022). Learning Gaze Behaviors for Balancing Participation in Group Human-Robot Interactions. In Proceedings of the 17th ACM/IEEE International Conference on Human-Robot Interaction, 2022, New York, NY, USA. (HRI '22) PDF
The virtual conference talk⁚ Video


Our paper on Learning Gaze Behaviors for Balancing Participation in Group Human-Robot Interactions explores how we can learn gaze behaviors in groups that aim to balance participation. To avoid possibly harmful and expensive online exploration, we use a previously collected group human-robot interaction dataset. To learn gaze behaviors, we compare two methods: imitation learning (IL) in the form of behavioral cloning and offline RL (RL).

After an extensive offline evaluation of learned behaviors, we deploy the two different learned behaviors in a user study. Our results show that the learned behaviors did not compromise the interaction and could improve general amount of participation (IL) and amount of turn-taking (RL) when compared to the dataset.

The paper contains a detailed explanation of all techniques and methods used including handling of different seeds and training epochs.