ABSTRACT
Socially assistive robots are designed to help people through interactions that are inherently social, such as tutoring, coaching, and therapy. Because they operate in social environments, these robots must be programmed to recognize, process, and communicate social cues used by people. For example, non-verbal behaviors like eye gaze and gesture can provide significant communication in social interactions. However, identifying the correct non-verbal behavior to perform in a given context is a non-trivial problem for social robotics. One approach for designing robot behaviors is data driven, that is, reliant on actual observations of human behavior rather than pre-coded heuristics. This approach involves collecting data from natural human-human interactions, and then training a model based on that data. From this model, we can begin to generate non-verbal robot behaviors for known contexts, as well as identify the context given observations of new non-verbal behaviors. In this talk, I outline my current research designing data-driven generative behavior models for tutoring tasks. I also touch on the challenges of real-world robotics and how those challenges overlap with those faced by mobile augmented reality systems.
Index Terms
- Toward a data-driven generative behavior model for human-robot interaction
Recommendations
Data-Driven Model of Nonverbal Behavior for Socially Assistive Human-Robot Interactions
ICMI '14: Proceedings of the 16th International Conference on Multimodal InteractionSocially assistive robotics (SAR) aims to develop robots that help people through interactions that are inherently social, such as tutoring and coaching. For these interactions to be effective, socially assistive robots must be able to recognize and use ...
Analysis of Human Nodding Behavior during Group Work for Designing Nodding Robots
GROUP '16: Proceedings of the 2016 ACM International Conference on Supporting Group WorkNodding has various communicative functions in humans, such as agreement, emphasis and turn-taking and can also create various positive impressions in communication by the person exhibiting the behavior. The ultimate aim of our research is to facilitate ...
Human-robot collaborative tutoring using multiparty multimodal spoken dialogue
HRI '14: Proceedings of the 2014 ACM/IEEE international conference on Human-robot interactionIn this paper, we describe a project that explores a novel experimental setup towards building a spoken, multi-modally rich, and human-like multiparty tutoring robot. A human-robot interaction setup is designed, and a human-human dialogue corpus is ...
Comments