Full Content is available to subscribers

Subscribe/Learn More  >
Proceedings Article

Visual exploration and analysis of human-robot interaction rules

[+] Author Affiliations
Hui Zhang, Michael J. Boyles

Indiana Univ. (United States)

Proc. SPIE 8654, Visualization and Data Analysis 2013, 86540E (February 4, 2013); doi:10.1117/12.2002536
Text Size: A A A
From Conference Volume 8654

  • Visualization and Data Analysis 2013
  • Pak Chung Wong; David L. Kao; Ming C. Hao; Chaomei Chen; Christopher G. Healey
  • Burlingame, California, USA | February 03, 2013


We present a novel interaction paradigm for the visual exploration, manipulation and analysis of human-robot interaction (HRI) rules; our development is implemented using a visual programming interface and exploits key techniques drawn from both information visualization and visual data mining to facilitate the interaction design and knowledge discovery process. HRI is often concerned with manipulations of multi-modal signals, events, and commands that form various kinds of interaction rules. Depicting, manipulating and sharing such design-level information is a compelling challenge. Furthermore, the closed loop between HRI programming and knowledge discovery from empirical data is a relatively long cycle. This, in turn, makes design-level verification nearly impossible to perform in an earlier phase. In our work, we exploit a drag-and-drop user interface and visual languages to support depicting responsive behaviors from social participants when they interact with their partners. For our principal test case of gaze-contingent HRI interfaces, this permits us to program and debug the robots’ responsive behaviors through a graphical data-flow chart editor. We exploit additional program manipulation interfaces to provide still further improvement to our programming experience: by simulating the interaction dynamics between a human and a robot behavior model, we allow the researchers to generate, trace and study the perception-action dynamics with a social interaction simulation to verify and refine their designs. Finally, we extend our visual manipulation environment with a visual data-mining tool that allows the user to investigate interesting phenomena such as joint attention and sequential behavioral patterns from multiple multi-modal data streams. We have created instances of HRI interfaces to evaluate and refine our development paradigm. As far as we are aware, this paper reports the first program manipulation paradigm that integrates visual programming interfaces, information visualization, and visual data mining methods to facilitate designing, comprehending, and evaluating HRI interfaces. © (2013) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

Hui Zhang and Michael J. Boyles
" Visual exploration and analysis of human-robot interaction rules ", Proc. SPIE 8654, Visualization and Data Analysis 2013, 86540E (February 4, 2013); doi:10.1117/12.2002536; http://dx.doi.org/10.1117/12.2002536

Access This Article
Sign In to Access Full Content
Please Wait... Processing your request... Please Wait.
Sign in or Create a personal account to Buy this article ($15 for members, $18 for non-members).



Citing articles are presented as examples only. In non-demo SCM6 implementation, integration with CrossRef’s "Cited By" API will populate this tab (http://www.crossref.org/citedby.html).

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

Related Book Chapters

Topic Collections


Buy this article ($18 for members, $25 for non-members).
Sign In