Tracking of facial features to support human-robot interaction

Pateraki M., Baltzakis H., Kondaxakis P., Thahanias P., 2009. Tracking of facial features to support human-robot interaction. In: Proc. IEEE International Conference on Robotics and Automation (ICRA), 12-17 May, Koebe, Japan. [doi] [pdf] [bib]

Abstract:

In this paper we present a novel methodology for detection and tracking of facial features like eyes, nose and mouth in image sequences. The proposed methodology is intended to support natural interaction with autonomously navigating robots that guide visitors in museums and exhibition centers and, more specifically, to provide input for the analysis of facial expressions that humans utilize while engaged in various conversational states. For face and facial feature region detection and tracking, we propose a methodology that combines appearance-based and feature-based methods for recognition and tracking, respectively. For the stage of face tracking the introduced method is based on Least Squares Matching (LSM), a matching technique able to model effectively radiometric and geometric differences between image patches in different images. Thus, compared with previous research, the LSM approach can overcome the problems of variable scene illumination and head in-plane rotation. Another significant characteristic of the proposed approach is that tracking is performed on the image plane only wherever laser range information suggests so. The increased computational efficiency meets the real time demands of human-robot interaction applications and hence facilitates the development of relevant systems.

This entry was posted in Publications. Bookmark the permalink. Comments are closed, but you can leave a trackback: Trackback URL.