The University of Auckland

Project #100: Multi-modal social behaviour engine

Back

Description:

This project will develop a multi-modal social behaviour includes facial expression, verbal interaction, gazing using eyes and neck movement, lip synchronization, etc. We have a working system that detects human and talk with them using two different robots; 1) EveR, a humanlike mechanical robot, 2) Silbot, a humanoid robot using emoticon type facial expression. This project will improve the current version, and will be integrated to some of ongoing projects in the Robotics lab. All necessary devices will be provided.

This project includes to:

o   use different social interaction methods, such as facial expression, verbal interaction, gazing using eyes and neck movement, lip synchronization,

o   use two different robot; EveR and Silbot,

o   improve emoticon type facial expression, and

o   be evaluated focusing on its expression skills.

Project scope will be decided after the meeting with supervisors. You will bring to the role a passion for research and engineering, excellent computing skills (including a high level of programming ability), and a strong sense of responsibility.

Outcome:

Prerequisites

None

Specialisations

Categories

Supervisor

Co-supervisor

Team

Lab

Lab allocations have not been finalised