The University of Auckland

Project #64: Dynamic Facial Expression Generation for Head Robots with Adaptive Style Control

Back

Description:

Generating dynamic facial expressions for head robots represents a cutting-edge intersection of robotics, artificial intelligence, and human-computer interaction. Traditional methods in facial expression synthesis for robots are often limited by predefined expression ranges or lack the ability to accurately mirror complex human emotions. This project invites students to explore and develop more sophisticated, adaptive techniques for creating facial expressions on head robots, aiming to bridge the gap between robotic capabilities and the nuanced expression of human emotions.

This student project is centered around the development of innovative approaches to enable head robots to generate facial expressions that accurately convey a broad spectrum of emotions with adaptive style control. Leveraging the principles of large-scale contrastive language-image pre-training models, students will investigate how to effectively interpret diverse inputs such as textual emotions descriptions, emotional cues from images, or direct human interactions to dynamically generate facial expressions on robots that are realistic, expressive, and emotionally resonant.

Key Areas of Focus

● Interpreting Emotional and Stylistic Cues: Create algorithms capable of understanding complex emotional states and stylistic preferences from various inputs to guide the expression generation in robots.

● Robotic Expression Synthesis: Design and implement models that enable head robots to produce facial expressions that are not only mechanically feasible but also closely match the emotional and stylistic nuances conveyed by the input.

● Adaptive Mechanisms for Expression Generation: Employ adaptive strategies to adjust the robotic facial expressions in real-time, ensuring a high degree of accuracy and flexibility in reflecting intended emotions and styles.

● Alignment of Emotion and Style in Expressions: Develop methods to ensure that the robot's facial expressions are both emotionally authentic and stylistically appropriate, enhancing the robot's ability to engage with humans.

● Fine-Tuned Control of Robotic Expressions: Enable precise manipulation of the robot's facial features, allowing for detailed customization of expressions to better match human interactions and preferences.

Type:

Undergraduate

Outcome:

Prerequisites

None

Specialisations

Categories

Supervisor

Co-supervisor

Team

Lab

Robotics (405.652, Lab)