This project involves iteratively refining an intelligent body tracking to a Unity-based installation where users experience immersive sound and visual feedback with motion capture and VR headset. The immersive application will display an intelligent agent that mimics variations of the visitor's movement. The practical goal would be to integrate an intelligent movement avatar into a gallery installation experience. The research goal would be to understand how the agents affects movement, sensory perception, embodiment, and subjective experiences. Over the year, we would customise the agent based on stakeholder feedback.
This project is connected to three others that focus on: 1) the foundational visual and audio effects within the installation, 2) mixed reality responsive arduino controls, and 3) a Māori creative story (pūrākau) scene.
Undergraduate
developing an intelligent agent with an immersive art installation
user study interviewing participants and measuring aspects of their behaviour in VR
SOFTENG350, SOFTENG702
HCI Lab (303.521, Lab)