This project involves iteratively refining a participative Unity-based installation where users experience immersive sound and visual feedback with a motion capture suit and VR headset. In previous versions, the installation was completly in virtual reality, and this year, we would like to Adruino prototyping for responsive theatre lighting. The real-world lights and immersive application directs gallery-visitors’ attention in various ways (for example: https://www.youtube.com/watch?v=wb_jM2kbqTA). The practical goal would be to refine the development of a complete gallery installation experience and performance. The research goal would be to understand how the mixed reality and arduino controlled lights affects sensory perception, embodiment, and subjective experiences of dancers and audience members. Over the year, we would customise the application based on stakeholder feedback.
This project is connected to three others that focus on: 1) the foundational visual and audio effects within the installation, 2) an embodied AI agent, and 3) a Māori creative story (pūrākau) scene.
Undergraduate
developing an interactive Arduino prototype that is functional for an installation with responsive audio and visual effects to physical movement
user study interviewing participants and measuring aspects of their behaviour in VR
supporting dancers in creating a performance with the application
SOFTENG350, SOFTENG702
HCI Lab (303.521, Lab)