Music performance, in the context of electronic music, is typically done using a hardware controller with buttons and dials which the performer can use to play different tracks and customise effects such as volume and pitch on-the-fly. The aim of this project is to create a virtual reality user interface for live music performance, combined with a motion tracking camera, in order to remove the need for a hardware device. This also has the potential to offer better visual feedback to the user.
Undergraduate
Ableton -> VR link
This will mean being able to show data in VR from Ableton, focusing on good visuals. The
research here could be in DSP, or how to automatically generate visuals given Ableton / audio.
None
Lab allocations have not been finalised