Developing a gesture set for a particular scenario requires a deep understanding of how people preform a particular job and what their needs and requirements are. We are looking to support the consenting officers who check a building site to ensure that what is constructed matches the requirements specified by the council when the building was consented. This will require visualising and manipulating a 3D building model matching what is being constructed on site and checking that required conditions are being adhered to. The project will have a strong Human-Computer Interaction component to it to create a usable gesture set based on requirements of specialist consenting officers and then to test the usability of the developed gesture set. A previous Part IV project developed a gesture set to navigate a 3D building environment which may be used as a starting point for the project.
A gesture set that can be recognised in an AR/VR environment (e.g., through a LeapMotion controller afixed to an Oculus headset) with a Unity-based software system to allow gestures to be mapped to consenting functionality.
SOFTENG 350 must have been taken by at least one student in the group.
Lab allocations have not been finalised