The University of Auckland

Project #112: Augmented Reality Gesture-Based Interaction System

Back

Description:

This project focuses on the development of user interaction in Augmented Reality (AR) by integrating both static and dynamic gesture recognition systems. Users will initiate the experience by scanning their immediate environment using their mobile devices, transforming it into a dynamic and interactive 3D space. Gesture recognition techniques will be implemented to interpret the user's hand movements, allowing for the creation and manipulation of virtual objects within the AR space. The project aims to provide an intuitive way for users to engage with AR content using natural gestures, enhancing the overall immersive experience. Meanwhile, the project will employ machine learning algorithms to more accurately interpret gestures, fostering an immersive experience where users can place, resize, and interact with 3D virtual objects seamlessly integrated into their real-world surroundings. By creating a user-friendly, easily accessible, and captivating AR platform, this project redefines how individuals engage with digital content through the AR technology and intuitive gesture-based interactions.

Type:

Undergraduate

Outcome:

Project outcome: "The successful realization of this project will culminate in an immersive Augmented Reality (AR) experience that fundamentally transforms how users interact with digital content in their real-world surroundings. The integration of a sophisticated gesture recognition system will enable users to interact intuitively with virtual objects, leveraging natural hand movements for placement, manipulation, and exploration within the AR environment. The application will be optimized for cross-platform accessibility, ensuring a consistent and engaging AR experience on smartphones and tablets. The user-friendly interface will provide an intuitive means for users to navigate and customize their AR space through recognized gestures. Ultimately, the project aims to deliver a groundbreaking AR system that not only pushes the boundaries of immersive technology but also prioritizes ethical considerations, fostering responsible practices, transparency, and user consent throughout the entire user experience." Related Details: 1. Environment Scanning: Possible development platforms: ARCore (Android) Description: Use ARKit or ARCore to enable the phone's camera to scan and understand the real-world environment. AR core provides multiple techniques including environmental understanding, image tracking, brightness sensing, motion detecting etc, that helps environmental scanning. 2. Gesture Recognition: Possible development platforms: Hand tracking libraries: MediaPipe Description: Implement gesture recognition to interpret user hand movements. MediaPipe offers cross-platform solutions for hand tracking on both iOS and Android. 3. 3D Object Placement: Possible development platforms: Unity3D or Unreal Engine for AR Description: Use a game engine like Unity3D or Unreal Engine to create and render 3D objects in the AR environment. ARCore allows these engines to streamline AR development. 4. Interaction with Virtual Objects: Possible development platforms: Unity3D or Unreal Engine for AR Description: Implement interaction mechanisms with virtual objects. This can include tapping, swiping, or specific hand gestures to trigger actions or manipulate the virtual environment. 5. User Interface (UI) Design: Possible development platforms: Unity UI or native android studio development tools Description: Design intuitive and user-friendly user interface for our AR app. 6. Testing and Optimization: Possible development platforms: Real devices for testing Description: Test the AR app on real devices to ensure optimal performance and user experience. Optimize for different screen sizes, resolutions, and device capabilities.

Prerequisites

None

Specialisations

Categories

Supervisor

Co-supervisor

Team

Lab

Intelligent Room (405.760C, Lab)