Orchestra
Orchestra
VR CONDUCTOR
TIMELINE
TOOLS USED
ROLE
Oct - Dec 2024
Unity / Figma / Blender
VR Developer


"Become a professional conductor
Start from now"
The primary objective of this project is to develop a Mixed Reality (MR) prototype that provides an interactive environment for aspiring conductors to practice their skills. By leveraging VR technology, it aims to simulate a realistic orchestral setting that enhances the learning experience through hand gesture recognition, dynamic volume adjustment, and intuitive spatial interactions.
Overview
My role
Dynamics control: developed features that translate the hand movement of players into audio control, including volume control, speed, play and pause
VR Developer
Hand Tracking & Gesture Detection: based on Oculus OpenXR toolkit, utilized hand tracking features and specific hand gesture detection in gameplay development
UI system: collaborated with UI&UX designers to work on the UI guidance, from the main menu to interactive elements, refined the interaction of button pressing and page sliding
Feature Showcase

Dynamics control: the hand movement of players can be translated into the dynamics change of the orchestra

Hand gesture detection: specific hand gestures like thump up or fist can be recognized through the headset, and change the orchestra behavior

Conduction guidance: user interface interactions are based on virtual hand collision