MR DJ
A downloadable game for Android
Mixed Reality DJ rhythm game created for Interaction Design & Prototyping final project. In this prototype, designed for the Quest 3, the user loads in and selects the VR or MR gameplay modes. After selecting the environment and song, the user then plays one round of a collider-based rhythm game. In the future, a tool for creating more levels/maps will be added for ease of use.
DevLog
This Mixed Reality DJ application was created during my time taking the Interaction Design & Prototyping course hosted by Circuit Stream. During this course, I learned and applied new workflows & tools to boost my XR design toolkit.
Figma & ShapesXR
To begin the prototyping journey, we began with using a familiar design tool, Figma, mixed with an XR design application ShapesXR to rapidly iterate quickly on XR ideas. After designing layouts for UI and other interaction guidelines in Figma, using ShapesXR's built in Figma bridge, bringing them into Mixed Reality was simple and precise. The layout can be found here: ShapesXR MR DJ
Unity & XR Interaction Toolkit
ShapesXR's most important feature is it's seamless transition to Unity through the use of importable stages. Each interaction layer in ShapesXR is brought into Unity as a collection of objects, allowing you to quickly add functionality inside of the engine on all the imported models.
Although the model creation and placement workflows were expedited thanks to the ShapesXR features, animations had to be manually created and scripted inside of Unity. This process was tedious, having to create the animations in a separate program (Blender) and then set up animation controllers inside of the engine. Once animated, the crowd assets felt much more alive, adding to the ambience of the cutout section.
During our course, we were introduced to the XR Interaction Toolkit, Unity's latest built in solution for XR development. This package affords us the ability to develop both virtual & mixed reality applications, or easily combine the two.
Interaction setup was straightforward after demoing the example Bowling scene, first we needed to set up our XR Rig to handle input from our XR device. Once input was clear, setting up the first interactable object using XRI was simple enough. Complexity was introduced when learning how to switch between XR controls for things like XR Raycasts, UI Raycasts, Direct Interactors, and more.
For my Mixed Reality app, all of the aforementioned control schemes were deployed for various features. UI interactions help guide the user, from beginning the game in the reality of their choosing to ending the level by choosing to exit or return to main menu. Direct interactors allowed me to make a tactile record based song selection menu, as well as provide a satisfying mechanic for destroying music notes. XR Raycasts are used to allow the user to grab interactable objects from afar, or teleport around the playable area.
All of these interaction methods took time and repetition to get the correct interactive feel to the project. Certain features, like the UI raycasts, had unique issues when it came to debugging using the XRI's Device Simulator, which would require in-headset testing to get proper functionality.
After interactions were added, setting up the scene flow was the last overhead to finish before tackling the main rhythm game features. Custom text and images were used to create the Main Menu, Scene Selection, and Song Selection UIs.
For the rhythm game mechanics, all of the note traversal systems are created using the AudioSettings dspTime in Unity. This avoids any issues with using deltaTime in Unity, which would lead to desync from the music during gameplay. Notes are stored as lists of NoteObjects, which are spawned and progress towards the player on 3 rails. Each note comes into contact with the goal hitbox, and is added to a list of destroyable notes. When the player interacts with each goal hitbox, an instrumental sfx plays and destroys the note. This interaction is given a score based on the amount of time the note has been waiting in the collider.
As the project progressed, I also ran into the need for optimization in order to get steady performance on the onboard processor of the Quest 3. Certain Particle System features had to be removed from the Note interaction gameplay to avoid slowdowns during crucial gameplay moments.
If I were to continue work on the project past the deadline, there are few features that were cut from the project for scope reasons. Gameplay wise, each map is basically the same with rotating notes for each beat spawned. This would be changed to be customizable per song, and have each note given a track (Red, Yellow, or Blue) to travel down.
As for Mixed Reality features, I would like to add more room tracking ability, with the playspace being placed on a living room/coffee table, and the wall destruction being attached to the user's scanned far wall. An option to choose seated or standing play would be good for usability, along with other VR comfort settings. Finally, I attempted to add music-responsive objects but was unable to get the intended behavior. It would be great to see such objects being spawned in the user's environment during gameplay to incentivize better scores.
That's all for MR DJ, thank you for reading!
Assets Used:
DJ Table by gamayunovantonsakh (@gamayunovantonsakh) [a395ec5] (sketchfab.com)
Status | Prototype |
Platforms | Android |
Author | brendanmillerxr |
Genre | Rhythm |
Tags | mixed-reality, Virtual Reality (VR) |
Leave a comment
Log in with itch.io to leave a comment.