RAD
Competitive Mixed-Reality Game

DESCRIPTION
After exploring several mixed reality project ideas with the team, we ultimately developed RAD.RAD is a modern, single-player reimagining of Pong, designed for mixed reality (AR/VR).
The project served as a research initiative to explore the potential of mixed reality gaming without relying on a VR headset.
The resulting proof-of-concept prototype also demonstrates how simple gameplay mechanics combined with immersive AR/VR technology can create an engaging and entertaining experience.
The project is available as a GitLab repository.
Gameplay
Gameplay takes place inside a virtual cage with one open side behind the player.A ball continuously bounces around within the cage, and the player's goal is to keep it from escaping through the open rear.
Using virtual paddles controlled by real-time hand movements, the player must deflect the ball to keep it in play.
Each time the ball bounces on a paddle, a point is added, and the ball's speed is increased, gradually raising the difficulty.
If the ball escapes through the open side, the game is lost.
To win, the player must keep the ball in play for a full minute.
Once the timer expires, the game ends, and the final score is recorded on a high-score leaderboard for others to challenge.
The player wears a headset made from a mobile phone, which overlays the 3D virtual environment onto the real world.
The entire body of the player is tracked in real time and his movements are replicated in the game's environment, creating an immersive and responsive experience.
Spectators can view the action from outside the virtual world via an external screen, which displays a live representation of the player inside the cage.
Architecture
The game architecture follows a client-server model.The server manages the core gameplay logic and real-time tracking of the player's body.
It also transmits the positions of the player's body parts, paddles, and the ball to the client.
The client tracks the player's head orientation, and in combination with the data received from the server, is allowed to render the scene as if the player were wearing a virtual headset.
This setup creates a convincing mixed-reality experience by synchronizing real world physical movements with the virtual environment.
The game was developed using the Unity game engine.
The server runs on a Windows PC, with a connected Microsoft Kinect device for the body tracking.
The client runs on an Android mobile device (Google Pixel 2 in our case), using the device's sensors* to track the player's head orientation.
Reception
The game was showcased at the 2019 Vietnam Festival Of Media & Design in Hanoi, where it attracted significant attention.Its intuitive controls, fast-paced sessions, and competitive high-score system made it an instant hit. Attendees lined up to take their turn and try to climb the leaderboard.
The positive reception highlighted the game's potential beyond a prototype, with many viewing it as a fresh and innovative take on interactive entertainment.
The project, part of the festival, was also featured on Vietnam's national television channel VTV1.
ROLE
Our four-member team collaborated closely on developing the concept and determining how best to bring it to life.After evaluating various gameplay components, hardware options, and implementation strategies, we agreed on a clear direction for the project.
As the sole developer, I was responsible for all software development tasks, including building the game in Unity.
The first step involved setting up and testing the Kinect SDK within Unity. The Kinect device conveniently provides body tracking data along with video and depth images, which had to be adapted for game logic, transmission and display purposes.
Next, a multiplayer-like system was needed to support the client-server architecture of the game.
The SpaceBrew toolkit was chosen for the server management and communication with the clients.
The SpaceBrew server was fully integrated into the project, allowing it to be launched and controlled directly from within Unity. I created a separate Unity template project that includes a customized version of SpaceBrew along with a built-in Node.js setup. On the client side, integration was based on the spacebrewUnity library.
To enhance the overall experience, I incorporated additional features such as a main title screen, settings menu, and screens for success, failure, and high scores. These were managed through a previous personal Unity project dedicated to handling game state transitions.
At the start of the project, we discovered that head tracking using the mobile device's built-in sensors was unreliable, suffering from noticeable drift during gameplay. This caused the in-game environment to fall out of sync with the real world, making the game unplayable.
We explored several potential solutions to address this issue.
However, after switching to a Google Pixel device, and given the short duration of gameplay sessions, the drifting problem was no longer present, eliminating the need for additional fixes.
To accommodate both scenarios, an option was added to the client application to play with or without the drift compensation solution.