Four Software Design Patterns I Used In My VR Game Prototype

Neo Zhou
Level Up Coding
Published in
13 min readJan 9, 2024

--

Early Styleframe for KunaiVR

Intro

This article documents some of the programming involved in my VR game prototype, KunaiVR, made with Unity. I’ll walk you through what my game is about, the VR headset I’m developing for, the game mechanics I developed, the basics of Unity, and finally the software design patterns I used to improve the code of my game. When I get to the section on design patterns, it will be helpful to have some background in object oriented programming.

What is KunaiVR?

KunaiVR is a VR multiplayer dodgeball experience with anime inspired magic that is designed to push the boundaries of the Meta Quest 2. I currently have a single player prototype of the game developed. The 3 key pillars of my game are:

  • Long and short range combat that encourages dynamic full body movement
  • Fully physics-based worlds and avatars to maintain immersion
  • Self expression via infinitely customizable avatars using the VRoid SDK

Before I show it off and explain some of my code, it’s important to understand what the Quest 2 is and what it can do, since many people don’t even know it exists yet.

What is the The Quest 2?

The Quest 2 is a 6-degree-of-freedom virtual reality headset with hand controllers developed by Meta. “6DoF” means the headset can track the rotation and movement of the body. This means you can walk and look around in a physical room to interact with virtual worlds. The hand controllers are also tracked with 6DoF, and have buttons for gripping with the middle finger, triggering with the index finger, and traditional game controller buttons that can be manipulated with the thumbs.

This headset and VR in general opens up new possibilities for the apps that can be developed, adding features like more immersive 3D vision,

Starry Nights: VRChat World by ~Sam

the precise manipulation of virtual objects,

Auto Hand: VR Interaction Framework

enhanced bodily presence,

DeepMotion: 3 Point Tracking VR Avatar

and enhanced social presence.

The Virtual Reality Show: Best VRChat Worlds you NEED to visit!

KunaiVR Game Mechanics

30 seconds of footage of my prototype

Seeing the deep potential of VR, I set out to create a game that takes advantage of its unique methods of immersion. The goal of the game is just like dodgeball: don’t get hit and try to hit other players. The big difference is that instead of using balls, players use kunai (Japanese knives). The kunai are fully physics based: they can be thrown, used to deflect incoming kunai, stabbed into or ricocheted off of surfaces, and they can even be recalled back to the hand.

physics based kunai

In addition, players have access to magical movement. They can walk on walls and ceilings, or even teleport to the kunai that they have thrown.

walking up walls
teleporting to kunai

(Anime fans might be able to tell that this game was heavily inspired by Naruto. I hope to add more interesting mechanics from that series in the future.)

If you have a Quest 2 and know how to load apps with SideQuest, you can download my prototype here: https://kunaixr.itch.io/kunaivr-singleplayer-prototype.

A Brief Intro to Unity

My prototype was developed with the Unity game engine. Unity has many cool features, but I’ll explain the bare minimum necessary to understand my work.

A Brief Intro to Unity: Game Engines

Put simply, a game engine is a tool that abstracts away the complexity of a video game (rendering graphics, playing audio, performing physics simulation, compiling for cross-platform support, etc.) into a set of reusable systems so developers can focus on art and game mechanics from day 1. I chose to develop my prototype with Unity because it currently has the best support for developing VR applications.

The process of developing in a game engine is pretty different from standard software development. Normally, developers code, access shared libraries, compile, debug, view the console, and execute programs all in an IDE.

Game engines have a graphics rendering component to them. In Unity developers organize “game objects” in a 3d graphical environment called a “scene”, and then add functionality to the game objects via “components”, which are built using code. Unity has many default components for different systems. For example:

Example of a scene in Unity. It contains 4 game objects: a camera, a light, a cube, and a plane. The inspector window on the right shows the components attached to the cube.
  1. the transform component: stores the position, rotation and scale of the game object
  2. the mesh filter component: links a 3d mesh to a game object
  3. the mesh renderer component: enables the mesh filter to be rendered by a virtual camera
  4. the collider component: defines the physical boundaries of the objects for physics simulations
  5. the rigidbody component: makes the object act under physics at runtime
At runtime, the virtual camera renders the scene based on its own transform as well as all of the lights and mesh renderers in the scene. The cube is falling and stopping at the plane because of its rigidbody component.

A Brief Intro to Unity: The Game Loop

The role of a unity developer is to develop custom components that interact with these default components in order to create the behavior they want. These custom components are written in C#, an object oriented programming language designed by Microsoft.

Knowing object oriented programming is important for understanding how to code in Unity because most components extend Unity’s Monobehaviour class. Firstly, the Monobehaviour class contains important fields that expose key data of a game object to the developer, like references to the game object and transform data, and whether or not the game object is enabled. Secondly, it allows developers to add custom code into the Unity Game Loop by defining overridable functions (describing the functions that Unity lets you modify as “overridable” is a slight oversimplification, but if you think of it that way you’ll do the right things).

A Brief Intro to Unity: The Game Loop

As mentioned before, Unity handles a lot of behavior for the developer. To oversimplify things a bit, every frame, it has to simulate the physics of objects with rigidbody components and then render them in relation to a virtual camera. At the same time, it has to allow a developer to add custom logic. The way it manages the ordering of all of the tasks it has to do is via the Unity Game Loop. Once a unity program is executed, it follows the flow chart below, going through each game object for each step.

Each user callback function is defined in the Monobehaviour class, so for each game object, Unity checks all of its components and runs the specific callback function in each component depending on where it is in the game loop.

The skeleton of a component for the developer to fill in.

The amount of callbacks Unity allows the developer to inject their own custom code into seems overwhelming, but most of the time we only have to worry about three: Start(), Update(), and FixedUpdate(). Start() is called once after the game object is initialized, Update() occurs each time the graphics are updated, and FixedUpdate() occurs each time the physics are updated.

My Design Patterns

With all of that background out of the way, I can finally explain four of the design patterns I used in my project.

I organized each section by:

  1. Specifying a problem.
  2. Showing how a design pattern solves the problem.
  3. Concretely showing the structure of my project files with a UML diagram and explain what they are doing.

Using the State Pattern for Physics-Defying Kunai

The Problem:

As I mentioned at the beginning, my game revolves around the manipulation of kunai. With Unity’s default components and a few packages specifically designed for VR, I got a lot of functionality for free (with a LOT of time reading documentation and bug fixing). Unity handles physics simulation and rendering, a package called HurricaneVR allows the user to grab and throw objects with their hand controllers as well as stab objects into surfaces, and a package called FinalIK syncs a virtual full body avatar with the headset and hand controllers.

However, as any game developer will tell you, realistic physics/interactions don’t always make great game mechanics. To make the experience of manipulating kunai feel great, I had to write a lot of custom code from scratch. To give you an idea of what I needed to code myself, below are some of the most important conditions:

  1. If the user throws hard enough, instead of following the laws of physics, the kunai should move at a constant velocity without acting under gravity until it hits an object
  2. If the kunai is being thrown, the tip needs to be constantly pointing to the direction it’s moving
  3. If the thrown kunai hits a metallic object, it should ricochet instead of sticking
  4. The object should not accidentally stick to the hand it was thrown from.
  5. If the kunai sticks to a surface, it needs to store the point of collision on the surface so the user can teleport onto it.
  6. If the user makes a specific gesture at the kunai, it needs to come back to the hand, but while it’s coming back, the gesture can be cancelled and the kunai should continue along the trajectory it was already on, acting as if it were thrown.
  7. If the kunai is not being thrown and is not stuck to a surface, it should obey the normal laws of physics.

As you can see, the conditions for how the kunai act at any given moment are complex and depend on what has happened to it previously. At first, I tried to code this system only using if statements, but my code quickly became incredibly complex to modify and even understand.

The Solution:

To simplify the structure of my code, I used the state pattern. I organized the custom behavior of my kunai into five states: normal, grip, thrown, stuck, and recall. The graph below illustrates their relationship.

The kunai starts in normal state, where it obeys the laws of physics. Each state can transition to another state only if there is a directional arrow pointing to it. The labels above the arrows is the specific event that triggers the state change. For example, OnSelectEnter occurs when the user grips the kunai, OnSelectExit occurs when they let go.

I coded the states and their behaviors myself, but I didn’t create the events that cause the state transitions with the exception of RecallToHand and LetGoOfGrip. I listen into events called from HurricaneVR, a package designed for VR interactions.

Class Structure:

The following UML diagram shows the structure of my classes:

The kunai class extends the Monobehaviour class so it can be attached to a kunai game object. The kunai class has a KunaiState field that represents the current state, and contains instances of every possible KunaiState. The current KunaiState’s StateUpdate() method is called every fixed update so that the kunai can dynamically behave based on the state is in.

When an event occurs that would cause a state change, the ChangeState() method is called, which triggers the current state’s OnExit() method, then sets the new current state, and then triggers the new state’s OnEnter method.

Before using this cleaner method of transitioning between states, I had to keep track of the variables and functions that would needed to be called for every state, and toggle every single variable that could be possibly changed every time I changed state. With this ChangeState system, I can simply edit variables on entry and roll back the exact edits that were made after the state ends.

Using the Factory and Prototype Patterns for Kunai Summoning

The Problem:

Inspired by the summoning seals in Naruto, I wanted players to be able to summon a kunai by making a grabbing motion on their wrist.

The Solution:

The factory pattern was a perfect match for what I needed, where a separate factory class instantiates another class in order to decouple creation from usage.

Unlike with my previous design pattern though, I’m not just working with classes but game objects. In order to create more kunai, I have the specifications for a kunai game object that has the correct components attached that are configured with the correct settings stored in my game files. This is called a prefab in Unity, which just so happens to be an implementation of the prototype design pattern, where an object is cloned from a model prototype rather than being initialized from scratch.

Whenever I want to create another kunai, I just call Instantiate(kunaiPrefab), where kunaiPrefab is a reference to that file, and Unity will create a copy of that object.

Class Structure:

The following UML diagram shows the structure of my classes:

When the player performs the gesture for spawning a kunai, the GestureDetection class will raise its OnKunaiSummonConditionsMet event. Events are a feature in C# that allow classes to communicate with eachother asynchronously. When the event is raised, the SummonKunai class will trigger the SummonInHand() method, passing in information on which hand to spawn the kunai in and what direction the kunai should be facing. The SummonInHand() method will then use Unity’s Object class’s Instantiate() method to create a copy of my kunai prefab and put it in the correct position/orientation depending on the info it received from the event.

Using Object Pooling and the Multiton Pattern to Improve Kunai Performance

The Problem:

I ran into various performance issues while playtesting my game on the Quest 2. The framerate would drop when too many kunai were spawned, and there would be a slight dip in frame rate while instantiating kunai. Framerate drops are problematic in games, but especially in VR because they can cause motion sickness. In order to even publish an app on the Quest app store, it needs to be consistently above 90 fps.

The Solution:

I tried to solve these framerate problems with object pooling. which involves creating a fixed amount of objects on startup. I also follow the multiton pattern and make it so that only a set number of objects can be created (though I don’t completely follow the exact specs of multiton as the kunai that are created are not globally accessible).

In terms of my project, when I first initialize the kunai, they are set to inactive, which means they are invisible and their attached components aren’t running. Once they are summoned for the first time, they are set to active. Once all of the kunai in the pool are set to active, a summon will just teleport an already existing kunai on the field to the hand instead of revealing a new one.

Class Structure:

The following UML diagram shows the structure of my classes:

On Awake(), when the game first loads, the SummonKunai component will fill a queue with MAX_KUNAI number of kunai. After this point, no kunai will be initialized.

When the player tries to summon a kunai and triggers the OnKunaiSummonConditionsMet event, SummonInHand() will take a kunai from the start of the queue, teleport it to the player’s hand, and set it to active only if it’s being summoned for the first time. Once it does this, it will add that kunai back to the end of the queue for later summons.

Using the Strategy Pattern to Organize the Different Ways to Teleport

The Problem:

When the player teleports to their kunai, the algorithm for calculating the teleportation target is different depending on the state of the kunai. If the kunai is stuck to a surface, I need to calculate the orientation and location of the player so that they land feet-first on the surface they are on. If it’s not, I just teleport them to a location behind the kunai and orient them to face it. I also had a deprecated third teleportation algorithm, designed for teleporting to teammates instead of kunai.

Initially, I put all of my code in a gigantic if-else-if block, but my code ended up becoming really hard to read. It didn’t help that I would also share variables between the teleportation methods, making it hard to understand which variables were actually being used for each algorithm.

The Solution:

Looking at my mess of code, I felt that it would be annoying to edit or scale for more algorithms, so I decided to make it a little cleaner with the strategy pattern, which involves encapsulating behaviors inside objects and makes it easy to activate them interchangeably.

Class Structure:

The following UML diagram shows the structure of my classes:

I define an interface that the Strategy classes should implement, and I have instances of each strategy stored in my teleport manager. Before the user teleports, the teleport manager checks the state of the kunai and sets the currentTeleportStrategy accordingly.

My Teleport() method in the teleport manager is much simpler than a giant if-else block now: I can just call .Teleport() on the current strategy and pass in the current teleportable.

Adding new teleportation strategies is also going to be much easier now. I can just create a new strategy class, define what variables to initialize in the constructor, and write my algorithm in a new clean file with a single method. Editing existing strategies is quicker too now, since I can just visit a well organized file instead of scrolling through tens of lines to find the correct if-else block with the teleportation algorithm I want to edit.

Conclusion

I have a lot of work cut out for me to finish KunaiVR. I’m would need to code an avatar loading system, add multiplayer functionality, develop an artistic style, create maps, improve the game design, and develop the business/marketing aspect of the game. At the same time, I’m really proud of what I’ve been able to do in a couple of months and there’s nothing else I would rather be doing right now. I’m looking forward to continuing to share my progress with everyone and eventually release the game!

For anyone who wants to offer advice or is just interested in the game’s development, feel free to join the discord server here: https://discord.gg/Nh6nVttQRF.

--

--

Computer Science major at Boston College, Class of 2024. Reach out to me on LinkedIn: @neo-zhou-xr