Nyaggin'

"The Art of Game Design" Book Note #4: On Immersiveness of VR Games

This blog is written for the assignment of the GSND 5110 course. We are required to write out our thoughts about some reading materials.

I have no idea why is the VR chapter is listed together with other entries related to serious games... But I'll just write a blog about it because I've generated many thoughts on it.

Schell's taken many words on explaining one thing:

To make a VR game immersive, it must behave realistically and match with the physical environment the player's in.

This can be expanded into smaller items:

  • When players interact with the game world "unexpectedly" (like the "using a knife as a screwdriver" example in the book), the game should be able to handle it.

  • The "visual force" the player perceives in game should match with reality, otherwise nausea.
    —Hence no additional acceleration is allowed for VR games.

  • When something hit other things, it should make a collision sound.
    And this sound should vary depending on the acoustic environment.

  • Etc, etc. There are so many all details you can work on.

One of my undergraduate professor, Shi Huang (he teaches Game Arts), concluded this with one sentence in his book 数字游戏设计 ("Digital Game Design"):

The bandwidth of video games, compared to real life, is limited.

There are so many rich details in real life. Even if you don't do anything and just sit quietly on a chair, there are still many things you could perceive with your senses—the ambient noise, the temperature, the wind that breezes over your skin, the shaking tree leaves, and most importantly, your presense.

There are something that could be recreated in a video games, and there are many things that couldn't; and even for the could part, it's still too hard for a dev team to recreate them all, because it'll require too much manual work just to label all the implicit information of the virtual objects, not to even mention implementing all the underlying systems that support these detailed behaviors.

I'm just thinking, is it possible to make a game where we deliberately break realism and let players to find out what's weird? Totally out-of-context, please continue reading. :D

But however, there is one game I'd like to mention that could achieve this pretty well: Minecraft (not this again LOL XD ROFL dafuq LMFAO). Here are some aspects that reflect its completeness on detail, or you could say, physical (not Newton kind of) mechanism:

  • The application of each tool is depending on the target object you're working on, clear and intuitive.

  • The walking sound effect would always vary based on the texture of the surface you're walking on.

  • Flexible crafting recipes would always work when you replace an ingredient with another one of the same kind, like oak log to spruce log.

  • All undead mobs would receive damage when splashed with healing potion.

The key to achieve this, I think is the underlying tier/tag-based system that's used throughout the game. Instead of implementing each type of block/entity individually, the dev team gave them tiers and tags to mark the properties they have. So when interaction happens, the corresponding subsystem could use these information to dynamically generate the details. Also, Minecraft's game world is procedurally generated, so a one-time investment could benefit all future gameplays. Great deal!

But what if we want to achieve this level of detailedness in our game project? For example, a course project? Well then it's definitely impossible to design such a system in advance. But here's my advice: Combination is better than inheritance.

This is a saying in the programming field that has long lasted. So in traditional OOP style programming, you'd always think things in terms of "what it is", because you're designing classes that represent a general template of that kind of stuff; and if something got specialized (say, gained new abilities), you just inherit a child class out of an existing class. And it turns out, inheritance might not be the best approach for game logics.

Let's go back to the "using a knife as a screwdriver" example from the book (This is a darn good example, how did Schell come up with it?). Normally, knives are for cutting things. Taking the OOP approach, you might design a "Knife" class that represent a single knife object: It could be grabbed as a tool; it could be used to cut something else. So, how'd you plan to integrate the screw-driving function into this design? There are a couple of possibilities:

  • Adding another controller class for screwdrivers.

    —This could be weird, because: a) There would be two controller classes controlling one object at the same time, which may or may not be problematic; b) I went to reply some messages when I was writing this line and I forgor💀 when I came back.

  • Inherit a "ScrewDrivableKnife" class from the base "Knife" class.

    —Come on, what the freck is that?

So, it's not very good to solve this with an OOP approach. The correct approach, I think, is to break down the functions of a knife into individual componenst. There would be:

  • A "Cutting" component bound to a mesh of the knife's edge that could cut other things when it hits something.

  • A "Grabbable" component with an anchor on the knife's handle, which makes it able to be grabbed by the player.

Provided with a knife's model, these could make up a whole knife. And to make it also a screwdriver, just add a "ScrewDrivable" component on the edge mesh. You'll notice that this is also compatible for the screwdrivers in game, you could just combine a "ScrewDrivable" edge and a "Grabbable" model and it's a screwdriver!

With the combinatoric approach, you don't focus on "what things are", but instead "what they do". When a thing's function gets extended, you just add more combinatable features to it, very simple.

At the end of this blog, I'd like to show you what I've done in my graduation project, where I combined the tier/tag-based approach and the combinatoric approach to achieve physical realism. In this project, the designer designed many interactable objects in the game world. Most of them are physical, which means that upon interaction or collision they'd make sounds, and we'd like the sounds to vary depending on the interaction type and collided bodies' physical textures. If done in an OOP approach, there would be a really complex sound controller class, involving imaginably giant branching/switching blocks. To avoid this, I designed a rigid body sound system along with a tier-marking component. The system would automatically try to fetch the tiers of the collided bodies' (if there is none, it'll use a fallback value) to decide which audio clip to play when a collision happens. It also allows the designer to override the default interaction sound effect. As you can see, this system is completely compatible with all other systems, and even if it's not added to some objects, there'd be a fallback behavior. So I'd say it's a success as an approach on combinatoric design!