Light In The Dark is an advanced graphics and interaction project for the course DH2413 at KTH during the autumn of 2022. At its core, the game is a competitive survival experience in which one player is trying to stay alive for as long as possible while the other player tries to kill them as fast as possible. The gameplay experience of the two players is highly asymmetrical, as the player trying to stay alive is looking and interacting with a scene through a virtual reality (VR) headset while the second player is directly influencing and placing enemies into that scene by using a monitor and mouse. The scene used for the VR view of the game is set in a dark underground arena and features highly detailed textures and modeling for the surrounding environment. There are three different types of enemies, each of which has been modeled, textured and scripted by the group. Additionally, the VR scene features advanced particle systems for fire and electricity. The fire effects are used for a flaming sword used by the player as well as a chargeable fireball while the electricity particles are used to emulate a chaotic electrical ray. As the VR player is fighting for their life, the second player interacts with the scene through a custom networking implementation, allowing the actions of the second player to directly impact the VR experience.
Featuring two very distinct gameplay experiences, two players interact and compete against each other, one through an action packed hectic perspective with a VR headset and the other through a calculating and strategic perspective, interacting through a monitor and mouse.
When setting out on the project, one of the main motivations of the group was to experiment with high fidelity lightning. This also is the reason for the setting for the game being a dark mine, as the darkness emphasizes and truly brings out the light and particle effects in the scene.
Working on the project has presented several challenges and challenges to the group. At the start of the project, none of the group members had any experience working with VR, high fidelity particle systems and advanced audio. In order to overcome these challenges, the group split up the roles to the members early, and made sure to have each member be responsible for a set of distinct features/areas using a continually updated trello board. For further communication, the group used discord and lastly, google drive was used for file sharing and documentation.
During the project, the group also ran into a number of obstacles. Mainly these issues arose in relation to
optimization, as the group often strived to achieve realistic-looking graphics, the performance of the game
would sometimes take a hit. One example of an area in which we encountered this was in relation to lighting in
the Unity Universal Render Pipeline, where the graphics renderer would automatically stop rendering light
sources when the number of light sources in the scene became too high (We later solved this by switching to
the ultimate quality preset in the URP and increasing the max limit).
Throughout the project, we also had occasional issues with networking.
The asynchronous nature of the two clients made it hard to use already existing network solutions such as
Unity Netcode for GameObjects. Because of this we designed and implemented our own network protocol for the
game,
which we had to revise multiple times to reduce the amount of sent data as the throughput would be too high,
overwhelming both clients and making them drop packages. The inherent asynchronous nature of networking also
caused minor issues, as both clients had to be synched to both each other, but also their respective
networking
thread, causing bugs stemming from race conditions between the game logic thread and networking thread.
Towards the end of the project, the network issues were largely resolved, but we still had occasional
connection
issues, and both users needed to restart the game to restart. The electrical arc ray also held its share of
issues.
Mainly these issues related to making it so the electric arc would behave realistically, chain between enemies
as
well as spawning smaller “child arcs” from the main arcs, without causing too much of a performance hit.
Another, more wide range obstacle which remained in the game, is that it isn't currently playable in single
player form (not counting just gawking at the graphics in the VR headset). At one point in the project, the
group had plans on making the enemies spawn from set monster spawners which would generate monsters over time,
as opposed to having the non-vr player actively summon them. By implementing this, the idea would be that the
game could potentially be played by either one or two players but we never had enough time to implement this,
as we prioritized fixing the features we already had.
Throughout the project, the centerpiece platform used for development was the Unity game engine, in which all the game logic, scripting, particle system creation, lighting and overall scenes were assembled. The choice of using Unity was made on the basis of multiple members of the group having prior experience working with it, creating a solid foundation for moving directly unto starting work on the VR implementation and other more advanced topics. Other than Unity, Maya, Zbrush and Blender were used for modeling the enemies and the arena, and the texturing was done in Adobe Substance 3D Painter and. For these tools, the motivation for using them was also made on the basis of previous experience of the group members. For the audio, SteamAudio was used for rendering, as it seemed to give the best results when comparing it to other high fidelity audio options, and worked nicely along the HTC Vive VR headset used (which already uses steamVr).
Combining different works will take time. You will also meet many bugs in this process. Thus it's good to have
enough time for merging.
It's good to have a nice structure in the project before all files start to get very messy.
It's good to think of the performance when you are creating your objects and not creating more details than
necessary.
The view with the VR headset is really different from the desktop view. Thus when you are doing graphic
programming on VR it's important to also test it on the VR headset.
Audio is additive, and so great care must be taken to adjust for multiple simultaneous sound sources to create
a
meaningful experience.
Gawron, M., & Boryczka, U. (2018). Heterogeneous fog generated with the effect of light scattering and blur.
Journal of Applied Computer Science, 26(2), 31-43.
Cao, C., Ren, Z., Schissler, C., Manocha, D., & Zhou, K. (2016). Interactive sound propagation with
bidirectional path tracing. ACM Transactions on Graphics (TOG), 35(6), 1-11.
Lee, D. F., Huang, X. W., Chen, Y. C., Hsu, Y. X., Chang, S. Y., & Han, P. H. (2020). FoodBender: Activating
Utensil for Playing in the Immersive Game with Attachable Haptic. In SIGGRAPH Asia 2020 XR (pp. 1-2).
The ones who made the game
Sound & 2nd Player Experience Director
Implemented the game logic for the non-vr player and designed and implemented the network protocol connecting the two clients together. Set up the project to use 3D sounds and spatialization using the Steam Audio unity plugin, and configured the plugin to create a smooth and high fidelity auditory experience. Selected and tweaked individual sound effects, as well as their 3D sound properties, and set up the audio geometry of the scenes. Implemented the game logic to spawn sound effects.
AI and Enemy Gameplay Director & (sadly enough) Web Designer
Implemented and created the whole gameplay system for the enemies. Also implemeted boid behaviour for the bats and extended their behaviour and optimizing the basic algorithm. It was also he who balanced the game, so blame him if it is too hard. Was also forced to make this website.
Spell Director (and forced Jacob to Web Design)
Responsible for the VR interaction input from the controllers used for the spells as well as the spell logic (scripting for each spell) and the underlying player mana system. Created the particle effects and graphics for the lightning spell and some smaller script-based parts of the fireball graphics, such as the size depending on its level of charge. Also did much of the underlying game logic such as pausing & restarting the game.
Enemy Designer & Tablet UI Designer & Model Rigging Director
Responsible for all enemy designs which include concepts、 sketches 、modeling and animation. Joint the VR player with the virtual body in-game and develop the rigging animation function. Design the tablet gameplay mechanics and User interface for the summoner player.
Artist & Level Director & Rendering Director
Designed and created the game level together with the game level objects. He used Zbrush to sculpt rock surfaces, Maya to model different objects and Substance Painter to create high quality textures. Also took the overall graphics to the next level by adding anti-aliasing and real-time shadows for the rendering.
Proof that we actually did this. It is not photoshop.
Credits to authors that created materials (e.g. images, sounds, prefabs) that are used in this project.
Fire texture used for Fire Graphics - website 80 level, Author Nadir
Khabibullin
Noise texture used for Fire Graphics - Author Yoeri Luos Vleer
Fire textures used for first version of Fire Sword - Youtuber Sirhaian'Arts
Death text font - website dafont, Author Jonathan Harris
Tool used to edit sounds - software audioalter, Copyright link: https://audioalter.com/licenses.txt
Sound Effects for fireball, explosion and background - website Pixabay
Sound Effects for Enemies and Player - website mixkit
Background music for The making of Documentary - "They Said I Can't" - website
fesliyanstudios, Author David Fesliyan
Steam® Audio SDK - website steam-audio, Copyright Steam® Audio, Copyright 2017 –
2022, Valve Corp. All rights reserved
Player Body Model in VR - website Mixamo