← back to Main Page

GIKO: Get In - Keep Out

Overview

GIKO is an asymmetrical, three-player game, where players jump into the roles of an agent, a hacker and a guard as part of the GIKO Company. The agent and hacker must work together to navigate through the level, while the guard and the drones they control, try to stop them.

The AGENT is tasked with infiltrating a restricted facility, while communicate with the Hacker to solve tasks and find the “exit” to the level, all while avoiding the security drones roaming around.
The HACKER, hidden away in their secure office, has access to classified information, like a map, showing the locations of objectives and security bots. Their job is it to guide the Agent through the level via verbal communication, while also counteracting the Guard security measures.
The GUARD needs to protect and defend the facility, stopping the Agent from finishing their objectives. To do so, they have control over the “drones”, which are NPCs that provide the guard with video footage of the premises, helping them to track the Agent down. Additionally, they can influence the level, by manipulating doors, cutting off the agent's escape routes and luring them into a trap.

While initially starting as a uni project, we have plans to pursue this game idea further in the future, expanding it to a fully flesh-out game which will eventually get released. Since this includes major reworks of the prototype we created so far, we decided to no longer have the demo freely available to play for the time being.

← back to Main Page

Technical Aspects

The primary goal of our project was to create a functional multiplayer game in Unreal, including the networking of multiple computers, a lobby functionality, as well as a host-client setup using a Steam API.
For our secondary goal, we implemented a level generation system, utilising a grid based, procedural generation of rooms and walls. We additionally increased the randomness of our game further with generated tasks for the agent and hacker to complete.

My responsibilities in this project were primarily the implementation of the NPCs, and any aspects that were related to them. This included, for example, the setup of the dynamic NavMesh, to enable the drones to move around the level, while reacting to the changing layout of the locked and unlocked doors.
Additionally, the NPCs had to be controllable by the Guard player via a map UI. This map not only had to display the real-time location of all NPCs in the scene, it also had to be interactable, to allow the Guard to select any drone and give it instructions.
The available commands have been seperated into 3 different NPC states: while in the GUARD state, the player can click on any spot on the map to specify a location for the bot to walk to and stay at. CLicking on the map again, deletes the old location and replaces it with the new position. The PATROL state allows the Guard to define a patrol route which the drone with walk along, by placing multiple position markers on the map. Finally, STANDBY can be used to delete all previous commands, causing the drone to freeze in place.

To be able to detect the Agent, running around the level, I fitted the droids with both sight and sound based sensors. This way, they are not only reacting to visual contact, but are also able to detect the player's footsteps, even if they are out of sight. Additionally, I added different sound levels that are emitted when the player runs or sneaks around.
As soon as a bot detects the Agent, their current state are being overwritten, and replaced with a new CHASE state. For as long as the drone is in pursuit, all previous and new instructions are being ignored, and instead the player is being followed. Whenever the bot looses track of them however, it will continue towards the Agents last known location, which had been determined either by sight or sound. Having reached this position, and still being unable to locate them again, the drone will return to its previous, controllable state again after a few seconds.

Besides being able to see the position of the NPCs on the map, the bots' visual inputs are also streamed to a RenderTarget texture, so that it can be displayed on the screens in the Guard office, allowing the Guard player to see more of level.


Screens in the Guard office, displaying the map on top,
and the current view of the individual drones on the smaller screen below

Finally, while we did not make the 3D model and animations ourselves, I was still responsible for implementing ou assets, and making sure they work together and fit our game's aesthetic. This required wrangling the Unreal Animation Retargeter, to correctly apply the animations to our droid model, as well as reworking the model's materials to allow for matching recoloration and displaying of our logo.


Different recoloration options

← back to Main Page