Home Blog

Anna Borja

Projects

First-Person Shooter Prototype

An FPS prototype featuring a variety of playable characters, weapons, and abilities. Built from scratch in Unreal Engine 5 using C++ and Blueprints along with:

  • Unreal Engine Gameplay Ability System (GAS)
  • Unreal Engine Common UI
  • Behavior trees for enemy and companion AI

Overview

Inspired by the story-driven campaigns of games like Titanfall 2 and Uncharted, the weapon systems of games like Halo and Call of Duty, and the companion mechanics of games like Mass Effect and Gears of War, I created this project to practice the design and implementation of various interconnecting game systems and mechanics.

Asset Credits

To speed up development and allow myself to focus on design and coding, I utilized premade assets from these sources:

Controls

Using Unreal Engine’s Enhanced Input system, the game’s controls are implemented for both gamepads and mouse/keyboard. In designing this control scheme, my goal was to support a wide variety of actions while keeping the controls as simple and intuitive as possible.

Gamepad Controls

A diagram of a video game control scheme overlaid on top of a photo of an Xbox controller

Mouse/Keyboard Controls

InputAction
W/A/S/DWalk/Run
ShiftSprint
Mouse-X/YAdjust Camera
XMelee Attack
SpacebarJump
CCrouch
EInteract
R (tap)Reload
R (hold)Open Special Ability Wheel
EscapePause Game
InputAction
RMBAim Down Sights
RMB + FAim Grenade
CommandSpecial Ability
LMBFire Weapon/Throw Grenade
1Switch to Squad Member 1
2Switch to Squad Member 2
3Primary Weapon
4Secondary Weapon
5 (tap)Quick Info
5 (hold)Switch Camera Mode

Camera

Although the game is designed to be an FPS, I wanted to practice working with different camera systems, so I implemented a camera that can switch between first-person and third-person (over-the-shoulder) modes.

To support multiple camera perspectives, I used separate first-person and third-person skeletal meshes for each playable character. I shared animations between these meshes by creating an IK Retargeter for each unique skeleton in Unreal Engine and retargeting animations between them.

A screenshot of an IK Retargeter in Unreal Engine showcasing two different skeletal meshes arranged in similar T-poses.

I used a variety of techniques inside an Animation Blueprint to implement smooth animation transitions for each skeleton, including animation state machines, blended poses, Blend Spaces, and Aim Offsets.

A screenshot of an Animation Blueprint inside Unreal EngineA screenshot of an Animation Blueprint inside Unreal Engine

Scripting

I built a system that allows for triggering scripted events within a level in a data-driven way.

NPC barks can be triggered, which utilize Dialogue Waves to play audio (in this prototype, placeholder audio is used in lieu of recorded dialogue). Barks can trigger other barks, allowing NPCs to respond to each other.

An animated GIF of video game dialogue subtitles

Scripted behaviors for NPCs can be triggered, such as gesture animations and targeted movement.

An animated GIF of a video game character turning and pointing in the distanceAn animated GIF of a video game character nodding

Mission objectives and tutorials can be triggered and displayed in the HUD.

An animated GIF of a video game tutorial notification being shown and then hidden An animated GIF of a video game mission objective notification being shown and then hidden

Temporary player view targets can be triggered, which prompt the player to press a button to pan the camera in a certain direction and gain visual context on their objective.

Companion AI

Companion logic is implemented in a Behavior Tree.

Companions can be targeted at a particular location, encouraging the player to follow them.

When not in combat and not targeted at a location, companions will generally follow the player.

Nav Link Proxies are used to bridge separated areas within the level’s Navigation Mesh. Companions will use these links to follow the player, performing relevant movements like opening doors and mantling onto objects along the way.

To avoid impeding the player’s movement, companions will move out of the way if the player bumps into them. This behavior uses Environment Query System (EQS) queries to find a reasonable nearby location for the NPC to go.

Enemy AI

Enemy logic is implemented in a Behavior Tree.

Enemies not engaged in combat can be posted at a particular location or walk along a patrol route.

Through an AI Perception Component, enemies can detect when the player is nearby.

When the player first enters an enemy’s sight line, the enemy turns toward the player and plays an audio bark. If other enemies are nearby, they will also be alerted to the player’s presence (this is accomplished through an instance-synced Blackboard key that is shared across related enemies).

Enemy combat behavior is implemented using Environment Query System (EQS) queries. Enemies attack the nearest player or companion while maintaining a minimum distance and preferring to attack from behind cover. Enemies will also switch between weapons depending on their distance from their target.

Gameplay Abilities

Most significant actions in the game are implemented using Unreal Engine’s Gameplay Ability System (GAS). Using GAS allowed me to not only tap into a robust system of activation, blocking, and cancellation capabilities, but to do so in a clean and organized way.

Sprinting

I integrated GAS cost and cooldown capabilities with sprinting to build a stamina regeneration mechanic. How much stamina a character has, how quickly it is depleted, and the rate at which it regenerates can all be configured through gameplay attributes.

An animated GIF of a video game stamina bar emptying and refilling

Character Switching

In this prototype, I wanted to be able to play with a variety of character types and weapons, so I implemented a system in which every companion is also a playable character.

When the player switches between characters, C++ code executes teardown logic on the departing character to prepare it for AI control (switching from a first-person mesh to a third-person mesh, starting Behavior Tree logic, etc.). Similarly, setup logic is executed on the incoming character to prep it for player control.

Melee Attacks

During a melee attack, an Animation Montage is played, which triggers a Notify State that creates and then destroys a hitbox. Hostile characters that overlap with this hitbox play a hit reaction animation and are dealt damage.

Since shooting is the main form of combat in this prototype, I kept the melee system simple. In a more complex melee combat system, unique attacks could be designed to use hitboxes of different sizes, shapes, and placements. Systems like soft locking could also be implemented using motion warping to aid the player’s accuracy.

Special Abilities

Although they aren’t fully fleshed out yet, I implemented a few different special abilities as a proof of concept. Special abilities can add variety and depth to combat by giving the player options other than just shooting.

Shooting

The three types of guns in this prototype—pistols, assault rifles, and shotguns—are implemented as hitscan weapons.

Shooting is implemented as a GAS gameplay ability. The ability code performs various line traces and calculations to execute each shot along with triggering relevant animations, SFX, VFX, hit reactions, and damage gameplay effects. It utilizes GAS costs to consume weapon ammo and GAS cooldowns to enforce a maximum rate of fire.

Spread

To implement spread, a bullet’s trajectory is randomized within a cone originating from the gun’s muzzle. The cone’s diameter could be calculated using gameplay attributes and factors like whether the character is running or standing still.

Weapons like shotguns can be configured to fire multiple projectiles at once.

A screenshot of a video game shotgun with debugging markings illustrating the trajectory of the gun's projectiles

Recoil

After firing a weapon, viewkick is temporarily added to the viewport to simulate recoil. The direction, severity, and rate of recovery (centerspeed) of the viewkick could be configured for each unique weapon, allowing for a variety of designs.

UI

The HUD displays key pieces of information related to the player’s vital stats, squad, weapons, and special abilities.

A screenshot of a video game HUD (heads-up display) UI

Thanks to Unreal Engine’s Common UI plugin, the game’s UI can dynamically display gamepad and mouse/keyboard icons depending on what type of controller the player is using.

The game’s main menu supports three levels of tabbing. It utilizes Common UI widgets like tab lists, carousels, and activatable widget switchers.

The HUD can display temporary UI components like notifications and info displays. These components use C++ timers and queues to display information at the right time.