Custom AI Perception System - Unreal C++
For the main AI in CH, I am using a state machine as well as some perception functions that run on timers to check if the patrolling AI can see or hear the player. If the AI can perceive the player, then its state will change.
See:
For the see function, 3 line traces move from the AI’s eye to the player’s camera and both hands. To save some performance, I only take these traces into account if the dot product from the AI eye’s forward vector, and the (AILocation - PlayerLocation) vector is over a certain threshold. If the dot product is over the threshold (meaning that the player is in the AI’s view frustum (cone), then depending on, how close the player is to the center of the AI’s view frustum and how many of the eye traces hit, the AI will perceive the player faster or slower.
Hear:
In CH, the player has 3 ways to make noise that the AI can hear. Sprinting, slamming doors, and breaking bottles.
When a noise is made, I get the Navmesh path from the noise location to the AI location. The AI will then move to a random location in a sphere with a radius based on the path length and number of path points. The farther the noise is away, the greater the radius.