Police recruits must train to respond to a variety of high-risk situations, including the Immediate Rapid Deployment (IRD) tactics used to respond to an active deadly threat, such as an active shooter. This VR simulation supplements other IRD training by immersing recruits in a complex virtual environment to apply these skills without the need for actors, facilities, and equipment required for their non-VR simulation-based training. After an onboarding component to become familiar with the VR system, recruits move through the scenario environment and interact with non-playable characters to find and stop the deadly threat. An instructor can see what the recruit sees as they go through the simulation and then debrief the recruit on their performance to maximize learning.
AUTHORING TOOL(S) USED
WHY WAS THIS PROJECT NEEDED?
This VR training is needed to provide supplementary training for recruits to build on the skills they have learned in their in-person training and practice in a complex virtual environment under high-stress conditions, but with low risks. This IRD VR simulation has several benefits, including:
Supplementary training with fewer personnel, equipment, and facilities requirements:
Compared to in-person IRD scenarios that require actors, equipment (e.g. simulation ammunition), and a large multi-room building where the training can be conducted without disturbing others (e.g., an empty school building), this virtual reality simulation allows instructors to conduct training with fewer logistical hurdles involved. This supplementary training tool will help further hone the stress management and critical decision-making skills of police recruits.
Scalability:
There are two parts included in the current version: a Practice Session and the IRD office scenario. To achieve specific training needs, new characters/threads/components can be added to the existing scenario, and new scenarios can be developed in the future.
Easy for instructors to monitor recruits’ performance and helpful in the debrief sessions:
Followed by the scenario, there is an in-person debrief between the recruit and the instructor. It is typically where the teaching takes place. What recruits see and do can be cast on the TV/computer monitor during the simulation, so the instructor can see from the recruits’ perspective and observe what recruits experience in the VR environment; in future development, there is a possibility to record the experience and to play back the experience while the instructor and recruits are debriefing.
HOW DOES THIS DEMONSTRATE INNOVATION?
The idea of conducting Immediate Rapid Deployment in the 3D virtual reality is an innovation as traditionally this simulation-based training is done with live actors in a large multi-room building. While other simulators exist, this VR approach places recruits in an environment with autonomy that requires them to make all of the decisions about how to move through that environment, based on the observations and information they can obtain.
To create a simulation that would meet training needs, the development team had to apply several innovative features, such as:
-Locomotion mechanics consisting of short-distance and long-distance travel
-Spatialized audio, with complimentary stereo sound effects
-Firearm operation: Aiming, shooting, and reloading mechanic
-Interacting with NPCs, using speech recognition technology supported by Microsoft Azure
-Intelligent shooter NPCs
PROBLEMS OR CHALLENGES FACED?
There are a couple of challenges and problems the team has encountered throughout the process. Below are some major challenges.
Design challenge 1: Locomotion
Locomotion has been a constant design challenge during this project. This is an industry-wide issue, with designers trying to create the most immersive experiences possible while handling small play areas, motion sickness and all sorts of VR-specific problems. For the specific user group (police recruits) in this simulation, we have found that recruits tend to run/sprint when doing the IRD training. Thus, we have also considered this user habit and what VR simulation can afford.
Based on the findings from user testing and suggestions from subject matter experts, the solution consists of a combination of room-scale movement plus always-forward smooth locomotion. The recruits/users to be turning, walking, crouching, hugging walls and generally move tactically using their own bodies, as they are trained to do in real life.
Design challenge 2: Animation
The feedback from user testing showed the importance of a realistic environment and characters for an immersive simulation. The team had to create a high-fidelity environment but making animations of non-playable characters (NPCs) look natural was especially challenging. Therefore, the development team has explored the possibility of using Unity, Maya (3D modeling software) or motion capture techniques. As a result, the development team uses Maya HumanIK to rebuild animations. Additionally, as there are unique interactions for specific characters, new animations are added to existing NPCs.
Collaboration challenge: Trailer
The perception of VR shooting simulations for police can be misread by the media or the general public. This project, working with an outside team, required balancing the team’s legitimate interest in showcasing their work to external audiences with JIBC interest in limiting reputational risk. Discussions among the key stakeholders were needed to help identify what content to present and what to focus on to meet everyone’s needs.
LESSONS LEARNED
Design for the audience and onboard them:
As the vast majority of recruits have no prior experience using VR equipment, it is crucial to onboard them with clear and effective instructions on performing actions and navigating in the VR environment. In our experience, the best practice is to provide short onboarding practices before they have to perform that action in the scenario. We design two sessions in the simulation: a practice session where users learn and practice actions, such as using the controllers, navigating, using voice recognition, and a scenario session (active shooter in an office building) where the users do the simulation.
Have well-designed user tests at early stages:
User testing is effective in directly evaluating whether features/functions work as intended on the target audience and whether they meet assumed goals and requirements. The development team in this project is very good at managing user testing and listening to feedback. They have clear plans and goals before conducting it, study feedback from testers and have discussions on improvements and next steps. Because user tests are carefully planned at early development stages, we are able to have more time to resolve issues and try innovative solutions.
Join Wenyi virtually for DEMOFEST Day 1 on Tuesday, October 11, 2022 from 10:00am to 11:30am in Teams.
Or you can visit Wenyi’s table for DEMOFEST Day 2 on Wednesday, October 12, 2022 from 10:30am to 12:00pm happening at JIBC’S New Westminster’s Main Campus.