MetaVR Visuals in UAS / RPA Simulation
For nearly two decades MetaVR visuals have been used in unmanned aerial system (UAS) training simulators, primarily through the Multiple Unified Simulation Environment / Air Force Synthetic Environment for Reconnaissance and Surveillance (MUSE/AFSERS) simulation system, where Virtual Reality Scene Generator™ (VRSG™) provides simulated video feeds for various intelligence gathering platforms.
Developed by the Joint Technology Center/Systems Integration Laboratory (JSIL), MUSE/AFSERS is the primary UAS training and simulation system used in the Department of Defense for command- and staff-level joint services training and provides the largest number of fielded simulation systems in the USA. MetaVR visuals have been part of MUSE/AFSERS since 2002.
As result, MetaVR has become one of the largest suppliers of commercial licensed 3D visualization software for UAS simulation training in the U.S. military, with nearly 2,500 active VRSG licenses in the field.
Most recently, the U.S. Navy at Naval Air Station Patuxent River, MD, (NAS PAX River) built and began operating multiple portable ship-based MQ-8B/C Fire Scout UAV simulators using the MetaVR Virtual Reality Scene Generator (VRSG) with 3D ocean states. The unmanned rotary-wing Fire Scout is designed to provide reconnaissance, situational awareness, and precision targeting support for the Navy's littoral combat ships (LCS).
The simulators, primarily used for ship-based operations, serve to both instruct and maintain proficiency of the Fire Scout by both the air vehicle operators (who fly the aircraft) and mission payload officers (who control the sensor payload). VRSG’s out-the-window view is first processed by the Navy’s sensor simulation, and then is used to provide both the electro-optical (EO) and infrared (IR) sensor modes on training systems. The FLIR EO/IR simulation enables operators to classify MetaVR’s detailed 3D ship models from a great distance, with high-magnification narrow fields-of-view (FOVs), and under varying environmental conditions. In the Navy’s 4-channel training system, VRSG renders a third-person (stealth) instructor view for situational awareness and a shipboard camera view that is attached to the automatic approach system.
U.S. Army's UAS simulators
The U.S. Army uses VRSG in its Shadow Crew Trainer, and in Grey Eagle, Aerosonde, and Hunter trainers in portable, classroom, and embedded configurations. VRSG embedded in the Army’s Universal Ground Control Station (UGCS) Embedded Trainer for training operators of the Shadow, Grey Eagle, and Hunter UASs. VRSG is also used in high-fidelity classroom Universal Mission Simulators (UMS) and portable Institutional Mission Simulators. In addition to a portable classroom, each simulation setup replicates a full GCS shelter with one-seat, two-seat, and three-seat configurations.
A key feature of VRSG is the ability to stream real-time HD-quality simulated video with KLV metadata using the H.264 protocol, which is indiscernible in composition from actual UAV video feed. This means that in the UGCS when UAS operators/trainees are not flying an actual UAS, they can fly a simulated UAS using the same hardware they use to operate the real system, using the JTC/SIL MUSE air vehicle and data link simulation software and VRSG.
With MetaVR visuals used for simulated UAV camera payload video in ground control stations and in manned aircraft simulators, UAV operators, pilots, and JTAC trainees can achieve fully correlated HD H.264 simulated sensor video with accurate KLV metadata that replicates the actual sensor payload imagery of ISR assets during manned-unmanned teaming (MUM-T) and other distributed training exercises.
One of the many development efforts at JSIL that involve the use of VRSG is the MALET-JSIL Aircrew Trainer, or MJAT, a plug-and-play training capability that converts a current tactical MQ-9 Reaper ground control station (GCS) into a training simulator.
The MJAT provides RPA operators the ability to conduct simulation training as part of their qualification and follow-on continuation training to maintain proficiency and currency in all required operator tasks. Like other JSIL developed UAS/RPA trainers, the embedded MJAT uses the Air Force Synthetic Environment for Reconnaissance and Surveillance (AFSERS) software to stimulate the tactical Vehicle Control software (VCS) to simulate GCS functions: air vehicle control, payload control, weapons control, communications, send and receive video data, and mission planning.
From 2016-2018, 56 MJATs were installed at USAF sites across the U.S that house the MQ-9 Reaper aircraft. As part of the continued rollout of the MJAT simulator, JSIL has purchased an additional 86 new VRSG licenses in 2019.
VRSG's ability to simulate the camera payload by streaming real-time HD-quality H.264 video with KLV metadata means that when pilots and sensor operators are not flying an actual MQ-9, they can train by flying the simulated RPA using the same hardware that they use to operate the actual aircraft, stimulate real ISR systems, and interoperate with JTAC training simulators.
UAS/RPA simulators that use MetaVR visuals are interoperable with JTAC simulators that also use MetaVR visuals, such as the Air National Guard Advanced JTAC Training System (AAJTS) and the Joint Terminal Control Training and Rehearsal System (JTC TRS). All MJAT simulators also use Battlespace Simulations' (BSI's) Modern Air Combat Environment (MACE) for scenario creation and computer-generated/semi-automated forces (CGF/SAF). This common baseline of VRSG and MACE provides enhanced interoperability and correlation between the MJAT and the large number of deployed VRSG/MACE-based JTAC simulators. This interoperability is critical in the simulated training environment, in order to emulate the interaction and collaboration between Reaper sensor operators and JTACs during real-world missions.
VRSG is used with BSI's MACE in the US Battlefield Information Collection and Exploitation Systems (US BICES) Coalition Integration Lab (CIL) at Joint Base Langley-Eustis, VA. US BICES is part of a larger BICES organization, which is a NATO intelligence gathering system for integrating current and future intelligence networks. BICES gathers and merges data from each NATO member nation intelligence system to provide U.S. forces, NATO personnel, and other allied military organizations with near real-time correlated, situational, and order of battle information.
In a simulated environment, VRSG provides the simulated UAS view used in preparedness training of traditional ISR activities such as collecting intelligence and sharing it among Allied nations for processing, exploitation, and dissemination.
VRSG is also used in testing network interoperability of functions like streaming full-motion video for periodic, multi-nation exercises and demonstrations.
One aspect of training UAS operators entails interacting with JTACs in joint mission training. Training together in a networked synthetic environment, the UAS operator and the JTAC on the ground work together to identify the same target in a scene.
The Air Force Research Laboratory’s (AFRL) Warfighter Readiness Research Division 711th Human Performance Wing at Wright Patterson Air Force Base has long used VRSG in various training research simulators. For example, in 2007 AFRL was an early adopter of VRSG as the IG for a JTAC training dome (precursor to the current AAJTS and JTC-TRS dome systems). Close air support (CAS) training missions that are run in their three JTAC dome systems now include an AFRL-developed Predator training-research simulator, called the Predator Research Integrated Network Combat Environment (PRINCE). PRINCE is a high fidelity, networkable MQ-1 and MQ-9 Remotely Piloted Aircraft (RPA) simulator, which was built four years ago, and serves as an R&D tactical simulator for Predator pilots and sensor operators. As part of AFRL’s research in human performance methods and technologies that provide the warfighter the necessary knowledge and skill to dominate their operational environment, the objective of the PRINCE research program is to meet known training gaps in UAV tactical operations such as JTAC integration.
VRSG is currently integrated as the simulator's IG for cross-training with the JTAC dome systems. PRINCE team members use VRSG Scenario Editor to populate the 3D terrain with culture (such as building up dense urban areas) and for creating pattern-of-life scenarios. As an example, the PRINCE can take part in a CAS training mission that includes a convoy overwatch with enemy targets in the area; VRSG generates the simulated MQ-1 camera feed that appears on both the ground control station monitor and a ROVER device used by the JTAC.
UAS classroom, portable, and embedded visual system trainers
The UMS enables UAS operators to conduct training as part of their qualification training and follow-on continuation training to maintain proficiency and currency in all required operator tasks. The system uses the MUSE system with VRSG to stimulate the tactical Vehicle Control software (VCS) to simulate the following UGCS functions: air vehicle control, payload control, weapons control, communications, send and receive video data, and mission planning. The simulator incorporates multifunctional software approaches to provide UAS operators with a high-fidelity training experience for individual, crew, and collective training. In a classroom setting such as at the Army’s UAS Training Center at Fort Huachuca, AZ, the simulators are full-size mockups of the actual ground control stations from which UAVs are operated in the field.
VRSG is embedded directly in the UGCS, which makes a smooth transition for training in a shelter in the field which soldiers can use to maintain flight time requirements and currency. The UGCS is a NATO STANAG 4586-compliant command-and-control platform that incorporates the Army’s Tactical Common Data Link (TCDL) for robust bandwidth and data security, and is designed to command and control multiple joint services UASs simultaneously. The TCDL, which sends secure data and streaming video from reconnaissance airborne platforms to ground stations, transmits radar, imagery, video, and other sensor information.
Since 2002, the U.S. Army National Guard has purchased VRSG licenses for ongoing fielding in its embedded Shadow Crew Trainer (SCT). These licenses support embedded trainers in Shadow TUAS, Aerosonde, Hunter, and Grey Eagle UASs, which are used by both Army and Army National Guard units. The SCT is a mission-level fully enclosed mobile classroom environment training device that enables users to train on their specific roles, as well as team-level communication and mission rehearsal. The SCT can train up to five students simultaneously in an integrated mode. Simple graphical user interfaces mimic the actual equipment. Trainees can log SCT hours as flight hours toward their overall requirements for Shadow UAS training. Each TUAS system is comprised of three air vehicles, two ground control stations, two ground data terminals, a launcher, a tactical automatic landing system, and an aerial vehicle transport. The GCS is a critical component of the TUAS system. In normal operation, the GCS is used to control the flight of the UAS and receive its telemetry. When the system operators are not flying the actual UAS, they can fly a simulated UAS using the same hardware using the JTC/SIL MUSE air vehicle and data link simulation software and MetaVR VRSG.
VRSG can be configured to simulate a UAS in a variety of ways, ranging from using VRSG’s internal camera payload model in which the telemetry of the simulated UAV is provided by a DIS entity, to fully integrated applications such as the MUSE UAV tactical trainer.
Simulation features include: