About Us      Recent News
How To Buy    |    Products    |    Technology    |    Case Studies
Products
Products Home
Virtual Reality Scene Generator (VRSG)
Overview
Computer Image Generator (IG)
Standalone Networked Viewer
What's New in VRSG
Tour the User Interface
Scenario Editor
Physics-Based IR
3D Content Libraries
Character Animation
First Person Simulator
Radar Simulation
VR Tracker Support
Sound Support
Multi-Channel IG
Combat Vehicle Identification
Multi-Channel IG
FBX Conversion
OpenFlight Conversion
Interoperability with SAF
System Requirements
User's Guide
Terrain Tools
CONUS++ 3D Terrain
CONUS++ Virtual Airfields
Africa 3D Terrain
Asia 3D Terrain
Europe 3D Terrain
Collecting and Processing Sub-Inch Geospecific Imagery
Summary of MetaVR's Available 3D Terrain
3D Terrain Creation
Complete Systems
JTAC Desktop Simulator
3D Accelerators
Storage Devices
Navigational Controllers
Evaluation Policy

   More About MetaVR VRSG's Features

Back to VRSG Overview

MetaVR™ Virtual Reality Scene Generator™ (VRSG™) core features include:

Light points

VRSG supports full-featured light points. Light point processing runs entirely in vertex shader programs downloaded to the graphics chipset, affording exceptional performance. You can expect as much as 20,000 omni-directional light points or 13,000 directional light points per frame at 60 HZ. VRSG light points were developed with input from subject matter experts, such as commercial and military pilots.

Light point features include:

  • Per-vertex color and intensity with no performance degradation for varying color and intensity within a light string
  • Per-vertex phase shift with no performance degradation for varying phase within a light string
  • Period
  • Duty cycle
  • Rotation rate
  • Real-world size rendered perspective-correct
  • Minimum size specified in pixels
  • Independent horizontal and vertical beam angle
  • Automatic luminance compensation for size-clamped lights, which prevents volume clamped lights from looking brighter at further ranges
  • Visibility range function of light type and weather conditions, separate from terrain visibility
  • Direction specified in azimuth/elevation
  • Ability to disable directional attenuation on a per-edge basis to support sharp transitions for realistic VASI and PAPI lights
  • Automatic ground clamping and elevation placement at load time by the image generator
  • ASCII description file that enables you to modify light independent of the terrain database

  MetaVR VRSG real-time scene of a simulated KC-135R aircraft refueling an F-15C.
     MetaVR VRSG real-time night scene of a simulated KC-135R aircraft refueling an F-15C.

Light lobes

VRSG provides realistic light lobes that yield per-pixel radial attenuation and per-vertex axial attenuation. VRSG light lobes are flexible enough to support landing lights, taxi lights, headlights, and searchlights. You can fully characterize the light lobe radial profile in two dimensions using a texture map. VRSG light lobes do not require multiple database render passes or hardware that can store alpha information in the frame buffer. Instead, VRSG light lobes are rendered single-pass, which results in minimal performance degradation when enabling a light lobe. No drastic impact on fill rate or geometry processing penalties is incurred when enabling light lobes.

MetaVR VRSG real-time night scene of virtual virtual Los Alamitos Army Airfield, featuring airfield lighting, and in the background, the cultural lighting of Anaheim.
MetaVR VRSG real-time night scene of virtual Los Alamitos Army Airfield, featuring airfield lighting, and in the background, the cultural lighting of Anaheim.

Dynamic lighting

VRSG supports a highly optimized dynamic lighting pipeline, which uses per-vertex color, blended with per-polygon material, combined with ambient lighting conditions and directional light sources for efficient and convincing dynamic lighting effects.

Multiple-CPU computer support

If your computer has multiple CPUs, VRSG will exploit the additional CPUs to support its asynchronous paging of both terrain geometry and texture. A second CPU, or multiple CPUs, will remove the burden from the primary CPU of loading terrain geometry and texture images into system memory, constructing the real-time graphics hierarchy from on-disk structures, and CPU-expensive operations such as wavelet decompression of textures.

Heads-up display (HUD) and 2D overlays

VRSG supports multiple mechanisms for adding 2D overlays to the 3D display. You can describe static overlays in an ASCII file without coding. Static overlays can be made dynamic though a UDP-based interface to the visual system. A simulation host can send commands to the visual system to enable, disable, scale, rotate, and translate overlay primitives.

MetaVR VRSG HUD with 2D overlay in one of the 4 cockpit simulators at the new F-16 MTC, located at the ANG base at Burlington, VT. The multi-channel synchronized view, driven by VRSG, is rendering the VT virtual terrain built by MetaVR. Photo courtesy of SSgt. Dan DiPietro, 158 FW, Vermont Air National Guard.
One of the 4 cockpit simulators at the new F-16 MTC, located at the ANG base at the Burlington International Airport, Burlington, VT. This VRSG multi-channel synchronized view with a 2D HUD display is rendering the VT virtual terrain built by MetaVR. Photo courtesy of SSgt. Dan DiPietro, 158 FW, Vermont Air National Guard.

MetaVR provides a plug-in mechanism for users who want to generate overlay graphics using a low-level graphics API. The end user develops a dynamically loaded library (DLL), which the visual system loads at run time. The visual system makes calls into functions exported by the DLL, which pass the thread of execution to the user-written function. From within the DLL, the user can use Direct3D to render customized overlays.

Multiple viewports

VRSG supports an unlimited number of viewports per channel. Multiple viewports on a single visual channel may be overlapped or spatially disjointed. Viewports can be horizontally mirrored to support applications that demand this orientation (such as rear-view mirror), or display systems whose optics imposes a horizontal reversal of the image.

Sensor options

VRSG's physics-based IR features real-time computation of the IR sensor image directly from the visual database, without the need to store a sensor-specific database. This real-time model combines automatic material classification of visual RGB imagery, and a physics-based IR radiance and sensor model. VRSG also provides several basic sensor options you can use in a scene, such as the amount of noise or blur, the level of intensity of the terrain or vehicles, whether the scene should be rendered in green or in black and white, and whether to display simulated A/C banding.

Environmental and weather options

VRSG provides several sky options you can use in a scene. The sky blends with the selected fog color as the sky merges with the horizon.

VRSG’s high-fidelity particle-based multi-layer atmospheric model provides continuous variable visibility as a function of altitude and earth curvature of atmospheric layers.

MetaVR VRSG real-time scene.
VRSG real-time scene with particle-based volumetric clouds, fog, and haze at the horizon.

VRSG uses an ephemeris model to calculate sun position, moon position, and moon phase from date, time, and geographic location. Lighting conditions can also be automatically calculated from date, time, and geographic location. A 40,000 light-point star field can be used for night scenes. Users can also provide a custom star field pattern with positions and intensities via a lighting table (.csv file).

MetaVR VRSG rain and snow effects.
Examples of rain and snow in VRSG.

With CIGI or DIS SetDataPDUs, users can instantiate multiple volumetric clouds from our library of over 13 cloud models. A cloud can be positioned, oriented, scaled, and moved over time. Volumetric clouds are particle masses that model light absorption, creating a realistic reduction in visibility when flown through. VSRG features a real-time dynamic lighting model for clouds that models light absorbtion as a function of particle depth into the cloud along the line-of-sight to the sun. Clouds can have a optional precipitation effect modeling either rainfall or snow. The precipitation effect is also a volumetric mass extending from the cloud base through ground level, creating a realistic reduction in visibility during flight.  The rain or snow precipitation effect is generated dynamically, so it can be applied to any cloud instance, at any cloud altitude.

3D Ocean Sea States

VRSG's 3D ocean sea states features realistic wave motion, 12 Beaufort sea states, 3D wakes, vessel surface motion, accurate environment reflections, and support for bathymetric data for shoreline wave shape and opacity.

MetaVR VRSG 3D ocean sea states.
VRSG real-time scene with 3D ocean simulation and multiple naval vessels from MetaVR's military model library.

Time options for virtual world display

VRSG has time options with which it can set the position of any celestial bodies in the sky and the light source angles based on your settings for the date and time, and the database's geographic location. You can specify an explicit time or you can have VRSG obtain the time from your system's clock. In turn, the time setting sets the position of any celestial bodies in the sky, and can optionally override any light source angle setting. In addition, you can have VRSG advance the time in the scene as in the manner of a clock, thus shifting the positions of celestial bodies and light source angles in the virtual world with the progression of time.

Particle-based EFFECTS

VRSG supports particle-based effects for smoke plumes, dust trails, tactical smoke, blowing sand, blowing dust, rotor wash, and explosions. The /Effects directory contains several kinds of smoke and dust effects as well as other effects.

Real-time MetaVR VRSG screen capture of rotor wash effect.
Example of VRSG simulated brown-out dust effect generated by helicopter downwash. The scene shows a Bell V-280A Valor tiltrotor entity model in flight at low altitude, with dust generated by particle effects per rotor.

You can also use particle-based effects for one-time animations such as explosions or muzzle flash effects. By editing particle description files, you can create or customize new particle-based effects.


Solid particle effects, which model projectiles with dust trails being cast from detonation events. This scene also shows culture models with DIS damage appearance states.

The image above not only shows an explosion of solid particle effects, but also buildings on MetaVR's modeled Afghanistan village with DIS damage states. Among the village’s 650 models of buildings and other structures are 150 models which have 3 DIS appearance damage states (slight damage, moderate damage, and destroyed).

MetaVR VRSG real-time scene.
VRSG real-time scene on virtual Afghanistan terrain featuring 160 new culture models with multiple DIS damage appearance states.

Track impressions

VRSG to simulate a track or wheel impression to appear behind a tracked or wheeled vehicle entity -- or a footprint impression to appear behind a human entity -- in motion and follow the entity in motion. For most tracked and wheeled vehicles, VRSG can automatically determine the width and offset of the tracks by inspecting the model's geometry, and will use a generic track or wheel texture for the impression. Several textures are delivered with VRSG are that can be used for track, wheel, and footprint impressions.

MetaVR VRSG with wheel  impressions. Real-time VRSG scene of tracks trailing 4-wheel and 2-wheel (single track) entities.

MetaVR VRSG with footprint impressions. Real-time VRSG scene of footprint tracks trailing a character's stride.

High-fidelity animated character visualization

VRSG supports high-fidelity 3D animated characters. Animation and rendering is designed to support hundreds of characters within the field of view while still maintaining a high frame rate. VRSG is delivered with a substantial model library of characters and weapons in MetaVR’s model format. Several animations for the characters are also included in the library; the animations portray all commonly used appearances required by the DIS protocol. You can immediately configure and use these models in VRSG.

Real-time MetaVR VRSG rendering of a JTAC team on MetaVR's Yuma Proving Ground terrain.
Real-time VRSG rendering of a JTAC team on MetaVR's Yuma Proving Ground terrain.

When you have human characters in a networked VRSG scenario, you can use the VRSG First Person Simulator (FPS) plugin to control your character and view the scenario from the character's point of view.

Common Image Generator Interface (CIGI)

VRSG supports Common Image Generator Interface (CIGI) version 3.3 for communication between an image generator and host device in simulations.

Model-edge anti-aliasing

You can apply selective, model-edge anti-aliasing to individual models by adding the"+antialias" qualifier to a model's entry in the ModelMap.ini file. This feature is a handy alternative to full-scene anti-aliasing (FSAA) in cases where FSAA is not an option due to its resource usage or performance impact. You can maintain high resolutions and 60 Hz frame rates, yet visually enhance one or more models in a scene.

Adding content to the terrain

You can add a few static models directly to the terrain by either dragging 3D models from the Windows Explorer to the visualization window and dropping them at the target location on the terrain (a drag-and-drop action), or by editing the terrain's cultural feature file. By attaching to the static model you can move it to modify its placement and orientation.

To build up a a scene of dense cultural content and/or create pattern-of-life scenarios, use VRSG Scenario Editor, which is installed with VRSG. In this game-level editor, you can use typical VRSG features like viewpoints, attachment modes (such as UAV or tracking mode), and sensor modes in your scenario. To disseminate the scenario to trainees, you could play the scenario in VRSG on a network or record the scenario in VRSG and upload the resulting mpeg video to a site for trainees to access.

To complement VRSG, MetaVR provides Terrain Tools for Esri ArcGIS, a terrain-generation product that uses elevation points and imagery as source data. This product works hand-in-hand with VRSG to provide rapid terrain creation and high-speed visualization.

You can order VRSG directly from MetaVR.

    Contact Us     Site Map     Downloads    Privacy Policy    Copyright © 2017 MetaVR, Inc.