About Us      Recent News
How To Buy    |    Products    |    Technology    |    Case Studies
Products
Products Home
Virtual Reality Scene Generator (VRSG)
Overview
Real-time Media Clips of Demo Scenarios
Computer Image Generator (IG)
Standalone Networked Viewer
What's New in VRSG
Tour the User Interface
Scenario Editor
Physics-Based IR
3D Content Libraries
Character Animation
First Person Simulator
Radar Simulation
VR Tracker Support
Sound Support
Multi-Channel IG
Combat Vehicle Identification
Multi-Channel IG
FBX Conversion
OpenFlight Conversion
Interoperability with SAF
System Requirements
User's Guide
Terrain Tools
CONUS++ 3D Terrain
CONUS++ Virtual Airfields
Africa 3D Terrain
Asia 3D Terrain
Europe 3D Terrain
Collecting and Processing Sub-Inch Geospecific Imagery
Summary of MetaVR's Available 3D Terrain
3D Terrain Creation
Complete Systems
JTAC Desktop Simulator
3D Accelerators
Storage Devices
Navigational Controllers
Evaluation Policy

VRSG as a Computer Image Generator

As an image generator, MetaVR™ Virtual Reality Scene Generator™ (VRSG™) supports the features that are typically required for flight training, driving simulations, locomotive simulators, and many other applications. Image generators are typically driven by users’ simulator host model, such as a flight model. As an image generator, VRSG renders the virtual world as it is specified by host parameters such as location and field-of-view.

MetaVR real-time rendering of simulated aerial refueling.

Core image generation features include:

Light points

VRSG supports full-featured light points; processing runs entirely in vertex shader programs downloaded to the graphics chipset, providing exceptional performance. VRSG light points were developed with input from subject matter experts, such as commercial and military pilots.

Phoenix at night with culture light points.

All light points, including directional light points with unique per FOV edge attenuation behavior, run entirely in the vertex shader, providing exceptional performance. Light point features include:

  • Per-vertex color and intensity with no performance degradation for varying color and intensity within a light string
  • Per-vertex phase shift with no performance degradation for varying phase within a light string
  • Period
  • Duty cycle
  • Rotation rate
  • Real-world size rendered perspective-correct
  • Minimum size specified in pixels
  • Direction specified in azimuth/elevation
  • Automatic ground clamping and elevation placement at load time by the image generator
  • Automatic luminance compensation for size-clamped lights, which prevents volume clamped lights from looking brighter at further ranges
  • Visibility range function of light type and weather conditions, separate from terrain visibility
  • Independent horizontal and vertical beam angle
  • Ability to disable directional attenuation on a per-edge basis to support sharp transitions for realistic VASI and PAPI lights
  • ASCII description file which enables you to modify light independent of the terrain database

Light lobes

VRSG provides realistic light lobes that yield per-pixel radial attenuation and per-vertex axial attenuation. VRSG light lobes are flexible enough to support landing lights, taxi lights, headlights, and searchlights. VRSG light lobes do not require multiple database render passes or hardware that can store alpha information in the frame buffer. Instead, VRSG light lobes are rendered single-pass, which affords minimal performance degradation when enabling a light lobe. You can configure multiple concurrent, independent light lobes. No drastic impact on fill rate or geometry processing penalties is incurred when enabling light lobes. VRSG supports up to 20 independent, concurrent, steerable light lobes for video cards that support Pixel Shader Model 5.0.

Dynamic lighting

VRSG supports a highly optimized dynamic lighting pipeline, which uses per-vertex color, blended with per-polygon material, combined with ambient lighting conditions and directional light sources for efficient and convincing dynamic lighting effects.

Environmental settings

VRSG supports environment and weather effects such as:

  • Multiple atmospheric layers, ground fog, and haze.
  • Sun angle-dependent haze color and density.
  • Volumetric clouds, procedurally generated and user defined and controlled.

MetaVR VRSG with rain effect.

VRSG uses an ephemeris model to calculate sun position, moon position, and moon phase from date, time, and geographic location. Lighting conditions can also be automatically calculated from date, time, and geographic location. A 40,000 light-point star field can be used for night scenes.

Using CIGI or DIS SetDataPDUs, users can instantiate multiple volumetric clouds from our library of over 13 cloud models. A cloud can be positioned, oriented, scaled, and moved over time. Volumetric clouds are particle masses that model light absorption, creating a realistic reduction in visibility when flown through. VSRG features a real-time dynamic lighting model for clouds that models light absorbtion as a function of particle depth into the cloud along the line-of-sight to the sun. Clouds can have a optional precipitation effect modeling either rainfall or snow. The precipitation effect is also a volumetric mass extending from the cloud base through ground level, creating a realistic reduction in visibility during flight.  The rain or snow precipitation effect is generated dynamically, so it can be applied to any cloud instance, at any cloud altitude.

Heads-up display (HUD) and 2D overlays

VRSG supports multiple mechanisms for adding 2D overlays to the 3D display. You can describe static overlays in an ASCII file without coding. Static overlays can be made dynamic though a UDP-based interface to the visual system. A simulation host can send commands to the visual system to enable, disable, scale, rotate, and translate overlay primitives.

MetaVR provides a plug-in mechanism for users who want to generate overlay graphics using a low-level graphics API. The end user develops a dynamically loaded library (DLL), which the visual system loads at run time. The visual system makes calls into functions exported by the DLL, which pass the thread of execution to the user-written function. From within the DLL, the user can use Direct3D to render customized overlays.

MetaVR also provides a limited OpenGL emulation layer that allows legacy OpenGL-based HUD implementations to be ported to a VRSG plug-in with ease. This mechanism allowed the Lockheed Martin F-22 HUD originally developed for SGI platforms to be easily ported to VRSG.

Viewports

VRSG supports an unlimited number of viewports per channel. Multiple viewports on a single visual channel may be overlapped or spatially disjoint. Viewports can be horizontally mirrored to support applications that demand this (such as, rear-view mirror), or display systems whose optics imposes a horizontal reversal of the image.

Mission functions

VRSG supports the basic mission functions requirements to meet the needs of ground -based vehicle simulators up to fast moving fixed-wing aircraft. VRSG channels support one laser range per channel, per frame at 60 HZ, with single frame latency. VRSG also supports an Above Ground Level (AGL) response per channel at 60 HZ. A library that can be integrated into the simulation host provides for features such as point-to-point intervisibility, terrain height lookup, and collision detection with terrain or dynamic model geometry.

Mission functions

Use in mission rehearsal

Mission rehearsal applications require the ability see long distances (that is, far horizon) and process large amounts of geospecific imagery draped upon terrain elevation data. MetaVR's visual database format, round-earth Metadesic format, meets these and other mission rehearsal requirements.

Click to swap the rendered/wireframe images.
MetaVR VRSG real-time scene of F-22 entities in flight over the virtual Nellis Air Force Base airfield; threat domes are shown in the background. Click the image to swap the rendered image and its underlying wireframe.

VRSG provides the ability to create a wire-frame threat dome, in a bubble or cylindrical shape, that represents the detection and lethal ranges of a Surface to Air Missile (SAM) or similar threat system. You can describe the radius of the dome in meters, and the color of the dome in terms of the red, green, and blue components of the intended color.

MetaVR VRSG threat dome.

    Contact Us     Site Map     Downloads    Privacy Policy    Copyright © 2017 MetaVR, Inc.