FPX Multi-Video Processing
Platform (MVP2)
|
BACKGROUND: |
|
The following FPX modules were designed to provide a single
continuous image created from multiple cameras to a pilot in a plane without
a windshield. Using the WUGS switch and multiple FPX's the video will
be buffered and a new image will be created for display in a head's up
display. Zoom and optical corrections along with smooth scrolling of
images are currently under design. |

Optical Correction:
Input images will be corrected and mapped to compensate for optical derivations
of the camera optics. This will facilitate the inter-weaving of multiple
images to create one smooth distortion-free image.
Design: Modular design of
components allows multiple cameras and multiple displays to interconnect with
each module functioning independently without state information from the other
modules. After initializing VCI's and setup of the environment each
monitor can display different views.
The design is divided
into three parts: IRE (Input Rendering Engine), Image Buffer, and ORE (Output
Rendering Engine). The transforms incoming 36Kb AAL5 frames with 8
pixels/cell to individual control cells with 4 pixels/cell which write to the
Image Buffer's (or Control Cell Processor's) 2 MB SRAM. Currently both 1
MB modules are needed to buffer complete images. Updates to the FPX will
replace the 2MB SRAM with 4MB SRAM. The ORE has three different map
coordinates. The first is the virtual coordinates which give a row and
column pixel address over the complete image space created by multiple camera.
The second is the output pixel coordinates which maps to the outputted image on
the monitor. The third is the image buffer pixel coordinates which maps to
the individual coordinates. Relating these all together we have the
following (640x480 camera): 0x0 (output cord.) = 660x0 (virtual cord.) =
20x0xCamera2 (image buffer cord.). Optical correction is achieved at the
IRE therefore 1-to-1 type matching (excluding zooming) can be used in the ORE.
FPX Homepage