3D Vision Ground Processing Workflow For The Panoramic Camera On ESA's Exomars Mission 2016

G. Paar, A. D. Griffiths, A. Bauer, T. Nunner, N. Schmitz, Dave Barnes, E. Riegler

Research output: Chapter in Book/Report/Conference proceedingConference Proceeding (Non-Journal item)

14 Citations (SciVal)


The European ExoMars rover will deliver the scientific exobiology payload Pasteur to the surface of Mars by 2016. Its Panoramic Camera system PanCam will provide multispectral stereo images with 34° square field-of-view (“WACâ€), and 5° (“HRCâ€) color monoscopic images for close-up views. PanCam is the primary source for 3D overviews and context for the ExoMars - experiment locations, required to enabling the exobiological aims of the mission. This paper describes the current status of implementation of the PanCam vision ground processing workflow. It is based on a dedicated 3D vision processing batch solution, namely PROX, which will support key functionalities such as panorama mosaiking, generation of textured triangular meshes and Digital Elevation Models (DEMs) in different projection geometries (Cartesian, spherical, cylindrical) from stereo images, and the fusion of WAC (filtered in various wave lengths) and HRC image data. The workflow has been successfully tested using different kinds of (stereo-) image data including images from NASA’s Mars Exploration Rover (MER) mission. We report on the photogrammetric methods involved in the processing chain and give examples of successful processing in the mentioned scenarios, including PROX application in the Aberystwyth University Planetary Analogue Terrain Laboratory (PATLab), and the recent AMASE campaign, jointly conducted by NASA-JPL, ESA and the ExoMars Science Team in the Arctic area of Spitzbergen.
Original languageEnglish
Title of host publicationConference Proceedings of Optical3d 2009
Number of pages9
Publication statusPublished - 2009


Dive into the research topics of '3D Vision Ground Processing Workflow For The Panoramic Camera On ESA's Exomars Mission 2016'. Together they form a unique fingerprint.

Cite this