A Conversion Pipeline:

From Laser-scanned Data to High Fidelity Rendering

Gavin Ellis

Computer Graphics Group, University of Bristol

March 04

Read paper

Abstract

Accurate, textured, 3D computer models of objects are of increasing interest in many disciplines, e.g. archeology. Laser scanners provide a very precise way of gathering data for such models; however, a number of steps are necessary before this new laser data can be imported into a computer model that is ready for rendering.

This work discusses the processes undertaken to develop a simple to use, seamless method of transforming raw laser-scanned data (with associated texture information) to the three dimensional modeling system, Maya, for further editing, or directly to high fidelity rendering in the Radiance lighting visualisation suite.

Background

It is clear that computer graphics can help us better understand the past by visually recreating archaeological sites under accurate and authentic conditions \cite{arch_sites}. Such graphics are only scientifically valid if the components of the scene are created in an exact, scientific manner. Methods such as 3D scanning and rendering with high fidelity graphics offer a faithful way to recreate scenes from the past.

Laser scanners provide a method of capturing suitably accurate information about object's surfaces. The Minolta 910 laser scanner, which we used, has an accuracy of under a millimetre. It also captures colour texture information for the model by way of the CCD.

The figure above illustrates the way the 2 dimensional image captured by the scanner is mapped onto the mesh data it collects. It shows an exploded view of the components, the 2D image A is mapped onto a 3D mesh B and the combined textured model C.

Motivation

There was, however, no existing way to import the data acquired by the laser scanner into Radiance, nor a way to import the captured textures into Maya. In addition, writing texture images to Radiance PIC files directly saves time and complication of converting via a number of intermediate stages. This paper is concerned with facilitating and simplifying the process necessary for conversion from the VMRL 1.0 format files produced by a Minolta 910 laser scanner to formats that can be understood by both Maya and Radiance.

Results

Once the texture map has been extracted using our approach it can be mapped onto a model. The scanner software can export Maya OBJ meshes, but they have no texture mapping information. Consequently when Maya tries to map the texture onto the mesh it does not `fit', as shown in the figure (below - left). Our software also extracts the texture mapping information from the VRML file and converts it into a Maya .OBJ file. The result is a correctly applied texture (below - right).

A Jug Model Rendered In Maya Using Scanner
Software [left] and with Texture Map Using our method [right]

 

When this technique was being developed, Radiance (version 3.5) was not capable of mapping textures onto complex polygon surfaces.
Mapping onto simple planes worked correctly, but not when applied to more complicated polygons. At the September 2003 Radiance conference version 3.6beta was released with an improved obj2mesh program. This new release fixes the problem and so, when used in conjunction with out software, allows high fidelity rendering of scanned objects, as shown in the figure below.

The High Fidelity Render of the Jug Model produced in Radiance

The High Fidelity Radiance Render of the Jug Model Included in a Scene.


Thanks to Alan Chalmers for his guidance, Richard Gillibrand for his assistance with Maya and Veronica Sundstedt for the use of the room model.