Visualization software tames large data sets to forecast earthquakes

On January 17, 1994, an earthquake of magnitude 6.7 rocked Northridge, CA, killing 57 people and causing property damage of more than $10 billion. This earthquake, of a relatively moderate magnitude, was the second most significant disaster in United States history at the time, surpassed only by Hurricane Andrew. Even in an area as closely studied and monitored as the Los Angeles Basin, the fault that caused the earthquake was unknown before the event.

Th S Cfig1

Jon Snyder and Mark Goosman

On January 17, 1994, an earthquake of magnitude 6.7 rocked Northridge, CA, killing 57 people and causing property damage of more than $10 billion. This earthquake, of a relatively moderate magnitude, was the second most significant disaster in United States history at the time, surpassed only by Hurricane Andrew. Even in an area as closely studied and monitored as the Los Angeles Basin, the fault that caused the earthquake was unknown before the event.

Karl Mueller, a structural geologist and assistant professor at the University of Colorado department of geological sciences, and Adam Bielecki, a graduate student, are using the software package Environment for Visualizing Images (ENVI; Research Systems, Boulder, CO) to study the morphology of Wheeler Ridge, a fold produced by shortening of the crust above an active thrust-fault northeast of Los Angeles, CA. Analyzing the movement of the surface of Wheeler Ridge should contribute to more-accurate long-term forecasts of thrust-fault behavior and lead to the development of better models for forecasting damaging earthquakes in densely populated areas.

In addition to information compiled during on-location field studies, Mueller and Bielecki's research incorporates remote-sensing data gathered by the NASA Raster Scanning Airborne Laser, a scanning laser altimeter mounted on a jet aircraft. The altimeter maps the surface of the earth in 100-m-wide swaths by firing a laser 5000 times per second and recording the time it takes the pulses to bounce off the ground and return to the aircraft. The time-of-flight data are then corrected for sensor pointing attitude and a trajectory of the jet, derived from the global positioning system, to produce accurate, georeferenced values for surface elevation and the spatial position of each footprint.

Th S Cfig1
FIGURE 1. A colored slope map of eastern Wheeler Ridge, including an aqueduct and a pumping plant to the left of center, has flat regions in black, with red representing steeper slopes.
Click here to enlarge image

With flight speed maintained at 100 m/s, the topographic data have 1.67-m along-track and across-track resolution between each measurement footprint, with vertical resolution of ±5 cm. This high-resolution remote sensing results in large data sets, typically several hundred megabytes in size, that do not lend themselves to interpretation using conventional software packages. Our package, however, can process the data in two or three hours and can produce shaded relief and slope images in four hours or less. The software's rapid processing has enabled Mueller to test 20 different theories that attempt to explain the unique topography of Wheeler Ridge (see Fig. 1).

Formatting data

The first step in processing the data is to break the flight-corrected altimeter data into portions the computer can handle. The unformatted data are then converted into three vectors-longitude, latitude, and altitude-which, in turn, are triangulated and interpolated with triangulate and trigrid routines. These use a grid spacing calculated by applying a function Bielecki wrote in Interactive Data Language (IDL), the underlying language for ENVI, which allows a user to develop customized processing routines. The data are then exported into ENVI, where the topographic modeling features can produce a two-dimensional (structured binary) grid, a digital elevation model. From there, the data can be refined to remove noise and other unwanted artifacts. "The ability to write batch files in IDL saved a great deal of time, because you can automate all of these tasks for each parsed section of the data set," says Bielecki.

The language was originally developed to help scientists more easily interact with their technical data. It supports most common file formats, including popular image file formats such as jpeg, gif, tiff, and pict. It also supports scientific data formats and offers specialized support for HDF, CDF, netCDF, and other formats central to missions at NASA and other organizations. The software provides an array-oriented language, allowing tasks to be performed in a fraction of the number of lines required by more-traditional programming languages.

The graphics system uses OpenGL to take advantage of hardware acceleration, giving high-performance image processing and integrated two- and three-dimensional graphics. Because the graphics system is object-oriented, programs can use features such as hierarchy, inheritance, polymorphism, and code reusability. The development environment includes a color-coded source-code editor, source-code debugger, application-code profiler, and variable watch window. The Windows version allows users to design their graphical user interface in a drag-and-drop environment.

The language is available on Microsoft Windows 95/98/NT, Power Macintosh, and most popular Unix platforms. Applications written on one platform can run on other supported platforms with little or no change.

Th S Cfig2
FIGURE 2. A three-dimensional shaded relief map shows a range in California, with an oil-pumping facility visible on the left half of the image.
Click here to enlarge image

Mueller says having the algorithms provided, instead of having to write them, allowed the researchers to move straight into analyzing high-resolution data. The data are represented in several different forms. Color-coded contour maps, 3-D gray scale, color renderings, 3-D shaded relief maps, and long mosaics of images each provide insight (see Fig. 2). Bielecki says the topographic modeling allows the creation of shaded relief, aspect, and slope images, "which are key elements in testing all the viable hypotheses for terrace development based on morphology and attitude." The contour map can be overlaid on 3-D images, with a range of values displayed and attributed with different colors and line weights, while being annotated for easy identification.

Useful information

Builders, city planners, land-use managers, and federal agencies such as FEMA that plan responses to future earthquakes all can benefit from Mueller's research. "Knowing how fast a fault is moving, which faults are active and which are inactive, and how big the earthquakes are along a fault are critical factors," Mueller says.

More than 400 faults exist in the Los Angeles area alone, many of which are only expressed at the surface as active folds. Wheeler Ridge presents an ideal natural laboratory for developing geomorphic models that can be applied to similar features in dense urban areas such as Los Angeles. And, while the mystery of terraces at Wheeler Ridge remains unsolved, their intriguing formation inspires earth scientists to apply and refine the latest remote-sensing devices and analysis tools, including laser altimeters.

Higher-resolution remote sensing

Interest in altimeter data follows a general trend toward the use of higher-resolution remote-sensing data. Laser altimetry is more precise than other methods of digital elevation measurement. In the same vein, the trend is toward higher spatial and spectral resolution in remote-sensing imaging devices. High-spatial-resolution sensors can acquire images from space with resolution of 5 m or smaller. High-spatial-resolution data allow even small objects to be identified on the basis of their shape.

Hyperspectral sensors are extremely sensitive to the frequency of light. Hyperspectral data have hundreds of frequencies at which a scene is imaged. This acute spectral sensitivity provides detailed insight into the materials that make up objects in images. Currently, earth scientists are analyzing data from satellites and other remote-sensing devices to get a better grasp on the effects of global warming, the loss of rain forests, and cyclical weather phenomena such as El Niño and La Niña.

Farmers are beginning to realize the benefit of multi- and hyperspectral analysis, which can help them produce more-abundant and healthier yields, while reducing costs related to fertilizer, pest and disease control, and irrigation. As data become more readily available, other uses will naturally become common. For example, merging digital elevation data with multi- and hyperspectral imagery will give urban planners and environmental monitors a forum to study growth and more accurately assess its impact. o

JON SNYDER is a writer and MARK GOOSMAN is IDL product manager at Research Systems, 4990 Pearl East Circle, Boulder, CO 80301; e-mail: [email protected]

More in Research