IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

GPS Assists Earthquake Probability Assessment

Global Positioning Systems may help geologists determine which faults they should be most concerned with.

Global Positioning
Systems may help
geologists determine
which faults they
should be most
concerned with
before an
earthquake occurs.
On Oct. 17, 1989, a magnitude 7.1 earthquake struck in the vicinity of Loma Prieta, in the southern Santa Cruz Mountains. The quake caused 62 deaths, 3,757 injuries, and over $6 billion in property damage from Santa Cruz to San Francisco and Oakland, sixty miles to the northeast. Less than two years before, a U.S. Geological Survey (USGS) study had indicated that this segment of the San Andreas Fault was one of six with a 30 percent probability of experiencing a magnitude 6.5 or larger earthquake within 30 years.

In light of the data provided by that event, a revised USGS report concluded that a 33 percent probability now exists for one or more large earthquakes in the San Francisco Bay Area region between 1990 and 2000. The level of reliability for this assessment is one step below "most reliable." The report states that since the analysis is based only on the region's three major faults -- San Andreas, Hayward and Rodgers Creek -- "the calculated probabilities are necessarily a minimum estimate of the hazard." It also points out that "the most densely populated parts of the area lie atop of, or adjacent to, fault segments having the greatest potential for large earthquakes."

PROBABILITY ASSESSMENT
Scientists agree that earthquakes cannot be predicted, and possibly may never be, given that all faults behave differently. But with sufficient and timely data, and records of past earthquakes, the probability of a major event occurring on a given fault segment within a given time can be estimated with varying levels of reliability. Since little data existed on the fault segment near Loma Prieta, the probability of an event there carried a minimum level of reliability.

Probability assessments are largely based on records of historic events -- surface deformation, fault-slip rates, and the length of time since the last event. If the previous earthquake predates history, geologists must dig a trench across the fault, locate the strata displacements, and analyze them to determine when the event occurred and the amount of deformation it produced.

Fault-slip rate refers to the continuous subterranean movement, or slipping, of active faults that takes place between earthquakes. Since this is on the order of 1 to 3 centimeters a year, the surface deformation it produces occurs too slowly to excite measurable seismic waves. Fault-slip rates, therefore, must be derived by geodetic measurement of the small but detectable surface deformations caused by that movement.

By contrast, the fault slip that occurs during a major event is sudden and considerably greater. In the Loma Prieta earthquake, the Pacific plate pushed 6 feet to the northwest and thrust 4 feet up, over the North American plate, all in a matter of seconds. USGS Geophysicist Nancy King describes the process: "At depth on the fault, rocks are warm and ductile enough that friction is not an issue, they are slipping all the time. Strain builds up at shallow depths [of 10 to 15 kilometers] because rocks there are cold and brittle; they resist and lock up while the lower part continues moving. When the shallow part finally lets go, it catches up with the lower part, but it moves so quickly that it excites seismic waves. That is what we feel as an earthquake."

DETERMINING FAULT SLIP
King emphasized that fault-slip rate is a key factor in determining earthquake probability. "If we know that a fault segment is slipping at a rate of 1 centimeter per year, and the slip in the last big earthquake was 1 meter (100 centimeters), then a back-of-the-envelope guess at the recurrence time is 100 years. In real life, as we have found in the past 20 years, it is not quite this easy to predict earthquakes. In fact, we can't do it. Instead, we use fault-slip rate, along with other information, to determine the probability that an earthquake of a given size will occur on a given fault within, say 30 years."

The process of deriving fault-slip rates involves complex analyses, and by necessity, certain assumptions about what happens several miles beneath the surface of the earth. Scientists, said King, have computer models that connect what happens when a fault slips to how much the surface is deformed by that slip. "Given that we know the surface deformation within certain errors, we can deduce how much the fault is slipping down below. But good estimates of fault slip depend on timely data."

Only recently, however, has technology been able to provide anything like timely surface deformation measurements. Until 1960, measurements were made by triangulation, and later by EDM (electronic distance measuring). Geologists went out once or twice a year with tripods and set up using traditional survey techniques. Measurements were limited to line-of-sight distances between geodetic monuments, usually on mountain peaks. Vertical deformation was measured separately using leveling. If the tripod was not precisely centered and leveled over the monument mark, errors resulted that could go undetected for months. The entire process was expensive, time consuming, and accounted only for the total slip between annual or biannual measurements.

What makes timely measurement of surface deformation now possible is the geodetic-grade GPS receiver. Unlike previous survey methods, fully automated GPS stations provide continuous measurement, and send data to a control center via telephone or radio telemetry. Setup errors are virtually eliminated. At permanent stations, GPS antennas are there to stay. At temporary sites, they are screwed directly into the base of the monument. Also, GPS calculates horizontal and vertical distances simultaneously, and is not limited to line-of-sight measurement.

"Continuous GPS measurement has special advantages," King explained. "If we design our permanent stations properly and have backup power, the receiver will track right through an earthquake. We can get a very fast picture of the surface deformation, and use that to infer what part of the fault slipped and by how much. Along with information from seismometers and other geophysical instruments, continuous data can be used to deduce at which part of the fault aftershocks are likely to occur, and what bridge approaches, tunnels, pipelines or aqueducts may have been damaged by fault slip and should be inspected quickly."

To learn more about fault activity in the San Francisco Bay Area region, scientists from the Universities of California at Berkeley, Davis, Santa Cruz and Stanford -- along with USGS and Lawrence Livermore National Laboratory -- established the Bay Area Regional Deformation (BARD) Network, an array of GPS stations throughout the San Francisco Bay Area.

BARD scientists believe that long-term, continuous GPS measurement of surface deformation will ultimately provide greater understanding of fault behavior and associated stresses caused by recent earthquakes, such as the Loma Prieta event. The network will identify active faults as well as those most likely to fail in a large earthquake.

Continuous GPS is the way to get much greater accuracy and reliability, said UC Santa Cruz Geologist Eli Silver. "Even with SA [selective availability: an accuracy limitation imposed by the Department of Defense for purposes of national security], horizontal accuracies are on the order of 2 to 3 millimeters. The error is about 2 millimeters, plus one part in 100 million. Vertical accuracy is in the range of 2 to 3 centimeters in campaign (temporary) mode, but may be close to half a centimeter in continuous, long-term mode. The kind of regional network we are talking about will have receivers every 5 to 10 kilometers, [and are] probably looking at 3 millimeters of error.

To achieve these goals, it is estimated that several hundred permanent GPS receivers will be needed throughout the region. By the end of 1996, the network is expected to have 20 stations, not counting those operated by other agencies, such as the Coast Guard, the Jet Propulsion Laboratory (JPL), and Scripps Institute of Oceanography. BARD receives data from these stations, and will also receive data from FAA GPS installations when these are completed. Since these agencies also use precision GPS receivers, the data are suitable for both navigation and geodetic purposes.

FUNDING
Funding for the network is provided by the National Science Foundation and by the National Earthquake Hazards Reduction Program. At present, the cost of permanent installations has been a limiting factor. To compensate for this, the consortium has established a few permanent stations and several semi-permanent ones. In the campaign measurement mode, instruments may be installed for up to a month in a temporary station, then moved to another. To further reduce costs and maximize resources, several sites have been set up at existing seismic stations.

OPERATION
Each institution maintains and downloads its own sites and receivers. Data from permanent stations are automatically polled via telephone, with the exception of the Farallon Islands, which is downloaded via cellular. Data are sent to the USGS center in Menlo Park, Calif., to be processed, using software and precise orbital information from JPL's International GPS Service. From there, the data are sent to the Berkeley Archive, a site operated by the Northern California Earthquake Data Center at UC Berkeley. BARD data are archived in standard RINEX (receiver-independent exchange) format, a common language understood by post-processing software.

Operators of permanent GPS stations generally make the data available to anyone via anonymous FTP. The Web page for the Northern California Earthquake Data Center is . The Web site for USGS Center, at Menlo Park, is . Both sites provide background information on BARD, maps, lists of GPS stations, GPS receiver details, and hot links to sites having data on other regions of seismic risk in California.

GOALS
"In the future," said King, "we hope to approach near-realtime monitoring of deformation. In that case, continuous GPS stations could also monitor crucial structures such as dams. For example, if, after an earthquake, you find that a dam is sinking by a centimeter a day you want to know that. With continuous GPS monitoring, you will."

"Realtime data will definitely improve our ability to do probability assessments," agreed Silver, "especially after a few years. In the long term, we should get a much better idea which faults and what parts of faults to be most concerned with."

Bill McGarigle is a free-lance writer specializing in communication and information technology. E-mail: < bmcgarigle@aol.com >.


HARDWARE:
Ashtech Z-12 GPS receiver

Trimble 4000 Series GPS receiver

*