Modern Seism helps manage drilling risk

Don Stowers, Editor-OGFJ

Seismic imagery, a tool used to characterize subsurface geology, has reduced drilling risk to a level once considered unimaginable. As a result of advances in seismic technology and reservoir modeling, today’s investors no longer see oil and gas exploration as the calculated crapshoot it once was. Even the most conservative investment firms see the reward as far outweighing the risk in upstream energy ventures, and with commodity prices expected to remain at high levels for the foreseeable future, there is no shortage of project capital.

How did seismic technology become so reliable? It did not happen overnight, but it did come in waves. And just before the onset of each new wave, the E&P industry had come to believe there were few prospects left to drill and that many existing fields were past their prime. After each new wave, a generation of new prospects opened up, resulting in new exploration opportunities and improved recovery from existing reservoirs.


Image to the right shows 3-D reservoir model with pressure distribution at 1,000 days. Image courtesy Object Reservoir.
Click here to enlarge image

Part one in this two-part series traces the evolution of seismic imaging from its origins and discusses the impact the technology has had on the petroleum industry. Part two, which will appear in the January issue of OGFJ, will focus on the latest advances in seismic imaging technology and will look at what lies ahead.

Evolution of seismic technology

The first recorded use of sound waves and “echoes” to identify an oil field was in 1924 in Brazoria County, Texas, just south of Houston. Oil was discovered in a field beneath the Nash salt dome based on single-fold seismic data.

Before that, oil exploration was based primarily on surface geology and other indicators. It was crude and very much a guessing game. Early wildcatters rolled the dice, and very often they crapped out. Losses due to dry holes could be devastating.

Seismic instruments used to measure and record ground movements during earthquakes had been around since the mid-1800s. However, it was another 60 to 70 years before engineers and earth scientists discovered they could use low-frequency sound waves to map subsurface geologic structures and locate possible hydrocarbon traps.


Larger photo shows a geotechnical ship doing a seismic survey in heavy seas. Photo courtesy of Fugro NV Geotechnical Services.
Click here to enlarge image

The first workable seismograph, an instrument that measures the intensity of underground vibrations, was developed by Ludger Mintrop for the German army in World War I. In 1917, John C. Karcher, an employee of the US Bureau of Standards, independently invented a similar instrument. Both the German and American versions were crude contrivances and were intended for use in locating enemy artillery by measuring the seismic vibrations produced by their firing.

After the war, Karcher began applying his knowledge of seismology to the subject of geology. He had received electrical engineering and physics degrees from the University of Oklahoma and interrupted his graduate studies at the University of Pennsylvania to resume his experiments in hope of finding a commercial use of the device by the petroleum industry. He enlisted two of his former Oklahoma professors to help him.


The didongy, �earth movement instrument�, was the world�s first seismograph created during the Han dynasty by Zhang Heng (right) in 132 A.D. It was cast in bronze in a shape resembling a wine jar. In the center was a cavity where a pendulum carried eight mobile arms radiating in different directions. Each was connected to a crank with a lever. Outside the vessel were eight dragons, each holding a ball in its mouth, and eight toads. The shock of an earthquake would set the pendurlum moving in one direction, pushing one of the arms to raise the crank and causing the corresponding dragon to vomit the ball into the mouth of the toad sitting below. The direction in which the earthquake had taken place could be deduced from which dragon had disgorged its ball
Click here to enlarge image

Dr. D. W. Ohern, a geologist, and Dr. W. P. Haseman, a physicist, collaborated with him in refining the seismograph’s design. The three men subsequently formed Geological Engineering Co. and prepared to test their new device. Financial backing was provided by Oklahoma City oil men Frank Buttram and brothers Walter R. and William E. Ramsey.

On June 4, 1921, on a farm three miles north of Oklahoma City, the group tested the seismograph and determined that it could indeed show subsurface structure that was capable of holding oil.

During the 1920s, Karcher’s company attempted to use the new tool in oil fields, but the price of crude oil plummeted, and the men were never able to market the device commercially, although they were a party to the successful drilling of a well in a field near Seminole, Okla., in 1928 in which Karcher’s reflection seismograph was used to locate oil-bearing structures.


John Karcher (above left), a University of Oklahoma graduate, began applying his knowledge of seismology to the subject of geology right after World War I. The early seismographs he invented were used by the military to find the position of enemy artillery batteries in European battlefields (right photo). Early mobile seismic equipment was often mounted on the bed of a pickup truck, such as this one in the Midwest.
Click here to enlarge image

Karcher, who died in 1978, is considered the father of the reflection seismograph. The company he founded was sold to new owners in the 1940s and was renamed Texas Instruments Inc.

Click here to enlarge image

At the same time Karcher was attempting to apply his technology to the search for oil, around 1925, two Texas brothers, Dabney Petty, a geologist with the Texas Bureau of Economic Geology, and his brother, Scott Petty, began work on a seismograph based loosely on the type Mintrop developed for the German army. Their innovation was to use a vacuum tube rather than dynamite to detect vibrations. The vacuum tube would be sensitive enough to detect vibrations made by dropping a heavy piece of lead to the ground.

The brothers’ collaboration resulted in the invention of the first displacement-sensitive seismograph and gave birth to a pioneering geophysical firm, the Petty Cos.

Post World War II developments

Following the Second World War, the development of transistors enabled geophysical companies to use transistorized equipment rather than vacuum tubes, which lightened the load for field crews. Another advancement in the mid-1950s was the recording of seismic signals on magnetic tape. Changing from paper to magnetic tape eventually led to machine processing, development of the analog processor, and a complete change in the manner in which seismic data were collected and processed.

Another development in seismic imagery occurred in the 1950s as well. William Harry Mayne, an employee of Petty Geophysical Engineering Co., developed a better means of recording seismic signals, the Common Reflection Point (CRP) method, which eventually came to be known as the Common Depth Point (CDP), for which he was granted a United States patent in 1956.

Mayne’s invention is the main signal-to-noise technique in seismic exploration and remains the basis from which new techniques emerge.

Also in the 1950s, Conoco developed a technology known as Vibroseis, which relied on specially designed vibrating or weight-dropping equipment to create recordable waves that would penetrate the ground, strike underground formations, and reflect back to the seismograph. This was much simpler and less time intensive than producing waves from explosions and recording them. This was also much less costly than using dynamite for controlled explosions.

Finally, the advent of digital data processing, which was conceived in the 1950s and came into its own in the next decade, added immensely to the ability to collect and analyze seismic data. Together, CDP, Vibroseis, and digital technology would shape seismic exploration into its present format.

As computers became increasingly sophisticated, the ability to develop databases evolved. Computers and information technology added a new dimension to seismic imaging and reservoir modeling. Over time, this led to today’s supercomputing power and high-definition imaging.

The origins of 3D seismic

High-resolution, three-dimensional seismic reflection imaging has been used in oil and gas exploration since the 1950s. Indeed, the concept of 3D seismic surveys has been around since the first half of the twentieth century. All that was lacking to implement 3D seismic was a more efficient data flow and greater computer power to process, display, and interpret data.


Seismic image of the sea floor off the coast of Nigeria, West Africa. Image courtesy of Veritas DGC.
Click here to enlarge image

As computers became integrated into commercial applications in the 1970s, their ability to process data advanced seismic capabilities exponentially. However, imaging methods were still two-dimensional.

Exxon Corp. shot the first 3D seismic survey in its Friendswood field near Houston in 1967. Within five years, Chevron, Texaco, Mobil, Phillips, Amoco, and Unocal were all involved in projects aimed at evaluating 3D seismic imaging.

The development of 3D seismic was one of the most important technological breakthroughs for the oil and gas industry in the past 50 years. Depicting the subsurface on a rectangular grid provided the viewer with detailed information about subsurface volume that could not be obtained previously from 2D data.

Today, 3D surveys are used not only for primary recovery but to model reservoirs, to plan and execute enhanced oil recovery strategies, and to monitor fluid movement in the reservoirs as they are developed and produced.

Better diagnostic and imaging technologies enable producers to “see” oil, gas, and associated rocks and to visualize the barriers and pathways for underground fluid flow. 3D imaging technology has been a major contributor to the revitalization of operations in the Gulf of Mexico, where oil production increased by 50 percent between 1995 and 2000.

Similarly, a Department of Energy-funded project helped boost oil production off the California coast while reducing the amount of water produced by re-analyzing past seismic data with modern logging instrumentation and newly developed software.

State-of-the-art geophysical technologies cannot, however, image most reservoir features. Surface seismic can only differentiate rock layers more than 30 feet thick. Smaller features, such as thin reservoirs and fractures, are “invisible” to detection.

An additional problem is the limited capability of geophysical techniques to distinguish between water and oil. Still, with advanced diagnostics and imaging technology, the costs and risks of exploring and developing these reserves can be significantly reduced.

New techniques and technologies are in the works that will enable E&P companies to recover vast amounts of hard-to-reach natural gas, including some geological formations at depths of 15,000 feet or greater. The US Department of the Interior Minerals Management Service estimates that 55 trillion cubic feet of natural gas exists at 15,000 feet or more below the outer continental shelf in the Gulf of Mexico.

In addition, the US Geological Survey says there may be as much as 32 billion barrels of technically recoverable, onshore undiscovered oil resources in the United States. Tapping into this would certainly help reduce the country’s over-dependence on foreign oil.

Look for more information in the January issue when OGFJ looks at the latest advances in seismic technology and reservoir modeling.OGFJ

Related Articles

Resource plays and capital players in the global deal landscape

12/12/2013 PLS reports that from October 17 to November 16, global deal activity totaled $12.6 billion in 49 deals. As of press time on November 25, Q4's tally has reached $27.0 billion in 84 deals – as compa...

Upstream News

12/12/2013

Energy Players

11/12/2013

Chevron makes natural gas discovery offshore Western Australia

12/15/2011

Chevron Corp.’s Australian subsidiary has made a natural gas discovery in the Exmouth Plateau area of the Carnarvon Basin, offshore Western Australia.

India Pt. 2: The Silent Revolution

12/01/2011 In its search to solve the question of India's energy security, the Indian state government has been pushing to promote exploration activities in the country, and efforts of both public and private...

More Oil & Gas Financial Articles

Resource plays and capital players in the global deal landscape

Thu, Dec 12, 2013

PLS reports that from October 17 to November 16, global deal activity totaled $12.6 billion in 49 deals. As of press time on November 25, Q4's tally has reached $27.0 billion in 84 deals – as compared to Q3 2013's $41.8 billion in 239 deals.

Upstream News

Thu, Dec 12, 2013

Energy Players

Tue, Nov 12, 2013

Chevron makes natural gas discovery offshore Western Australia

Thu, Dec 15, 2011

Chevron Corp.’s Australian subsidiary has made a natural gas discovery in the Exmouth Plateau area of the Carnarvon Basin, offshore Western Australia.

India Pt. 2: The Silent Revolution

Thu, Dec 1, 2011

In its search to solve the question of India's energy security, the Indian state government has been pushing to promote exploration activities in the country, and efforts of both public and private sector enterprises have recently been concentrating on the offshore exploration.

Most Popular

Oil & Gas Jobs

Search More Job Listings >>
Subscribe to OGFJ