Oil & Gas
Data Driven Decisions
Better tech, more information are a boon to oil field exploration
By Julie Stricker
Oil & Gas
Data Driven Decisions
Better tech, more information are a boon to oil field exploration
By Julie Stricker
G

eologists knew more than a century ago there was oil on Alaska’s North Slope. A 1921 report to the US Geological Survey that discussed Alaska petroleum noted areas of the state had oil seeps that looked commercially promising, including “some indications of oil in the extreme northern part of Alaska, a region at present almost inaccessible.”

At that time, geologists relied on obvious signs of oil, such as seeps, to find potential oil fields. In the ‘20s, a method of using sound waves to define underground rock layers was developed. Called 2D seismic surveys, they allowed geologists to see thin slices of the layers of rock underground. In the ‘80s, 3D surveys widened and sharpened that view. Today, 4D surveys are common. Each advance resulted in more data points for a more accurate look at what was happening underground.

“It’s like you’re looking through your binoculars, but you don’t have them focused,” says Chris Nettels, president of GeoTek Alaska, who has worked in the oil industry for more than forty years. “You’re looking and you can see these blob shapes in general, but you can’t see what they are. And then all of a sudden, you start focusing and oh, that’s a bear!”

Over the years, geologists have been honing their focus. In the oil and gas industry, data is king. Nettels says a coworker once told him, “You’ve got to learn to drink from the firehose, Chris. That data will always be available. Drink from the firehose and take as much data as you can.”

Technology has been and remains a huge driver in gathering and processing data. During his career, Nettels has moved from floppy disks to thumb drives, from a room full of supercomputers to powerful laptops.

“You start to see things better through better data acquisition processing,” Nettels says. “Then you can do a better job interpreting because you’ve got better data sets. So it really comes down to visualization.

“No one today will drill wells without having seismic data, you’d just be a nut to do it,” he adds. “So that means today no one’s going to do a wildcat exploratory well without having had shot seismic data first [to] know where to put that location for your best opportunity at finding oil.”

What Are They Looking For?

Imagine the North Slope is a layer cake, with oil-bearing layers sandwiched between layers of impervious rock, he says. Each layer has a different density, which affects the speed of a seismic wave. But the Slope is also riddled with lots of minor faults, like cracks in the cake layers, which can separate the oil reservoirs.

“It can end up making that layer not homogenous or continuous in that cake,” Nettels says. “If you put a well between faults, you’ll only get that area of that layer within those faults because it’s not draining across. Those can cause headaches for people drilling wells up in the Prudhoe Bay area, so they attempt to try to look at really fine detail.”

It’s not a static measurement. Nettels says he’s been working with companies that are doing 4D modeling, which, along with the X, Y, and Z in 3D seismic surveys, includes a fourth dimension: time.

That consists of installing a grid of sensitive tiltmeters in an area in which a company is fracking—injecting liquids at high pressure to improve the flow of oil.

“When you start injecting stuff in the ground, you really don’t know where it’s going to go,” he says. “You don’t really have any control. So what they try to do is understand what’s happening to the frack by putting these tiltmeters out to see how much of the subsurface has been influenced away from the wellbore.”

The subsurface isn’t the only area of concern. Quantum Spatial specializes in taking exact pictures of the surface of oil fields and potential oil fields, according to Adam McCullough, Alaska program manager.

“Our office specifically has been around in Anchorage since the early ’60s,” he says. “We flew the first exploration aerial imagery up on the North Slope in 1968 right after the Prudhoe Bay exploration discovery.”

At the time, Quantum Spatial used analog film cameras to photograph areas of interest. The company moved to digital photography in the ‘90s. Today, Quantum Spatial uses LIDAR, which stands for light detection and ranging.

“It’s a remote sensing technology that uses infrared light to take precise measurements of the Earth’s natural and built features,” McCullough says. “A lot of the data that we capture goes into the general engineering planning to support the oil development.”

That includes information on the best places to put an ice or gravel road or drill pads and facilities.

“It’s also used for permitting, so they can understand the environmental impacts,” he says. “How many lakes are we going to impact? What’s the acreage for potential impact and how do you permit for all that?”

Quantum Spatial also offers bathymetric LIDAR, which uses a green wavelength laser instead of infrared. The green wavelength is capable of penetrating and mapping the depths of lakes, streams, and oceans, McCullough says. Once the information is gathered and verified by a ground surveyor who gathers control data, all of the information is taken to the office for post-production, “where we mosaic all of these things into products like aerial imagery, orthomosaics, and topographic maps, contours, digital elevation models,” he says.

Processing all of the data gathered both aboveground and underground takes some beefy computational capabilities. Improvement in the ability to process data is one of the main reasons imaging is so much more accurate today, Nettels says. It’s also made gathering data easier, too. Instead of changing film on a camera inside a black bag during a flight, digital photography is limited only by the data storage device it’s linked to. And data storage has improved significantly since the days of the floppy disk.
But even old photos have data value.

Quantum Spatial has taken aerial photos on the North Slope and along the Trans Alaska Pipeline System nearly every year since the ‘60s, McCullough says.

“We still have the film and companies want access to it because they want to know what things have changed in the environment or if there’s an environmental concern like a contaminated site from years ago,” he says. “They want to know what was staged in that area, who was working there, that sort of thing. So it’s a really interesting way to kind of peer back in time and get a picture, a history of what was going on on the ground back in the early days.”

“[LIDAR is] a remote sensing technology that uses infrared light to take precise measurements of the Earth’s natural and built features… A lot of the data that we capture goes into the general engineering planning to support the oil development.”
Adam McCullough, Alaska Program Manager, Quantum Spatial
Who Has the Data?
In general, the oil companies maintain their own proprietary databases, but they are required to submit certain data to the state. According to David LePain, a petroleum geologist with the Alaska Division of Geological & Geophysical Surveys (DGGS), data from exploration wells and producing oil and gas fields are submitted by petroleum companies to the Alaska Oil and Gas Conservation Commission (AOGCC), including daily drilling reports, production data, wireline logs, and representative samples such as cuttings and core chips from wells drilled on state land. The AOGCC archives the data and makes some of it available to the public.

Cuttings and core chips are sent to the Alaska Geological Materials Center (GMC), where they are archived and available to the public for study. The GMC is part of the DGGS.

Work carried out by the DGGS Energy Resources section is largely “upstream” from oil and gas production, according to LePain. DGGS provides the geological framework that helps the petroleum industry explore for oil and gas. The division is mandated by state statute (AS 41.08.020) to determine the potential of Alaska land for production of metals, minerals, fuels, and geothermal resources.

The Energy Resources section studies the petroleum potential of sedimentary basins throughout the state, but has a strong focus on basins whose known geology suggest the presence of functioning petroleum systems, he says.

“As such we focus most of our efforts in frontier areas of the North Slope and Cook Inlet basins, but we also have ongoing applied petroleum research programs in several Interior basins, most notably the Susitna and Nenana basins,” LePain says. “Our work is heavily field-oriented (outcrop) and carried out through helicopter-supported summer field campaigns. The data that we collect are summarized and released to industry and the public in bedrock geological maps, technical reports, and presentations. Our reports and some of our presentations are available for free download from the DGGS website (dggs.alaska.gov). When and where possible, we tie our field outcrop data sets to the subsurface by studying exploration cores, cuttings, and wireline log data. We work closely with the Alaska Division of Oil and Gas to integrate our outcrop observations with reservoir quality and seismic data sets.”

LePain notes that the Alaska Division of Oil and Gas does not prospect but uses federal assessments to determine “undiscovered” potential.

The Energy Resources section collaborates extensively with the US Geological Survey (USGS). The USGS is tasked with assessing the undiscovered, but technically recoverable, oil and gas resources in sedimentary basins throughout Alaska. DGGS staff routinely contribute data to the geological models that underpin these resource assessments, LePain says. USGS resource assessments are typically released to the public as digital information circulars available free from its website.

The websites for the various state divisions under the Alaska Department of Natural Resources and Department of Commerce are a motherlode of information. The public can even visit the GMC, which houses the actual cores and rock samples from around the state.

Over the past several years thousands of rock samples and cuttings have been examined by geologists from industry, government, and academia. Sediment samples and palynology (study of pollen and other spores) slides were used by several companies to study how sediment layers relate to the location of oil and gas reservoirs.

According to an information sheet from the GMC, its rock collections “have played an important role in recent North Slope oil discoveries and development and will be critically important as future work continues.”

Over the years, most data-gathering methods haven’t changed so much as they’ve just gotten better and improved, Nettels says.

“There’s a lot of improvements in terms of acquisition,” he says. “When you go out and run a seismic survey for the first time, you go out and get the best data you can, but you find out that, well, if I’d done this or if I’d done that, I would get better data.

“There are still limitations to all this data. There are still unknowns,” Nettels says. “And the only way to check that out is to put a hole in the ground. And sometimes that can be expensive if it doesn’t turn out the way you think it does.”