Computerizing offshore Mozambique

Data gathering and large-scale reservoir modeling will be key to unlocking Anadarko's big gas development off Mozambique. Bruce Nichols spoke with Anadarko and engineers at Halliburton's Landmark to discover more.

When Anadarko made three big natural gas discoveries in quick succession offshore Mozambique in 2010, the company was already using computer modeling to derisk the undertaking.

“We start from day one,” says P.K. Pande, Anadarko’s Director of Reservoir Technology and Characterization. “We look at our early models, and we use them as input into making judgments about the overall project economics and viability.”

Now that nine successful wells have been drilled in the field named Prosperidade, the model built using Nexus software from Halliburton’s Landmark unit is in its third development cycle. And it has grown more detailed. There are 13 million “active cells" in it, and it has been "upscaled" to 2.5 million "active cells" for faster processing without losing the detail needed to plan development, Pande says.

When Prosperidade begins production in 2018, performance monitoring will help refine the model and maximize output over the life of the field.

Prosperidade is not the only field Anadarko has discovered off Mozambique. In all, it has drilled more than 20 wells and has identified separate natural gas deposits in multi-layered fields named Golfinho/Atum, Orca and Tubarao, which are in different stages of modeling. Prosperidade is the furthest along, Pande says.

“The Prosperidade area has been defined by Anadarko, its the partners and the government as a focus of the initial development,” Pande says.

Reservoir modeling and its adjunct, reservoir monitoring, are not new. They’ve been a part of the industry for 60 years. Anadarko has used it on old projects as well as new ones. A notable example is Anadarko’s Hugoton field, a low-pressure gas field straddling the Kansas-Oklahoma border that has been producing for nearly 100 years, enhanced by ever more sophisticated modeling.

Increased computing power and more sophisticated software capability have made modeling tools more useful.

“One of the things that has really changed, especially in the last 10 years, is our ability to visualize in 3-D. Instead of just getting quantitative results, we have easier ways to actually visualize the results and get meaning,” Pande says.

“Visualization used to be a lot more 2-D. You’d have to have layers and layers of maps. Geoscientists, the way their thinking process goes, it’s very much a 3-D type of process, so 3-D visualization actually helps us quite a lot in our interpretive work.”

Still, even after being converted into pictures, all models are essentially mathematical estimates of subsurface reality. The types of data that go into making the eye-popping 3-D visuals in use today – seismic, well logs, core sample analyses, fluid samples, flow tests – are essentially the same as when the visuals were 2-D layers of maps or, before that, mere charts and graphs.

As models grow more sophisticated and the data going into them more precise, oil companies’ reliance on the technology has increased as they move into more costly, more difficult areas in search of oil and gas.

“The big thing about reservoir modeling is it helps operators make a decision,” says Garrett Leahy, North American Sales Manager, Emerson Roxar. “It’s a commercial problem. The question is, ‘Are there enough barrels of oil down there to make it worth drilling?”

Despite advances in recent years, everyone who touts modeling emphasizes its limitations.

“One thing we need to make clear. A model is a model. The fact that we can visualize it in 3-D does not mean that’s exactly what the subsurface looks like. It’s our best representation at that point in time given the information we have,” says Joe Lynch, director of reservoir management for Halliburton’s Landmark Graphics unit.

It is ironic. Oil companies are seeking as much certainty as possible to underpin investment decisions. But they want modeling software to spell out the range of uncertainty in whatever picture a model is presenting.

“One of the really powerful things about having this technology is that we can look at uncertainty,” Pande says.

“We’re in a business where when we get an exploration project, there’s a range of outcomes, and we need to understand what that range is.

“We never look at a model and say this is what it is. We say there’s a range of possibilities. And if we don’t meet our economic threshold, even on the higher end of the range, it gives us some really unique insight into whether we should chase this or whether we should drop it and move to another opportunity,” Pande says.

Modellers use a phrase borrowed from gambling as well as mathematical and statistical terms to describe the process.

“When assessing uncertainty, Monte Carlo simulations are performed,” says Tomi Owodunni, a senior reservoir engineer at Schlumberger. Monte Carlo sims involve inserting different values for key data to see how it affects the model.

“For example, if the response of interest is cumulative oil production, then you would display the distribution of oil results from all combinations of your input data. Using the CDF (cumulative distribution function) curve, you can select three cases representing, say, a P10, P50 and P90. These cases are then used to investigate different development plans.”

Pande says more important than powerful computers and sophisticated software is good commercial judgment – the ability to decide wisely whether to spend money on a project, based on what a model is showing about key production drivers.

“Just having more complexity that both hardware and software allow does not necessarily mean you’re going to get a better product,” he says.

Pande notes that there are a lot of good reservoir modeling software products, that all the major service companies and some niche operators offer good tools.

Aside from Halliburton-Landmark Nexus and Decision-Making System, there are Schlumberger’s Eclipse and Intersect; Baker Hughes’ JewelSuite; Emerson’s Roxar RMS, and CMG’s IMEX, GEM and STARS, to name a few.

“We could have done equally as good a job on any other products because it’s more about how you apply this. It’s an art,” Pande says. “It depends on the operator’s experience and judgement.”

The art work begins early. Anadarko’s Mozambique exploration program started with a geologic model of the Rovuma Basin based on knowledge of similar basins and how they matched up with Rovuma. Seismic surveys helped identify potential subsurface accumulations of hydrocarbons.

Then, in February 2010, the first wildcat well came in big. Windjammer, about 30mi. east of Palma, found 555ft of net natural gas pay layered over more than 1200ft of Oligocene and Paleocene sands after reaching 16,930ft total depth in water 4800ft deep.

Next, also in 2010, came the Barquentine discovery 2mi. to the southwest of Windjammer (308ft of net pay in Oligocene and Paleocene sands) and the Lagosta find (550ft of net pay in Oligocene and Eocene sands) 14mi. to the southeast.

At that point, as exploration continued, the appraisal phase began in earnest to gather more data to feed the computers and build the models of Prosperidade.

“In 2012, we conducted the single most extensive deepwater testing program in Mozambique that was comprised of well tests and interference tests,” Pande says.

Anadarko ran at least 10 flow tests of three to seven days in duration, with each test producing up to 120MMscf/d, events which resulted in spectacular video posted on You Tube by Anadarko. Flow testing a well without producing significant pressure drop is encouraging because it indicates a sizable reservoir. Flow testing in conjunction with pressure monitoring at other wells can start to define the actual geographic dimensions of the reservoir.

“We needed to really establish how continuous this reservoir was and we establish continuities through interference tests,” Pande says, describing a process in which flowing one well and watching the pressure impact on another well some distance away indicates whether they are connected.

How quickly that pressure response shows up provides additional information about how well the fluids in the reservoir will flow.

A 3-D view of an earth model filled with the effective porosity and stratigraphic zones displayed on the base structural horizon. Image source: Anadarko.

These pressure measurements are incredibly sensitive. In the case of Prosperidade, the analysis took into account the effect on reservoir pressure of Mozambique’s tides, which can be 12ft or more.

“You actually have more water on top of the sea floor from tides, and we can actually see this in the pressure data,” Pande says. “The differences are small, but they’re very discernible. And this is such a huge reservoir, such a large asset, we had to figure out how to account for that.”

Estimates of recoverable gas at Prosperidade grew rapidly as drilling of exploration and appraisal wells and modeling continued. The estimate was 6Tcf in August 2011. It climbed to 10Tcf in October. In November 2012, after the Camarao discovery 5mi. south of Windjammer found 380ft of net pay and two new sand layers, the estimate soared to 15-30+ Tcf.

Information from drilling and testing was so good and the model, by this time, so promising, that Anadarko was able to attract a new investor into the project. India’s OVL paid US$2.64 billion for a 10% stake in 2013, reducing Anadarko’s share to 26.5%. The original partners in the project were Mitsui, BPRL, Videocon Mozambique, PTT E&P and ENH, Mozambique’s national oil company.

The results also led Anadarko and Italian giant Eni, which operates Mozambique exploration Area 4 adjacent to Anadarko’s Area 1, to agree in late 2012 to partner in building an LNG plant. Prosperidade straddles Area 1 and Area 4

Delivery of the first LNG cargo is targeted for 2018.

Shell: Future oilfield models will dwarf today’s enhanced computer-based evaluation and visualization is at the heart of a big Shell push to improve oil and gas exploration and production. Some of the reservoir modeling technology development is proprietary within Shell. Some of it is bought off the shelf and customized for Shell in cooperation with the provider.

“While we aggressively invest in proprietary processing and visualization technology in the seismic area, we invest more selectively in proprietary reservoir modeling technology because the market anticipates many of our needs, and we communicate with market providers,” says Detlef Hohl, Shell’s manager of computation and modeling.

An example of Shell’s taking advantage of products available on the market is its software license and joint development agreement, announced last September, to use the Baker Hughes JewelSuite (trademark) platform as the basis for highquality modeling of complex reservoirs.

In a joint announcement, Shell and Baker Hughes said the new platform would complement Shell’s existing applications, including GeoSigns, Shell’s proprietary software used to visualize and interpret seismic data.

The basis of reservoir modeling is mathematical, but the step change of recent years has been the capacity to create 3D models that can be visualized and interactively manipulated allowing geologists, geophysicists and petroleum engineers to exercise their judgment in new ways.

“It’s not a luxury,” Hohl said. “The interpreter has to be able to see these things in three dimensions. He has to be able to dive into it. The visual user interface is extremely important for interpretation.”

Shell is working on other step changes as sensors and data-gathering equipment improve, computer power grows and visualization technology surges. A key focus is uncertainty analysis, says John Karanikas, Shell’s chief scientist for reservoir engineering.

“We’ve made a significant effort to include more and more of the breadth of the subsurface uncertainty in our calculations,” Karanikas says, noting executives need to know quantitatively the magnitude and impact of this reservoir uncertainty in order to make sound economic decisions.

“Another way our models are changing is the inclusion of more advanced chemistry and physics,” Karanikas says.

And “big data” – the term used for increasingly massive data-gathering and storage capability in all areas of life and of business – is coming to the oilfields. In time, the piles of data already gathered in oil exploration and development will seem small.

“We expect in the future to get avalanches of data, factors of 10 or more, and that in turn will drive larger models,” Karanikas says.

Subsurface and well planning reservoir simulation model for Prosperidade field offshore of Mozambique.

How Halliburton built Nexus

Creating software that can model an oil or gas reservoir involves writing lines of code that can perform billions of calculations per second. The software runs simultaneously on clusters of computers linked by high-speed interconnections.

But it still can take a couple of days to finish a model. And when you’re done, you still can’t be sure what the subsurface looks like, even after you drill and produce it. After all, you can’t actually see it.

Welcome to the world of using computers to model oil fields and monitor their performance as they age.

“What we are doing is solving a complex set of four-dimensional partial differential equations with what is known as a finite difference approximation,” says Steve Crockett, product manager for Nexus®, reservoir simulation software which is developed and marketed by Halliburton’s Landmark unit.

“I’ve worked with models that solve 50 million-plus equations simultaneously.”

Despite its uncertainty and complexity, adoption of modeling for exploration, development and production management has accelerated as oil companies take on more and more challenging projects, and the cost of being wrong about a prospect has skyrocketed.

“We don’t know what a reservoir is going to deliver. So if a reservoir doesn’t deliver what is expected, then an oil company has a problem. One way to address that problem is to get a better understanding of what the reservoir is going to deliver beforehand, and understand the uncertainty because we cannot predict perfectly,” says Joe Lynch, Landmark’s director of reservoir management.

Modeling is especially important offshore, where the potential to over-produce, though rarer, can be almost as embarrassing as under-delivering. It can mean that because the reservoir was misunderstood, the platform, pipelines and other production equipment have been built too small, and output can’t reach its full potential, Lynch says.

“The production guys are pleased about it, but typically, if that happens, then the operator is leaving money on the table because they are surface-facility-constrained. They could actually be producing more oil and getting a better rate of return,” he says.

Uncertainty analysis, in fact, is a growing focus in modeling oilfield reservoirs.

“What we want to try and do is deliver this envelope of possible outcomes, and then it’s up to the decision-makers to decide how they want to go and size the project. Think about capacity for, let’s say, water-handling up front. To build that into the platform, that’s expensive. But it’s nowhere near as expensive as having to retrofit later on when you get a surprise,” Lynch says.

Uncertainty is enormous at the exploration and appraisal stage. “In those early stages, you’re in an extremely speculative mode in your modeling. You’re at the point of trying to decide do we sink hundreds of millions or more into developing this field. And so you have to have the widest range of possible realities,” Crockett says.

Uncertainty diminishes – but never completely disappears – through the life of the field.

“When we have a model, it is a model. It is not reality. But then against that model we’ll have hypotheses. Then we will drill appraisal wells to test hypotheses. Based on the results, we’ll have a whole bunch more information and that model can be refined,” Lynch says.

Information gathering continues and the model evolves as field development matures.

“What you find is, as soon as you have the first six months or year of production for even one well, you can start to eliminate some of the possibilities from your models,” Crockett says. And done right, tweaking of a reservoir model is a neverending process.

Lynch foresees a trend toward tighter linkage between modeling a field and monitoring it after production begins. Monitoring data can be used to update the field model and improve forecasting of the future life of the field. Part of the reason monitoring feedback into the model is feasible is advancements in data-gathering and field-management equipment.

“We’re getting data back on a much more frequent basis, plus we’re getting better ability to control the production system, Lynch says. “I think we’re starting to get something like an evergreen reservoir model. Instead of every few years going back to scratch and building a new model. I think models are now having to be kept fairly up to date. I can see more use of simulation in that role in the future.”

 

Current News

France Picks Ocean Winds for 250MW Floating Wind Farm in Mediterranean

France Picks Ocean Winds for 2

Vestas Lands First 15MW Offshore Wind Turbine Order in Asia Pacific

Vestas Lands First 15MW Offsho

EDF, Maple Power to Develop 250MW Floating Wind Farm in France

EDF, Maple Power to Develop 25

Shell Shuts Down Oil Processing Unit in Singapore Due to Suspected Leak

Shell Shuts Down Oil Processin

Subscribe for OE Digital E‑News

Offshore Engineer Magazine