Efficiency, cost-effectiveness and optimized analyses are crucial in upstream asset development. OE spoke with Paradigm’s Indy Chakrabarti to chart the pathway to success. Jeannie Stell reports.
Indy Chakrabarti. Images from Paradigm. |
Given today’s low oil and gas prices, exploration and production companies must strive to be ever more efficient and cost-effective in the creation, execution and analysis of their asset development plans. Companies that manage these activities successfully will outlast the bust. Those that don’t will likely not survive as viable entities — as the industry has already seen.
At the start of any successful asset development plan, managers must consider the five aspects, or domains, of exploration and production. These include: processing and imaging; interpretation and data management; modeling and reservoir engineering; formation evaluation; and drilling and well planning.
This end-to-end workflow identifies: where to drill, where to land the well, the analysis of production over time, and optimized decisions about future development (i.e., where to drill next).
“At Paradigm, we have brought all of that together in a single, unifying software platform suite we call Epos,” says Indy Chakrabarti, senior vice president of product management and strategy for Paradigm. “This technology allows individual applications for each specific domain to be are cross-integrated with each other through Epos.”
Typically, managers and geoscientists seeking new software tools for analyzing their asset environments focus on cost savings, infill drilling, and high-grading prospects. In today’s economic environment, getting these management planning decisions right is crucial, as each has significant cost implications.
Conversely, overall financial concerns have very little influence over software purchase decisions. “The cost of a manager’s decision about where to drill, how to get rigs, and when to contract seismic boast, is 100-fold greater than the choice of a particular software tool to help make those decisions,” Chakrabarti says. “Even a 1% gain in a major asset management decision will easily compensate for the cost of any software tools. As a result, in times like these, managers tend to go up the hill, technologically speaking.”
Build accurate reservoir models in the presence of complex faults. |
Production and enhanced recovery trends
The critical need for organizations to understand sweep efficiencies and bypassed compartments. “What did I miss?” That question is critical when margins are thinner, Chakrabarti says. “It’s essential for you to have all the information that can tell you what your subsurface actually looks like,” he says. The placement of an injector must be evaluated against how much oil it can actually make contact with, mobilize and push toward the recovery well.
Because the subsurface is heavily fractured, compartmentalized and fault-riddled, geoscientists historically would remove complexities and simplify their image of the subsurface, treating it more like a big tank than a reservoir, Chakrabarti says.
“Then they would run production for whatever they could get. The fact that they might have placed an injector on the wrong side of a fault meant that some of the CO2 never reached the reservoir of hydrocarbons,” he says.
Now, managers are very focused on how well their technical software allows them to “see” the subsurface effects of their development operations. “That clearer image of the subsurface makes a big difference when you are attempting to optimize production,” Chakrabarti says.
Optimized planning and production
Targeted at solving these issues, both the Paradigm 14 and 15 software releases are pointed in the same direction — the industry’s first high-definition (HD) platform. “One of the most acute problems occurs when companies acquire rich seismic data, but don’t have software tools that can meet the scale and the new user-interface changes that have to be made, and new outputs that have to be delivered.” Paradigm has built a four-step solution to meet that challenge.
In step one of the Paradigm software platform, high-end computational tools use all available data to process an image and build an accurate image of the subsurface.
In step two, geoscientists can interpret the data and therefore make sense of the image, he says. “Once the image is understood, analysts can zoom into the data down to the narrowest band of the reservoir, and be able to understand the specific lithologies and variations that couldn’t be seen before.”
Step three is about modeling. Ultimately, geologists want to be able to create a truly accurate and detailed representation of the subsurface.
The fourth and final step is to create a simulation that can answer the question, “How much of this reservoir do I think I can produce?” Since the simulator simply runs an algorithm, a successful output depends upon the input being a more accurate representation of the subsurface.
“That’s what the HD platform does,” Chakrabarti says. “On one end, it receives this advanced, rich seismic data. And on the other, it outputs an accurate, granular model that can be used to forecast production.” Paradigm continues to focus on very high resolution billion-cell subsurface models.
Understaind subsalt uncertainty through illumination study. |
Case study
To illustrate the effectiveness of the new technology, Chakrabarti details a case study conducted to characterize, model and flow simulate a non-conventional fractured basement reservoir located offshore Vietnam.
The challenge
A new reservoir modeling and simulation workflow was needed to prove that the complex structure of the field and its properties could be represented, and its dynamic behavior reproduced. Predictions of production using standard reservoir simulators had been problematic, because different flow laws apply to fractured rocks, and inclined narrow faults are difficult to represent at field scale.
The assessment
The field contains complex intersections between faults (Y and X contact shapes, etc.), between horizons and faults (reverse faults, important offset of the basement on the flanks) and between horizons (converging small angle contacts between the top of the basement and the sediments lying on its flanks). Since results to date using standard tools had not been sufficient, the decision was made to use Paradigm SKUA modeling software to build a structural framework model that properly honors fault intersections.
The solution
For this project, only a sector scale model (12.5km by 4.5km) was studied. A total of 53 original fault interpretations were included and no faults were excluded when modeling the data with the SKUA system. Faults were loaded as fault sticks from an ASCII file.
Most of the faults had not been assigned any throw type, mainly because their throw was too small to make a decision. Some faults were identified as reverse, about 10 others were identified as normal. This extra information was used as a constraint in the SKUA modeling process.
The top basement interpretation was loaded as a CPS3 regular 2D grid (resolution 25m by 50m). The interpretation of the top of the sediment contained points (resolution 300m by 300m) loaded from the ASCII file. Contact curves between the top basement and sediments, corresponding to non-depositional curves, were used to constrain the modeling of the sedimentary layers.
Watertight models are an important prerequisite for volumetric meshing as part of the simulation workflow. Structural models generated in SKUA are watertight, meaning that they are composed of surface-delimited sub-volumes in which the surfaces are perfectly welded together without any holes. This structural model can be transformed into a set of triangulated surfaces that share nodes on the contact lines.
The modeling of the top basement and top of sediments was performed in a single operation. The contact of the sediments on the top basement was handled automatically through the use of the stratigraphic column (unconformable contact between the basement and the sediments) and the non-depositional curve. The resulting horizons were smooth and clean, while the complexity of the fault network was preserved.
To avoid very small or degenerate elements in the triangulated surfaces, which would produce holes or overlapping elements in the 3D grid, fault throws smaller than a given target refinement-based threshold (5m for the sector scale model) were merged. This was done automatically in SKUA with the creation of the triangulated surfaces.
The results
The sector scale model was generated in less than a week. No structural model had ever been built for this field before this study, as no software could properly handle representation of the faults.
By creating the structural model and performing the QC on the seismic data together with the geophysicists, many refinements and updates could be made to the existing interpretation. Questions about fault extensions within the sediments could be answered for the first time. Thanks to the preservation of all the faults in the model and the analysis of their vertical and lateral extent from the basement to the surrounding sediments, precise flow pathways were identified in the reservoir. This is critical for field development, and had not been possible using other existing tools.
*Data courtesy of NFR Studies.