Heather Saucier investigates how microseismic monitoring is being used for monitoring field conditions in real time and helping to build better models.
Landmark’s 3D models can use reservoir monitoring to improve performance prediction. Image from Halliburton. |
One of the first times microseismic monitoring was used offshore was in the mid-1990s, when the deck of a platform in the Ekofisk Field in the North Sea began sinking – its surface slowly dropping toward sea level for unknown reasons.
Used mostly onshore at the time for cost reasons, microseismic sensing devices were adapted for the water, and over a period of months they detected a swarm of small earthquakes that occurred as hydrocarbons were pumped from the subsea reservoir. As a result, the reservoir began subsiding and the platform along with it.
Today, microseismic monitoring is much more pervasive in the offshore industry – not only to diagnose problems after the fact, but to monitor reservoirs in real time for compaction and fluid movement.
“A lot of the reason microseismic monitoring was first developed for onshore use is because people wanted to see the results and appreciate what it means at a lower price point and lower risk before spending big bucks offshore,” says Peter Duncan, co-chairman and founder of Houston-based MicroSeismic, Inc. “However, when used offshore, it does allow you to accelerate your response to whatever is going on in the subsurface.”
So, where there was once a dearth of data, the offshore is now facing what some are calling an “information explosion.” In response, it’s scurrying to build better models so that operators can understand in real time the goings-on in their reservoirs. As better models are built, operators can make decisions more quickly, ultimately realizing higher profits during an economic downturn.
A glimpse – by sound – into the earth
Peter Duncan, founder and co-chairman of MicroSeismic, Inc. Image from MicroSeismic. |
Realizing that microseismic monitoring can paint a moving picture of how a reservoir is responding during production and also predict how it may continue to respond – many operators are using the technology to maximize hydrocarbon recovery, Duncan says.
“When faults move during the course of production, they can offset a reservoir and form compartments. If hydrocarbons are not in contact with the production wells, this creates bypassed pay,” he says, explaining that additional wells might be needed. “You want to stick a straw into it and suck the oil and gas out.”
Many offshore reservoirs are highly compressible, adds Peter Flemings, professor of Geological Sciences at the Institute for Geophysics at University of Texas – Austin. “The reservoirs will compact and the grains will re-orient, resulting in significant displacements in the reservoir,” he says.
A worst case scenario might be that well casings can be sheared off as the overlying cap rock is displaced, Flemings says. “You can lose a lot of money if you lose a well that way. Essentially, you have to plug the old well and sidetrack and go into the reservoir again,” he says.
However, the most common consequence of compaction is a loss in production. “As compaction occurs, permeability declines. Flow is hampered and production becomes more difficult,” Flemings says.
To help prevent such occurrences, microseismic monitoring has become a solution for many operators, especially those who already have set-ups for 3D and 4D seismic acquisition – whether they be geophones or fiber optic sensors in multiple wellbores, or large arrays spread out on the seafloor. For, in between 3D snapshots, geophysicists can use the same sensing equipment to passively monitor the behavior of the reservoir.
“Forward-looking”
Microseismic monitoring shows how the reservoir responds to hydraulic fracturing in real time. Image from MicroSeismic. |
Perhaps the most notable example of microseismic monitoring today can be found in the Jubarte Field, about 70km offshore the Brazilian state of Espírito Santo – a pre-salt field developed and produced by Petrobras and monitored by MicroSeismic in partnership with PGS.
One of the largest and deepest permanent fiber-optical sensing system in the world – with a 33km-long seabed streamer laid out over 9sq km and 1300m below sea level – Jubarte’s offshore monitoring system has given Petrobras a serious nod by the industry.
“Petrobras was forward-looking and wanted to install a permanent monitoring system over the Jubarte field prior to development and production,” Duncan says. “They also were forward thinking in the fact that they had an array in place and opted to monitor the reservoir during the downtime between the 3D surveys.”
What the monitoring revealed was quite surprising. “We detected seismicity related to production when they turned on the field, but we did not see any events at the reservoir level,” Duncan says. “However, events were taking place below the reservoir. They appeared to be related to motion along a deep fault that was perhaps reactivated by the production in the field.”
As a result, Petrobras opted to continue the passive monitoring. “It’s essential to understand what is happening. If this means that the reservoir is in communication with these faults, it’s important to understand what that communication might mean,” Duncan says. “Is more oil and gas going into the reservoir? Only time will tell.”
Technology advancements
Geoscientists analyze microseismic data in real time, enabling on-the-fly decisions for offshore reservoir stimulation. Image from MicroSeismic. |
Microseismic monitoring has experienced quite an evolution over the last decade, technologically speaking. It began with the ability to detect the “hypocenter” of a seismic event’s location – or where an event occurred in time and space, Duncan explains. “We knew that there was an event and that the earth moved, but not more than that,” he says.
Yet, as more wellbores were equipped with sensing devices and as operators began to lay large sensing arrays on the bottom of the seafloor, more data was generated that could add more dimension to the seismic event.
“The nature of the sound – called the ‘moment tensor’ – we were recording told us the type of movement that was taking place in addition to its location,” Duncan says.
If one side of a fault moved up and the other moved down, that is considered a dip slip. If one side moved to the right and the other to the left, this is considered a strike slip.
“The nature of the movement tells you about the local stress in the field at the time the rock broke,” Duncan explains. “The vertical and horizontal stresses in the rock combined with how the fluid interacts ultimately determines how the rocks break.”
That geotechnical analysis can be fundamental to understanding how a reservoir will respond over time, say 10-15 years of production,” Duncan adds.
Information explosion
It sounds so simple, yet geophysicists still have much to figure out when it comes to interpreting the “explosion of data” they are experiencing today from advanced monitoring, says Joe Lynch, director of Reservoir Management at Landmark, a business line of Halliburton.
Offshore has always been “data poor” in deepwater because it costs so much to get data, says Steven Crockett, senior product manager for Nexus Reservoir Simulation at Landmark. However, “now it is starting to experience the same data explosion that we’ve seen onshore,” he says.
“The traditional means of data gathering was to test a tank with a dip stick every month, and that was the amount produced from a well,” Lynch adds. “Now we are getting measurements on a second-by-second basis.”
Crockett adds: “It is becoming more and more critical that models mimic what’s going on in reality so that valid predictions can be made.”
Building better models
To build better models, data processing and interpretation must be streamlined to achieve an accurate picture of a reservoir in a timeframe short enough to allow operators to optimize well locations and other completion challenges.
Information including the elastic properties of a rock, its brittleness and failure capability must be married with information about the stresses in a rock to predict how the rock will respond during production, Duncan says.
Yet, “the models we use today of how a rock will break are 2D. But, the Earth is not two dimensional, so groups are trying to develop 3D models,” Duncan says.
One way to calibrate a model is through rock-breaking experiments in a lab. However, it is difficult to duplicate conditions of the Earth 10,000ft below its surface in a lab, he says. So, assumptions must be made.
“The best lab we have is the Earth itself,” Duncan says. “And, the only way to see it is through microseismic monitoring, which is now producing the results that geomechanical engineers are trying to produce in a lab. Once we get better models, we can better understand how reservoirs will respond.”
Furthermore, if the industry can more precisely measure where the hydrocarbons are coming from in a well, it can use that information to improve the quality of predictions that comes from the models, Lynch says.
“The better quality of models, the better quality of decisions that can come from those models at the end of the day,” he says.
Looking into his “crystal ball,” Lynch believes that the offshore geophysical industry will begin seeing trends in more automation, data management and predictive modeling to keep up with a heavy flow of information for operators who can’t afford to lose a drop of oil in today’s economic climate.
“What comes out needs to be actionable,” he says. “The turn-around time needs to be fast.”