With its latest software release, Paradigm offers high-definition seismic processing for a general user. Sarah Parker Musarra discussed the powerful new software with Paradigm SVP Indy Chakrabarti.
Accurate delineation of salt bodies with massively scalable pre-stack depth migration algorithms and visual quality control capabilities. |
When a new technology is introduced, it is typically implemented in stages. Companies, like consumers, often do not heft large sums of money at a product any time the next big thing is introduced.
The seismic industry is no different. With the introduction of high-definition data companies are facing down a new technology – and deciding how to maximize the value from these expensive seismic acquisitions. Exploration and production software provider Paradigm attempts to bridge this gap with its latest software enhancement, Paradigm 14.1, which it released in October.
While seismic acquisition companies have been shooting and acquiring high-definition data, Senior Vice President, Product Management & Strategy for Paradigm Indy Chakrabarti says that the Paradigm 14 series is a software solution dedicated to subsurface evaluation in high-definition.
Chakrabarti says that the industry is experiencing the same shift that the television industry experienced a few years ago. When the industry switched from a standard definition to a high-definition signal, many did not run out and purchase televisions right away, even though the capacity was there to experience a clearer signal. These people were still able to watch television; they were just not able to take advantage of the more precise image. Similarly, many operators are gathering high definition data, but lack the software systems to properly use it. Paradigm offers hundreds of new features, but in the creation of this particular software system, Chakrabarti says there was no need to reinvent the wheel.
“Of course, [the software] needs to have all the traditional capabilities,” he explains. “It’s not like it’s a brand new system. Just like with the TV, it didn’t have to be reinvented; it just needed to be tuned in the right way.”
Under the tagline “advanced science for everyone,” Paradigm 14 addresses the three components needed to extract value out of high-definition data: performance, scientific techniques and ease-of-use.
“When you have higher definition, you have much larger volumes of data because you are now carrying more content. You have to be able to load those volumes, process data quickly, and then you have to be able to work with it,” Chakrabarti says, noting that high-definition seismic volume is hundreds of gigabytes or terabytes in size, so performance is critical.
Scientifically, Paradigm 14.1 can deploy new techniques because of the data’s high level of precision, including quantitative interpretation (QI), which, in the seismic domain, characterizes rock, fluid and flow inside a reservoir. Large scale interpretation has always been a part of seismic acquisition, but the upgraded software allows for a reduction in the number of false positive identification of hydrocarbon.
“Drilling down into details, to get into that fine, rich, additional information so that you can discriminate between false positives and real hydrocarbons is what the technique of QI is all about,” Chakrabarti says. “It’s what you can do because your data is in high-definition.
Another way that Paradigm 14.1 attempts to appeal to an ever-changing industry is through its ease-of-use. Chakrabarti explains that advanced techniques like QI were previously only carried out by specially-trained personnel. That is not always the case now.
“Now with high-definition data, an operator has the opportunity to let lots more folks do this kind of advanced analysis,” he says. “Not only do they have the opportunity, they have the need.
“We see declining oil prices in the news today. What that means for operators is that they have to get more efficient in the way they explore. So there’s an opportunity with high-definition data and there’s a need to get more productive.”
Paradigm 14 can also deploy advanced techniques to older data collected in standard definition through broadband processing, which is also commonly referred to by a subcomponent: de-ghosting.
“If you’ve got high-definition data, then we have a system that has the performance; the scientific techniques; and the ease of use for you to extract information out of it,” Chakrabarti says. “If you don’t have high-definition data, we have a technique that lets you enhance the definition of your standard legacy data.”