Just spin it the right way, and digital technol- ogy-working with standardized infrastructure, sprinkled with new age seismic sensors, analytics and algorithms - produces Big Data which transforms workflows, improves collaboration and enhances production and profit.
“Automation in mixing and pumping is out there,” said Jason Dykstra, Ph.D., senior technical advisor and manager of Halliburton’s research group, the Automation Center of Excellence. “But the full automation of the process is still to happen.”
Acceptance of new automation and new systems has taken so long because it’s difficult, requires a large investment and there’s a fear it will take jobs. But there are now more doors open to these ideas and changes upstream. In the end, however, upstream still lags behind in automation compared to downstream which has pioneered the use of advanced automation techniques since the 1980s, Dykstra said.
A term used loosely in the industry for years, au- tomation is significant now because of its relation to Big Data. Today, automation can capture data, put everything in proper context; get the information to the right people and enable real-time decision making. This allows systems to do what humans can’t. By accepting the shift to more supervisory roles, crews can now spend more time overseeing operations and making real-time, profit-making decisions based on data that has been automatically pre-analyzed.
The complexity of systems is why the automation model is still evolving. After all, an offshore platform is not the place to experiment with a new system. But once people accept the fact automated systems work better, they are more open to adopting it. With the manpower gap of capability growing in all levels of operations, companies will have to make the investment, Dykstra said.
NOW’S THE TIME
“The easy days are over. It’s a technology race,” said Arjen Dorland, the man charged with reinvigorating Royal Dutch Shell, the world’s second largest company, with a 100 percent increase in computing power. His work uses the latest seismic sensors linked to new interpretative software, visualization applications to make new information available to a wider internal audience – all to reduce the cost of drilling thousands of wells with increased speed and efficiency. Preparing for the onslaught of Big Data, Dorland was named in 2011 to lead Shell’s new effort, called Technical and Competitive Information Technology.
As Shell Chief Executive Peter Voser told The Wall St. Journal, information technology (IT) innovation is critical for the company to become the most com- petitive and innovative energy company in the world. While declining to comment specifically on the amount of its investment in IT, Shell’s capital spending rose to $38 billion in 2013 from $32 billion in 2012 up 23 percent from $24.6 billion the previous year, according to the company.
“Transforming Big Data to enable better decisions requires significant work,” said Charles Peters, senior executive vice president at Emerson. “Big Data opens a sea of opportunities.” Organizations must simply work through the challenges to ensure numerous inputs are useful inputs. “The potential looms large for those organizations that commit – to better business processes, to destroying organizational silos, to smarter products and to solutions that allow our customers to prosper.”
Exploration and Production is data-driven, that’s not new. However, what drives the data is. New technology is now available to process high volumes of data, which is unprecedented in scale and scope, in streams rather than fixed datasets, accumulating in large volumes at high velocity. The potential is enticing: More oil from existing wells, a hedge against price volatility, a buffer to risk, enhanced use of diminishing talent, remote operation, and, the holy grail, the fully automated rig and higher profitability.
Advanced technology is the spine of 21st century energy development. Just take a look at Chevron’s internal IT traffic which exceeds 1.5 terabytes a day. In one case a large seismic data processing center “will gather the power of 20,000 personal computers to crunch a single seismic data set,” said Jay R. Pryor, vice president, business development, Chevron Corp., in a speech to the World National Oil Companies Congress.
“The oil and gas industry is recognizing that there could be untapped value in data that has been previously unexamined or inaccessible,” said the global research and consulting firm IDC Energy Insights, adding the industry is starting to think about whether there is value in analyzing data across disciplines. For example, could seismic data, typically the province of exploration, be used to enhance oil production?
IDC expects the Big Data technology and services market to grow from $3.2 billion in 2010 to $16.9 billion in 2015, predicts Jill Feblowitz, vice president at IDC Energy Insights.
Big Data is generally defined as volume, velocity, variety and value, which Invensys Operations Management calls the “four rights” – the right information to the right people at the right time with the right context.
Because of the torrent of data, people are still spending about 70 percent of their time preparing data, the same as 20 years ago. What they need is better context, Invensys executives explain. This is essential.
Simply stated, the context needed for the data is the business objective of the rig: Increase quality or quantity of production, or to reduce cost of operation, costs of drilling or energy use, while improving safety and environmental standards. With the context for the business objective of the operation, the operator through all the automation tools can quickly arrive at the strategy to achieve the objective. Don’t gather data for the sake of gathering data, use it to turn a business strategy into profit, Invensys executives urge.
The key to understanding the business objective is the overlay of policies and procedures, involving contractors and operators. Difficulty in ensuring the lack of compliance to policies and procedures is inherent in the business model which doesn’t allow companies working on the same rig to cooperate.
“We can’t change the culture, but we can change behaviors; we can ensure that policies and procedures are responded to. This is why people are investing in process improvement,” said Victor Lough, Invensys Operations Management product manager.
While Big Data is revered as the silver bullet, it also looks like a short cut, “let’s remember policy and procedure, then apply technology. With Big Data you need software but you must remember that you also need the boundaries from policies and procedures,” Lough said.
As the industry faces an unprecedented drain of expertise from retirements, experience and knowledge can be leveraged effectively with a process safety management system. Also, as the visibility of the costs of decisions increases with the system, decisions get faster action, saving time and money, as well as maximizing talent.
NEW FRONTIER OR WILD WEST
Senate Committee chairman Jeff Bingaman stated at the first Deepwater Horizon oil spill hearing in 2010, “At the heart of this disaster are three interrelated systems: A technological system of materials and equipment, a human system of persons who operated the technological system, and a regulatory system. These interrelated systems failed in a way that many have said was virtually impossible. We will likely dis- cover that there was a cascade of failures: Technical, human and regulatory.”
In fact, more than 40 percent of all safety/reliabil- ity related incidents are caused by human factors, reported the Norwegian Continental Shelf Petroleum Safety Authority in 2011. “Mny of these failures could have been avoided if the management and field based operations were situationally aware and able to take preventative action using well defined operating procedures,” Lough noted in a paper (SPE 146289, Mobile Workforce Integration with Process Safety Management Framework Enables Sustained Improvement) presented at the SPE Offshore Europe Oil and Gas Conference in Aberdeen, UK.
The days of “near misses,” that is, when luck overrides disaster, should be long gone with the deploy- ment of Big Data solutions allowing real time visibility through to the control center of the operation.
To further reduce risk and its associated costs, there needs to be a collaboration of technology for condition monitoring and human factors, specifically:
With the development of an automated enterprise control system a company can generate the consistent data needed for an operation with less risk and downtime, while using technology to analyze interoperable layers of data.
“In the past, such a safety system was seen as a cost, not a profit; however, safety is reliability and reliability is profit,” Lough said.
SETTING THE PACE
“Chevron set out to reinvent and automate operations using existing, emerging and yet-to-de-developed technologies and workflow enhancements,” said Mike Hauser, program manager of Chevron’s Upstream Workflow Transform (UWT) effort in Chevron’s on-line publication “Next.” Under a broad business priority known as the digital oil field, the new program is the result of a decade of investment in infrastructure and instrumentation, mostly in Chevron’s North America operations. Now the company wants to extend the proven solutions and safety gains from its U.S. oil and gas fields to its operations on six continents.
Industry results show up to 25 percent in operating cost savings, up to 8 percent higher production rates, 2 to 4 percent lower project costs, and as much as 6 percent im- proved resource recovery within the first year of deployment of a “digital oil field,” ac- cording to global information company IHS CERA, which is tracking projects at a dozen companies.
“Chevron has been a leading light, one of the early industry drivers, and they’ve worked methodically and thoughtfully to become one of the top three companies working on the digital oil field,” said Judson Jacobs, IHS CERA research director.
“We used technology to change what we do, rather than optimize what we have always done,” said Jim Williams, one of the key managers of Chevron’s foray into the digital oil field.
Each day at Chevron’s big Sanha Field off the coast of southern Africa, operators inject millions of cubic feet of natural gas, an essential task at a complex facility that produces millions of barrels of ultra-light oil per year. When a compressor showed subtle signs of overloading, the first person to notice was 6000 miles away in Chevron’s Machinery Support Center (MSC) in a Houston office tower.
Now this and other Chevron upstream operations have solid backup to detect any similar situations with the teams and tech- nologies at the global MSC. The MSC actually evolved from an earlier surveillance center designed to monitor compressors in the Gulf of Mexico and California.
“We’ve seen a revolution in sensors to mea- sure what’s happening down in the wells and in production equipment and have seen major advances in process instrumentation. And we’ve connected hardware and data to field performance models, continually analyzing information and making optimum decisions to maximize output,” Hauser said.
INTEGRATED TECHNOLOGY
It’s about automation, but also integration – linking once separate functions, such as maintenance and drilling, and managing them within value chains. This requires streaming all relevant data into asset-decision environ- ments, which fuse humans, data and technol- ogies in a collaborative setting, said Chevron i-field specialist Darrell Carriger. “Centralized surveillance allows management of excep- tion, which enables a more efficient use of the workforce than manually checking every well and facility.” For example, a malfunction that might reduce output by 200 barrels in a week is caught and fixed in a day.
Before the support centers were conceived, Chevron’s Gulf of Mexico operations created an Offshore Logistics Decision Support Center to streamline the constant coordination of vessels, supplies, equipment and people moving between shore bases and hundreds of structures. Within a year of opening, the center was logging cost savings from smarter vessel usage and fleet management, and safety gains as well, Hauser said.
It is no surprise the Gulf of Mexico presents major chal- lenges for data managers and IT professionals. The dif- ficult job of tracking and then correcting data issues, such as missing wells and wellbores, was creating a growing resource allocation problem for Chevron. It was also becoming harder to keep information in sync with the steady stream of new well data entering the master da- tabase from various vendors and government agencies, Schlumberger explained in a case study.
Chevron ended up having to manage huge amounts of well data stored in approximately 100 project databases, ranging in size from 200 to 17,000 wells. Working to actively find errors, correct them and keep project data in sync with the most current well data, Schlumberger engineered several advanced software packages to achieve full data quality management automation. This enabled Chevron to:
Chevron now has higher quality data in master and project data stores, which allows personnel to focus on exceptions or other situations that need expert attention. Standardization and the high level of accuracy from the upgrades has improved Chevron’s Gulf operation.
As a result, UWT is building an enterprise version of the Gulf of Mexico’s logistics solution for deployment across all major Chevron upstream operations.
Recognizing the value of designing and building its major new projects as digital oil fields, the company is investing at least $1 billion in each of 40 energy devel- opments. Within 10 years, 50 percent of company pro- duction is expected to come from today’s big projects. “We’ve set a course to fully harvest the potential of the digital oil field” Hauser said.
HYPE, HOPE OR HAPPENING?
So why are the applications of Big Data and analytics in E&P still “in the experimental stage,” as claimed by IDC and other industry observers.
In IDC’s 2012 Vertical IT and Communications Survey of oil and gas companies based in the United States, 70 percent of the 144 respondents were not aware of the terms Big Data and analytics. The most highly ranked (at 22.5 percent) barrier to adoption of Big Data and analytics was the lack of business support and/or busi- ness units not understanding the benefits of Big Data.
The market is still uncertain about the costs and requirements of Big Data and analytics. There have not been enough cases for the industry to weigh the busi- ness benefits of Big Data and analytics against the level of investment required to achieve greater reliability and speed, IDC concluded.
Deloitte Consulting in a report entitiled, “The Insight economy, Big Data matters – except when it doesn’t,” cautions, “the goal is more insight, not more information.”
Deloitte suggested “crunchy questions” to chew on: 1 – What are the five most critical business decisions your organization made last year? 2 – How many of them should have had better information? 3 – Who within your organization is making sure you have better information next year?
Deloitte advised: “Whether you’re looking for quants who understand business or up-and-coming leaders who ‘get’ analytics, there are simply not enough data scientists to go around. Fortunately, you already have some of the talent you need in-house. Identify them. Understand them. Take care of them. And make sure they have opportunities to learn, grow, and be fulfilled.
“Getting Big Data right means aligning information capital, human capital, and organizational capital to build a culture of disciplined decision-making. Analyzing data. Concerting data into actionable in- sights. Generating foresight and creating incentives for people to make effective decisions no matter where wGetting Big Data will also transform workflows, improve collaboration and enhance production and profit.