Gregory Hale explains why real-time data transfer doesn’t always cut it in the offshore oil and gas environment. Real-time information is not enough anymore. When a compressor was showing signs of possible overload, a team of engineers watching the data noticed some subtle, but potentially troubling, signs at Chevron’s Machinery Support Center (MSC) in Houston, 7000mi from its Sanha field off the coast of Angola.
Operators on platforms need to know what is going to happen before things start to unfold. It sounds like magic, but it is an issue of knowing the system and understanding key data points.
“Real-time isn’t good anymore. Right now, especially in oil and gas, the easy to operate wells as a percentage are diminishing,” said Stan DeVries, senior director - solutions architecture at Schneider Electric. “If I am an expert because of my experience, I am on the beach. If somebody with less experience needs my help, by the time they contact me it is too late for me to help, then it is useless. I can’t just have real-time of what is in front of the hood, I need the GPS view of what is happening around the corner,” DeVries said.
The crew working at Chevron’s facility on-site may or may not have found the issue in time, but having the back up watching the data saved the company millions in lost uptime.
“The crew acted on the MSC’s tip and avoided a couple of million dollars in downtime and lost production,” said Fred Schleich, machinery and electrical power system manager at Chevron, in Chevron’s Next technology magazine.
The future is here
“This is not ‘Star Wars,’ there is a lot we have put together, it has to be baked into transforming the work,” DeVries said. “Especially with wells with smaller reservoirs. There is not an underground lake anymore.”
Smaller wells having a shorter lifespan and are harder to operate. If the operator has to interrupt the flow, there is a higher potential for things to go wrong quickly.Information needs to get to the proper personnel as soon as possible.
“I talk about only the right information and only the right context in only the right time – which could be ahead of real-time – to the right person,” he said. “Shell calls that bringing the work to the worker. Once the work comes to me, I want to browse around and discover patterns the software can’t figure out. That is a transformation of information to support the transformation of work.”
That transformation, as DeVries calls it, is the cornerstone of how to manage information so managers and planners have access to comprehensive performance data to assess and capitalize on opportunities.
“We are seeing a much higher demand for information transport from the offshore environment to an onshore location of some kind,” said John Gilmore, director of global application consulting at Schneider Electric. “Early drivers were Macondo, where they had a four-hour gap in information from when the last data set was sent from the drillship until the incident. There are now people saying we have to ship data more frequently and even in real-time.”
While that data would not have saved Deepwater Horizon, getting that information over to the experts would have told them much quicker about what happened.
“The bulk of the data went down with the vessel,” Gilmore said. “There are a lot of people saying they want real-time data now.”
Reducing staff for safety, costs
Reacting and developing plans off a disaster is one thing, but another emerging trend is the idea of automation reducing staff levels offshore. That not only allows for automation to offer increased levels of process control, but it also becomes safety and cost factors.
“It is a combination of business process design and then data system enhancement, in particular bandwidth, in getting the data to the beach,” Gilmore said. “What I have seen so far is yes, we can do the data, but we don’t necessarily eliminate things until we look at the business organization or the clerical staffing.”
That is also where the cost factor comes into play.
“The number I hear is US$5,000 to $10,000 a week in costs per person,” Gilmore says. “There is a manual checking and rechecking verification process that we will probably never eliminate, but right now that is a little bit of three or four guys’ jobs. Can we restructure the business process where we have one guy that does a lot of data checking and the other three can be on the beach?”
“You are trying to do the work better, you can throw technology at it or the people, but you change the work. That can be very disruptive for people and that means organizational change management is required,” DeVries says.
There is no doubt that having people, process and technology all working in sync ends up being the Holy Grail in the automation environment offshore. With fewer workers, that means technology needs to improve so workers can make informed decisions.
One example is a heat exchanger. “When a heat exchanger starts out clean as you possibly can get it, you don’t expect to take it out of service for up to five years,” DeVries says. “Today’s technology keeps calculating efficiency; you can do that every hour or every day. You can connect that information with some relatively simple calculations and say that well will only last so long and the heat exchanger will last only so long to a threshold. It is not an alarm, but it is a trigger for decisions. So the decision may be we can live to fight another day. We are making a production target and we will take the hit and we will move on when the well completes. We can work together and make the decision on scheduling on what we can do to operate less severely.”
Consolidating technology
All that data needs to pass along to command centers onshore, like the Chevron MSC, and to the executive suite. Users now want to bring everything under one roof.
Centralization of data and the ability to relate different sources into a single place is a driving force offshore, Gilmore says.
“Historically, we have had a marine control system, which runs the bottom and reports relatively little information to the topside crew. Now we are seeing the trend that says I want all the data in one place. I want to manage my entire vessel from one window,” Gilmore says. “Yes, the topside may have different control and different operators assigned but I want the data together so inferences can be made and responses made. We are seeing integration of the hull systems, ballast management, mooring management, they want propulsion DP, all of those integrated into that system as well. They also want to integrate the hotel, everything about the people, HVAC, even food management. The driver is not only to reduce the amount of equipment by centralizing stuff, but it is also driver to get the data to do the analysis, and to get the data to the beach.”
Getting the information for the “C-level and for technology knowledge workers has been around since 1995,” DeVries says. While some of the technology has evolved, the idea and the method have remained constant. But, DeVries adds, only a few users are communicating enterprise wide. That could mean users have the potential to reap benefits once they start employing a complete automation program.
DeVries explains: “42% of errors that lead to unplanned shutdowns are caused by people, and that is from experienced workers. What would the number be with less experienced worker?”
Whether it is about a shortage of workers or embracing new technologies, to grow and hike productivity to become more profitable, users now have to manage information to understand performance data and to jump on opportunities faster and more intelligently – before real-time.
Gregory Hale is the Editor and Founder of Industrial Safety and Security Source (ISSSource.com) and is the Contributing