Page 19: of Offshore Engineer Magazine (Aug/Sep 2015)

Read this page in Pdf, Flash or Html5 edition of Aug/Sep 2015 Offshore Engineer Magazine

Analysis In-Depth

By Elaine Maslin

Big data gets even bigger

Big data is not just for the seismic segment. Elaine Maslin takes a look at other areas, such as condition monitoring, where the industry could make signi? cant gains.

ig data is here and there’s no stopping it, a speaker at a recent oil con- ference told the audience.

B

It is going to get faster and faster, in a lot of areas, he warned, saying that such was the pace of growth it could easily swallow up a lot of resources.

Big data has indeed been growing. Total launched its 2.3 peta? op super computer Pangea in 2013, and the system went on to help analyze seismic data from Total’s Kaombo project in Angola in nine days – compared to four and a half months it would have taken before. The same year, Eni launched its latest super computer, with 3.3 peta? op capacity. Earlier this year, Petroleum

Geo-Services topped them all and ordered a new 5 peta? op system.

But, these machines have mostly focused on seismic modeling and data interpretation. What about the broader industry?

Awash with data

Tor Jakob Ramsøy, director, McKinsey & Co., says while the industry more or less invented big data 30 years ago, to handle seismic modeling, it’s “not there” in other areas, such as condition monitoring, where signi? cant gains could be made.

“Information is something this industry is not understanding the value of,” he told the Subsea Valley conference in Olso earlier this year. “You under- stand the assets in the ground – the geology. What is being done with the other information? [The industry is] just imposing more and more data tags, but there is no evidence of how it is making money out of it [the information gathered from these tags].

“The Johan Sverdrup development will have 70,000 data tags, he says, com- pared to 30,000 on the North Sea platform. Yet, in production and operations departments, there are no data scientists. “The industry is drowning in informa- tion, but it doesn’t get to those who need to use it,” he says.

A lot of data collected is also wasted. According to Ramsøy, some 40% of data is lost because sensors are binary, i.e. they simply show if a parameter is above or below where it should be, which is important data, but doesn’t give data for trending to aid decision making or planning. More data is then lost because there’s no interface to enable real-time use of analytics.

Further, data management is ad hoc, infrastructure is limited, in terms of high-

A view of the server room at BP’s Center for High-

Performance Computing, which opened in 2013. speed communication links etc., so little is streamed onshore. Forty percent of

Photo from BP.

data generated isn’t stored for future use and the remainder is only stored on the oedigital.com August 2015 | OE 21 021_OE0815_Analysis_inDepth.indd 21 7/21/15 3:47 PM

Offshore Engineer