Advancing Geoscience Through Science & Experimentation

Advancing Geoscience Through Science & Experimentation

0

In a recent (somewhat rowdy by his own admission) blog post by Teradata's brilliant geoscience expert, Duncan Irving, he addressed multiple frustrations with uptake of data analytics in oil and gas. The blog post is aptly named "4 Reasons Oil & Gas Companies Are Going to Fail in a Big Data World". Although we don't fundamentally disagree with Duncan's views, we feel that a slightly more nuanced view wouldn't hurt.

Duncan's third point is that "Oil companies have lost their (geo) technical capability". He writes "Tools like Petrel are replacing the holistic approach and even deterring people from testing science-driven hypotheses." His concern is that "The result is a lack of candidates to become the upstream data scientists that can discover new insights in the available data. If no one in the oil company can apply the science, then analytical discovery just can’t happen." Let's look into that from a couple of different angles.

We are old enough that we had to code up our first word processor before we could use it. Leaving out the question of whether this helps produce great prose, it is obvious that writing your own word processors in order to write prose isn't very productive. That is the primary reason why oil companies are using software suites. Our chairman, Dan Piette wrote : "The oil and gas industry has seen two prior waves of disruptive software technologies that radically changed how businesses operated. Landmark Graphics (now owned by Halliburton) introduced the first in the 1980s, whereas Technoguide (now owned by Schlumberger) led the second around the turn of the millennium. In both cases, the improvements in software resulted in more efficient use of capital (due to more discoveries and fewer dry holes) and the creation of billion dollar businesses. [...]" In other words, what the first and second generation geoscience products provided was a level playing field and productivity.

What Duncan points out is that in that process, oil companies lost their (geo) technical capability. We should not forget that what happened simultaneously, was that the service business grew drastically. What oil companies used to do, including developing their own software, is now done more and more by the software companies and oilfield service companies. So, if Duncan is indeed right that users are "deterred from testing science-driven hypotheses", what are the reasons?

1. "Open" vs "Closed". What might be said about both the first and second generation geoscience software, is that these packages were fairly closed. Although most offered some form of programming interface, the extent to which these programming interfaces are open and what they expose are probably details that are lost on a CIO. Without going into technical nitpicking, the key point is that the APIs that are generally available are fairly limited. Only a subset (20-30%?) of what's in the underlying application is exposed through these APIs, and these APIs weren't provided for "testing science-driven hypotheses" - they are there so that oil companies can add their own modules to the software suite and deploy that within the company. In other words, whereas the 2nd generation geoscience software offered some ability to "plug in", these APIs didn't offer the required level of exposure needed for oil companies to really experiment. Only the provider of the software has full access to do that.

2. Methods & algorithms. For oil companies to go back to first principles (as Duncan advocates) and not be deterred from doing so (ref the point above), users as well as researchers need practically unrestricted access to data and ability to experiment. The first part is not as trivial as one might think. Unrestricted access might require access to terabytes of prestack data, or 50.000 wells. The 2nd generation geoscience products were never written with that idea in mind. Systems like what Teradata provides support data access at that level, but are geoscientists able to experiment within the context with those systems? We would argue that they aren't. Although the usability of data visualization software like Qlik and Spotfire has evolved a lot, fact remains that packages like these do not really understand the domain specificity of the data. What is really needed, for testing of science-driven hypotheses, is the combination of unfettered data access combined with ability to write and run their own algorithms/methods in ways that geoscientists can do, and with interactivity and display capabilities that are familiar to geoscientists.

When Headwave made the Python API available, Ron Masters, Headwave's brilliant Geoscience Advisor, aptly called this feature "...if only I could try my idea..." And that is the point of science-driven testing, isn't it? Geoscientists have domain knowledge derived from years of working various basins, and the really great ones are able to go back to first principles and ask questions that sometimes change things.

More information on Headwave's Foundation for R&D (which is completely open and provides unfettered data access) here.

Teradata original article: http://www.forbes.com/sites/teradata/2015/09/16/4-reasons-oil-gas-companies-are-going-to-fail-in-a-big-data-world/

Share This Post: