Seismic data and exploration geophysics face plenty of big data challenges.
In WSJ’s CIO Journal in December 2017 Deloitte analysts wrote that, with many companies doubling their data every two years, short-term, narrowly focused strategies for data storage can quickly become obsolete. New content management architectures and strategies will be needed to accommodate the Big Data explosion.
The Holy Grail for today’s companies, Deloitte added, is an enterprise-wide content management strategy to handle increasing volumes of data in an agile, efficient, and controlled manner. This is especially true in the world of oil and gas, where digital transformation of subsurface E&P data, followed by careful analysis, is crucial to optimizing today’s production and tomorrow’s discoveries.
But production optimization doesn’t come easy. Oil and gas companies must first tackle the big data challenges facing seismic data and exploration geophysics, including establishing a company-wide, global approach to resolving those issues, such as:
- Maintaining data sovereignty worldwide
- Managing petabytes (PB) of seismic data
- Implementing big data analytics
Taking a comprehensive content management approach to managing big data can increase discipline and availability, solve challenges related to architecture, global regulatory compliance and data ownership, and provide big benefits such as optimized production.
Let’s dive deeper into these E&P big data challenges, and see how you can solve them.
Keeping Data Sovereign to its Source
The first big data challenge oil and gas companies will need to solve is data sovereignty.
Data sovereignty is the concept that information that has been converted through digital transformation and stored in a binary digital form is subject to the laws of the country in which it is located. Concerns underlying data sovereignty include enforcing privacy regulations and preventing E&P data stored in a foreign country from being subpoenaed by the host country’s government.
But data sovereignty became a whole lot more complicated with GDPR. Data ownership rights across national boundaries have always been enforced, but the passage of the UK’s General Data Protection Regulation in 2018 made data sovereignty a top-of-mind issue.
Why so? Because there is no United Nations of data, and a seismic data cloud is not a country.
If data sovereignty laws prevent data from leaving the country in which it was created, and you store your E&P data in the cloud, how can you consistently control, manage and access your proprietary data, including seismic, interpretation and well data, that is scattered across all the geographic borders you cover?
The answer is the geospatial metadata within your seismic data – the information associated with a position on the surface of the globe. This metadata may be stored in a geographic information system (GIS) or may simply be documents, datasets, images or other objects, services, or related items that exist in some other native environment. The important metadata needs to be extracted from the data sources into something contextual and structured.
Once captured in context, you’ll need secure web content storage management that can examine this geospatial metadata within your data and verify its proprietary ownership before allowing access to it. This not only follows the letter of global laws, but it also reduces the risk for you and your subsurface E&P data assets worldwide.
Managing Seismic Data – Terabytes (TB) and Petabytes of It
Seismic data is Big Data exemplified.
Seismic data provides a “time picture” of the subsurface structure. By time picture, we mean it includes 2D reflection cross-sectional views in both dip and strike directions and 3D reflection cross-sectional views along any azimuth as well as time “slices” on any horizon. It also includes shear-wave data on lithology, fractures, and the presence of hydrocarbons, and refraction seismic data that provides a deep crustal view of gross structure, from basin scale to lithosphere-upper-mantle scale.
Seismic data is growing exponentially. As soon as we think it can’t get much bigger, it does!
To handle massive amounts of seismic data, you need to look no further than the Professional Petroleum Data Management Association’s PPDM Master Data Management data model, coupled with Esri‘s GIS map-interface that gives you direct access to your digital subsurface data, which is securely stored in a private cloud. Then routinely maintain your data with a program of content management maintenance services to maximize your data’s value and accessibility.
It may sound complicated, but the positive impact to your bottom line makes it well worth it. Global access to seismic data opens the door to big data analytics and production optimization.
We’ll discuss production optimization more in a minute. For now, just know you need data management flexibility. Each of your business units must be able to locate its local subsurface data, with a central administration to ensure all business-unit locations are being managed according to your established framework.
Implementing Big Data Analytics
Clearly, there are several big data challenges facing geophysicists. Fortunately, the rewards of mastering big data analytics vastly outweigh the work that’s involved. Seismic big data is only getting bigger, and with it, more opportunities are unearthed. Once you master the processes of analytics, you’ll be able to fully tap the benefits of oil and gas big data:
- Greater productivity
- Saving time
- Enhancing safety both onshore and offshore
- Optimizing current production
- Making new discoveries using legacy assets
The name says it all: big data analytics allows you to manipulate and mine massive volumes of dissimilar data streams that were previously impossible to integrate. With big data analytics, you can observe new patterns, modify workflows, make more accurate predictions and uncover new subsurface exploration possibilities. Essentially, you can extract more value from your legacy assets.
But big data analytics involve such complex and large data sets such that it’s difficult to capture, process, store, search and analyze using a conventional database system. Doing so would be like trying to juggle a million data sources, each in a different format, and none with any context as to where and when the data was captured and cataloged. To handle a task of this magnitude, you’ll need an end-to-end data management program.
To make sure you use all your relevant data to make better decisions means first placing it in a “data lake” – a centralized repository within your data management system that stores both structured and unstructured E&P data at any scale. At that point, a good solution will bring structure and context to unstructured data – making it accessible for you and your data science team for detailed analysis.
You’ll also need a user-friendly online Esri GIS map-interface portal for your PPDM database to access all that geological and geophysical information you’re storing. The portal displays all your subsurface data on the map, readily available for interpretation and data analysis. Your geophysical data scientists can immediately begin analyzing your data and feeding the data into your reservoir model.
It’s simple, really. Instant access to your E&P data significantly reduces time spent searching for subsurface assets, leading to a much faster understanding of the reservoir, and consequently a much faster resolution to production bottlenecks.
In 2019, Katalyst will add about 15 petabytes of newly digitized E&P data to our Data Management Services, for a total of roughly 55 petabytes under our management. That’s strictly E&P data from our clients. Shouldn’t you be one of them?
To get started, give Katalyst Data Management a call. We provide the only integrated, end-to-end data management and consulting services specifically designed to help companies like yours with the challenges and rewards of big data.
© 2022 SeismicZone | All Rights Reserved.