Contact a PGS expert
If you have questions related to our business please send us an email.
Andrew Long shares insights from EAGE's 84th annual conference.
There were many CCS and other ‘decarbonization’ themes presented at EAGE 2023. I will attempt to summarize several of them in an upcoming “Industry Insights” newsletter. It was clear from the diverse content that the scale of challenges to CCS and monitoring is quite vast, and even more so than ‘traditional’ EOR / IOR / reservoir depletion management, requires a multi-disciplinary collaboration to understand long-term geochemical processes from the pore scale to the long-term management of diverse local, regional, and global stakeholders.
In the meantime, I decided to briefly focus on the other key theme at major conferences these days, the application of AI to geosciences.
Machine learning (ML) comes in many forms, but the most quoted today typically involve the application of trained neural networks (NNs) to make classifications and predictions in some form. I can recall forming my strong interest more than two decades ago in the use of Bayesian statistics for subsurface characterization, but I was frustrated by my inability to automate rock physics modeling and quantitative interpretation (QI) when playing with toy examples. So, it feels particularly satisfying to see the most demonstrable benefits of ML to petrophysical data conditioning, subsurface property prediction, and increasingly, to the automation of seismic processing workflows that feed the input data to QI.
Baturin et al. from TotalEnergies (Read here) used NNs to fine-tune and greatly refine the prediction of sand content from seismic inversion, adapt the reservoir characterization workflow to a complex sedimentary setting with highly heterogeneous reservoir intervals, and stated with my favorite phrase in their abstract, use “… the Neural Networks as a complement to the inversion rather as an alternative to it.” Nevertheless, several abstracts described efforts to use NNs to directly predict, most commonly, either subsurface velocity models or geological facies. Indeed, Ravasi et al. from KAUST (Read here) addressed the question of whether NNs can replace or augment model-based seismic inversion. Unsurprisingly, they observed that purely data-driven methodologies can only outperform state-of-the-art model-based algorithms under favorable conditions with dense well control. In a scenario where the legacy seismic data in offshore Angola were not ideal for predicting reservoir properties, Osypov et al. from Halliburton (Read here) showed an example of how deep learning seismic inversion was able to identify the reservoir seal and observe some lateral changes in lithology that were a significant improvement over the previous traditional approaches.
One common challenge to any NN-based scheme is the burden of building training data with appropriate labels. Using data from offshore New Zealand, Alfarhan et al. from KAUST (Read here) proposed a self-supervised framework to alleviate the need for large amounts of labeled data. Masked autoencoders were shown to aid the labeling process, with improved facies classification results: 78% accuracy on the untrained portion of the data using 20% of the labels, while a fully supervised approach reached 61% accuracy.
Regards the automation of seismic processing workflows, the DeepWave consortium led by Tariq Alkhalifah at KAUST has clearly been publishing several interesting abstracts. A Transformer-based model with a pretraining and fine-tuning framework referred to as StorSeismic was used to sequentially apply denoising, direct arrival removal, multiple attenuation, and VRMS prediction to synthetic data from the Marmousi model by Harsuko and Alkhalifah (Read here), who also described steps to improve performance (Read here). Several industry presentations in recent years have demonstrated beneficial applications of NNs to isolated seismic processing and imaging steps, but application to many sequential steps promises to greatly simplify and accelerate seismic project turnaround.
Of course, seismic imaging zealots are eager to demonstrate how innovative combinations of FWI and LSM (least-squares migration), such as the simultaneous inversion of velocity and reflectivity by Pankov et al. from PGS, promise to compress entire workflows into one step, and for appropriate implementations, also naturally output key subsurface properties such as impedance and relative density.
Common to all high-end imaging and deep learning model pursuits, computing requirements are increasingly significant. Quantum computing is attracting vast global investments by global players eager to gain strategic advantages (Read here). Alsalmi and Dossary from Aramco (Read here) used quantum computing principles to break beyond classical computing limitations and develop the industry’s first quantum-based seismic attribute computation algorithm. It is early days, but these steps are an exciting portent for what may be achievable in the future.
Overall, EAGE 2023 was clearly characterized by energy and enthusiasm for returning to an industry faced with somewhat existential challenges. Here’s hoping that EAGE 2024, due to be held in Norway, a nation apparently trying to break free from its oil and gas roots, is even more interesting.
If you have questions related to our business please send us an email.