Robust Extraction of Useful Information from Seismic Measurements

September 3, 2019

Geophysicists rely on seismic data to understand the Earth’s subsurface. Data from seismic receivers contains two types of information convoluted into a single signal: information about the source of the signal (source effects) and information about the subsurface features it passed through on its way to the receiver (path effects). Current methods for separating out the two types of information rely on assumptions which may not be completely accurate—extracting source effects requires assumptions about the path, and extracting path effects requires assumptions about the source. In a recent paper, ERL researcher Dr. Pawan Bharadwaj, in collaboration with Dr. Aimé Fournier and Prof. Laurent Demanet, introduced a new mathematical method, “Focused Blind Deconvolution,” that can be used to extract source and path information without relying on the assumptions. Instead, this method compares data from the same source picked up by multiple receivers, and uses advanced math to identify similarities and differences among them. Similarities among the signals can be identified as source effects, while dissimilarities indicate path effects. Because it does not require the aforementioned assumptions, this method could provide more accurate information to both earthquakes scientists (who are generally interested in the source) and energy industry scientists (who are generally interested in the path). The researchers demonstrate the method by applying it to the Marmousi model, a standard test model in geophysics. Dr. Bharadwaj is a postdoctoral associate in ERL and the Department of Mathematics, and Dr. Fournier is a Research Scientist and Principal Investigator in ERL and the Department of Earth, Atmospheric and Planetary Sciences. Prof. Demanet is the director of ERL and a joint faculty member in both departments.

Read the full paper

 

Above: Video about Focused Blind Deconvolution created by the researchers for the 2018 SEG Annual Meeting.

Cover image: Petr Bro┼ż (Czech Academy of Science), CC BY-SA 4.0.