DC Element | Wert | Sprache |
---|---|---|
dc.contributor.advisor | Hammer, Conny | - |
dc.contributor.advisor | Walda, Jan | - |
dc.contributor.advisor | Gajewski, Dirk | - |
dc.contributor.author | Knispel, Stefan | - |
dc.date.accessioned | 2024-11-29T13:13:15Z | - |
dc.date.available | 2024-11-29T13:13:15Z | - |
dc.date.issued | 2024 | - |
dc.identifier.uri | https://ediss.sub.uni-hamburg.de/handle/ediss/11314 | - |
dc.description.abstract | Seismic data, which is crucial for understanding the Earth’s subsurface structure, is frequently compromised by incoherent and coherent noise, complicating accurate geological imaging and thus making noise suppression one of the most important processing steps. Traditional denoising methods, although widely used, are typically time-intensive and struggle to differentiate adequately between signal and noise, often leading to primary signal damage. We therefore utilize advanced machine learning methods and in this regard introduce a novel self-supervised residual encoder-decoder network equipped with a local attention mechanism to effectively attenuate uncorrelated seismic noise. The self-supervised nature of this architecture allows the network to learn directly from the data itself, eliminating the need for explicit labels or prior knowledge about the noise, thereby simplifying us- age and enhancing efficiency. Residual connections within the network help retain critical seismic signal characteristics during the denoising process. A significant innovation in our approach is the integration of a local self-attention mechanism, enabling the model to concentrate on relevant segments of the input data, thus improving noise attenuation while preserving the seismic signals. Additionally, we employ a specialized loss function that combines Mean Squared Error (MSE) with the Structural Similarity Index (SSIM) to minimize primary damage and ensure better preservation of primary signals. To address the more complex challenge of coherent noise attenuation, we enhance the encoder-decoder network by incorporating attention gates alongside the already implemented local attention within the so-called Dual-Attention Residual Encoder-Decoder (DARED) network. This dual-attention mechanism allows the network to focus on both local and global features, further reducing the loss of primary signal. Given the predictable structure of the noise, we have to use supervised learning with labels, generated by rank-reduction-based denoising. Recognizing the time-intensive nature of traditional denoising parameter selection, we propose to train the network on a manually denoised small portion of the dataset before applying it to the entire dataset. This strategy offers a time-efficient enhancement to conventional methods. The results indicate that our approach can reduce the primary damage more effectively than the deterministic denoising label. Furthermore, this thesis tackles the challenge of limited labeled training data by introducing a novel data augmentation scheme based on Denoising Diffusion Probabilistic Models (DDPM). This approach generates new seismic data and corresponding labels that mirror the distribution of the seismic dataset used to train this generative network, effectively addressing the training data limitation. By integrating DDPM-based data augmentation with the Dual-Attention Residual Encoder-Decoder network, we achieve significant performance improvements in denoising. Our results demonstrate that this combined approach enhances the attenuation of noise and also better mitigates primary damage, compared to the application of the DARED network without data augmentation. Applications of these methods to both synthetic and field seismic data showcase their potential for seismic denoising. These methods offer efficient and less destructive noise attenuation techniques, underscoring the impact of using neural networks and integrating attention mechanisms alongside data augmentation based on generative AI in seismic data denoising. This thesis highlights the crucial role of advanced machine learning in modern geophysical research. | en |
dc.language.iso | en | de_DE |
dc.publisher | Staats- und Universitätsbibliothek Hamburg Carl von Ossietzky | de |
dc.rights | http://purl.org/coar/access_right/c_abf2 | de_DE |
dc.subject | Neural Networks | en |
dc.subject | Seismic Denoising | en |
dc.subject | Autoencoder | en |
dc.subject | Attention | en |
dc.subject | Diffusion Model | en |
dc.subject.ddc | 550: Geowissenschaften | de_DE |
dc.title | Neural Networks for Seismic Data Denoising: Attention Mechanisms and Diffusion Models | en |
dc.type | doctoralThesis | en |
dcterms.dateAccepted | 2024-11-22 | - |
dc.rights.cc | https://creativecommons.org/licenses/by/4.0/ | de_DE |
dc.rights.rs | http://rightsstatements.org/vocab/InC/1.0/ | - |
dc.subject.bcl | 38.72: Seismik | de_DE |
dc.subject.gnd | Prinzessin Dornröschen | de_DE |
dc.subject.gnd | Maschinelles Lernen | de_DE |
dc.subject.gnd | Geophysik | de_DE |
dc.subject.gnd | Neuronales Netz | de_DE |
dc.subject.gnd | Rauschunterdrückung | de_DE |
dc.type.casrai | Dissertation | - |
dc.type.dini | doctoralThesis | - |
dc.type.driver | doctoralThesis | - |
dc.type.status | info:eu-repo/semantics/publishedVersion | de_DE |
dc.type.thesis | doctoralThesis | de_DE |
tuhh.type.opus | Dissertation | - |
thesis.grantor.department | Geowissenschaften | de_DE |
thesis.grantor.place | Hamburg | - |
thesis.grantor.universityOrInstitution | Universität Hamburg | de_DE |
dcterms.DCMIType | Text | - |
dc.identifier.urn | urn:nbn:de:gbv:18-ediss-123447 | - |
item.languageiso639-1 | other | - |
item.fulltext | With Fulltext | - |
item.creatorOrcid | Knispel, Stefan | - |
item.creatorGND | Knispel, Stefan | - |
item.advisorGND | Hammer, Conny | - |
item.advisorGND | Walda, Jan | - |
item.advisorGND | Gajewski, Dirk | - |
item.grantfulltext | open | - |
Enthalten in den Sammlungen: | Elektronische Dissertationen und Habilitationen |
Dateien zu dieser Ressource:
Datei | Prüfsumme | Größe | Format | |
---|---|---|---|---|
Dissertation_Knispel_2024_publication.pdf | 01369d3f2a6574e2993298314c5f80c8 | 37.98 MB | Adobe PDF | Öffnen/Anzeigen |
Info
Seitenansichten
Letzte Woche
Letzten Monat
geprüft am null
Download(s)
Letzte Woche
Letzten Monat
geprüft am null
Werkzeuge