DC ElementWertSprache
dc.contributor.advisorBruns, Patrick-
dc.contributor.authorKubetschek, Cora-
dc.date.accessioned2026-04-27T11:31:18Z-
dc.date.available2026-04-27T11:31:18Z-
dc.date.issued2026-
dc.identifier.urihttps://ediss.sub.uni-hamburg.de/handle/ediss/12308-
dc.description.abstractTo perceive a coherent world, the brain must infer which sensory signals originate from the same event. Bayesian Causal Inference describes this process as an estimation of the probability that cues share a common cause, combining bottom-up sensory evidence with top-down prior knowledge and weighting cues by their reliability. The ventriloquist effect - where visual cues bias perceived sound location - provides a powerful model of this inference, yet the distinct temporal roles of visual saliency and learned priors remain poorly understood. This dissertation investigates how these influences shape multisensory binding across time. Study 1 established a methodological basis by dissociating perceptual visual saliency from sensory reliability in a unimodal localization task. Participants localized moving annular gratings of different sizes, a manipulation known to affect saliency. Localization precision remained constant across conditions, confirming that stimulus size alters perceived prominence without changing encoding fidelity. This validated a controlled way to manipulate saliency independently of reliability in later multisensory experiments. Study 2 examined how bottom-up saliency biases audiovisual integration. In a ventriloquist paradigm with two visual stimuli differing in saliency, participants’ sound localizations shifted toward the more salient cue, indicating that saliency promotes a common-cause interpretation. EEG analyses showed that pre-stimulus alpha and beta power reductions and specific theta phase states predicted stronger visual capture, pointing to state-dependent readiness for integration. During stimulus processing, early gamma increases and alpha suppression reflected enhanced feedforward processing of salient input. Mid-latency parietal and frontal ERP components (250-600 ms) indexed the evaluation of causal structure. Together, these results show that saliency influences both preparatory neural states and early sensory gain, shaping how strongly visual cues dominate auditory perception. Study 3 investigated top-down influences by testing how learned tone-color associations affect integration. After acquiring consistent pairings, participants showed systematic shifts in sound localization toward the previously associated color, demonstrating that learned correspondences increase the inferred probability of a common cause. Neural effects appeared later than those from saliency: between 350-600 ms, ERPs revealed enhanced fronto-parietal activity, accompanied by beta suppression and gamma activity consistent with the retrieval of associative priors and predictive feedback. These findings suggest that learned associations shape integration through later feedback-based mechanisms rather than early sensory processing. Across studies, a coherent temporal hierarchy emerges. Bottom-up saliency modulates early neural states and feedforward dynamics, whereas associative priors influence later feedback signals that refine perceptual interpretation. Both converge within fronto-parietal networks during the 300-600 ms interval in which the brain evaluates whether auditory and visual signals should be integrated. Saliency-driven effects appear earlier in this window, while top-down associative effects increasingly dominate its later portion. Overall, this dissertation shows that the ventriloquist effect reflects a dynamic, multi-stage inference process: bottom-up saliency and top-down prior knowledge act on distinct timescales yet interact within shared cortical networks to resolve perceptual ambiguity. This highlights that multisensory perception emerges from the coordinated interplay of early sensory-driven biases and later prior knowledge-driven computations.en
dc.language.isoende_DE
dc.publisherStaats- und Universitätsbibliothek Hamburg Carl von Ossietzkyde
dc.rightshttp://purl.org/coar/access_right/c_abf2de_DE
dc.subjectVentriloquist effecten
dc.subjectMultisensory integrationen
dc.subjectCausal inferenceen
dc.subjectEEGen
dc.subjectBottom-upen
dc.subjectTop-downen
dc.subjectAssociation learningen
dc.subjectVisual saliencyen
dc.subject.ddc500: Naturwissenschaftende_DE
dc.titleNeural Dynamics of Bottom-Up and Top-Down Processes Guiding Audiovisual Integration under Causal Inferenceen
dc.typedoctoralThesisen
dcterms.dateAccepted2026-03-18-
dc.rights.cchttps://creativecommons.org/licenses/by/4.0/de_DE
dc.rights.rshttp://rightsstatements.org/vocab/InC/1.0/-
dc.subject.bcl02.13: Wissenschaftspraxisde_DE
dc.type.casraiDissertation-
dc.type.dinidoctoralThesis-
dc.type.driverdoctoralThesis-
dc.type.statusinfo:eu-repo/semantics/publishedVersionde_DE
dc.type.thesisdoctoralThesisde_DE
tuhh.type.opusDissertation-
thesis.grantor.departmentPsychologiede_DE
thesis.grantor.placeHamburg-
thesis.grantor.universityOrInstitutionUniversität Hamburgde_DE
dcterms.DCMITypeText-
dc.identifier.urnurn:nbn:de:gbv:18-ediss-136555-
item.fulltextWith Fulltext-
item.advisorGNDBruns, Patrick-
item.creatorGNDKubetschek, Cora-
item.grantfulltextopen-
item.creatorOrcidKubetschek, Cora-
item.languageiso639-1other-
Enthalten in den Sammlungen:Elektronische Dissertationen und Habilitationen
Dateien zu dieser Ressource:
Datei Beschreibung Prüfsumme GrößeFormat  
CoraKubetschekDissertation2026.pdfDissertation Cora Kubetschekd83eb43a2bdc0d9ed782c70bfddc40f516.06 MBAdobe PDFMiniaturbild
Öffnen/Anzeigen
Zur Kurzanzeige

Info

Seitenansichten

Letzte Woche
Letzten Monat
geprüft am null

Download(s)

Letzte Woche
Letzten Monat
geprüft am null
Werkzeuge

Google ScholarTM

Prüfe