|Titel:||Searches for new heavy bosons and vector-like quarks with the CMS experiment at √s = 13 TeV and novel pileup mitigation techniques||Sprache:||Englisch||Autor*in:||Benecke, Anna||Schlagwörter:||vector-like; heavy resonance; pileup; LHC; CMS||Erscheinungsdatum:||2020||Tag der mündlichen Prüfung:||2020-12-15||Zusammenfassung:||
In this thesis two searches for new physics are presented, involving a heavy resonance Z′ and a vector-like quark T. Both new particles are predicted by theories beyond the standard model that address the smallness of the Higgs boson mass compared to the Planck scale. The searches are based on proton-proton collision data at a center of mass energy of 13 TeV recorded with the CMS experiment at the LHC.
In the first analysis, a Z′ decaying to Tt is searched for in the lepton+jets final state for the first time. Data corresponding to an integrated luminosity of 35.9fb−1 are analyzed. The three T decay channels Ht, Zt, and Wb are taken into account. The analysis is performed in the highly Lorentz-boosted regime where substructure techniques are used to identify the heavy bosons. This search leads to the most strin- gent upper cross section limits on a Z′ → tT resonance to date. A heavy gluon can be excluded between a mass of 1.5 and 2.3 TeV if MT = 1.2 TeV, and between 2.0 and 2.5 TeV if MT = 1.5 TeV.
The second analysis considers a singly produced vector-like T that decays into Ht in the lepton+jets final state. While vector-like Ts have been excluded up to a mass of 1.3TeV, additional decay channels can weaken this bound considerably. The analysis is designed to achieve sensitivity for resonances with masses down to 600GeV. The T is reconstructed using three jets originating from the fragmentation of b quarks, the lepton and missing transverse momentum. A resonant structure on a smoothly falling background is searched for. The analysis is carried out using data corresponding to 137.2fb−1. The signal region is still blinded, because the analysis is still undergoing the CMS internal review. But the expected sensitivity corresponds to a significance of five standard deviations for a possible signal with mass of 650GeV, visible in the all-hadronic final state.
The analysis of proton-proton collision data is impaired by particles that originate from additional proton-proton interactions during the same bunch crossing (pileup). Pileup mitigation is important for analyses of data with high instantaneous luminosi- ties, which have been reached during Run 2 (2016-2018). Even higher levels of pileup are expected in future data acquisition periods. The performance of pileup mitigation techniques, including the novel pileup per particle identification (PUPPI) algorithm, is studied for up to 70 simultaneous collisions per bunch crossing. In addition, the validation in data and improvements of the PUPPI algorithm are shown. Following these studies, PUPPI has become the default pileup mitigation technique in CMS.
In order to improve the data quality and sensitivity of the CMS experiment further, the CMS pixel detector was upgraded in 2016/2017 and a new powering system with DC-DC converters was installed. A failure of several DC-DC converters during the data taking in 2017 caused losses in the data quality. The systematic analysis of this failure presented here led to a change in the operation of the pixel detector in 2018, preventing losses due to broken DC-DC converters during data acquisition.
This thesis presents significant advancements in the search for Z′ and T, pileup mitigation techniques ensuring future p p collision physics at the LHC and a failure analysis of the pixel powering system that led to successful acquisition of 80fb−1 in 2018.
|Enthalten in den Sammlungen:||Elektronische Dissertationen und Habilitationen|
geprüft am 29.07.2021
geprüft am 29.07.2021