DC ElementWertSprache
dc.contributor.advisorSteinicke, Frank-
dc.contributor.authorFreiwald, Jann Philipp-
dc.date.accessioned2022-09-02T08:43:47Z-
dc.date.available2022-09-02T08:43:47Z-
dc.date.issued2022-
dc.identifier.urihttps://ediss.sub.uni-hamburg.de/handle/ediss/9792-
dc.description.abstractThe metaverse is commonly described as a plethora of multi-user mixed reality (MR) applications, in which users can interact with the virtual environment and each other through shared MR spaces. However, several aspects of the metaverse are still unclear, and fundamental research in human collaboration and interaction in MR hybrid setups is required. We use "hybrid setups" as umbrella term for the stark variance between individual users with regard to their technical setup, used interaction forms and virtual representations. This includes aspects such as different hardware devices, sitting and standing configurations, appearance of avatars, as well as locomotion and its visualization. This dissertation examines these components from both technical and user experience perspectives, including metrics such as usability, sense of presence and cybersickness. First, we present techniques that enable effective hybrid collaborations between a variety of devices. More specifically, two types of MR devices require further research in terms of integration into shared MR spaces: (i) mobile devices and (ii) video see-through head-mounted displays. To integrate mobile devices, video streaming and camera-based tracking technologies are combined in a smartphone application called VR Invite, that acts as a handheld viewport to a local or remote shared MR space. The results of the accompanying experiment indicate that the opportunity for direct interaction positively influences the sense of presence of co-located mobile users. For video see-through head-mounted displays, an image reprojection algorithm named Camera Time Warp is introduced, that stabilizes virtual objects and virtual human representations in the real world reference frame. This technique addresses one of the major drawbacks of video see-through devices, the perceptual registration error caused by the inherent camera-to-photon latency. The user study demonstrates a positive effect of the Camera Time Warp technique on the participants’ spatial orientation and subjective levels of cybersickness. Second, we investigate visualization techniques within shared MR spaces to support spatial awareness as well as the sense of spatial presence and co-presence. In the context of human-tohuman interactions, techniques to visualize a user’s gaze direction and their points of interest are compared. The experiment shows that visualizations which are coupled to virtual human representations invoked the highest grade of co-presence, while task performance was highest when additional viewports were provided. In another user study, we contrasted virtual human representations, as well as the visualization of their locomotion from first- and third-person perspective. The results emphasize the importance of natural-looking human gait animations to increase the observer’s spatial awareness and co-presence. Third, we introduce novel techniques for locomotion and its visualization as implementations of embodied interactions. Based on findings of our prior experiments, the focus for these locomotion user interfaces was the visualization of human gait from first- and third-person perspectives. A locomotion user interfaces for seated MR experiences called VR Strider is presented, which maps cycling biomechanics of the user’s legs to virtual walking movements. An experiment revealed a significant positive effect on the participants’ angular and distance estimation, sense of presence and feeling of comfort compared to other established locomotion techniques. A second confirmatory study indicates the necessity of synchronized avatar animations for virtual locomotion. Lastly, we present a toolkit for continuous and noncontinuous locomotion with matching human representations based on intelligent virtual agents. This toolkit includes two techniques each for avatar visualizations, i.e., Smart Avatars and Puppeteer, as well as locomotion, i.e., PushPull and Stuttered Locomotion. Smart Avatars deliver continuous full-body human representations for noncontinuous locomotion in shared VR spaces. They can smoothly visualize Stuttered Locomotion, a technique which transforms continuous user movement into a series of short-distance teleport steps. Stuttered Locomotion is also applicable to our motion-based PushPull technique, which employs a dynamic velocity multiplier based on the user’s hand pose. An experiment indicates that Smart Avatars were preferred over regular virtual human representations, while remote controlling one’s own virtual body through Puppeteer produced body ownership close to first-person interactions. The experiment evaluation shows that both PushPull and Stuttered Locomotion significantly reduce the occurrence of cybersickness. In summary, this work offers novel techniques and insight for the implementation of hybrid MR setups with respect to virtual interactions and their visualization.en
dc.language.isoende_DE
dc.publisherStaats- und Universitätsbibliothek Hamburg Carl von Ossietzkyde
dc.rightshttp://purl.org/coar/access_right/c_abf2de_DE
dc.subjectMixed Realityen
dc.subjectCollaborationen
dc.subjectUser studyen
dc.subjectHuman-Computer Interactionen
dc.subjectMetaverseen
dc.subject.ddc004: Informatikde_DE
dc.titleShared Mixed Reality Spaces: Visualization and Interaction Techniques for Hybrid User Groupsen
dc.typedoctoralThesisen
dcterms.dateAccepted2022-08-29-
dc.rights.cchttps://creativecommons.org/licenses/by/4.0/de_DE
dc.rights.rshttp://rightsstatements.org/vocab/InC/1.0/-
dc.subject.bcl54.73: Computergraphikde_DE
dc.subject.gndErweiterte Realität <Informatik>de_DE
dc.subject.gndVirtuelle Realitätde_DE
dc.subject.gndFortbewegungde_DE
dc.subject.gndAvatar <Informatik>de_DE
dc.subject.gndCGI <Computergrafik>de_DE
dc.type.casraiDissertation-
dc.type.dinidoctoralThesis-
dc.type.driverdoctoralThesis-
dc.type.statusinfo:eu-repo/semantics/publishedVersionde_DE
dc.type.thesisdoctoralThesisde_DE
tuhh.type.opusDissertation-
thesis.grantor.departmentInformatikde_DE
thesis.grantor.placeHamburg-
thesis.grantor.universityOrInstitutionUniversität Hamburgde_DE
dcterms.DCMITypeText-
dc.identifier.urnurn:nbn:de:gbv:18-ediss-102966-
item.advisorGNDSteinicke, Frank-
item.grantfulltextopen-
item.languageiso639-1other-
item.fulltextWith Fulltext-
item.creatorOrcidFreiwald, Jann Philipp-
item.creatorGNDFreiwald, Jann Philipp-
Enthalten in den Sammlungen:Elektronische Dissertationen und Habilitationen
Dateien zu dieser Ressource:
Datei Beschreibung Prüfsumme GrößeFormat  
Dissertation_JannPhilippFreiwald_2022.pdfDissertation4a0917dcff0af972b0f2d4be7a3e217d6.83 MBAdobe PDFÖffnen/Anzeigen
Zur Kurzanzeige

Info

Seitenansichten

384
Letzte Woche
Letzten Monat
geprüft am 25.04.2024

Download(s)

418
Letzte Woche
Letzten Monat
geprüft am 25.04.2024
Werkzeuge

Google ScholarTM

Prüfe