DC Element | Wert | Sprache |
---|---|---|
dc.contributor.advisor | Steinicke, Frank | - |
dc.contributor.author | Freiwald, Jann Philipp | - |
dc.date.accessioned | 2022-09-02T08:43:47Z | - |
dc.date.available | 2022-09-02T08:43:47Z | - |
dc.date.issued | 2022 | - |
dc.identifier.uri | https://ediss.sub.uni-hamburg.de/handle/ediss/9792 | - |
dc.description.abstract | The metaverse is commonly described as a plethora of multi-user mixed reality (MR) applications, in which users can interact with the virtual environment and each other through shared MR spaces. However, several aspects of the metaverse are still unclear, and fundamental research in human collaboration and interaction in MR hybrid setups is required. We use "hybrid setups" as umbrella term for the stark variance between individual users with regard to their technical setup, used interaction forms and virtual representations. This includes aspects such as different hardware devices, sitting and standing configurations, appearance of avatars, as well as locomotion and its visualization. This dissertation examines these components from both technical and user experience perspectives, including metrics such as usability, sense of presence and cybersickness. First, we present techniques that enable effective hybrid collaborations between a variety of devices. More specifically, two types of MR devices require further research in terms of integration into shared MR spaces: (i) mobile devices and (ii) video see-through head-mounted displays. To integrate mobile devices, video streaming and camera-based tracking technologies are combined in a smartphone application called VR Invite, that acts as a handheld viewport to a local or remote shared MR space. The results of the accompanying experiment indicate that the opportunity for direct interaction positively influences the sense of presence of co-located mobile users. For video see-through head-mounted displays, an image reprojection algorithm named Camera Time Warp is introduced, that stabilizes virtual objects and virtual human representations in the real world reference frame. This technique addresses one of the major drawbacks of video see-through devices, the perceptual registration error caused by the inherent camera-to-photon latency. The user study demonstrates a positive effect of the Camera Time Warp technique on the participants’ spatial orientation and subjective levels of cybersickness. Second, we investigate visualization techniques within shared MR spaces to support spatial awareness as well as the sense of spatial presence and co-presence. In the context of human-tohuman interactions, techniques to visualize a user’s gaze direction and their points of interest are compared. The experiment shows that visualizations which are coupled to virtual human representations invoked the highest grade of co-presence, while task performance was highest when additional viewports were provided. In another user study, we contrasted virtual human representations, as well as the visualization of their locomotion from first- and third-person perspective. The results emphasize the importance of natural-looking human gait animations to increase the observer’s spatial awareness and co-presence. Third, we introduce novel techniques for locomotion and its visualization as implementations of embodied interactions. Based on findings of our prior experiments, the focus for these locomotion user interfaces was the visualization of human gait from first- and third-person perspectives. A locomotion user interfaces for seated MR experiences called VR Strider is presented, which maps cycling biomechanics of the user’s legs to virtual walking movements. An experiment revealed a significant positive effect on the participants’ angular and distance estimation, sense of presence and feeling of comfort compared to other established locomotion techniques. A second confirmatory study indicates the necessity of synchronized avatar animations for virtual locomotion. Lastly, we present a toolkit for continuous and noncontinuous locomotion with matching human representations based on intelligent virtual agents. This toolkit includes two techniques each for avatar visualizations, i.e., Smart Avatars and Puppeteer, as well as locomotion, i.e., PushPull and Stuttered Locomotion. Smart Avatars deliver continuous full-body human representations for noncontinuous locomotion in shared VR spaces. They can smoothly visualize Stuttered Locomotion, a technique which transforms continuous user movement into a series of short-distance teleport steps. Stuttered Locomotion is also applicable to our motion-based PushPull technique, which employs a dynamic velocity multiplier based on the user’s hand pose. An experiment indicates that Smart Avatars were preferred over regular virtual human representations, while remote controlling one’s own virtual body through Puppeteer produced body ownership close to first-person interactions. The experiment evaluation shows that both PushPull and Stuttered Locomotion significantly reduce the occurrence of cybersickness. In summary, this work offers novel techniques and insight for the implementation of hybrid MR setups with respect to virtual interactions and their visualization. | en |
dc.language.iso | en | de_DE |
dc.publisher | Staats- und Universitätsbibliothek Hamburg Carl von Ossietzky | de |
dc.rights | http://purl.org/coar/access_right/c_abf2 | de_DE |
dc.subject | Mixed Reality | en |
dc.subject | Collaboration | en |
dc.subject | User study | en |
dc.subject | Human-Computer Interaction | en |
dc.subject | Metaverse | en |
dc.subject.ddc | 004: Informatik | de_DE |
dc.title | Shared Mixed Reality Spaces: Visualization and Interaction Techniques for Hybrid User Groups | en |
dc.type | doctoralThesis | en |
dcterms.dateAccepted | 2022-08-29 | - |
dc.rights.cc | https://creativecommons.org/licenses/by/4.0/ | de_DE |
dc.rights.rs | http://rightsstatements.org/vocab/InC/1.0/ | - |
dc.subject.bcl | 54.73: Computergraphik | de_DE |
dc.subject.gnd | Erweiterte Realität <Informatik> | de_DE |
dc.subject.gnd | Virtuelle Realität | de_DE |
dc.subject.gnd | Fortbewegung | de_DE |
dc.subject.gnd | Avatar <Informatik> | de_DE |
dc.subject.gnd | CGI <Computergrafik> | de_DE |
dc.type.casrai | Dissertation | - |
dc.type.dini | doctoralThesis | - |
dc.type.driver | doctoralThesis | - |
dc.type.status | info:eu-repo/semantics/publishedVersion | de_DE |
dc.type.thesis | doctoralThesis | de_DE |
tuhh.type.opus | Dissertation | - |
thesis.grantor.department | Informatik | de_DE |
thesis.grantor.place | Hamburg | - |
thesis.grantor.universityOrInstitution | Universität Hamburg | de_DE |
dcterms.DCMIType | Text | - |
dc.identifier.urn | urn:nbn:de:gbv:18-ediss-102966 | - |
item.fulltext | With Fulltext | - |
item.grantfulltext | open | - |
item.languageiso639-1 | other | - |
item.creatorOrcid | Freiwald, Jann Philipp | - |
item.creatorGND | Freiwald, Jann Philipp | - |
item.advisorGND | Steinicke, Frank | - |
Enthalten in den Sammlungen: | Elektronische Dissertationen und Habilitationen |
Dateien zu dieser Ressource:
Datei | Beschreibung | Prüfsumme | Größe | Format | |
---|---|---|---|---|---|
Dissertation_JannPhilippFreiwald_2022.pdf | Dissertation | 4a0917dcff0af972b0f2d4be7a3e217d | 6.83 MB | Adobe PDF | Öffnen/Anzeigen |
Info
Seitenansichten
557
Letzte Woche
Letzten Monat
geprüft am 26.11.2024
Download(s)
518
Letzte Woche
Letzten Monat
geprüft am 26.11.2024
Werkzeuge