Please use this identifier to cite or link to this item:
https://hdl.handle.net/20.500.11851/9253
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Gulcu, A.E. | - |
dc.contributor.author | Atalay, F.B. | - |
dc.date.accessioned | 2022-11-30T19:37:39Z | - |
dc.date.available | 2022-11-30T19:37:39Z | - |
dc.date.issued | 2022 | - |
dc.identifier.isbn | 9781665470100 | - |
dc.identifier.uri | https://doi.org/10.1109/UBMK55850.2022.9919479 | - |
dc.identifier.uri | https://hdl.handle.net/20.500.11851/9253 | - |
dc.description | 7th International Conference on Computer Science and Engineering, UBMK 2022 -- 14 September 2022 through 16 September 2022 -- 183844 | en_US |
dc.description.abstract | One of the strongest aspects of virtual reality (VR) hardware and applications is the immersion it provides to the user. While combination of techniques such as real-walking for locomotion, head movement for gaze direction and hand detection for controls are great for immersive experience; when the real space and the virtual environment does not match, developers must resort to alternative locomotion techniques such as teleportation, controller-assisted movement or in-place walking via external devices. Despite lifting the restrictions on physical spaces, these techniques may need expensive or cumbersome hardware, cause nausea and dizziness on the user or simply diminish the immersion. We propose a portal based environment design method and a modern rendering engine to visualize the created scenes. Our method uses the same physical space by overlapping multiple regions of the virtual environment on top of each other while keeping the affine properties of the scene. The user can walk between different virtual rooms while staying in a limited physical space. We also conduct a user study to compare user experience to state-of-the-art physical-virtual environment mapping method; in which, the users strongly favored our method. © 2022 IEEE. | en_US |
dc.language.iso | en | en_US |
dc.publisher | Institute of Electrical and Electronics Engineers Inc. | en_US |
dc.relation.ispartof | Proceedings - 7th International Conference on Computer Science and Engineering, UBMK 2022 | en_US |
dc.rights | info:eu-repo/semantics/closedAccess | en_US |
dc.subject | cell-portal graphs | en_US |
dc.subject | environment mapping | en_US |
dc.subject | infinite spaces | en_US |
dc.subject | multi-thread | en_US |
dc.subject | OpenGL | en_US |
dc.subject | portals | en_US |
dc.subject | virtual reality | en_US |
dc.subject | Vulkan | en_US |
dc.subject | Application programming interfaces (API) | en_US |
dc.subject | Mapping | en_US |
dc.subject | User interfaces | en_US |
dc.subject | Cell-portal graph | en_US |
dc.subject | Environment mapping | en_US |
dc.subject | Gaze direction detection | en_US |
dc.subject | Hands detections | en_US |
dc.subject | Head movements | en_US |
dc.subject | Immersive | en_US |
dc.subject | Infinite space | en_US |
dc.subject | Multi-thread | en_US |
dc.subject | Opengl | en_US |
dc.subject | Vulkan | en_US |
dc.subject | Virtual reality | en_US |
dc.title | Infinite Spaces Using Recursive Portals | en_US |
dc.type | Conference Object | en_US |
dc.identifier.startpage | 332 | en_US |
dc.identifier.endpage | 337 | en_US |
dc.identifier.scopus | 2-s2.0-85141868335 | en_US |
dc.identifier.doi | 10.1109/UBMK55850.2022.9919479 | - |
dc.authorscopusid | 57964277400 | - |
dc.authorscopusid | 23110410300 | - |
dc.relation.publicationcategory | Konferans Öğesi - Uluslararası - Kurum Öğretim Elemanı | en_US |
dc.ozel | 2022v3_Edit | en_US |
item.openairetype | Conference Object | - |
item.languageiso639-1 | en | - |
item.grantfulltext | none | - |
item.fulltext | No Fulltext | - |
item.openairecristype | http://purl.org/coar/resource_type/c_18cf | - |
item.cerifentitytype | Publications | - |
Appears in Collections: | Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection Öğrenci Yayınları / Students' Publications |
CORE Recommender
Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.