SemanticAdapt: Optimization-based Adaptation of Mixed Reality Layouts Leveraging Virtual-Physical Se

We present an optimization-based approach that automatically adapts Mixed Reality (MR) interfaces to different physical environments.

Overview

Authors: Yi Fei Cheng, Yukang Yan, Xin Yi, Yuanchun Shi, David Lindlbauer

Publication Date: 1 January 2021

Link: https://augmented-perception.org/publications/2021-semanticadapt.html

Keywords: Semantics, context-Aware Computing, mixed reality (MR), optimization, adaptive interfaces, physical environments, virtual interface elements, automation, UX, task switching, layout adaptation, efficiency

Type: Peer-Reviewed Journals/White Papers

Summary

We present an optimization-based approach that automatically adapts Mixed Reality (MR) interfaces to different physical environments. Current MR layouts, including the position and scale of virtual interface elements, need to be manually adapted by users whenever they move between environments, and whenever they switch tasks. This process is tedious and time consuming, and arguably needs to be automated by MR systems for them to be beneficial for end users.

Last updated