Unreal Realm

Lead Designer
XR Hand Interaction Study
Unreal Realm is a set of interoperable gestures across devices. It serves as a vision for how we could interact across different spaces enabled by spatial computing.

Created by Davide Zhang, Aria Xiying Bao, Nix Liu Xin


Screen-based devices have become the dominant medium through which we interact with digital content. Despite the variety of device types and screen sizes, our experience rests on a handful of touch gestures (swiping, pinching, tapping) that act as the backbone of mobile interactions today. Even with the emerging popularity of mixed-reality experiences today, screen space, AR space, and physical space have been in relative isolation.

Problem Space

What are the ways through which the boundaries between spaces can be blurred? We define the 3 spaces as follows:

Screen space: 2D pixel coordinate system of a computer display
AR space: 3D virtual coordinate system overlayed onto a real-world camera feed
Physical space: the real world with tangible materials and matter.

No items found.
Grab to manipulate

By grabbing virtual objects naturally, the user is able to move them to another position and orient them as they wish.

No items found.
Extract real textures and apply them to AR objects

Applying materials is a common function in 3D rendering and the product design process. The process of obtaining a material that mimics the real object is not intuitive and efficient enough. In our implementation, we propose that users can directly search for and extract materials from real-world objects, and transform them into digital content for further AR-based and screen-based creation. The user uses an ‘OK’ sign gesture that mimics a magnifier to indicate texture on a real object.

After the material is confirmed, a texture ball will be generated beside the user’s hand. The user drags the texture ball to the object to apply the texture.

Flip to show menu

The menu is designed to be displayed at the right side of the left palm when the user's palm is facing toward the user. To press a button in the hand menu, the user uses the non-dominant hand (right hand) to press the virtual button with the index finger.

Drag the object out of screen space into real space
Envisioning a seamless transition between multiple realities

Scan to create a digital twin

We envision a gesture-based 3D scanning process that produces a digital twin in a seamless and intuitive way. The user moves the hand in one direction to perform the scan.