STORY #3

Enhancing
the VR/MR
Experience Through
Multi-Sensory
Integration

Asako Kimura, Ph.D.

Professor, College of Information Science and Engineering

What user interface would
allow for a more enjoyable
VR/MR space experience?

Virtual Reality (VR) and Mixed Reality (MR) technologies enable the reproduction of the real world in a virtual space, allowing users to experience it as if they were physically present there. However, it remains a question whether the experience in VR/MR is truly equivalent to that in the real world.

“The sensations experienced in VR/MR space are not necessarily the same as those felt in the real world,” Asako Kimura reveals. For instance, if we superimpose an image of a leather handle with a thick and heavy feel on a plastic handle using MR technology—which blends virtual and reality—will the person holding the handle only feel the tactile sensation of the plastic? These questions sparked Kimura’s interest in how changes in visual information in VR/MR space impact tactile perception. In response, she constructed a virtual space and conducted perceptual experiments to investigate various effects. One such study examined how visual stimuli alter perceptions of hardness.

Two identical real objects were placed in front of the study participants, and they were asked to compare the hardness of each by pressing each with a finger. Meanwhile, a computer graphics (CG) image was superimposed by MR on one of the objects so that the object appeared more concave than the actual object when pressed with a finger. “The participants found that, despite the hardness of the actual object being the same, the object on which MR superimposed the concave CG image felt softer to the touch. The greater the concavity of the CG image, the softer the object tended to feel.”

In another experiment, when the CG images of a large object and a small object were superimposed on the same real object, it was found that the center of gravity of the large object was considered to be farther away (closer to the center of gravity of the CG image than that of the real object).

Other perceptual experiments on weight have also confirmed the impact of visual stimuli on the perception of weight. In these experiments, the participants were asked to hold a box with handles, and an MR image of liquid in the box was superimposed. When the amount of liquid in the image was changed, the participants perceived the box as being heavier when the water surface was higher (more water) and lighter when the water surface was lower (less water), even though the actual weight of the box remained the same. Additionally, when the water surface in the image was shown to be lapping, the participants perceived the box as being lighter than when the water surface was still. These findings suggest that visual representations of a shifting center of gravity, such as the lapping water in this experiment, can lead to the perception of the real object being lighter than it actually is.

The results of these experiments demonstrate how humans can be deceived by visual perception and misunderstand reality based on assumptions. “Up until now, VR and MR have aimed to reproduce the real world as accurately as possible, but human senses are not always faithful to the reality. For both games and attractions, by combining visual, auditory, and tactile stimuli, it is possible to create a more immersive experience in VR and MR,” Kimura says.

These illusions can also occur in the real world, but recent research has shown that different patterns of illusions occur in VR/MR space compared with the real world. Kimura is also investigating the causes and factors that contribute to these differences.

Besides, Kimura is also researching user interfaces (UI) in VR/MR space to see how senses such as sight, hearing, and touch can be utilized to improve operation. In the real world, it is common to operate equipment using hands, such as inputting information on a computer with a mouse or keyboard or using fingers to operate a smartphone. However, in VR/MR spaces where the whole body can be used, there are potential input methods using all parts of the body, not just hands and fingers. One of these is the use of foot gestures. “If foot gestures can be used as commands, it can allow both hands to be used freely. However, there has been little consideration of what kind of foot gestures are best suited for different operations in the VR space,” she says. Kimura is thus examining the characteristics of various foot gestures as a UI to address this issue.

Foot gestures were classified into various patterns, including rotating the toes around the heel in different directions (vertically and horizontally), rotating the heels around the toes in different directions (vertically and horizontally), stepping forward, sliding backward and forward, left and right, and moving the knee up and down. In the VR space, each gesture was used as a command to turn on and off and change values, and their operability was compared and examined. “For example,” Kimura says, “a foot gesture that moves the toes up and down using the heel as the axis is easy to understand as an on/off command, but if the input needs to be repeated, it can be tiring. Accordingly, operating methods were classified in accordance with their suitability to tasks.”

Furthermore, an experiment was carried out in which different commands were assigned to each gesture. The participants were asked to use those gestures to control a robot to move through the VR space to reach its destination. In it, the operability of various gesture combinations was also examined. “The utilization of VR/MR contains a lot of elements of games and play, and it is not just about efficiency and simplicity. These are also physically demanding but could be exhilarating or fun operations as well. We hope to develop guidelines that can determine what kind of gestures are suitable for different purposes,” says Kimura, looking to the future.

Asako Kimura, Ph.D.
Asako Kimura, Ph.D.
Professor, College of Information Science and Engineering
Research Themes: Research on tool-type input interfaces that utilize tool shapes and tactile sensations; Development and application of unconstrained and environment-superimposed human interfaces; Research on user interfaces for VR/MR spatial manipulation; Research on human perception and physical sensation in VR/MR space; among others.
Specialties: Mixed Reality; Artificial Reality; Real-World Oriented Interface; Tangible User Interface; Multimodal and Cross-Modal Interface; Interactive Communication Devices

storageResearchers database

September 11, 2023