Section A

Near-body Interaction Enhancement with Distance Perception Matching in Immersive Virtual Environment

Ungyeon Yang1, Nam-Gyu Kim2,*
Author Information & Copyright
1Electronics and Telecommunications Research Institute (ETRI), Daejeon, Korea, uyyang@etri.re.kr
2Dong-Eui University, Busan, Korea, ngkim@deu.ac.kr
*Corresponding Author: Nam-Gyu Kim, 176 Eomgwangno, Busanjin-gu, Busan, +82-51-890-2713, ngkim@deu.ac.kr

© Copyright 2021 Korea Multimedia Society. This is an Open-Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

Received: Jun 10, 2021; Revised: Jun 18, 2021; Accepted: Jun 22, 2021

Published Online: Jun 30, 2021

Abstract

As recent virtual reality technologies provide a more natural three-dimensional interactive environment, users naturally learn to explore space and interact with synthetic objects. The virtual reality researcher develops a technique that realizes realistic sensory feedback to get appropriate feedback to sense input behavior. Although much recent virtual reality research works extensively consider the human factor, it is not easy to adapt to all new virtual environment contents. Among many human factors, distance perception has been treated as very important in virtual environment interaction accuracy. We study the experiential virtual environment with the feature of the virtual object connected with the real object. We divide the three-dimensional interaction, in which distance perception and behavior have a significant influence, into two types (whole-body movement and direct manipulation) and analyze the real and virtual visual distance perception heterogeneity phenomenon. Also, we propose a statistical correction method that can reduce a near-body movement and manipulation error when changing the interaction location and report the experiment results proving its effectiveness.

Keywords: Near-body Interaction; Visual Distance Perception; Virtual Reality

I. INTRODUCTION

Recent virtual reality (VR) researches focus on developing a technology that provides users with a high-quality virtual presence that is difficult to distinguish from reality. As a model for explaining the presence of users [1], there is a need for an interface device that delivers the same feedback as the real environment to the human sensory organ as a requirement to achieve an ideal level of presence. However, current head-mounted display (HMD) devices representing fully immersive virtual environments still present visually uneasy feedback with to the real world.

The monocular image quality of HMDs is being developed from 2K high definition (HD) resolution to prototypes with 4K ultra high definition (UHD) or higher image quality. However, to cope with the highest density region (i.e., the macula of retina) of human vision, a real-time output technology capable of updating the image information of 8K UHD or higher in a time of several milliseconds or less is needed.

Real-time focus change control technology that actively responds to accommodation, essential human depth-sensing parameters at near-body space distance within a few meters [2], has not yet been universally applied to HMDs. Therefore, the commercial HMDs based on a binocular stereoscopic virtual imaging structure could not intrinsically solve the convergence and accommodation conflict problem [3]. Since the conflict problem causes interaction errors in the virtual environment, it is necessary to reduce the errors by adjusting hardware or implementing software.

This paper proposes a compensation method to enhance distance perception to cope with the problem that the sense of distance in the real space does not match the virtual space when the user engages in the virtual environment wearing HMDs. We implement virtual environments that generalize two interactive scenarios: whole-body movement and direct manipulation and analyze correlation with distance perception and interaction behavior. We find the distance perception parameters matched by experimenting with the two interaction scenarios. And then, we show that the interaction is effective in adjusting the virtual environment.

II. RELATED WORKS

This study explores the real-virtual distance perception matching control technique that can apply to the movement and direct manipulation scenarios. In general, combination of various factors determines the depth of feeling that humans have learned. As a research example of factors affecting distance [2], humans perceive distance sense by understanding relative empirical relation through light and shading, color intensity & contrast, perspective, occlusion sequence, focus, and relative motion. Furthermore, in the case of near-body space within a few meters, distance perception due to the physical structure of the binocular vision is an essential factor in interaction.

With the continuous development of the HMDs since the 1990s, there are various examples of differences in distance between reality and virtual space when wearing HMDs. Many virtual environment users tend to underestimate the distance sense of the target to about 74% through the research survey of the past 20 years about the egocentric distance perception. In addition, a careful virtual camera setup could help correct the distance perception of subjects. Especially, virtual environment content characteristics, such as space representation, accurate disparity, high-quality rendering, and ground textures, made less perceptive [4]. The effect of the horizon stimulus is important on the change of distance in the HMD environment. Generally, users underestimated in distance sensing experience [5]. The comparative experiments that limited visual observation conditions could affect egocentric distance perception in real and virtual space [6]. In the Oculus Rift HMD immersive environment, users could determine the most accurate distance at distances of two to five meters. Field of View (FOV) affected the distance perception in the HMD environment [7].

As the distance of the observation target gets closer to the user in the virtual space, the tendency that the perceived distance by the user decreases becomes more apparent, and the correction process could reduce the difference respectively [8]. The relationship between distance and sensory perception is considered in virtual and real space [9]. The hardware structure of the wearable display and the optimization of the virtual camera parameters for rendering stereoscopic images depended on the distance perception between the real and the virtual space. Therefore, all display devices should present seamless scene transition with controlled visual parameters [10].

In VR and augmented reality (AR) applications, some works presented that distance perception enhancement can increase behavioral performance. With a low-cost motion tracking system and fuzzy postal score algorithm, an analytical interplay model between collision prevention and proprioceptive postal stability was introduced on the dual task paradigm. The experimental result can apply to health and rehabilitation scenarios [11], [12]. Image and video understanding techniques based on convolutional neural networks (CNNs) applied to behavioral motion analysis and AR-guided learning. The CNN algorithm, which can predict and respond to human behavior in advance, increased system usage satisfaction and accuracy [13], [14]

Recently, in an interview about VR theme park construction based on HMD and motion capture system, the user still experiences the inconsistency phenomenon of real-virtual distance when proceeding with the space movement and object manipulation. Although there are plentiful previous studies on distance perception and measuring techniques, it is not enough that the significant factors apply to various hardware and software. Since consumer HMDs in the mid-2010s, considerable hardware upgrades have affected visual immersion and realism. As a representative example, LeepVR [15] typed HMDs, which have a field-of-view of 100 degrees or more wide angles, and small-sized high-resolution display panels are pervasive. Therefore, updating the old data and studying the distance perception characteristic based on the HMD wearing virtual environment could provide an adequate viewing angle within VGA grade image quality. Recent studies on spatial cognition in HMD immersive environments have focused on orientation control, which uses a limited physical tracking space more widely by controlling the human senses by remapping methods. The spatial remapping technique could utilize the effect of distance discrepancy. Discrete motion control techniques, such as short jump (leap), could match exact distance matching of real-virtual space instead of continuous motion control [16]. Similarly, this study aimed to increase the accuracy of the direct interaction and movement in near-body space.

As introducing high-resolution image quality and wide-angle HMD, many studies for solving the convergence and accommodation conflict problem presented hardware and software techniques capable of various focal lengths. A continuous depth HMD hardware based on light-field-based holography rendering technology was introduced [17]. A survey paper summarized the research examples that enabled focus control in HMD for VR and augmented reality (AR) [18]. However, a general-purpose HMD solution that can consider all observable distance ranges similar to the human visual acuity has not been commercialized yet [19].

III. INTERACTION SCENARIOS

In the general classification of virtual reality interaction methodology [20], the situation in which the distance perceived by the HMD user has a significant influence is the case of moving the space like navigation and directly manipulating the object, such as direct control. Users perceive the surrounding space or object in the interaction, move their body, or contact themselves. Therefore, sensory feedback about near-distance perception improves realism in virtual space.

Human beings are aware of body position and posture movement through the feedback of sensory and vestibular senses. For providing the main feedback realistically, the virtual environment system uses a real-time motion capture system and immersive HMD, as shown in Figures 1 and 2. These systems have features that utilize the user’s body as a direct interaction medium without using existing indirect manipulation interfaces, such as a mouse, keyboard, and controller. In this process, instead of a low usable exoskeleton device, as a means of haptic feedback to compensate for a feeling of realistic interaction, physical installation props present a passive haptic effect [21].

jmis-8-2-111-g1
Fig. 1. Whole-body interaction with real-time motion tracking.
Download Original Figure
jmis-8-2-111-g2
Fig. 2. Direct manipulation with passive haptics.
Download Original Figure

Assume that a physical instrument is an object (e.g., a key-hole, stepping stone) placed in a space far from a tool that the user always carries (e.g., hand tool, wand). In that case, the sense of distance to the target operates the natural movement of their own body. Distance perception has a significant impact on the performance and natural experience of the results of the interaction. In other words, an error caused by a sense of reality different from the real space in virtual space can increase the risk of safety accidents such as collision and fall in the actual space and the deterioration of immersion and realism. Therefore, we generalize the real-virtual space distance experience by two interactions as shown in Figures 3 and 4 and study ways to improve the degree of distance matching in each situation.

jmis-8-2-111-g3
Fig. 3. Distance perception error occurs the wrong reach in whole-body movement interaction.
Download Original Figure
jmis-8-2-111-g4
Fig. 4. Distance perception error occurs the wrong interaction results in direct manipulation.
Download Original Figure

The interaction scenario in which the entire body moves describe in detail as in Figure 3. The user observes the goal (①) and recognizes the distance (②, Dpr) of the target point to be moved, and performs appropriate actions (e.g., short jump, Djr) in the real space according to the sense of movement customarily learned. However, due to the tendency to under-estimate distance (Dpv) in a virtual environment wearing an HMD, the distance traveled in the virtual space becomes different from the real situation.

Therefore, when observing and moving a target point designed with the same distance (Dr), a different result (DjvDr) is retrieved from the content designer’s intention. Therefore, it is necessary to find a way to control the erroneous distance perception in the HMD-based virtual space and correct the interaction of physical objects (e.g., stepping stones) in real space to not cause problems.

Figure 4 describes the interaction situation in which the user directly manipulates a virtual object linked to a real object. Within the eye and hand reference frame, the user continuously observes (Dpr) the position of manipulating or moving object (e.g., a key) and the position targeting object (e.g., a key-hole). The correct visual distance perception leads to exact proprioceptive interaction results. It is easy to work without error (Emr) because the distance perception (Dpr) is accurate by complementing the relative hand position.

However, in a virtual environment, the distance perception (Dpv) inaccuracy due to the HMD can cause a change in the work performance (EmvEmr). Therefore, in recognizing the sense of distance while continuously observing objects in the HMD-based virtual environment, we need to find a way to continuously control the amount of distance perception to obtain the same feedback as in the real environment. In the end, the virtual environment and HMD should be well-calibrated to reduce interaction errors of physical objects (e.g., a key and a key-hole) in real space

IV. SUBJECT EXPERIMENT

4.1. Experiment Design: Whole Body Movement

We reconfirmed the user’s tendency to wear the HMD to underestimate the distance presented in the pilot test conducted before this experiment. This study investigates the correlation of the amount of behavior (e.g., short jump distance) according to the target’s distance in a given HMD environment and finds a method for inducing the interaction result to the user under the requirements.

As shown in Figure 5-(c), the subject observes a target point at a certain distance (e.g., 50 cm, 100 cm) in the real space. Then, the subject had breaks to practice the exercises to learn the sense of distance and physical movement to the target point within the near-body space through learning the action of a standing short jump by looking ahead. In moving body within the range of near-body space, the distance value set at about 50% level based on the national standing short jump statistics (19 years~29 years, average height 188cm, standard deviation 29cm) of Korea National Statistical Office.

jmis-8-2-111-g5
Fig. 5. Whole body movement experiment setup: (a) Short jump task in virtual environment, (b) Provisional adjustment of target position, (c) Distance perception learning step in real environ-ent, subject (left), and scene of HMD (right).
Download Original Figure

The subject, as in Figure 5-(a) and Figure 5-(c), is asked to verbally report the distance perception (Dpv) to the target position placed at the designed distance (Dv0 = 85cm), not notified to the subject) and to move (standing short jump) the perceived distance. As with the introduction of the paper, in an ideal HMD environment, where the sensory feedback presents the subjects in the same way as in the real environment (RE), the user would feel the same distance (Dpv) as the real distance (Dpr) and move the same distance as the learned distance in reality. However, in a virtual environment, a phenomenon that they become inconsistent is observed. Suppose the perception distance is related to jump distance statistically, and the virtual content is changeable correctly. In that case, it will be possible to affect the user’s distance perception amount (Dpv′) to move (Djv′ = Djr) the distance (Dr) that matches the equipment (e.g., stepping stone) installed in the real environment, as shown in Figure 5-(b).

We attached the Oculus Rift HMD’s touch controller to the shoe to measure the movement (Djv) of the user in the experiment, and the subject was allowed to move forward without looking at his shoes continuously. The subjects were experienced users of HMD and consisted of 10 male and 15 female college students aged between 20 and 30. The subjects’ average height and IPD were 164.8 and 58.9 for each. The virtual environment visualization utilized the Oculus Rift CV1 and Unreal Engine4 to motivate participants to focus (e.g., move the position precisely with one short jump) on task performance by setting the crossing of the stepping bridge placed on the coastal cliff, as shown in Figure 5-(c). In addition, five subjects with relatively small heights exclude from the data analysis.

4.2. Results and Discussions: Whole Body Movement

In order to investigate the relationship between the perceived distance (PD, Dpv) and the amount of short jump distance (SD, Djv) when wearing the HMD, we found a significant relationship between the two preceding factors by analyzing the experimental results by ANOVA (F-value is 7.92 and P-value is 0.001 (<0.05). The regression analysis results show a proportional relationship (R square value = 0.6511), as shown in Figure 6.

jmis-8-2-111-g6
Fig. 6. Significant relationship between moving distance and perceived distance in virtual environment.
Download Original Figure

The regression equation, Djv = 52.464+0.4683·Dpv, obtained through the experiment of the first subject group, can be utilized to correct the user’s behavior according to the installation conditions of the real environment as shown in Figure 5-(b). For example, suppose a stepping bridge at a distance of 85 cm,Dv0 from the real and virtual environment. In that case, if the user’s initial distance perception value, Dpv0 has the value of 60 cm, the expected travel distance (Djv) of the user is about 80 cm, according to the regression equation. Therefore, in order to allow the user to step on the stepping stones of the real space usually, it is necessary to change the position (Dv′) of the stepping stones of the virtual environment so that the user’s sense of distance (Dpv′) can increase to about 10 cm.

In order to verify the effectiveness of the proposed correction method, we conducted experiments to compare the error of the moving distance before and after correction under the same conditions to 15 subjects. As shown in Figure 7, a significant number of subjects (seven of 12, 3 subjects who reported too little perceived distance than the reference distance 85cm, were excluded) had the effect of reducing the error of the distance movement task. As a result of comparing (Djv/Dr) the movement distance (Djv) in the virtual environment compared with the movement distance (Dr) designed in the contents, analysis of 12 subject data showed 0.969 before correction and 1.009 after correction. Therefore, it shows that the proposed calibration scheme can induce stable interaction with the physical devices of the real environment in the virtual environment.

jmis-8-2-111-g7
Fig. 7. Error comparisons of adjustment method verification test.
Download Original Figure
4.3. Experiment Design: Direct Manipulation

The indirect manipulation scenario assumes that the performance error is changed because the relative sense of relationship according to the eye-hand reference frame in the real environment is different from the visual feedback error in the virtual environment. As in the case of Figure 4, in order to collect user behavior data when the user interacts directly within the near-body space range, this experiment was performed by injecting a key-bar into the keyhole respectively. This interaction situation considers that the user continuously observes the workspace, consecutively recognizes the distance and corrects the behavior. Therefore, we select the world scale factor among the rendering factors that affect global visual distance perception in the HMD environment and observe the performance according to the experimental conditions.

In order to match the virtual environment to the actual conditions as much as possible, the physical control elements of the interface device adjust to match the subjects. To consider an essential role of the stereoscopic cue in the near-body space range, the distance between two eyes, called inter pupil distance (IPD), measures with the Digital Pupillary Distance Measurement device. The value applies to the adjustment part of the HMD. All interesting variables (such as the subject’s viewpoint, the keyhole, and key-bar position) collect in tracking system space. All objects related to direct interaction can maintain the same relative frame in real and virtual space.

As shown in Figure 8-(a), the environment of the direct interaction consists of that the subject can perform the peg-in-hole task within reach of the natural stereoscopic view (without moving prominent viewpoints). A three-way pin-art array panel locate at a distance of 120 cm within a range of about 110 degrees based on the user’s position, and the user performs work by moving the upper body while observing the target. We used a pin-array panel to simulate a natural passive haptic effect as shown in Figure 2 so that subjects could perform peg-in-hole work in a realistic sense in various directions and positions, and 2x3 virtual keyholes correspond to each panel. The height of the keyhole is 116.4cm based on the average data of the average elbow-to-underarm height of the users of the early 20s based on the Korea National Statistical Office’s online database. The experimental hardware environment uses Oculus Rift CV1 and HTC VIVE as HMD, and the tracking system uses Optitrack Trio composed of 3 IR cameras. Experimental content runs on UnrealEngine 4.15 on Nvidia GTX980 graphical processing unit.

jmis-8-2-111-g8
Fig. 8. Direct manipulation experiment setup: (a) design of target in near-body space, (b) peg-in-hole task in virtual environment with passive haptic feedback.
Download Original Figure

The goal of this experiment considers how the change of visual distance perception affects work performance indirect interaction environment in near-body space requiring eye-hand reference frame. The visual change elements of the experimental content provide the subjects by varying the world scale by 98%, 100%, and 102% using World-to-meters parameters that can change at runtime in Unreal ngine. The within-subjects group consisted of seven males and three females, all of whom had prior experience with HMD.

4.4. Results and Discussions: Direct Manipulation

In this experiment, eight subjects (five males and three females. Their average height and IPD were 171.15 and 62.25) performed the manipulation task three times for each distance. Even in the Oculus Rift, eight subjects (four males and four females. Their average height and IPD were164.84 and 60.81) perform the same experimental process. An ANOVA test exploits to analyze whether changes in the World-to-meters (W2M) element affected the work performance of the subjects. Tables 1 and 2 show analysis results between the world-to-meters parameters and interaction distance error. There is a statistically significant correlation; P-value is 0.014 and 0.022 for each.

Table 1. ANOVA result (HTC VIVE HMD).
Source DF Adj SS Adj MS F-Value P-Value
W2M 2 40.08 20.038 4.57 0.014
Error 69 302.64 4.386
Total 71 342.71
Download Excel Table
Table 2. ANOVA result (Oculus Rift CV1 HMD).
Source DF Adj SS Adj MS F-Value P-Value
W2M 2 71.24 35.622 4.06 0.022
Error 69 605.96 8.782
Total 71 677.20
Download Excel Table

In order to observe the relationship between the world scale factors and the results of the work performance measurement, we perform a regression analysis. As a result, though the R2 value is small, the two components tend to be inversely proportional to the experimental range of the HMD environment, as shown in Figures 10 and 11.

jmis-8-2-111-g9
Fig. 9. Regression analysis between distance error and world-to-meters parameter (HTC VIVE HMD).
Download Original Figure
jmis-8-2-111-g10
Fig. 10. Regression analysis between distance error and world-to-meters parameter (Oculus Rift CV1 HMD).
Download Original Figure
jmis-8-2-111-g11
Fig. 11. Comparison of interaction distance error before and after calibration (distances are 40 cm, 50 cm and 60 cm).
Download Original Figure

By expanding the previous analysis results, we assume that the matching task error can reduce by controlling the world scale element when performing direct interaction within the spatial space. According to the experimental results, the system can control the World-to-meters in UnrealEngine. We derive the world scale factors, which are about 106.8% for HTC VIVE and 107.5% for Oculus Rift CV1.The increasing world-scale factor, about 7% for HMDs, is similar to hardware specifications currently used in the consumer market. In brief, increasing about 7% can reduce the user’s errors the least.

We briefly examined the effectiveness of the proposed calibration method and then re-measured the performance of the six subjects before and after world-scale correction.

As shown in Figure 11 because the work performance of the six subjects improved at 83.3%, we can conclude that world scale-based distance sense correction methodology is able to be a candidate for positively influencing the real-virtual distance perception matching.

Even if VR content based on distance perception controls arbitrary virtual tracking space system by attaching 6DOF tracking sensor to all objects, this situation is difficult to completely solve the incongruity with the proprioception adapted to the real environment. In other words, the real-virtual space matching method based on existing hardware and software calibration performed without consideration of human factors depends on only visual feedback that accurately expresses the relative relation between virtual objects. Therefore, it would be adequate to add a correction method in terms of human factors like the result of this study.

V. CONCLUSION

We report on the study of the control technique to solve the problem of the real and virtual space inconsistency caused by the fact that the user of the virtual environment wearing the HMD does not maintain the same distance feeling in the real environment. This study proposes a method of correcting contents using the distance perception within the near-body space range and the short-range body movement (navigation) characteristics measured in a controlled environment. Furthermore, we propose a method to correct the FOV to about 107% to express accurate distance when directly manipulating the objects in the near-body space. However, this study results from initial research on a limited scale group of subjects and hardware and software. Therefore, it needs to extend to research on various control elements for various HMD, tracking systems, and 3D visualization engines in the future.

Acknowledgement

This work was supported by Institute for Information & communications Technology Promotion (IITP) grant funded by the Korea government (MSIT) (R0118-16-1007, Immersive content experience enabling multi-user interaction based on high-speed and precise tracking in an indoor VR space)

REFERENCES

[1].

K. E. Bystrom, W. Barfield, and C. M. Hendrix, “A conceptual model of the sense of presence in virtual environments,” Presence: Teleoperators and Virtual Environments, vol. 8, no. 2, pp. 241-244, Apr. 1999.

[2].

W. Epstein and S. Rogers, Handbook of perception and cognition, Vol 5; Perception of space and motion, San Diego, CA: Academic Press, 1995.

[3].

G. Kramida, “Resolving the Vergence-Accommodation Conflict in Head-Mounted Displays,” IEEE Transactions on Visualization and Computer Graphics, vol. 22, no. 7, pp. 1912-1931, 2016.

[4].

R. S. Renner, B. M. Velichkovsky, and J. R. Helmert, “The perception of egocentric distances in virtual environments - A review,” ACM Computing Survey, vol. 46, no. 2, pp. 1-40, Nov. 2013.

[5].

R. Messing and F. H. Durgin, “Distance Perception and the Visual Horizon in Head-Mounted Displays,” ACM Transactions on Applied Perception, vol. 2, no. 3, pp. 234-250, Jul. 2005.

[6].

S. H. Creem-Regehr, P. Willemsen, A. A. Gooch, and W. B. Thompson, “The Influence of Restricted Viewing Conditions on Egocentric Distance Perception: Implications for Real and Virtual Indoor Envir-onments,” Perception, vol. 34, issue 2, pp. 191-204, Feb. 2005.

[7].

B. Li, “Distance Perception in Virtual Environment through Head-mounted Displays,” Ph.D. dissertation, Michigan Technological University, 2017.

[8].

B. Altenhoff, “Effects of Interaction with an Immersive Virtual Environment on Near-field Distance Estimates,” M.S. thesis, Clemnon University, 2012.

[9].

C. J. Ziemer, J. M. Plumert, J. F. Cremer, and J. K. Kearney, “Estimating Distance in Real and Virtual Environments: Does Order Make a Difference?” Attention, perception & psychophysics, vol. 71, no. 5, pp. 1095-1106, Jul. 2009.

[10].

U. Yang, N. G. Kim, and K. H. Kim, “Augmented System for Immersive 3D Expansion and Interaction,” ETRI Journal, vol. 38, issue 1, pp. 149-158, Feb. 2016.

[11].

S. Roy, O. Mazumder, D. Chatterjee, and K. Chakravarty, “Quantification of Postural Balance using Augmented Reality based Environment: A Pilot Study,” presented at the 2017 IEEE Sensors, Oct. 2017.

[12].

S. Roy, O. Mazumder, K. Chakravarty, and D. Chatterjee, “Model to Understand Neural Interplay Involving Proprioceptive Adaptation in Lower Limb During Dual Task Paradigm,” presented at the 2018 IEEE Sensors, Oct. 2018.

[13].

S. Mukherjee, R. Saini, P. Kumar, P. P. Roy, D. P. Dogra, and B. G. Kim, “Fight Detection in Hockey Videos using Deep Network,” Journal of Multimedia Information System, vol. 4, no. 4, pp. 225-232, 2017.

[14].

A. K. Dash, S. K. Behera, D. P. Dogra, and P. P. Roy, “Designing of Marker-based Augmented Reality Learning Environment for Kids using Convolutional Neural Network Architecture,” Displays, vol. 55, pp. 46-54, Oct. 2018.

[15].

E. M. Howlett, “Wide-angle orthostereo,” in Pro-ceedings of Stereoscopic Displays and Applications, vol. 1256, Sep. 1990.

[16].

M. Azmandian, T. Grechkin, M. Bolas, and E. Suma, “The Redirected Walking Toolkit: A Unified Devel-opment Platform for Exploring Large Virtual Environments,” in Proceeding of IEEE 2-nd Workshop on Everyday Virtual Reality, pp. 9-14, Mar. 2016.

[17].

F. C. Huang, K. Chen, and G. Wetzstein, “The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues,” ACM Transactions on Graphics, vol. 34, no. 4, pp. 1-12, Aug. 2015.

[18].

H. Hua, “Optical methods for enabling focus cues in head-mounted displays for virtual and augmented reality,” in Proceedings of SPIE Three-Dimensional Imaging, Visualization, and Display,
, May 2017.

[19].

S. Masnadi, K. P. Pfeil, J. T. Sera-Josef, and J. J. Laviola, “Field of View Effect on Distance Perception in Virtual Reality,” in Proceeding of 2021 IEEE Conference on Virtual Reality and 3D User Interfaces, pp. 542-543, Apr. 2021.

[20].

D. A. Bowman, “Interaction Techniques for Common Tasks in Immersive Virtual Environments: Design, Evaluation, and Application,” Ph.D. Dissertation. Georgia Institute of Technology, Atlanta, USA, 1999.

[21].

L. Kohli, “Redirected touching: Warping space to remap passive haptics,” in Proceeding of IEEE Symposium on 3D User Interfaces, pp. 129-130, 2010.

Authors

Ungyeon Yang

jmis-8-2-111-i1

received his BS degree in computer science and engineering from Chungnam National University, Daejeon, Rep. of Korea, in 1997. He received his MS and PhD degrees from Pohang University of Science and Technology (POSTECH), Rep. of Korea, in 2000 and 2003, respectively. Since 2003, he has been a principal researcher with Electronics and Telecommunications Research Institute (ETRI). Since 2021, he has been a professor with Information and Communication Technologies Major, the ETRI School, University of Science and Technology (UST). His research interests include wearable display, information visualization, 3D user interfaces, human factors, haptics and multimodal user interaction in the field of virtual/mixed reality and ergonomics.

Nam-Gyu Kim

jmis-8-2-111-i2

received his BS degree in computer science from KAIST, Daejeon, Rep. of Korea, in 1995. He received his MS and PhD degrees in computer science and engineering from POSTECH, Pohang, Rep. of Korea, in 1996 and 2005, respectively. He joined the Advanced Telecommunications Research Institute International (ATR), Kyoto, Japan, in 2001 and then went on to work for the Korea Telecommunication (KT) research center, Rep. of Korea, in 2006. Since 2009, he has been an associate professor with Applied Software Engineering Major, the Division of Creative Software Engineering, Dong-Eui University, Busan, Rep. of Korea. His research interests include 3D human-computer interaction in multimedia system, computer vision and visual information processing in the field of virtual and augmented reality.