In the years to come, users of virtual reality (VR) will be in a position to interact between the real and virtual realms even more easily, more naturally, and in real time. It would be possible to transfer a scene from the real world into virtual reality and to reflect feedback from there in turn into the real situation. Fraunhofer Heinrich Hertz Institute HHI will be exhibiting an innovative X-reality technology that renders this possible at the CeBIT, Booth E78 Hall 27.
Mixed-reality applications create new possibilities wherever collaboration is required, even over vast distances - for example, in the field of remote assistance. Going forward, if an installer needs to fix anything on site, for instance, VR glasses can be used by his colleague from the office to obtain a virtual picture of the situation in 3D and could even intervene virtually, without any contact in the scenery, displaying the correct handles or components to the installer.
The solution we have developed for this purpose can connect a simulated world in real time and high quality with the real world and open up new perspectives or collaborations. The physical interaction without disruptive controllers is more natural and comfortable. The solution can adapt more flexibly to relevant characteristics (such as the size of the user) and increases the immersion. At the same time, it can reduce symptoms of motion sickness that are often caused by VR scenarios.
Paul Chojecki, Project Manager at the Fraunhofer HHI
High-Resolution 3D Object and Body Acquisition for Mixed-Reality Interactions
The process is typically dependent on two technologies: in the real world, eight cameras (four pairs each) are used to record the scene from all sides and generate depth maps of the scene of up to 30 Hz. Moreover, dynamic movements and gestures are detected. Then, this data is fused by algorithms, and then coded and transmitted in real time with the related 3D textures to the VR station.
In the mean time, in the virtual world, another 3D camera records the VR user. The Fraunhofer HHI algorithms enable gesture interpretation and 3D body detection, thereby allowing the user to naturally interact in the VR scene without the need for disruptive controllers or markers. The user is represented there, just as if, as a movable full-body avatar and views his or her own body and gestures in virtual space. “Only the combination of the two technologies enables a unique solution for new mixed reality interaction and collaboration scenarios,” stated Chojecki.
A projection represents the feedback from the virtual world into the real scenario. Special image processing algorithms developed by the Fraunhofer HHI researchers were used to realize this projected augmentation. The algorithms guarantee that the controls and cues are displayed with visual accuracy, even when the surfaces on which they are projected tilt or move.
The X-Reality solution finds its use in versatile applications: apart from remote assistance, the process can be adopted for telecommunications or telepresence, human-robot interaction, gaming sessions, and rapid prototyping. Consequently, it will be possible for two spatially separated individuals to play a board game with each other.
At the Fraunhofer booth at the CeBIT, visitors can attempt to solve a 3D puzzle and get assistance from the virtual world. As part of the demo, a live projection and explanation of the real object are given to the VR user in the VR environment. Then, the user can respond to it virtually and reflect these reactions back into the real scenario.