Virtual touch and feel

Virtual Reality (VR) is growing rapidly and is used in countless industries including:
- automotive,
- healthcare
- retail,
- tourism,
- science and development,
- entertainment,
- etc.
While VR creates a completely simulated virtual environment, Augmented Reality (AR) works by superimposing virtual objects on the real world. While AR/VR technologies become realistic, they still do not provide haptic (tactile) or force feedback to the user.
In the world of virtual and augmented reality, images and sound remain paramount. In contrast, methods of providing tactile feedback (sense of touch) in commercial media are much less sophisticated than graphic and audio feedback. However, without the sense of touch, experiences ultimately appear empty, virtual realities appear fake, and human-computer interfaces become unintuitive.
A true revolution awaits us in the years ahead. Soon, tactile feedback will allow technology enthusiasts to interact with distant objects or people as if they were right next door. This tangible or “tactile” virtual experience is the long-term vision behind the EU-funded H-Reality project.
The ambition of the H-Reality project is to “achieve a sensory experience where digital 3D shapes and textures are manifested in real space using modulated, focused ultrasound, ready to be sensed by the unconstrained hand, where next-generation wearable touch rings provide directional vibrodotactic stimulation to inform users of object dynamics, and where computational renderings of specific materials can be distinguished based on their surface properties.”-we can read on the project website.
First, the team investigated the mechanics of touch. The sense of touch involves four major submodalities; vibration/pressure, temperature, itch, and pain, which transmit neuroelectrical signals to the brain via supplying nerves, collectively referred to as the somatosensory system. One of the authors will note that despite this, “The translation of mechanical signals to touch perception is still unclear.
The team also found a universal law to explain the favorable distribution of vibration-sensitive mechanoreceptors in many mammals. Building on this work, they identified ways to digitally render shapes and textures, using non-contact tactile elements that rely on ultrasound, and contact tactile elements that use wearable vibrodot devices.
They then began combining touch and non-contact tactile elements into prototypes of immersive VR applications to demonstrate the potential and possibilities of this technology.
What does virtual touch have to serve in the future? The authors believe that their paradigm of mixed touch interaction can revolutionize the way users interact with data in a wide range of applications. Ultimately, the team hopes to completely transform online interactions. Examples of applications include, for example: operating dangerous machines from a safe distance or performing operations remotely.
Literature: