1 Introduction
Immersive Virtual Reality (VR) must actively engage one's senses, so as to make the user feel truly part of the virtual world. One important aspect to achieve this objective remains the synchronization of motion and sensory feedback between the human users and their virtual avatars. Whenever one user moves a limb, the same motion should be replicated by the avatar; similarly, whenever the avatar touches a virtual object, the user should feel the same haptic experience. Di Luca et al. [1] recently studied the range of tolerable visuohaptic asynchronies when touching an object. Participants could not reliably detect the asynchrony if haptic feedback was presented less than 50 ms after the view of the contact. The asynchrony tolerated for presenting haptic feedback before the visual one was instead only 15 ms. These results suggest rather stringent requirements for haptic-enabled VR systems. Achieving this visuohaptic synchronization is also important for the perception of the objects properties. For example, Di Luca et al. [2] showed that delay in presenting visual and force information introduces a bias in the perception of compliance. Knorlein et al. [3] proved that a similar effect holds also when interacting with virtual objects.