I. Introduction
Hand exoskeletons allow us to interact with virtual or remote environments intuitively. It is necessary to acquire the position and orientation of the human hand and the joint angles of the fingers, for determining contact or penetration of remote or virtual environments, positioning the slave manipulator or virtual human hand, and calculating force feedback. Several dexterous haptic interfaces use encoders on mechanical joints fixed to joints of the fingers [1], or sensors in a data glove [2], [3], which often results in bulky interface designs when force-feedback is also required. We present an approach to accurately estimate pose and configuration of the human hand from only the positions of well-chosen attachment points to the exoskeleton.