Loading [MathJax]/extensions/MathMenu.js
Dual-Filtering for On-Line Simultaneously Estimate Weights and Phase Parameter of Probabilistic Movement Primitives for Human-Robot Collaboration | IEEE Conference Publication | IEEE Xplore

Dual-Filtering for On-Line Simultaneously Estimate Weights and Phase Parameter of Probabilistic Movement Primitives for Human-Robot Collaboration


Abstract:

The Probabilistic Movement Primitives (ProMPs) is an essential issue and framework for robotics Learning from Demonstration (LfD). It has been successfully applied to the...Show More

Abstract:

The Probabilistic Movement Primitives (ProMPs) is an essential issue and framework for robotics Learning from Demonstration (LfD). It has been successfully applied to the robotics field in tasks such as skill acquisition and Human-Robot Collaboration (HRC). In this paper, we focus on its adaptability in the HRC scenario, in which the adaptability of the ProMPs allows the robot to predict the future movement of its human partner and plan its movement accordingly, given the observed human movement. Most of the existing works about the application of the ProMPs in HRC either only focus on the estimation of the weights on-line and lack the estimation of the phase parameter or merely depend on the prior distribution of the phase parameter. As a result, these methods can lead to a misinterpretation of the basis matrix when the divergence between the prior distribution and the posterior distribution of the phase parameter becomes large, resulting in a divergence of the estimation of the weights. In this paper, we propose a Dual-Filtering method for the ProMPs, which is able to simultaneously on-line estimate the weights and phase parameter for the ProMPs. The preliminary experimental result demonstrates the proposed method is able to provide better prediction performance and more accurate estimation of the phase parameter in comparison with the previous works.
Date of Conference: 27 September 2021 - 01 October 2021
Date Added to IEEE Xplore: 16 December 2021
ISBN Information:

ISSN Information:

Conference Location: Prague, Czech Republic

I. Introduction

The application of robots has revolutionized manufacturing over the past 60 years, but two limitations still exist in the deployment of the conventional robot: 1) the robot requires expert skills to program so that it can be a functional component of the manufacturing process; 2) the conventional robot is required to be caged to ensure safety. These two limitations have prevented the further application of robot and to address these limitations, a framework named Learning from Demonstration (LfD) have been proposed. The LfD framework allows the robot to extract skills from the human demonstrations and reproduce or generalize the movement during the task execution. Such property has dramatically reduced the dependence on expert skills in robot programming of robot deployment by allowing the non-experts to program the robot via demonstration. Furthermore, the LfD also plays an important role in the Human-Robot Collaboration (HRC), in which the robot is expected to be freed from the cage and able to interact with the human safely and efficiently. In the HRC setting, the LfD framework extracts the human movement patterns and the interaction patterns between the collaborators from the demonstrations. During the task execution, the extracted patterns provide the robot the ability to react accordingly, with respect to the movement of its human partner.

Contact IEEE to Subscribe

References

References is not available for this document.