Loading [MathJax]/extensions/MathMenu.js
Dynamic visibility checking for vision-based motion planning | IEEE Conference Publication | IEEE Xplore

Dynamic visibility checking for vision-based motion planning


Abstract:

An important problem in position-based visual servoing (PBVS) is to guarantee that a target will remain within the field of view for the duration of the task. In this pap...Show More

Abstract:

An important problem in position-based visual servoing (PBVS) is to guarantee that a target will remain within the field of view for the duration of the task. In this paper, we propose a dynamic visibility checking algorithm that, given a parametrized trajectory of the camera, determines if an arbitrary 3D target will remain within the field of view. We reformulate this problem as the problem of determining if the 3D coordinates of the target collide with the frustum formed by the camera field of view during the camera trajectory. To solve this problem, our algorithm computes and compares the shortest distance between the target and the frustum with the length of the trajectory described by the target in the camera's coordinate frame. Furthermore, we demonstrate that our algorithm can be combined with path planning algorithms and, in particular, probabilistic roadmaps (PRM). Results suggest that our algorithm is computationally efficient even when the target moves in the vicinity of image borders. In simulations, we use our dynamic visibility checking algorithm in conjunction with a PRM to plan collision free paths while providing the guarantee that a specific target will not leave the field of view.
Date of Conference: 19-23 May 2008
Date Added to IEEE Xplore: 13 June 2008
ISBN Information:
Print ISSN: 1050-4729
Conference Location: Pasadena, CA, USA
References is not available for this document.

I. INTRODUCTION

Whether it is in the structured environments of assembly lines or in the unstructured environments of households, the repertoire of robotic tasks has consistently expanded over the last few decades. As the level of autonomy of robots increases, so does the reliance on sensors that provide feedback to robot controllers. Among sensing devices, cameras are one of the most popular in the robotics community. In particular, motion control based on visual feedback, also known as visual servoing [1], [2], has been consistently at the forefront of robotics research. The bulk of the research in visual servoing has focused on a specific architecture known as image-based visual servoing (IBVS). Despite the advantages of IBVS, its velocity control aspect is not suitable for the majority of industrial robots. Typically, industrial robots operate through proprietary interfaces that only allow position commands in joint space or Cartesian space. A more suitable visual servoing architecture for such robots is position-based visual servoing (PBVS). In PBVS, a command is defined by the Cartesian parameters of a desired position and visual feedback is used to assess the error between the current parameters and the desired ones. In general, if visual feedback is used to control the motion of a robot, it is necessary to keep specific targets, markers or features within the field of view. Whereas this issue is implicitly addressed by IBVS, it is not the case for PBVS. In fact, one of the most cited drawbacks of PBVS is the inability to guarantee that a target or scene will remain within the field of view [3]. This deficiency is often sufficient to cause the failure of a task, especially if vision tracking is required. For example, in [4], a Kalman filter is used to track the pose of a target with respect to the coordinate frame of the camera.

Select All
1.
K. Ashimoto, "A review on vision-based control of robot manipulators," Advanced Robotics, vol. 17, no. 10, pp. 969-991, 2003.
2.
S. Hutchinson, G. D. Hager, and P. I. Corke, "A tutorial on visual servo control," IEEE Transactions on Robotics and Automation, vol. 12, no. 5, pp. 651-670, October 1996.
3.
F. Chaumette, "Potential problems of stability and convergence in image-based and position-based visual servoing," in The Confluence of Vision and Control, ser. Lecture Notes in Control and Information Systems, D. Kriegman, G. Hager, and A.Morse, Eds. Springer-Verlag, 1998, vol. 237, pp. 66-78.
4.
W. J. Wilson, C. C. Williams Hulls, and G. S. Bell, "Relative end-effector control using cartesian position based visual servoing," IEEE Transactions or Robotics and Automation, vol. 12, no. 5, pp. 684-696, October 1996.
5.
F. Schwarzer, M. Saha, and J.-C. Latombe, "Adaptive dynamic collision checking for single and multiple articulated robots in complex environments," IEEE Transactions on Robotics, vol. 21, no. 3, pp. 338-353, June 2005.
6.
B. Espiau, F. Chaumette, and P. Rives, "A new approach to visual servoing in robotics," IEEE Transactions on Robotics and Automation, vol. 8, no. 3, pp. 313-326, June 1992.
7.
G. Chesi, K. Hashimoto, D. Prattichizzo, and A. Vicino, "Keeping features in the field of view in eye-in-hand visual servoing: A switching approach," IEEE Transactions on Robotics, vol. 20, no. 5, pp. 908-913, October 2004.
8.
K. Hashimoto and T. Noritsugu, "Potential switching control in visual servo," in Proceedings of the 2000 IEEE International Conference on Robotics and Automation, April 2000, pp. 2765-2770.
9.
N. Mansard and F. Chaumette, "A new redundancy formalism for avoidance in visual servoing," in 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, Canada, August 2005, pp. 468-474.
10.
P. I. Corke and S. A. Hutchinson, "A new partitioned approach to image-based visual servo control," IEEE Transactions on Robotics and Automation, vol. 17, no. 4, pp. 507-515, August 2001.
11.
E. Malis, F. Chaumette, and S. Boudet, "2-1/2-d visual servoing," IEEE Transactions on Robotics and Automation, vol. 15, no. 2, pp. 238-250, April 1999.
12.
Y. Mezouar and F. Chaumette, "Path planning for robust image-based control," IEEE Transactions on Robotics and Automation, vol. 18, no. 4, pp. 534-549, August 2002.
13.
F. Schramm, F. Geffard, G. Morel, and A. Micaelli, "Calibration free image point path planning simultaneously ensuring visibility and controlling camera path," in Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Roma, Italy, April 2007, pp. 2074-2079.
14.
B. Thuilot, P. Martinet, L. Cordesses, and J. Gallice, "Position based visual servoing : keeping the object in the field of vision," in Proceedings of the 2002 IEEE International Conference on Robotics and Automation, Washington, DC, May 2002, pp. 1624-1629.
15.
P. I. Corke and M. C. Good, "Dynamic effects in visual closed-loop systems," IEEE Transactions on Robotics and Automation, vol. 12, no. 5, pp. 671-683, October 1996.
16.
E. Trucco and A. Verri, Introductory Techniques for 3-D Computer Vision. Prentice Hall, 1998.
17.
L. E. Kavraki, P. Svestka, J.-C. Latombe, and M. H. Overmars, "Probabilistic roadmaps for path planning in high-dimensional configuration space," IEEE Transactions on Robotics and Automation, vol. 12, no. 4, pp. 566-580, August 1996.
18.
D. Hsu, J.-C. Latombe, and H. Kurniawati, "On the probabilistic foundation of probabilistic roadmap planning," International Journal of Robotics Research, vol. 25, no. 7, pp. 627-643, July 2006.
19.
D. Nieuwenhuisen and M. H. Overmars, "Useful cycles in probabilistic roadmap graphs," in Proceedings of the 2004 IEEE International Conference on Robotics and Automation, New Orleans, LA, April 2004, pp. 446-452.
20.
J. Y. Yen, "Finding the k shortest loopless paths in a network," Management Science, vol. 17, pp. 712-716, 1971.

Contact IEEE to Subscribe

References

References is not available for this document.