Loading [MathJax]/extensions/MathZoom.js
The complexities of grasping in the wild | IEEE Conference Publication | IEEE Xplore

The complexities of grasping in the wild


Abstract:

The recent ubiquity of high-framerate (120 fps and higher) handheld cameras creates the opportunity to study human grasping at a greater level of detail than normal speed...Show More

Abstract:

The recent ubiquity of high-framerate (120 fps and higher) handheld cameras creates the opportunity to study human grasping at a greater level of detail than normal speed cameras allow. We first collected 91 slow-motion interactions with objects in a convenience store setting. We then annotated the actions through the lenses of various existing manipulation taxonomies. We found manipulation, particularly the process of forming a grasp, is complicated and proceeds quickly. Our dataset shows that there are many ways that people deal with clutter in order to form a strong grasp of an object. It also reveals several errors and how people recover from them. Though annotating motions in detail is time-consuming, the annotation systems we used nevertheless leave out important aspects of understanding manipulation actions, such as how the environment is functioning as a “finger” of sorts, how different parts of the hand can be involved in different grasping tasks, and high-level intent.
Date of Conference: 15-17 November 2017
Date Added to IEEE Xplore: 08 January 2018
ISBN Information:
Electronic ISSN: 2164-0580
Conference Location: Birmingham, UK
References is not available for this document.

I. Introduction

For roboticists working on dexterous robots, observation of human manipulation continues to be an important way to understand the problem of grasping (e.g. [1]–[3]). One way of observing human grasping in detail is to use high-framerate video. Due to the growing ubiquity of high-framerate video cameras in phones, it is now feasible to capture a large number of grasping actions “in the wild” i.e. in everyday settings such as cluttered workspaces. The large number of actions and the everyday setting allows behaviors such as mistakes to be captured, and the high framerate reveals detailed finger movement and the making and breaking of contact.

Select All
1.
M. Khansari, E. Klingbeil and O. Khatib, "Adaptive human-inspired compliant contact primitives to perform surface-surface contact under uncertainty", The International Journal of Robotics Research, vol. 35, no. 13, pp. 1651-1675, 2016.
2.
T. Feix, I. M. Bullock and A. M. Dollar, "Analysis of human grasping behavior: Object characteristics and grasp type", IEEE transactions on haptics, vol. 7, no. 3, pp. 311-323, 2014.
3.
L. Y. Chang, S. S. Srinivasa and N. S. Pollard, "Planning pre-grasp manipulation for transport tasks" in Robotics and Automation (ICRA) 2010 IEEE International Conference on, IEEE, pp. 2697-2704, 2010.
4.
I. G. Schlesinger, "Der mechanische aufbau der künstlichen glieder" in Ersatzglieder und Arbeitshilfen, Springer., pp. 321-661, 1919.
5.
J. R. Napier, "The prehensile movements of the human hand", Journal of Bone and Joint Surgery, vol. 38, no. 4, pp. 902-913, 1956.
6.
N. Kamakura, M. Matsuo, H. Ishii, F. Mitsuboshi and Y. Miura, "Patterns of static prehension in normal hands", The American Journal of Occupational Therapy: Official Publication of the American Occupational Therapy Association, vol. 34, no. 7, pp. 437-445, 1980.
7.
T. Iberall, "The nature of human prehension: Three dextrous hands in one" in Robotics and Automation. Proceedings. 1987 IEEE International Conference on, IEEE, vol. 4, pp. 396-401, 1987.
8.
M. Cutkosky, "On grasp choice grasp models and the design of hands for manufacturing tasks", IEEE Transactions on Robotics and Automation, vol. 5, no. 3, pp. 269-279, 1989.
9.
B. Abbasi, E. Noohi, S. Parastegari and M. Žefran, "Grasp taxonomy based on force distribution" in Robot and Human Interactive Communication (RO-MAN) 2016 25th IEEE International Symposium on., IEEE, pp. 1098-1103, 2016.
10.
H. Marino, M. Gabiccini, A. Leonardis and A. Bicchi, "Data-driven human grasp movement analysis", ISR 2016: 47st International Symposium on Robotics; Proceedings of. VDE, pp. 1-8, 2016.
11.
J. M. Elliott and K. Connolly, "A classification of manipulative hand movements", Developmental Medicine & Child Neurology, vol. 26, no. 3, pp. 283-296, 1984.
12.
I. M. Bullock, R. R. Ma and A. M. Dollar, "A hand-centric classification of human and robot dexterous manipulation", Haptics IEEE Transactions on, vol. 6, no. 2, pp. 129-144, 2013.
13.
L. Y. Chang and N. S. Pollard, Video survey of pre-grasp interactions in natural hand activities, June 2009.
14.
F. Worgotter, E. E. Aksoy, N. Kruger, J. Piater, A. Ude and M. Tamosiunaite, "A simple ontology of manipulation actions based on hand-object relations", Autonomous Mental Development IEEE Transactions on, vol. 5, no. 2, pp. 117-134, 2013.
15.
D. Leidner, C. Borst, A. Dietrich, M. Beetz and A. Albu-Schäffer, "Classifying compliant manipulation tasks for automated planning in robotics" in Intelligent Robots and Systems (IROS) 2015 IEEE/RSJ International Conference on, IEEE, pp. 1769-1776, 2015.
16.
J. Borras and T. Asfour, "A whole-body pose taxonomy for loco-manipulation tasks" in Intelligent Robots and Systems (IROS) 2015 IEEE/RSJ International Conference on, IEEE, pp. 1578-1585, 2015.
17.
N. Abe and J.-P. Laumond, "Dance notations and robot motion", Proceedings of the 1st Workshop of the Anthropomorphic Motion Factory at LAAS-CNRS ‘14, 2014.
18.
P. Ekman and W. Friesen, Facial action coding system: A technique for the measurement of facial movement, Palo Alto:Consulting Psychologists Press, 1978.
19.
J. Cohn and P. Ekman, "Measuring facial action by manual coding facial EMG and automatic facial image analysis", Handbook of Nonverbal Behavior Research Methods in the Affective Sciences, 2005.
20.
T. Torigoe, "Comparison of object manipulation among 74 species of non-human primates", Primates, vol. 26, no. 2, pp. 182-194, 1985.
21.
R. W. Byrne, J. M. Byrne et al., "Manual dexterity in the gorilla: bimanual and digit role differentiation in a natural task", Animal Cognition, vol. 4, no. 3–4, pp. 347-361, 2001.
22.
T. Feix, J. Romero, H.-B. Schmiedmayer, A. M. Dollar and D. Kragic, "The GRASP taxonomy of human grasp types", IEEE Transactions on Human-Machine Systems, vol. 46, no. 1, pp. 66-77, 2016.
23.
I. M. Bullock, T. Feix and A. M. Dollar, "The Yale human grasping dataset: Grasp object and task data in household and machine shop environments", The International Journal of Robotics Research, pp. 0278364914555720, 2014.
24.
M. T. Mason, "Mechanics and planning of manipulator pushing operations", International Journal of Robotics Research, vol. 5, no. 3, pp. 53-71, 1986.
25.
K. M. Lynch, "Toppling manipulation", IEEE International Conference on Robotics and Automation, 1999.
26.
E. Yoshida, M. Poirier, J.-P. Laumond, O. Kanoun, F. Lamiraux, R. Alami, et al., "Pivoting based manipulation by a humanoid robot", Autonomous Robots, vol. 28, no. 1, pp. 77-88, 2010.
27.
E. Klingbeil, A. Saxena and A. Y. Ng, "Learning to open new doors", IEEE/RSJ Conference on Intelligent Robots and Systems (IROS), 2010.
28.
M. Stilman and J. Kuffner, "Navigation among movable obstacles: Real-time reasoning in complex environments", International Journal of Humanoid Robotics, vol. 2, no. 4, pp. 479-503, 2005.
29.
M. Tenorth, U. Klank, D. Pangercic and M. Beetz, "Web-enabled robots - robots that use the web as an information resource", IEEE Robotics and Automation Magazine, vol. 18, 2011.
30.
M. Bollini and D. Rus, Cookies anyone?, [online] Available: http://web.mit.edu/newsoffice/2011/cookies-anyone.html.
Contact IEEE to Subscribe

References

References is not available for this document.