I. Introduction
Probably the most essential aspect of any manipulation task is the effect to the environment. Artificial intelligence (AI) research has shown how this can be properly taken into account in automated planning systems, where actions are described based on their pre-conditions and effects [1]. Unfortunately, this aspect has been mostly abandoned in robotics research. Traditionally, robots are clueless about the purpose of their motions, and they are not aware of the resulting changes to the world. A similar trend can be observed in research on the classification of manipulation tasks. Traditional taxonomies found in the literature usually apply hand-centric views and classify by finger position [2], relative motions [3], or geometric dimensions [4]. The applications and effects are mostly neglected. We argue that a novel point of view has to be applied to classify actions on a high level of abstraction according to their effects to the physical world. Based on this, less abstract sub-categories can be defined to derive generic process models and to combine symbolic and geometric parameters.