1 Introduction
The downsizing of computers has led to wearable computing devices, and context-aware systems using various sensors. Context-aware systems have many applications, such as to healthcare [6]. Most of the contexts that can be dealt with in these applications are postures (e.g. sitting) and behaviors (e.g. walking), which are states of human activities lasting for certain length of time. They are generally recognized with a classifier such as SVM (support vector machine)[7] operating on extracted feature values such as the mean, variance, and FFT (fast Fourier transform) power spectrum that express body orientation and exercise intensity. Other important activities in daily life include gestures(e.g. a punch). Gestures are not states but once-off actions, and they can be recognized with a template matching algorithm such as DTW (dynamic time warping) [5] after trimming the waveform of the gesture. The feature values do not have information on how the user has moved, and they are not good for discriminating similar gestures like rotating one's arm clockwise or anticlockwise. Conventional systems force users to explicitly indicate the starting point and endpoint of a gesture, say, by them standing still before and after the gesture or by them pushing a button while performing the gesture.