

The analysis of movements using inertial sensors represents an interesting alternative to video cameras, or other instrumentation used in posture analysis (treadmills, force plates, pressure plates, EMG). Inertial-sensor based analysis has been shown to be useful to classify Activities of Daily Living for situation assessment, healthcare applications, or to understand human emotions from body posture. We classify movements using a “lexical-like” approach. We use a vector representation of movements using a technique able to extract a great number of generic features, and a methods of classification, inspired by text mining, and machine learning techniques with some modifications, that transform our vector space from the feature-value space into a feature-frequency space. We used this method to classify a set of 21 movements performed by 13 people with good recognition results. Then we tested our method on the public WARD 1.0 database outperforming the results presented in literature on that database. The method we describe also shows to be technologically independent and semantically scalable, uses fast algorithms and appears to be suitable for every practical application where runtime movement analysis with big dictionaries could be a key factor.