As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Up to day, the majority of Sign language animation systems rely on 3D content dictionary. However, to create Sign language dictionary, two main techniques are employed: pre-synthesized animation and generated animation. The first one relies on an expensive material based on motion capture pre-recorded animation using avatar technology. The second technique relies on proprietary solutions that consist on automatic and real-time generation of animations using specific transcription systems. Unfortunately, the majority of proprietary transcription systems require user specific skills. Therefore, the sign language creation process still remains a hard task. In this paper, we propose a new approach inspired from the declarative language to make sign language dictionary creation easier. Our aim is to provide a high-level layer using a transcription language that resembles to the spoken language. We illustrate the sign creation logic without describing how our complex animation functions are executed or evaluated. Therefore, our solution can be used by any user without any specific skills requirement. The proposed approach also can be used to create signs independently to sign language nature. In this paper we describe also, how our tool generates a natural sign animation based on human motion trajectory analysis.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.