Kelly, Daniel and Delannoy, Jane Reilly and McDonald, John and Markham, Charles
(2009)
A Framework for Continuous Multimodal Sign Language
Recognition.
In:
ICMI-MLMI '09 Proceedings of the 2009 international conference on Multimodal interfaces.
ACM, pp. 351-358.
ISBN 9781605587721
Abstract
We present a multimodal system for the recognition of manual signs and non-manual signals within continuous sign language sentences. In sign language, information is mainly conveyed through hand gestures (Manual Signs). Non-manual signals, such as facial expressions, head movements, body postures and torso movements, are used to express a large part of the grammar and some aspects of the syntax of sign language. In this paper we propose a multichannel HMM based system to recognize manual signs and non-manual signals. We choose a single non-manual signal, head movement, to evaluate our framework when recognizing non-manual signals. Manual signs and non-manual signals are processed independently using continuous multidimensional HMMs and a HMM threshold model. Experiments conducted demonstrate that our system achieved a detection ratio of 0.95 and a reliability measure of 0.93.
Repository Staff Only(login required)
 |
Item control page |
Downloads per month over past year