Kelly, Daniel and Delannoy, Jane Reilly and McDonald, John and Markham, Charles
(2009)
A Framework for Continuous Multimodal Sign Language
Recognition.
In:
ICMI-MLMI '09 Proceedings of the 2009 international conference on Multimodal interfaces.
ACM, pp. 351-358.
ISBN 9781605587721
Abstract
We present a multimodal system for the recognition of manual signs and non-manual signals within continuous sign language sentences. In sign language, information is mainly conveyed through hand gestures (Manual Signs). Non-manual signals, such as facial expressions, head movements, body postures and torso movements, are used to express a large part of the grammar and some aspects of the syntax of sign language. In this paper we propose a multichannel HMM based system to recognize manual signs and non-manual signals. We choose a single non-manual signal, head movement, to evaluate our framework when recognizing non-manual signals. Manual signs and non-manual signals are processed independently using continuous multidimensional HMMs and a HMM threshold model. Experiments conducted demonstrate that our system achieved a detection ratio of 0.95 and a reliability measure of 0.93.
Item Type: |
Book Section
|
Keywords: |
Sign Language; Non-Manual Signals; HMM; |
Academic Unit: |
Faculty of Science and Engineering > Computer Science |
Item ID: |
8338 |
Identification Number: |
https://doi.org/10.1145/1647314.1647387 |
Depositing User: |
John McDonald
|
Date Deposited: |
14 Jun 2017 14:58 |
Publisher: |
ACM |
Refereed: |
Yes |
Funders: |
Irish Research Council for Science Engineering and Technology (IRCSET) |
URI: |
|
Use Licence: |
This item is available under a Creative Commons Attribution Non Commercial Share Alike Licence (CC BY-NC-SA). Details of this licence are available
here |
Repository Staff Only(login required)
|
Item control page |
Downloads per month over past year
Origin of downloads