Literature DB >> 28940043

Extricating Manual and Non-Manual Features for Subunit Level Medical Sign Modelling in Automatic Sign Language Classification and Recognition.

Elakkiya R1, Selvamani K2.   

Abstract

Subunit segmenting and modelling in medical sign language is one of the important studies in linguistic-oriented and vision-based Sign Language Recognition (SLR). Many efforts were made in the precedent to focus the functional subunits from the view of linguistic syllables but the problem is implementing such subunit extraction using syllables is not feasible in real-world computer vision techniques. And also, the present recognition systems are designed in such a way that it can detect the signer dependent actions under restricted and laboratory conditions. This research paper aims at solving these two important issues (1) Subunit extraction and (2) Signer independent action on visual sign language recognition. Subunit extraction involved in the sequential and parallel breakdown of sign gestures without any prior knowledge on syllables and number of subunits. A novel Bayesian Parallel Hidden Markov Model (BPaHMM) is introduced for subunit extraction to combine the features of manual and non-manual parameters to yield better results in classification and recognition of signs. Signer independent action aims in using a single web camera for different signer behaviour patterns and for cross-signer validation. Experimental results have proved that the proposed signer independent subunit level modelling for sign language classification and recognition has shown improvement and variations when compared with other existing works.

Keywords:  Feature extraction; Sign language recognition; Signer independent action; Subunit Gesture Base; Subunit modelling

Mesh:

Year:  2017        PMID: 28940043     DOI: 10.1007/s10916-017-0819-z

Source DB:  PubMed          Journal:  J Med Syst        ISSN: 0148-5598            Impact factor:   4.460


  5 in total

1.  Sign language structure: an outline of the visual communication systems of the American deaf. 1960.

Authors:  William C Stokoe
Journal:  J Deaf Stud Deaf Educ       Date:  2005

Review 2.  Automatic sign language analysis: a survey and the future beyond lexical meaning.

Authors:  Sylvie C W Ong; Surendra Ranganath
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2005-06       Impact factor: 6.226

3.  Medical image classification based on multi-scale non-negative sparse coding.

Authors:  Ruijie Zhang; Jian Shen; Fushan Wei; Xiong Li; Arun Kumar Sangaiah
Journal:  Artif Intell Med       Date:  2017-05-27       Impact factor: 5.326

4.  Random Forest-Based Recognition of Isolated Sign Language Subwords Using Data from Accelerometers and Surface Electromyographic Sensors.

Authors:  Ruiliang Su; Xiang Chen; Shuai Cao; Xu Zhang
Journal:  Sensors (Basel)       Date:  2016-01-14       Impact factor: 3.576

5.  Discriminant features and temporal structure of nonmanuals in American Sign Language.

Authors:  C Fabian Benitez-Quiroz; Kadir Gökgöz; Ronnie B Wilbur; Aleix M Martinez
Journal:  PLoS One       Date:  2014-02-06       Impact factor: 3.240

  5 in total
  1 in total

1.  Recognition of Urdu sign language: a systematic review of the machine learning classification.

Authors:  Hira Zahid; Munaf Rashid; Samreen Hussain; Fahad Azim; Sidra Abid Syed; Afshan Saad
Journal:  PeerJ Comput Sci       Date:  2022-02-18
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.