Peruvian Sign Language Recognition Using Recurrent Neural Networks
Descripción del Articulo
Deaf people generally face difficulties in their daily lives when they try to communicate with hearing people, this is due to the lack of sign language knowledge in the country. Deaf people have to go on their everyday lives in company of a interpreter to be able to communicate, even wanting to go t...
| Autores: | , , |
|---|---|
| Formato: | artículo |
| Fecha de Publicación: | 2022 |
| Institución: | Universidad Peruana de Ciencias Aplicadas |
| Repositorio: | UPC-Institucional |
| Lenguaje: | inglés |
| OAI Identifier: | oai:repositorioacademico.upc.edu.pe:10757/669597 |
| Enlace del recurso: | http://hdl.handle.net/10757/669597 |
| Nivel de acceso: | acceso embargado |
| Materia: | Deep learning Recurrent neural networks Sign language |
| id |
UUPC_fbfa73024895a7ee06625f51b6a19468 |
|---|---|
| oai_identifier_str |
oai:repositorioacademico.upc.edu.pe:10757/669597 |
| network_acronym_str |
UUPC |
| network_name_str |
UPC-Institucional |
| repository_id_str |
2670 |
| dc.title.es_PE.fl_str_mv |
Peruvian Sign Language Recognition Using Recurrent Neural Networks |
| title |
Peruvian Sign Language Recognition Using Recurrent Neural Networks |
| spellingShingle |
Peruvian Sign Language Recognition Using Recurrent Neural Networks Barrientos-Villalta, Geraldine Fiorella Deep learning Recurrent neural networks Sign language |
| title_short |
Peruvian Sign Language Recognition Using Recurrent Neural Networks |
| title_full |
Peruvian Sign Language Recognition Using Recurrent Neural Networks |
| title_fullStr |
Peruvian Sign Language Recognition Using Recurrent Neural Networks |
| title_full_unstemmed |
Peruvian Sign Language Recognition Using Recurrent Neural Networks |
| title_sort |
Peruvian Sign Language Recognition Using Recurrent Neural Networks |
| author |
Barrientos-Villalta, Geraldine Fiorella |
| author_facet |
Barrientos-Villalta, Geraldine Fiorella Quiroz, Piero Ugarte, Willy |
| author_role |
author |
| author2 |
Quiroz, Piero Ugarte, Willy |
| author2_role |
author author |
| dc.contributor.author.fl_str_mv |
Barrientos-Villalta, Geraldine Fiorella Quiroz, Piero Ugarte, Willy |
| dc.subject.es_PE.fl_str_mv |
Deep learning Recurrent neural networks Sign language |
| topic |
Deep learning Recurrent neural networks Sign language |
| description |
Deaf people generally face difficulties in their daily lives when they try to communicate with hearing people, this is due to the lack of sign language knowledge in the country. Deaf people have to go on their everyday lives in company of a interpreter to be able to communicate, even wanting to go to buy bread every morning becomes a challenge for them and being treated in health centers also becomes a challenge, a challenge which should not exist since they have the fundamental right to health. For that reason this paper attempts to present a system for dynamic sign recognition for Peruvian Sign Language and our main goal is to detect which model and processing technique is the most appropriate to solve this problem. So that this system can be used in deaf people everyday life and help them communicate. There have been many projects around the world trying to address this situation. However, each Sign Language is unique in its own way and, therefore, a global and complete solution is not possible. There have also been similar projects in Peru, but all of them share the same flaw of only recognizing static signs. Since sign language is not just the static signs like the alphabet, a solution which addresses also words that can be used in sentences is needed. For this a dynamic recognition is needed, and this is the system that will be presented in this paper. |
| publishDate |
2022 |
| dc.date.accessioned.none.fl_str_mv |
2023-12-08T01:36:56Z |
| dc.date.available.none.fl_str_mv |
2023-12-08T01:36:56Z |
| dc.date.issued.fl_str_mv |
2022-01-01 |
| dc.type.es_PE.fl_str_mv |
info:eu-repo/semantics/article |
| format |
article |
| dc.identifier.issn.none.fl_str_mv |
18650929 |
| dc.identifier.doi.none.fl_str_mv |
10.1007/978-3-031-20319-0_34 |
| dc.identifier.uri.none.fl_str_mv |
http://hdl.handle.net/10757/669597 |
| dc.identifier.eissn.none.fl_str_mv |
18650937 |
| dc.identifier.journal.es_PE.fl_str_mv |
Communications in Computer and Information Science |
| dc.identifier.eid.none.fl_str_mv |
2-s2.0-85144224840 |
| dc.identifier.scopusid.none.fl_str_mv |
SCOPUS_ID:85144224840 |
| dc.identifier.isni.none.fl_str_mv |
0000 0001 2196 144X |
| identifier_str_mv |
18650929 10.1007/978-3-031-20319-0_34 18650937 Communications in Computer and Information Science 2-s2.0-85144224840 SCOPUS_ID:85144224840 0000 0001 2196 144X |
| url |
http://hdl.handle.net/10757/669597 |
| dc.language.iso.es_PE.fl_str_mv |
eng |
| language |
eng |
| dc.relation.url.es_PE.fl_str_mv |
https://www.springerprofessional.de/en/peruvian-sign-language-recognition-using-recurrent-neural-networ/23752648 |
| dc.rights.es_PE.fl_str_mv |
info:eu-repo/semantics/embargoedAccess |
| dc.rights.*.fl_str_mv |
Attribution-NonCommercial-ShareAlike 4.0 International |
| dc.rights.uri.*.fl_str_mv |
http://creativecommons.org/licenses/by-nc-sa/4.0/ |
| eu_rights_str_mv |
embargoedAccess |
| rights_invalid_str_mv |
Attribution-NonCommercial-ShareAlike 4.0 International http://creativecommons.org/licenses/by-nc-sa/4.0/ |
| dc.publisher.es_PE.fl_str_mv |
Springer Science and Business Media Deutschland GmbH |
| dc.source.none.fl_str_mv |
reponame:UPC-Institucional instname:Universidad Peruana de Ciencias Aplicadas instacron:UPC |
| instname_str |
Universidad Peruana de Ciencias Aplicadas |
| instacron_str |
UPC |
| institution |
UPC |
| reponame_str |
UPC-Institucional |
| collection |
UPC-Institucional |
| dc.source.journaltitle.none.fl_str_mv |
Communications in Computer and Information Science |
| dc.source.volume.none.fl_str_mv |
1675 CCIS |
| dc.source.beginpage.none.fl_str_mv |
459 |
| dc.source.endpage.none.fl_str_mv |
473 |
| bitstream.url.fl_str_mv |
https://repositorioacademico.upc.edu.pe/bitstream/10757/669597/2/license.txt https://repositorioacademico.upc.edu.pe/bitstream/10757/669597/1/license_rdf |
| bitstream.checksum.fl_str_mv |
8a4605be74aa9ea9d79846c1fba20a33 934f4ca17e109e0a05eaeaba504d7ce4 |
| bitstream.checksumAlgorithm.fl_str_mv |
MD5 MD5 |
| repository.name.fl_str_mv |
Repositorio académico upc |
| repository.mail.fl_str_mv |
upc@openrepository.com |
| _version_ |
1846065954721628160 |
| spelling |
9ff05d64bac709f45fdab671e1a097703002057a73ef81911e96c63c6cfde81044d300533fd7e68213307170565ef90452257a500Barrientos-Villalta, Geraldine FiorellaQuiroz, PieroUgarte, Willy2023-12-08T01:36:56Z2023-12-08T01:36:56Z2022-01-011865092910.1007/978-3-031-20319-0_34http://hdl.handle.net/10757/66959718650937Communications in Computer and Information Science2-s2.0-85144224840SCOPUS_ID:851442248400000 0001 2196 144XDeaf people generally face difficulties in their daily lives when they try to communicate with hearing people, this is due to the lack of sign language knowledge in the country. Deaf people have to go on their everyday lives in company of a interpreter to be able to communicate, even wanting to go to buy bread every morning becomes a challenge for them and being treated in health centers also becomes a challenge, a challenge which should not exist since they have the fundamental right to health. For that reason this paper attempts to present a system for dynamic sign recognition for Peruvian Sign Language and our main goal is to detect which model and processing technique is the most appropriate to solve this problem. So that this system can be used in deaf people everyday life and help them communicate. There have been many projects around the world trying to address this situation. However, each Sign Language is unique in its own way and, therefore, a global and complete solution is not possible. There have also been similar projects in Peru, but all of them share the same flaw of only recognizing static signs. Since sign language is not just the static signs like the alphabet, a solution which addresses also words that can be used in sentences is needed. For this a dynamic recognition is needed, and this is the system that will be presented in this paper.engSpringer Science and Business Media Deutschland GmbHhttps://www.springerprofessional.de/en/peruvian-sign-language-recognition-using-recurrent-neural-networ/23752648info:eu-repo/semantics/embargoedAccessAttribution-NonCommercial-ShareAlike 4.0 Internationalhttp://creativecommons.org/licenses/by-nc-sa/4.0/Deep learningRecurrent neural networksSign languagePeruvian Sign Language Recognition Using Recurrent Neural Networksinfo:eu-repo/semantics/articleCommunications in Computer and Information Science1675 CCIS459473reponame:UPC-Institucionalinstname:Universidad Peruana de Ciencias Aplicadasinstacron:UPCLICENSElicense.txtlicense.txttext/plain; charset=utf-81748https://repositorioacademico.upc.edu.pe/bitstream/10757/669597/2/license.txt8a4605be74aa9ea9d79846c1fba20a33MD52falseCC-LICENSElicense_rdflicense_rdfapplication/rdf+xml; charset=utf-81031https://repositorioacademico.upc.edu.pe/bitstream/10757/669597/1/license_rdf934f4ca17e109e0a05eaeaba504d7ce4MD51false10757/669597oai:repositorioacademico.upc.edu.pe:10757/6695972023-12-08 01:36:57.593Repositorio académico upcupc@openrepository.comTk9URTogUExBQ0UgWU9VUiBPV04gTElDRU5TRSBIRVJFClRoaXMgc2FtcGxlIGxpY2Vuc2UgaXMgcHJvdmlkZWQgZm9yIGluZm9ybWF0aW9uYWwgcHVycG9zZXMgb25seS4KCk5PTi1FWENMVVNJVkUgRElTVFJJQlVUSU9OIExJQ0VOU0UKCkJ5IHNpZ25pbmcgYW5kIHN1Ym1pdHRpbmcgdGhpcyBsaWNlbnNlLCB5b3UgKHRoZSBhdXRob3Iocykgb3IgY29weXJpZ2h0Cm93bmVyKSBncmFudHMgdG8gRFNwYWNlIFVuaXZlcnNpdHkgKERTVSkgdGhlIG5vbi1leGNsdXNpdmUgcmlnaHQgdG8gcmVwcm9kdWNlLAp0cmFuc2xhdGUgKGFzIGRlZmluZWQgYmVsb3cpLCBhbmQvb3IgZGlzdHJpYnV0ZSB5b3VyIHN1Ym1pc3Npb24gKGluY2x1ZGluZwp0aGUgYWJzdHJhY3QpIHdvcmxkd2lkZSBpbiBwcmludCBhbmQgZWxlY3Ryb25pYyBmb3JtYXQgYW5kIGluIGFueSBtZWRpdW0sCmluY2x1ZGluZyBidXQgbm90IGxpbWl0ZWQgdG8gYXVkaW8gb3IgdmlkZW8uCgpZb3UgYWdyZWUgdGhhdCBEU1UgbWF5LCB3aXRob3V0IGNoYW5naW5nIHRoZSBjb250ZW50LCB0cmFuc2xhdGUgdGhlCnN1Ym1pc3Npb24gdG8gYW55IG1lZGl1bSBvciBmb3JtYXQgZm9yIHRoZSBwdXJwb3NlIG9mIHByZXNlcnZhdGlvbi4KCllvdSBhbHNvIGFncmVlIHRoYXQgRFNVIG1heSBrZWVwIG1vcmUgdGhhbiBvbmUgY29weSBvZiB0aGlzIHN1Ym1pc3Npb24gZm9yCnB1cnBvc2VzIG9mIHNlY3VyaXR5LCBiYWNrLXVwIGFuZCBwcmVzZXJ2YXRpb24uCgpZb3UgcmVwcmVzZW50IHRoYXQgdGhlIHN1Ym1pc3Npb24gaXMgeW91ciBvcmlnaW5hbCB3b3JrLCBhbmQgdGhhdCB5b3UgaGF2ZQp0aGUgcmlnaHQgdG8gZ3JhbnQgdGhlIHJpZ2h0cyBjb250YWluZWQgaW4gdGhpcyBsaWNlbnNlLiBZb3UgYWxzbyByZXByZXNlbnQKdGhhdCB5b3VyIHN1Ym1pc3Npb24gZG9lcyBub3QsIHRvIHRoZSBiZXN0IG9mIHlvdXIga25vd2xlZGdlLCBpbmZyaW5nZSB1cG9uCmFueW9uZSdzIGNvcHlyaWdodC4KCklmIHRoZSBzdWJtaXNzaW9uIGNvbnRhaW5zIG1hdGVyaWFsIGZvciB3aGljaCB5b3UgZG8gbm90IGhvbGQgY29weXJpZ2h0LAp5b3UgcmVwcmVzZW50IHRoYXQgeW91IGhhdmUgb2J0YWluZWQgdGhlIHVucmVzdHJpY3RlZCBwZXJtaXNzaW9uIG9mIHRoZQpjb3B5cmlnaHQgb3duZXIgdG8gZ3JhbnQgRFNVIHRoZSByaWdodHMgcmVxdWlyZWQgYnkgdGhpcyBsaWNlbnNlLCBhbmQgdGhhdApzdWNoIHRoaXJkLXBhcnR5IG93bmVkIG1hdGVyaWFsIGlzIGNsZWFybHkgaWRlbnRpZmllZCBhbmQgYWNrbm93bGVkZ2VkCndpdGhpbiB0aGUgdGV4dCBvciBjb250ZW50IG9mIHRoZSBzdWJtaXNzaW9uLgoKSUYgVEhFIFNVQk1JU1NJT04gSVMgQkFTRUQgVVBPTiBXT1JLIFRIQVQgSEFTIEJFRU4gU1BPTlNPUkVEIE9SIFNVUFBPUlRFRApCWSBBTiBBR0VOQ1kgT1IgT1JHQU5JWkFUSU9OIE9USEVSIFRIQU4gRFNVLCBZT1UgUkVQUkVTRU5UIFRIQVQgWU9VIEhBVkUKRlVMRklMTEVEIEFOWSBSSUdIVCBPRiBSRVZJRVcgT1IgT1RIRVIgT0JMSUdBVElPTlMgUkVRVUlSRUQgQlkgU1VDSApDT05UUkFDVCBPUiBBR1JFRU1FTlQuCgpEU1Ugd2lsbCBjbGVhcmx5IGlkZW50aWZ5IHlvdXIgbmFtZShzKSBhcyB0aGUgYXV0aG9yKHMpIG9yIG93bmVyKHMpIG9mIHRoZQpzdWJtaXNzaW9uLCBhbmQgd2lsbCBub3QgbWFrZSBhbnkgYWx0ZXJhdGlvbiwgb3RoZXIgdGhhbiBhcyBhbGxvd2VkIGJ5IHRoaXMKbGljZW5zZSwgdG8geW91ciBzdWJtaXNzaW9uLgo= |
| score |
13.932913 |
Nota importante:
La información contenida en este registro es de entera responsabilidad de la institución que gestiona el repositorio institucional donde esta contenido este documento o set de datos. El CONCYTEC no se hace responsable por los contenidos (publicaciones y/o datos) accesibles a través del Repositorio Nacional Digital de Ciencia, Tecnología e Innovación de Acceso Abierto (ALICIA).
La información contenida en este registro es de entera responsabilidad de la institución que gestiona el repositorio institucional donde esta contenido este documento o set de datos. El CONCYTEC no se hace responsable por los contenidos (publicaciones y/o datos) accesibles a través del Repositorio Nacional Digital de Ciencia, Tecnología e Innovación de Acceso Abierto (ALICIA).