Real-Time CNN Based Facial Emotion Recognition Model for a Mobile Serious Game
Descripción del Articulo
Every year, the increase in human-computer interaction is noticeable. This brings with it the evolution of computer vision to improve this interaction to make it more efficient and effective. This paper presents a CNN-based emotion face recognition model capable to be executed on mobile devices, in...
Autores: | , , |
---|---|
Formato: | artículo |
Fecha de Publicación: | 2024 |
Institución: | Universidad Peruana de Ciencias Aplicadas |
Repositorio: | UPC-Institucional |
Lenguaje: | inglés |
OAI Identifier: | oai:repositorioacademico.upc.edu.pe:10757/676066 |
Enlace del recurso: | http://hdl.handle.net/10757/676066 |
Nivel de acceso: | acceso abierto |
Materia: | Emotion Expression Facial FER Machine Learning Mobile Real-Time Recognition |
id |
UUPC_68795443a373fa520a2c6c1ec2a9b9d1 |
---|---|
oai_identifier_str |
oai:repositorioacademico.upc.edu.pe:10757/676066 |
network_acronym_str |
UUPC |
network_name_str |
UPC-Institucional |
repository_id_str |
2670 |
dc.title.es_PE.fl_str_mv |
Real-Time CNN Based Facial Emotion Recognition Model for a Mobile Serious Game |
title |
Real-Time CNN Based Facial Emotion Recognition Model for a Mobile Serious Game |
spellingShingle |
Real-Time CNN Based Facial Emotion Recognition Model for a Mobile Serious Game Anto-Chavez, Carolain Emotion Expression Facial FER Machine Learning Mobile Real-Time Recognition |
title_short |
Real-Time CNN Based Facial Emotion Recognition Model for a Mobile Serious Game |
title_full |
Real-Time CNN Based Facial Emotion Recognition Model for a Mobile Serious Game |
title_fullStr |
Real-Time CNN Based Facial Emotion Recognition Model for a Mobile Serious Game |
title_full_unstemmed |
Real-Time CNN Based Facial Emotion Recognition Model for a Mobile Serious Game |
title_sort |
Real-Time CNN Based Facial Emotion Recognition Model for a Mobile Serious Game |
author |
Anto-Chavez, Carolain |
author_facet |
Anto-Chavez, Carolain Maguiña-Bernuy, Richard Ugarte, Willy |
author_role |
author |
author2 |
Maguiña-Bernuy, Richard Ugarte, Willy |
author2_role |
author author |
dc.contributor.author.fl_str_mv |
Anto-Chavez, Carolain Maguiña-Bernuy, Richard Ugarte, Willy |
dc.subject.es_PE.fl_str_mv |
Emotion Expression Facial FER Machine Learning Mobile Real-Time Recognition |
topic |
Emotion Expression Facial FER Machine Learning Mobile Real-Time Recognition |
description |
Every year, the increase in human-computer interaction is noticeable. This brings with it the evolution of computer vision to improve this interaction to make it more efficient and effective. This paper presents a CNN-based emotion face recognition model capable to be executed on mobile devices, in real time and with high accuracy. Different models implemented in other research are usually of large sizes, and although they obtained high accuracy, they fail to make predictions in an optimal time, which prevents a fluid interaction with the computer. To improve these, we have implemented a lightweight CNN model trained with the FER2013 dataset to obtain the prediction of seven basic emotions. Experimentation shows that our model achieves an accuracy of 66.52% in validation, can be stored in a 13.23MB file and achieves an average processing time of 14.39ms and 16.06ms, on a tablet and a phone, respectively.. |
publishDate |
2024 |
dc.date.accessioned.none.fl_str_mv |
2024-10-08T15:19:54Z |
dc.date.available.none.fl_str_mv |
2024-10-08T15:19:54Z |
dc.date.issued.fl_str_mv |
2024-01-01 |
dc.type.es_PE.fl_str_mv |
info:eu-repo/semantics/article |
format |
article |
dc.identifier.doi.none.fl_str_mv |
10.5220/0012683800003699 |
dc.identifier.uri.none.fl_str_mv |
http://hdl.handle.net/10757/676066 |
dc.identifier.eissn.none.fl_str_mv |
21844984 |
dc.identifier.journal.es_PE.fl_str_mv |
International Conference on Information and Communication Technologies for Ageing Well and e-Health, ICT4AWE - Proceedings |
dc.identifier.eid.none.fl_str_mv |
2-s2.0-85193959213 |
dc.identifier.scopusid.none.fl_str_mv |
SCOPUS_ID:85193959213 |
identifier_str_mv |
10.5220/0012683800003699 21844984 International Conference on Information and Communication Technologies for Ageing Well and e-Health, ICT4AWE - Proceedings 2-s2.0-85193959213 SCOPUS_ID:85193959213 |
url |
http://hdl.handle.net/10757/676066 |
dc.language.iso.es_PE.fl_str_mv |
eng |
language |
eng |
dc.rights.es_PE.fl_str_mv |
info:eu-repo/semantics/openAccess |
dc.rights.*.fl_str_mv |
Attribution-NonCommercial-NoDerivatives 4.0 International |
dc.rights.uri.*.fl_str_mv |
http://creativecommons.org/licenses/by-nc-nd/4.0/ |
eu_rights_str_mv |
openAccess |
rights_invalid_str_mv |
Attribution-NonCommercial-NoDerivatives 4.0 International http://creativecommons.org/licenses/by-nc-nd/4.0/ |
dc.format.es_PE.fl_str_mv |
application/pdf |
dc.publisher.es_PE.fl_str_mv |
Science and Technology Publications, Lda. |
dc.source.none.fl_str_mv |
reponame:UPC-Institucional instname:Universidad Peruana de Ciencias Aplicadas instacron:UPC |
instname_str |
Universidad Peruana de Ciencias Aplicadas |
instacron_str |
UPC |
institution |
UPC |
reponame_str |
UPC-Institucional |
collection |
UPC-Institucional |
dc.source.journaltitle.none.fl_str_mv |
International Conference on Information and Communication Technologies for Ageing Well and e-Health, ICT4AWE - Proceedings |
dc.source.beginpage.none.fl_str_mv |
84 |
dc.source.endpage.none.fl_str_mv |
92 |
bitstream.url.fl_str_mv |
https://repositorioacademico.upc.edu.pe/bitstream/10757/676066/5/126838.pdf.jpg https://repositorioacademico.upc.edu.pe/bitstream/10757/676066/4/126838.pdf.txt https://repositorioacademico.upc.edu.pe/bitstream/10757/676066/3/license.txt https://repositorioacademico.upc.edu.pe/bitstream/10757/676066/2/license_rdf https://repositorioacademico.upc.edu.pe/bitstream/10757/676066/1/126838.pdf |
bitstream.checksum.fl_str_mv |
1c4b3ba995124535fc71a4a4c0fd0c0d 17d48dd77cbc368d1c8227858a1ceefd 8a4605be74aa9ea9d79846c1fba20a33 4460e5956bc1d1639be9ae6146a50347 ab1225f7b59efcfaceb0c2572ea498f4 |
bitstream.checksumAlgorithm.fl_str_mv |
MD5 MD5 MD5 MD5 MD5 |
repository.name.fl_str_mv |
Repositorio académico upc |
repository.mail.fl_str_mv |
upc@openrepository.com |
_version_ |
1837187181870841856 |
spelling |
8202064f076e56cfea34c791117c67cf30035f13e1fd28af19d5d5692c5f1e1dbb4300533fd7e68213307170565ef90452257a500Anto-Chavez, CarolainMaguiña-Bernuy, RichardUgarte, Willy2024-10-08T15:19:54Z2024-10-08T15:19:54Z2024-01-0110.5220/0012683800003699http://hdl.handle.net/10757/67606621844984International Conference on Information and Communication Technologies for Ageing Well and e-Health, ICT4AWE - Proceedings2-s2.0-85193959213SCOPUS_ID:85193959213Every year, the increase in human-computer interaction is noticeable. This brings with it the evolution of computer vision to improve this interaction to make it more efficient and effective. This paper presents a CNN-based emotion face recognition model capable to be executed on mobile devices, in real time and with high accuracy. Different models implemented in other research are usually of large sizes, and although they obtained high accuracy, they fail to make predictions in an optimal time, which prevents a fluid interaction with the computer. To improve these, we have implemented a lightweight CNN model trained with the FER2013 dataset to obtain the prediction of seven basic emotions. Experimentation shows that our model achieves an accuracy of 66.52% in validation, can be stored in a 13.23MB file and achieves an average processing time of 14.39ms and 16.06ms, on a tablet and a phone, respectively..application/pdfengScience and Technology Publications, Lda.info:eu-repo/semantics/openAccessAttribution-NonCommercial-NoDerivatives 4.0 Internationalhttp://creativecommons.org/licenses/by-nc-nd/4.0/EmotionExpressionFacialFERMachine LearningMobileReal-TimeRecognitionReal-Time CNN Based Facial Emotion Recognition Model for a Mobile Serious Gameinfo:eu-repo/semantics/articleInternational Conference on Information and Communication Technologies for Ageing Well and e-Health, ICT4AWE - Proceedings8492reponame:UPC-Institucionalinstname:Universidad Peruana de Ciencias Aplicadasinstacron:UPC2024-10-08T15:19:56ZTHUMBNAIL126838.pdf.jpg126838.pdf.jpgGenerated Thumbnailimage/jpeg93474https://repositorioacademico.upc.edu.pe/bitstream/10757/676066/5/126838.pdf.jpg1c4b3ba995124535fc71a4a4c0fd0c0dMD55falseTEXT126838.pdf.txt126838.pdf.txtExtracted texttext/plain33156https://repositorioacademico.upc.edu.pe/bitstream/10757/676066/4/126838.pdf.txt17d48dd77cbc368d1c8227858a1ceefdMD54falseLICENSElicense.txtlicense.txttext/plain; charset=utf-81748https://repositorioacademico.upc.edu.pe/bitstream/10757/676066/3/license.txt8a4605be74aa9ea9d79846c1fba20a33MD53falseCC-LICENSElicense_rdflicense_rdfapplication/rdf+xml; charset=utf-8805https://repositorioacademico.upc.edu.pe/bitstream/10757/676066/2/license_rdf4460e5956bc1d1639be9ae6146a50347MD52falseORIGINAL126838.pdf126838.pdfapplication/pdf1926058https://repositorioacademico.upc.edu.pe/bitstream/10757/676066/1/126838.pdfab1225f7b59efcfaceb0c2572ea498f4MD51true10757/676066oai:repositorioacademico.upc.edu.pe:10757/6760662024-10-09 07:12:45.096Repositorio académico upcupc@openrepository.comTk9URTogUExBQ0UgWU9VUiBPV04gTElDRU5TRSBIRVJFClRoaXMgc2FtcGxlIGxpY2Vuc2UgaXMgcHJvdmlkZWQgZm9yIGluZm9ybWF0aW9uYWwgcHVycG9zZXMgb25seS4KCk5PTi1FWENMVVNJVkUgRElTVFJJQlVUSU9OIExJQ0VOU0UKCkJ5IHNpZ25pbmcgYW5kIHN1Ym1pdHRpbmcgdGhpcyBsaWNlbnNlLCB5b3UgKHRoZSBhdXRob3Iocykgb3IgY29weXJpZ2h0Cm93bmVyKSBncmFudHMgdG8gRFNwYWNlIFVuaXZlcnNpdHkgKERTVSkgdGhlIG5vbi1leGNsdXNpdmUgcmlnaHQgdG8gcmVwcm9kdWNlLAp0cmFuc2xhdGUgKGFzIGRlZmluZWQgYmVsb3cpLCBhbmQvb3IgZGlzdHJpYnV0ZSB5b3VyIHN1Ym1pc3Npb24gKGluY2x1ZGluZwp0aGUgYWJzdHJhY3QpIHdvcmxkd2lkZSBpbiBwcmludCBhbmQgZWxlY3Ryb25pYyBmb3JtYXQgYW5kIGluIGFueSBtZWRpdW0sCmluY2x1ZGluZyBidXQgbm90IGxpbWl0ZWQgdG8gYXVkaW8gb3IgdmlkZW8uCgpZb3UgYWdyZWUgdGhhdCBEU1UgbWF5LCB3aXRob3V0IGNoYW5naW5nIHRoZSBjb250ZW50LCB0cmFuc2xhdGUgdGhlCnN1Ym1pc3Npb24gdG8gYW55IG1lZGl1bSBvciBmb3JtYXQgZm9yIHRoZSBwdXJwb3NlIG9mIHByZXNlcnZhdGlvbi4KCllvdSBhbHNvIGFncmVlIHRoYXQgRFNVIG1heSBrZWVwIG1vcmUgdGhhbiBvbmUgY29weSBvZiB0aGlzIHN1Ym1pc3Npb24gZm9yCnB1cnBvc2VzIG9mIHNlY3VyaXR5LCBiYWNrLXVwIGFuZCBwcmVzZXJ2YXRpb24uCgpZb3UgcmVwcmVzZW50IHRoYXQgdGhlIHN1Ym1pc3Npb24gaXMgeW91ciBvcmlnaW5hbCB3b3JrLCBhbmQgdGhhdCB5b3UgaGF2ZQp0aGUgcmlnaHQgdG8gZ3JhbnQgdGhlIHJpZ2h0cyBjb250YWluZWQgaW4gdGhpcyBsaWNlbnNlLiBZb3UgYWxzbyByZXByZXNlbnQKdGhhdCB5b3VyIHN1Ym1pc3Npb24gZG9lcyBub3QsIHRvIHRoZSBiZXN0IG9mIHlvdXIga25vd2xlZGdlLCBpbmZyaW5nZSB1cG9uCmFueW9uZSdzIGNvcHlyaWdodC4KCklmIHRoZSBzdWJtaXNzaW9uIGNvbnRhaW5zIG1hdGVyaWFsIGZvciB3aGljaCB5b3UgZG8gbm90IGhvbGQgY29weXJpZ2h0LAp5b3UgcmVwcmVzZW50IHRoYXQgeW91IGhhdmUgb2J0YWluZWQgdGhlIHVucmVzdHJpY3RlZCBwZXJtaXNzaW9uIG9mIHRoZQpjb3B5cmlnaHQgb3duZXIgdG8gZ3JhbnQgRFNVIHRoZSByaWdodHMgcmVxdWlyZWQgYnkgdGhpcyBsaWNlbnNlLCBhbmQgdGhhdApzdWNoIHRoaXJkLXBhcnR5IG93bmVkIG1hdGVyaWFsIGlzIGNsZWFybHkgaWRlbnRpZmllZCBhbmQgYWNrbm93bGVkZ2VkCndpdGhpbiB0aGUgdGV4dCBvciBjb250ZW50IG9mIHRoZSBzdWJtaXNzaW9uLgoKSUYgVEhFIFNVQk1JU1NJT04gSVMgQkFTRUQgVVBPTiBXT1JLIFRIQVQgSEFTIEJFRU4gU1BPTlNPUkVEIE9SIFNVUFBPUlRFRApCWSBBTiBBR0VOQ1kgT1IgT1JHQU5JWkFUSU9OIE9USEVSIFRIQU4gRFNVLCBZT1UgUkVQUkVTRU5UIFRIQVQgWU9VIEhBVkUKRlVMRklMTEVEIEFOWSBSSUdIVCBPRiBSRVZJRVcgT1IgT1RIRVIgT0JMSUdBVElPTlMgUkVRVUlSRUQgQlkgU1VDSApDT05UUkFDVCBPUiBBR1JFRU1FTlQuCgpEU1Ugd2lsbCBjbGVhcmx5IGlkZW50aWZ5IHlvdXIgbmFtZShzKSBhcyB0aGUgYXV0aG9yKHMpIG9yIG93bmVyKHMpIG9mIHRoZQpzdWJtaXNzaW9uLCBhbmQgd2lsbCBub3QgbWFrZSBhbnkgYWx0ZXJhdGlvbiwgb3RoZXIgdGhhbiBhcyBhbGxvd2VkIGJ5IHRoaXMKbGljZW5zZSwgdG8geW91ciBzdWJtaXNzaW9uLgo= |
score |
13.958958 |
Nota importante:
La información contenida en este registro es de entera responsabilidad de la institución que gestiona el repositorio institucional donde esta contenido este documento o set de datos. El CONCYTEC no se hace responsable por los contenidos (publicaciones y/o datos) accesibles a través del Repositorio Nacional Digital de Ciencia, Tecnología e Innovación de Acceso Abierto (ALICIA).
La información contenida en este registro es de entera responsabilidad de la institución que gestiona el repositorio institucional donde esta contenido este documento o set de datos. El CONCYTEC no se hace responsable por los contenidos (publicaciones y/o datos) accesibles a través del Repositorio Nacional Digital de Ciencia, Tecnología e Innovación de Acceso Abierto (ALICIA).