A mapping approach for real time imitation of human movements by a 22 DOF humanoid

Descripción del Articulo

This work was supported by grant 234-2015-FONDECYT (Master Program) from Cienciactiva of the National Council for Science,Technology and Technological Innovation (CONCYTEC-PERU).
Detalles Bibliográficos
Autores: Cornejo-Arismendi V.A., Barrios-Aranibar D.
Formato: objeto de conferencia
Fecha de Publicación:2018
Institución:Consejo Nacional de Ciencia Tecnología e Innovación
Repositorio:CONCYTEC-Institucional
Lenguaje:inglés
OAI Identifier:oai:repositorio.concytec.gob.pe:20.500.12390/489
Enlace del recurso:https://hdl.handle.net/20.500.12390/489
https://doi.org/10.1109/LARS/SBR/WRE.2018.00081
Nivel de acceso:acceso abierto
Materia:Spatial points
Anthropomorphic robots
Architecture
Mapping
Simulators
Capture system
Dynamic solutions
Human movements
Humanoid robot
Motion capture
Motion capture system
Robotics
https://purl.org/pe-repo/ocde/ford#2.02.02
id CONC_e3ffc7446439ff6a610389e99d7c5c4f
oai_identifier_str oai:repositorio.concytec.gob.pe:20.500.12390/489
network_acronym_str CONC
network_name_str CONCYTEC-Institucional
repository_id_str 4689
dc.title.none.fl_str_mv A mapping approach for real time imitation of human movements by a 22 DOF humanoid
title A mapping approach for real time imitation of human movements by a 22 DOF humanoid
spellingShingle A mapping approach for real time imitation of human movements by a 22 DOF humanoid
Cornejo-Arismendi V.A.
Spatial points
Anthropomorphic robots
Architecture
Architecture
Mapping
Simulators
Capture system
Dynamic solutions
Human movements
Humanoid robot
Motion capture
Motion capture system
Motion capture system
Robotics
https://purl.org/pe-repo/ocde/ford#2.02.02
title_short A mapping approach for real time imitation of human movements by a 22 DOF humanoid
title_full A mapping approach for real time imitation of human movements by a 22 DOF humanoid
title_fullStr A mapping approach for real time imitation of human movements by a 22 DOF humanoid
title_full_unstemmed A mapping approach for real time imitation of human movements by a 22 DOF humanoid
title_sort A mapping approach for real time imitation of human movements by a 22 DOF humanoid
author Cornejo-Arismendi V.A.
author_facet Cornejo-Arismendi V.A.
Barrios-Aranibar D.
author_role author
author2 Barrios-Aranibar D.
author2_role author
dc.contributor.author.fl_str_mv Cornejo-Arismendi V.A.
Barrios-Aranibar D.
dc.subject.none.fl_str_mv Spatial points
topic Spatial points
Anthropomorphic robots
Architecture
Architecture
Mapping
Simulators
Capture system
Dynamic solutions
Human movements
Humanoid robot
Motion capture
Motion capture system
Motion capture system
Robotics
https://purl.org/pe-repo/ocde/ford#2.02.02
dc.subject.es_PE.fl_str_mv Anthropomorphic robots
Architecture
Architecture
Mapping
Simulators
Capture system
Dynamic solutions
Human movements
Humanoid robot
Motion capture
Motion capture system
Motion capture system
Robotics
dc.subject.ocde.none.fl_str_mv https://purl.org/pe-repo/ocde/ford#2.02.02
description This work was supported by grant 234-2015-FONDECYT (Master Program) from Cienciactiva of the National Council for Science,Technology and Technological Innovation (CONCYTEC-PERU).
publishDate 2018
dc.date.accessioned.none.fl_str_mv 2024-05-30T23:13:38Z
dc.date.available.none.fl_str_mv 2024-05-30T23:13:38Z
dc.date.issued.fl_str_mv 2018
dc.type.none.fl_str_mv info:eu-repo/semantics/conferenceObject
format conferenceObject
dc.identifier.isbn.none.fl_str_mv 978-1-5386-7761-2
dc.identifier.uri.none.fl_str_mv https://hdl.handle.net/20.500.12390/489
dc.identifier.doi.none.fl_str_mv https://doi.org/10.1109/LARS/SBR/WRE.2018.00081
dc.identifier.isi.none.fl_str_mv 469159000006
identifier_str_mv 978-1-5386-7761-2
469159000006
url https://hdl.handle.net/20.500.12390/489
https://doi.org/10.1109/LARS/SBR/WRE.2018.00081
dc.language.iso.none.fl_str_mv eng
language eng
dc.rights.none.fl_str_mv info:eu-repo/semantics/openAccess
eu_rights_str_mv openAccess
dc.publisher.none.fl_str_mv IEEE
publisher.none.fl_str_mv IEEE
dc.source.none.fl_str_mv reponame:CONCYTEC-Institucional
instname:Consejo Nacional de Ciencia Tecnología e Innovación
instacron:CONCYTEC
instname_str Consejo Nacional de Ciencia Tecnología e Innovación
instacron_str CONCYTEC
institution CONCYTEC
reponame_str CONCYTEC-Institucional
collection CONCYTEC-Institucional
repository.name.fl_str_mv Repositorio Institucional CONCYTEC
repository.mail.fl_str_mv repositorio@concytec.gob.pe
_version_ 1839175779910418432
spelling Publicationrp00571600rp00572600Cornejo-Arismendi V.A.Barrios-Aranibar D.2024-05-30T23:13:38Z2024-05-30T23:13:38Z2018978-1-5386-7761-2https://hdl.handle.net/20.500.12390/489https://doi.org/10.1109/LARS/SBR/WRE.2018.00081469159000006This work was supported by grant 234-2015-FONDECYT (Master Program) from Cienciactiva of the National Council for Science,Technology and Technological Innovation (CONCYTEC-PERU).The main way of displacement of a humanoid robot is by walking, humanoid robots have a basic architecture of 22 DOF which are the minimum necessary to replicate human movements. A motion capture system stores the information of a human being from static points in a human body, the data used will be cycles of gait of a human being. The proposed technique transforms the data of a capture system and transforms them into angles in an architecture of a humanoid robot of 22 DOF. For this purpose it uses key points of a capture system and makes a mapping from the torso to then proceed with its upper and lower limbs. Tests were performed on an author's own simulator and also on the V-REP simulator using the architecture of the Poopy robot. The results show a visually imperceptibly mathematical error in the simulator, but numerically measurable, that lies in the elimination of an axial axis located at the waist. Tests were performed with the data of a woman, a man and a child, being the woman who has the greatest error for having a more pronounced hip movement in the gait. This proposed research opens the door for future research that requires a mapping of a capture system to be replicated in a humanoid robot of 22 DOF, being its use very versatile and expandable to dynamic solutions of balance and tightness.Consejo Nacional de Ciencia, Tecnología e Innovación Tecnológica - ConcytecengIEEEinfo:eu-repo/semantics/openAccessSpatial pointsAnthropomorphic robots-1Architecture-1Architecture-1Mapping-1Simulators-1Capture system-1Dynamic solutions-1Human movements-1Humanoid robot-1Motion capture-1Motion capture system-1Motion capture system-1Robotics-1https://purl.org/pe-repo/ocde/ford#2.02.02-1A mapping approach for real time imitation of human movements by a 22 DOF humanoidinfo:eu-repo/semantics/conferenceObjectreponame:CONCYTEC-Institucionalinstname:Consejo Nacional de Ciencia Tecnología e Innovacióninstacron:CONCYTEC#PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE#20.500.12390/489oai:repositorio.concytec.gob.pe:20.500.12390/4892024-05-30 15:35:35.329http://purl.org/coar/access_right/c_14cbinfo:eu-repo/semantics/closedAccessmetadata only accesshttps://repositorio.concytec.gob.peRepositorio Institucional CONCYTECrepositorio@concytec.gob.pe#PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE#<Publication xmlns="https://www.openaire.eu/cerif-profile/1.1/" id="de61b548-ba99-48ec-8c2f-627043108455"> <Type xmlns="https://www.openaire.eu/cerif-profile/vocab/COAR_Publication_Types">http://purl.org/coar/resource_type/c_1843</Type> <Language>eng</Language> <Title>A mapping approach for real time imitation of human movements by a 22 DOF humanoid</Title> <PublishedIn> <Publication> </Publication> </PublishedIn> <PublicationDate>2018</PublicationDate> <DOI>https://doi.org/10.1109/LARS/SBR/WRE.2018.00081</DOI> <ISI-Number>469159000006</ISI-Number> <ISBN>978-1-5386-7761-2</ISBN> <Authors> <Author> <DisplayName>Cornejo-Arismendi V.A.</DisplayName> <Person id="rp00571" /> <Affiliation> <OrgUnit> </OrgUnit> </Affiliation> </Author> <Author> <DisplayName>Barrios-Aranibar D.</DisplayName> <Person id="rp00572" /> <Affiliation> <OrgUnit> </OrgUnit> </Affiliation> </Author> </Authors> <Editors> </Editors> <Publishers> <Publisher> <DisplayName>IEEE</DisplayName> <OrgUnit /> </Publisher> </Publishers> <Keyword>Spatial points</Keyword> <Keyword>Anthropomorphic robots</Keyword> <Keyword>Architecture</Keyword> <Keyword>Architecture</Keyword> <Keyword>Mapping</Keyword> <Keyword>Simulators</Keyword> <Keyword>Capture system</Keyword> <Keyword>Dynamic solutions</Keyword> <Keyword>Human movements</Keyword> <Keyword>Humanoid robot</Keyword> <Keyword>Motion capture</Keyword> <Keyword>Motion capture system</Keyword> <Keyword>Motion capture system</Keyword> <Keyword>Robotics</Keyword> <Abstract>The main way of displacement of a humanoid robot is by walking, humanoid robots have a basic architecture of 22 DOF which are the minimum necessary to replicate human movements. A motion capture system stores the information of a human being from static points in a human body, the data used will be cycles of gait of a human being. The proposed technique transforms the data of a capture system and transforms them into angles in an architecture of a humanoid robot of 22 DOF. For this purpose it uses key points of a capture system and makes a mapping from the torso to then proceed with its upper and lower limbs. Tests were performed on an author&apos;s own simulator and also on the V-REP simulator using the architecture of the Poopy robot. The results show a visually imperceptibly mathematical error in the simulator, but numerically measurable, that lies in the elimination of an axial axis located at the waist. Tests were performed with the data of a woman, a man and a child, being the woman who has the greatest error for having a more pronounced hip movement in the gait. This proposed research opens the door for future research that requires a mapping of a capture system to be replicated in a humanoid robot of 22 DOF, being its use very versatile and expandable to dynamic solutions of balance and tightness.</Abstract> <Access xmlns="http://purl.org/coar/access_right" > </Access> </Publication> -1
score 13.210282
Nota importante:
La información contenida en este registro es de entera responsabilidad de la institución que gestiona el repositorio institucional donde esta contenido este documento o set de datos. El CONCYTEC no se hace responsable por los contenidos (publicaciones y/o datos) accesibles a través del Repositorio Nacional Digital de Ciencia, Tecnología e Innovación de Acceso Abierto (ALICIA).