A Novel EEG-Based Four-Class Linguistic BCI

Descripción del Articulo

In this work, we present a novel EEG-based Linguistic BCI, which uses the four phonemic structures "BA", "FO", "LE", and "RY" as covert speech task classes. Six neurologically healthy volunteers with the age range of 19-37 participated in this experiment. Part...

Descripción completa

Detalles Bibliográficos
Autores: Jahangiri, Amir, Achanccaray, David, Sepulveda, Francisco
Formato: artículo
Fecha de Publicación:2019
Institución:Consejo Nacional de Ciencia Tecnología e Innovación
Repositorio:CONCYTEC-Institucional
Lenguaje:inglés
OAI Identifier:oai:repositorio.concytec.gob.pe:20.500.12390/2831
Enlace del recurso:https://hdl.handle.net/20.500.12390/2831
https://doi.org/10.1109/EMBC.2019.8856644
Nivel de acceso:acceso abierto
Materia:Training
Task analysis
Electroencephalography
Time-frequency analysis
Phonetics
Protocols
https://purl.org/pe-repo/ocde/ford#2.02.01
id CONC_d95d59afd45fec819aec5990d45b3b32
oai_identifier_str oai:repositorio.concytec.gob.pe:20.500.12390/2831
network_acronym_str CONC
network_name_str CONCYTEC-Institucional
repository_id_str 4689
dc.title.none.fl_str_mv A Novel EEG-Based Four-Class Linguistic BCI
title A Novel EEG-Based Four-Class Linguistic BCI
spellingShingle A Novel EEG-Based Four-Class Linguistic BCI
Jahangiri, Amir
Training
Task analysis
Electroencephalography
Time-frequency analysis
Phonetics
Protocols
https://purl.org/pe-repo/ocde/ford#2.02.01
title_short A Novel EEG-Based Four-Class Linguistic BCI
title_full A Novel EEG-Based Four-Class Linguistic BCI
title_fullStr A Novel EEG-Based Four-Class Linguistic BCI
title_full_unstemmed A Novel EEG-Based Four-Class Linguistic BCI
title_sort A Novel EEG-Based Four-Class Linguistic BCI
author Jahangiri, Amir
author_facet Jahangiri, Amir
Achanccaray, David
Sepulveda, Francisco
author_role author
author2 Achanccaray, David
Sepulveda, Francisco
author2_role author
author
dc.contributor.author.fl_str_mv Jahangiri, Amir
Achanccaray, David
Sepulveda, Francisco
dc.subject.none.fl_str_mv Training
topic Training
Task analysis
Electroencephalography
Time-frequency analysis
Phonetics
Protocols
https://purl.org/pe-repo/ocde/ford#2.02.01
dc.subject.es_PE.fl_str_mv Task analysis
Electroencephalography
Time-frequency analysis
Phonetics
Protocols
dc.subject.ocde.none.fl_str_mv https://purl.org/pe-repo/ocde/ford#2.02.01
description In this work, we present a novel EEG-based Linguistic BCI, which uses the four phonemic structures "BA", "FO", "LE", and "RY" as covert speech task classes. Six neurologically healthy volunteers with the age range of 19-37 participated in this experiment. Participants were asked to covertly speak a phonemic structure when they heard an auditory cue. EEG was recorded with 64 electrodes at 2048 samples/s. The duration of each trial is 312ms starting with the cue. The BCI was trained using a mixed randomized recording run containing 15 trials per class. The BCI is tested by playing a simple game of "Wack a mole" containing 5 trials per class presented in random order. The average classification accuracy for the 6 users is 82.5%. The most valuable features emerge after Auditory cue recognition (~100ms post onset), and within the 70-128 Hz frequency range. The most significant identified brain regions were the Prefrontal Cortex (linked to stimulus driven executive control), Wernicke's area (linked to Phonological code retrieval), the right IFG, and Broca's area (linked to syllabification). In this work, we have only scratched the surface of using Linguistic tasks for BCIs and the potential for creating much more capable systems in the future using this approach exists.
publishDate 2019
dc.date.accessioned.none.fl_str_mv 2024-05-30T23:13:38Z
dc.date.available.none.fl_str_mv 2024-05-30T23:13:38Z
dc.date.issued.fl_str_mv 2019
dc.type.none.fl_str_mv info:eu-repo/semantics/article
format article
dc.identifier.uri.none.fl_str_mv https://hdl.handle.net/20.500.12390/2831
dc.identifier.doi.none.fl_str_mv https://doi.org/10.1109/EMBC.2019.8856644
url https://hdl.handle.net/20.500.12390/2831
https://doi.org/10.1109/EMBC.2019.8856644
dc.language.iso.none.fl_str_mv eng
language eng
dc.relation.ispartof.none.fl_str_mv 2019 41ST ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC)
dc.rights.none.fl_str_mv info:eu-repo/semantics/openAccess
eu_rights_str_mv openAccess
dc.publisher.none.fl_str_mv IEEE
publisher.none.fl_str_mv IEEE
dc.source.none.fl_str_mv reponame:CONCYTEC-Institucional
instname:Consejo Nacional de Ciencia Tecnología e Innovación
instacron:CONCYTEC
instname_str Consejo Nacional de Ciencia Tecnología e Innovación
instacron_str CONCYTEC
institution CONCYTEC
reponame_str CONCYTEC-Institucional
collection CONCYTEC-Institucional
repository.name.fl_str_mv Repositorio Institucional CONCYTEC
repository.mail.fl_str_mv repositorio@concytec.gob.pe
_version_ 1839175503005614080
spelling Publicationrp07691600rp03820600rp07692600Jahangiri, AmirAchanccaray, DavidSepulveda, Francisco2024-05-30T23:13:38Z2024-05-30T23:13:38Z2019https://hdl.handle.net/20.500.12390/2831https://doi.org/10.1109/EMBC.2019.8856644In this work, we present a novel EEG-based Linguistic BCI, which uses the four phonemic structures "BA", "FO", "LE", and "RY" as covert speech task classes. Six neurologically healthy volunteers with the age range of 19-37 participated in this experiment. Participants were asked to covertly speak a phonemic structure when they heard an auditory cue. EEG was recorded with 64 electrodes at 2048 samples/s. The duration of each trial is 312ms starting with the cue. The BCI was trained using a mixed randomized recording run containing 15 trials per class. The BCI is tested by playing a simple game of "Wack a mole" containing 5 trials per class presented in random order. The average classification accuracy for the 6 users is 82.5%. The most valuable features emerge after Auditory cue recognition (~100ms post onset), and within the 70-128 Hz frequency range. The most significant identified brain regions were the Prefrontal Cortex (linked to stimulus driven executive control), Wernicke's area (linked to Phonological code retrieval), the right IFG, and Broca's area (linked to syllabification). In this work, we have only scratched the surface of using Linguistic tasks for BCIs and the potential for creating much more capable systems in the future using this approach exists.Fondo Nacional de Desarrollo Científico y Tecnológico - FondecytengIEEE2019 41ST ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC)info:eu-repo/semantics/openAccessTrainingTask analysis-1Electroencephalography-1Time-frequency analysis-1Phonetics-1Protocols-1https://purl.org/pe-repo/ocde/ford#2.02.01-1A Novel EEG-Based Four-Class Linguistic BCIinfo:eu-repo/semantics/articlereponame:CONCYTEC-Institucionalinstname:Consejo Nacional de Ciencia Tecnología e Innovacióninstacron:CONCYTEC#PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE#20.500.12390/2831oai:repositorio.concytec.gob.pe:20.500.12390/28312024-05-30 15:25:39.265http://purl.org/coar/access_right/c_14cbinfo:eu-repo/semantics/closedAccessmetadata only accesshttps://repositorio.concytec.gob.peRepositorio Institucional CONCYTECrepositorio@concytec.gob.pe#PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE#<Publication xmlns="https://www.openaire.eu/cerif-profile/1.1/" id="39780a43-c9e5-41b2-b5a9-fd06c37ee6e6"> <Type xmlns="https://www.openaire.eu/cerif-profile/vocab/COAR_Publication_Types">http://purl.org/coar/resource_type/c_1843</Type> <Language>eng</Language> <Title>A Novel EEG-Based Four-Class Linguistic BCI</Title> <PublishedIn> <Publication> <Title>2019 41ST ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC)</Title> </Publication> </PublishedIn> <PublicationDate>2019</PublicationDate> <DOI>https://doi.org/10.1109/EMBC.2019.8856644</DOI> <Authors> <Author> <DisplayName>Jahangiri, Amir</DisplayName> <Person id="rp07691" /> <Affiliation> <OrgUnit> </OrgUnit> </Affiliation> </Author> <Author> <DisplayName>Achanccaray, David</DisplayName> <Person id="rp03820" /> <Affiliation> <OrgUnit> </OrgUnit> </Affiliation> </Author> <Author> <DisplayName>Sepulveda, Francisco</DisplayName> <Person id="rp07692" /> <Affiliation> <OrgUnit> </OrgUnit> </Affiliation> </Author> </Authors> <Editors> </Editors> <Publishers> <Publisher> <DisplayName>IEEE</DisplayName> <OrgUnit /> </Publisher> </Publishers> <Keyword>Training</Keyword> <Keyword>Task analysis</Keyword> <Keyword>Electroencephalography</Keyword> <Keyword>Time-frequency analysis</Keyword> <Keyword>Phonetics</Keyword> <Keyword>Protocols</Keyword> <Abstract>In this work, we present a novel EEG-based Linguistic BCI, which uses the four phonemic structures &quot;BA&quot;, &quot;FO&quot;, &quot;LE&quot;, and &quot;RY&quot; as covert speech task classes. Six neurologically healthy volunteers with the age range of 19-37 participated in this experiment. Participants were asked to covertly speak a phonemic structure when they heard an auditory cue. EEG was recorded with 64 electrodes at 2048 samples/s. The duration of each trial is 312ms starting with the cue. The BCI was trained using a mixed randomized recording run containing 15 trials per class. The BCI is tested by playing a simple game of &quot;Wack a mole&quot; containing 5 trials per class presented in random order. The average classification accuracy for the 6 users is 82.5%. The most valuable features emerge after Auditory cue recognition (~100ms post onset), and within the 70-128 Hz frequency range. The most significant identified brain regions were the Prefrontal Cortex (linked to stimulus driven executive control), Wernicke&apos;s area (linked to Phonological code retrieval), the right IFG, and Broca&apos;s area (linked to syllabification). In this work, we have only scratched the surface of using Linguistic tasks for BCIs and the potential for creating much more capable systems in the future using this approach exists.</Abstract> <Access xmlns="http://purl.org/coar/access_right" > </Access> </Publication> -1
score 13.439101
Nota importante:
La información contenida en este registro es de entera responsabilidad de la institución que gestiona el repositorio institucional donde esta contenido este documento o set de datos. El CONCYTEC no se hace responsable por los contenidos (publicaciones y/o datos) accesibles a través del Repositorio Nacional Digital de Ciencia, Tecnología e Innovación de Acceso Abierto (ALICIA).