Multilayer complex network descriptors for color-texture characterization
Descripción del Articulo
L. F. S. Scabini acknowledges support from CNPq (Grants #134558/2016-2 and #142438/2018-9). O. M. Bruno acknowledges support from CNPq (Grant #307797/2014-7 and Grant #484312/2013-8) and FAPESP (grant #14/08026-1 and #16/18809-9). R. H. M. Condori acknowledges support from Cienciactiva, an initiativ...
Autores: | , , , |
---|---|
Formato: | artículo |
Fecha de Publicación: | 2019 |
Institución: | Consejo Nacional de Ciencia Tecnología e Innovación |
Repositorio: | CONCYTEC-Institucional |
Lenguaje: | inglés |
OAI Identifier: | oai:repositorio.concytec.gob.pe:20.500.12390/981 |
Enlace del recurso: | https://hdl.handle.net/20.500.12390/981 https://doi.org/10.1016/j.ins.2019.02.060 |
Nivel de acceso: | acceso abierto |
Materia: | network dynamic complex networks multilayer complex network https://purl.org/pe-repo/ocde/ford#1.02.02 |
id |
CONC_5ae62da8225d7aa9aa0338680fc0a3c4 |
---|---|
oai_identifier_str |
oai:repositorio.concytec.gob.pe:20.500.12390/981 |
network_acronym_str |
CONC |
network_name_str |
CONCYTEC-Institucional |
repository_id_str |
4689 |
dc.title.none.fl_str_mv |
Multilayer complex network descriptors for color-texture characterization |
title |
Multilayer complex network descriptors for color-texture characterization |
spellingShingle |
Multilayer complex network descriptors for color-texture characterization Scabini, LFS network dynamic complex networks multilayer complex network https://purl.org/pe-repo/ocde/ford#1.02.02 |
title_short |
Multilayer complex network descriptors for color-texture characterization |
title_full |
Multilayer complex network descriptors for color-texture characterization |
title_fullStr |
Multilayer complex network descriptors for color-texture characterization |
title_full_unstemmed |
Multilayer complex network descriptors for color-texture characterization |
title_sort |
Multilayer complex network descriptors for color-texture characterization |
author |
Scabini, LFS |
author_facet |
Scabini, LFS Condori, RHM Goncalves, WN Bruno, OM |
author_role |
author |
author2 |
Condori, RHM Goncalves, WN Bruno, OM |
author2_role |
author author author |
dc.contributor.author.fl_str_mv |
Scabini, LFS Condori, RHM Goncalves, WN Bruno, OM |
dc.subject.none.fl_str_mv |
network dynamic |
topic |
network dynamic complex networks multilayer complex network https://purl.org/pe-repo/ocde/ford#1.02.02 |
dc.subject.es_PE.fl_str_mv |
complex networks multilayer complex network |
dc.subject.ocde.none.fl_str_mv |
https://purl.org/pe-repo/ocde/ford#1.02.02 |
description |
L. F. S. Scabini acknowledges support from CNPq (Grants #134558/2016-2 and #142438/2018-9). O. M. Bruno acknowledges support from CNPq (Grant #307797/2014-7 and Grant #484312/2013-8) and FAPESP (grant #14/08026-1 and #16/18809-9). R. H. M. Condori acknowledges support from Cienciactiva, an initiative of the National Council of Science, Technology and Technological Innovation-CONCYTEC (Peru). W. N. Gonçalves acknowledges support from CNPq (Grant #304173/2016-9) and Fundect (Grant #071/2015). The authors are grateful to Abdelmounaime Safia for the feedback concerning the MBT dataset construction, and the NVIDIA GPU Grant Program for the donation of the Quadro P6000 and the Titan Xp GPUs used on this research. |
publishDate |
2019 |
dc.date.accessioned.none.fl_str_mv |
2024-05-30T23:13:38Z |
dc.date.available.none.fl_str_mv |
2024-05-30T23:13:38Z |
dc.date.issued.fl_str_mv |
2019 |
dc.type.none.fl_str_mv |
info:eu-repo/semantics/article |
format |
article |
dc.identifier.uri.none.fl_str_mv |
https://hdl.handle.net/20.500.12390/981 |
dc.identifier.doi.none.fl_str_mv |
https://doi.org/10.1016/j.ins.2019.02.060 |
dc.identifier.isi.none.fl_str_mv |
436430300008 |
url |
https://hdl.handle.net/20.500.12390/981 https://doi.org/10.1016/j.ins.2019.02.060 |
identifier_str_mv |
436430300008 |
dc.language.iso.none.fl_str_mv |
eng |
language |
eng |
dc.relation.ispartof.none.fl_str_mv |
Information Sciences |
dc.rights.none.fl_str_mv |
info:eu-repo/semantics/openAccess |
eu_rights_str_mv |
openAccess |
dc.publisher.none.fl_str_mv |
Elsevier Ltd |
publisher.none.fl_str_mv |
Elsevier Ltd |
dc.source.none.fl_str_mv |
reponame:CONCYTEC-Institucional instname:Consejo Nacional de Ciencia Tecnología e Innovación instacron:CONCYTEC |
instname_str |
Consejo Nacional de Ciencia Tecnología e Innovación |
instacron_str |
CONCYTEC |
institution |
CONCYTEC |
reponame_str |
CONCYTEC-Institucional |
collection |
CONCYTEC-Institucional |
repository.name.fl_str_mv |
Repositorio Institucional CONCYTEC |
repository.mail.fl_str_mv |
repositorio@concytec.gob.pe |
_version_ |
1844883126861430784 |
spelling |
Publicationrp00843500rp00841500rp00844500rp00842500Scabini, LFSCondori, RHMGoncalves, WNBruno, OM2024-05-30T23:13:38Z2024-05-30T23:13:38Z2019https://hdl.handle.net/20.500.12390/981https://doi.org/10.1016/j.ins.2019.02.060436430300008L. F. S. Scabini acknowledges support from CNPq (Grants #134558/2016-2 and #142438/2018-9). O. M. Bruno acknowledges support from CNPq (Grant #307797/2014-7 and Grant #484312/2013-8) and FAPESP (grant #14/08026-1 and #16/18809-9). R. H. M. Condori acknowledges support from Cienciactiva, an initiative of the National Council of Science, Technology and Technological Innovation-CONCYTEC (Peru). W. N. Gonçalves acknowledges support from CNPq (Grant #304173/2016-9) and Fundect (Grant #071/2015). The authors are grateful to Abdelmounaime Safia for the feedback concerning the MBT dataset construction, and the NVIDIA GPU Grant Program for the donation of the Quadro P6000 and the Titan Xp GPUs used on this research.A new method based on complex networks is proposed for color–texture analysis. The proposal consists of modeling the image as a multilayer complex network where each color channel is a layer, and each pixel (in each color channel) is represented as a network vertex. The network dynamic evolution is accessed using a set of modeling parameters (radii and thresholds), and new characterization techniques are introduced to capt information regarding within and between color channel spatial interaction. An automatic and adaptive approach for threshold selection is also proposed. We conduct classification experiments on 5 well-known datasets: Vistex, Usptex, Outex13, CURet, and MBT. Results among various literature methods are compared, including deep convolutional neural networks. The proposed method presented the highest overall performance over the 5 datasets, with 97.7 of mean accuracy against 97.0 achieved by the ResNet convolutional neural network with 50 layers.Consejo Nacional de Ciencia, Tecnología e Innovación Tecnológica - ConcytecengElsevier LtdInformation Sciencesinfo:eu-repo/semantics/openAccessnetwork dynamiccomplex networks-1multilayer complex network-1https://purl.org/pe-repo/ocde/ford#1.02.02-1Multilayer complex network descriptors for color-texture characterizationinfo:eu-repo/semantics/articlereponame:CONCYTEC-Institucionalinstname:Consejo Nacional de Ciencia Tecnología e Innovacióninstacron:CONCYTEC#PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE#20.500.12390/981oai:repositorio.concytec.gob.pe:20.500.12390/9812024-05-30 15:23:26.082http://purl.org/coar/access_right/c_14cbinfo:eu-repo/semantics/closedAccessmetadata only accesshttps://repositorio.concytec.gob.peRepositorio Institucional CONCYTECrepositorio@concytec.gob.pe#PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE#<Publication xmlns="https://www.openaire.eu/cerif-profile/1.1/" id="7867ddb8-0025-4792-a12d-366a8db6d561"> <Type xmlns="https://www.openaire.eu/cerif-profile/vocab/COAR_Publication_Types">http://purl.org/coar/resource_type/c_1843</Type> <Language>eng</Language> <Title>Multilayer complex network descriptors for color-texture characterization</Title> <PublishedIn> <Publication> <Title>Information Sciences</Title> </Publication> </PublishedIn> <PublicationDate>2019</PublicationDate> <DOI>https://doi.org/10.1016/j.ins.2019.02.060</DOI> <ISI-Number>436430300008</ISI-Number> <Authors> <Author> <DisplayName>Scabini, LFS</DisplayName> <Person id="rp00843" /> <Affiliation> <OrgUnit> </OrgUnit> </Affiliation> </Author> <Author> <DisplayName>Condori, RHM</DisplayName> <Person id="rp00841" /> <Affiliation> <OrgUnit> </OrgUnit> </Affiliation> </Author> <Author> <DisplayName>Goncalves, WN</DisplayName> <Person id="rp00844" /> <Affiliation> <OrgUnit> </OrgUnit> </Affiliation> </Author> <Author> <DisplayName>Bruno, OM</DisplayName> <Person id="rp00842" /> <Affiliation> <OrgUnit> </OrgUnit> </Affiliation> </Author> </Authors> <Editors> </Editors> <Publishers> <Publisher> <DisplayName>Elsevier Ltd</DisplayName> <OrgUnit /> </Publisher> </Publishers> <Keyword>network dynamic</Keyword> <Keyword>complex networks</Keyword> <Keyword>multilayer complex network</Keyword> <Abstract>A new method based on complex networks is proposed for color–texture analysis. The proposal consists of modeling the image as a multilayer complex network where each color channel is a layer, and each pixel (in each color channel) is represented as a network vertex. The network dynamic evolution is accessed using a set of modeling parameters (radii and thresholds), and new characterization techniques are introduced to capt information regarding within and between color channel spatial interaction. An automatic and adaptive approach for threshold selection is also proposed. We conduct classification experiments on 5 well-known datasets: Vistex, Usptex, Outex13, CURet, and MBT. Results among various literature methods are compared, including deep convolutional neural networks. The proposed method presented the highest overall performance over the 5 datasets, with 97.7 of mean accuracy against 97.0 achieved by the ResNet convolutional neural network with 50 layers.</Abstract> <Access xmlns="http://purl.org/coar/access_right" > </Access> </Publication> -1 |
score |
13.243185 |
Nota importante:
La información contenida en este registro es de entera responsabilidad de la institución que gestiona el repositorio institucional donde esta contenido este documento o set de datos. El CONCYTEC no se hace responsable por los contenidos (publicaciones y/o datos) accesibles a través del Repositorio Nacional Digital de Ciencia, Tecnología e Innovación de Acceso Abierto (ALICIA).
La información contenida en este registro es de entera responsabilidad de la institución que gestiona el repositorio institucional donde esta contenido este documento o set de datos. El CONCYTEC no se hace responsable por los contenidos (publicaciones y/o datos) accesibles a través del Repositorio Nacional Digital de Ciencia, Tecnología e Innovación de Acceso Abierto (ALICIA).