Multilayer complex network descriptors for color–texture characterization
Descripción del Articulo
L. F. S. Scabini acknowledges support from CNPq (Grants #134558/2016-2 and #142438/2018-9). O. M. Bruno acknowledges support from CNPq (Grant #307797/2014-7 and Grant #484312/2013-8) and FAPESP (grant #14/08026-1 and #16/18809-9). R. H. M. Condori acknowledges support from Cienciactiva, an initiativ...
Autores: | , , , |
---|---|
Formato: | artículo |
Fecha de Publicación: | 2019 |
Institución: | Consejo Nacional de Ciencia Tecnología e Innovación |
Repositorio: | CONCYTEC-Institucional |
Lenguaje: | inglés |
OAI Identifier: | oai:repositorio.concytec.gob.pe:20.500.12390/522 |
Enlace del recurso: | https://hdl.handle.net/20.500.12390/522 https://doi.org/10.1016/j.ins.2019.02.060 |
Nivel de acceso: | acceso abierto |
Materia: | Threshold selection Classification (of information) Color Convolution Deep neural networks Feature extraction Multilayers Network layers Neural networks Textures Adaptive approach Characterization techniques Convolutional neural network Multi-layer network Spatial interaction Texture analysis Texture characterizations Complex networks https://purl.org/pe-repo/ocde/ford#1.02.02 |
id |
CONC_81f352c07749ac9d8b7e14ab2df90e8c |
---|---|
oai_identifier_str |
oai:repositorio.concytec.gob.pe:20.500.12390/522 |
network_acronym_str |
CONC |
network_name_str |
CONCYTEC-Institucional |
repository_id_str |
4689 |
dc.title.none.fl_str_mv |
Multilayer complex network descriptors for color–texture characterization |
title |
Multilayer complex network descriptors for color–texture characterization |
spellingShingle |
Multilayer complex network descriptors for color–texture characterization Scabini L.F.S. Threshold selection Classification (of information) Color Convolution Deep neural networks Feature extraction Multilayers Network layers Neural networks Neural networks Textures Adaptive approach Characterization techniques Convolutional neural network Multi-layer network Spatial interaction Texture analysis Texture characterizations Complex networks https://purl.org/pe-repo/ocde/ford#1.02.02 |
title_short |
Multilayer complex network descriptors for color–texture characterization |
title_full |
Multilayer complex network descriptors for color–texture characterization |
title_fullStr |
Multilayer complex network descriptors for color–texture characterization |
title_full_unstemmed |
Multilayer complex network descriptors for color–texture characterization |
title_sort |
Multilayer complex network descriptors for color–texture characterization |
author |
Scabini L.F.S. |
author_facet |
Scabini L.F.S. Condori R.H.M. Gonçalves W.N. Bruno O.M. |
author_role |
author |
author2 |
Condori R.H.M. Gonçalves W.N. Bruno O.M. |
author2_role |
author author author |
dc.contributor.author.fl_str_mv |
Scabini L.F.S. Condori R.H.M. Gonçalves W.N. Bruno O.M. |
dc.subject.none.fl_str_mv |
Threshold selection |
topic |
Threshold selection Classification (of information) Color Convolution Deep neural networks Feature extraction Multilayers Network layers Neural networks Neural networks Textures Adaptive approach Characterization techniques Convolutional neural network Multi-layer network Spatial interaction Texture analysis Texture characterizations Complex networks https://purl.org/pe-repo/ocde/ford#1.02.02 |
dc.subject.es_PE.fl_str_mv |
Classification (of information) Color Convolution Deep neural networks Feature extraction Multilayers Network layers Neural networks Neural networks Textures Adaptive approach Characterization techniques Convolutional neural network Multi-layer network Spatial interaction Texture analysis Texture characterizations Complex networks |
dc.subject.ocde.none.fl_str_mv |
https://purl.org/pe-repo/ocde/ford#1.02.02 |
description |
L. F. S. Scabini acknowledges support from CNPq (Grants #134558/2016-2 and #142438/2018-9). O. M. Bruno acknowledges support from CNPq (Grant #307797/2014-7 and Grant #484312/2013-8) and FAPESP (grant #14/08026-1 and #16/18809-9). R. H. M. Condori acknowledges support from Cienciactiva, an initiative of the National Council of Science, Technology and Technological Innovation-CONCYTEC (Peru). W. N. Gonçalves acknowledges support from CNPq (Grant #304173/2016-9) and Fundect (Grant #071/2015). The authors are grateful to Abdelmounaime Safia for the feedback concerning the MBT dataset construction, and the NVIDIA GPU Grant Program for the donation of the Quadro P6000 and the Titan Xp GPUs used on this research. |
publishDate |
2019 |
dc.date.accessioned.none.fl_str_mv |
2024-05-30T23:13:38Z |
dc.date.available.none.fl_str_mv |
2024-05-30T23:13:38Z |
dc.date.issued.fl_str_mv |
2019 |
dc.type.none.fl_str_mv |
info:eu-repo/semantics/article |
format |
article |
dc.identifier.uri.none.fl_str_mv |
https://hdl.handle.net/20.500.12390/522 |
dc.identifier.doi.none.fl_str_mv |
https://doi.org/10.1016/j.ins.2019.02.060 |
dc.identifier.scopus.none.fl_str_mv |
2-s2.0-85063901933 |
url |
https://hdl.handle.net/20.500.12390/522 https://doi.org/10.1016/j.ins.2019.02.060 |
identifier_str_mv |
2-s2.0-85063901933 |
dc.language.iso.none.fl_str_mv |
eng |
language |
eng |
dc.relation.ispartof.none.fl_str_mv |
Information Sciences |
dc.rights.none.fl_str_mv |
info:eu-repo/semantics/openAccess |
eu_rights_str_mv |
openAccess |
dc.publisher.none.fl_str_mv |
Elsevier Inc. |
publisher.none.fl_str_mv |
Elsevier Inc. |
dc.source.none.fl_str_mv |
reponame:CONCYTEC-Institucional instname:Consejo Nacional de Ciencia Tecnología e Innovación instacron:CONCYTEC |
instname_str |
Consejo Nacional de Ciencia Tecnología e Innovación |
instacron_str |
CONCYTEC |
institution |
CONCYTEC |
reponame_str |
CONCYTEC-Institucional |
collection |
CONCYTEC-Institucional |
repository.name.fl_str_mv |
Repositorio Institucional CONCYTEC |
repository.mail.fl_str_mv |
repositorio@concytec.gob.pe |
_version_ |
1839175720444624896 |
spelling |
Publicationrp00843600rp00841600rp00844600rp00842600Scabini L.F.S.Condori R.H.M.Gonçalves W.N.Bruno O.M.2024-05-30T23:13:38Z2024-05-30T23:13:38Z2019https://hdl.handle.net/20.500.12390/522https://doi.org/10.1016/j.ins.2019.02.0602-s2.0-85063901933L. F. S. Scabini acknowledges support from CNPq (Grants #134558/2016-2 and #142438/2018-9). O. M. Bruno acknowledges support from CNPq (Grant #307797/2014-7 and Grant #484312/2013-8) and FAPESP (grant #14/08026-1 and #16/18809-9). R. H. M. Condori acknowledges support from Cienciactiva, an initiative of the National Council of Science, Technology and Technological Innovation-CONCYTEC (Peru). W. N. Gonçalves acknowledges support from CNPq (Grant #304173/2016-9) and Fundect (Grant #071/2015). The authors are grateful to Abdelmounaime Safia for the feedback concerning the MBT dataset construction, and the NVIDIA GPU Grant Program for the donation of the Quadro P6000 and the Titan Xp GPUs used on this research.A new method based on complex networks is proposed for color–texture analysis. The proposal consists of modeling the image as a multilayer complex network where each color channel is a layer, and each pixel (in each color channel) is represented as a network vertex. The network dynamic evolution is accessed using a set of modeling parameters (radii and thresholds), and new characterization techniques are introduced to capt information regarding within and between color channel spatial interaction. An automatic and adaptive approach for threshold selection is also proposed. We conduct classification experiments on 5 well-known datasets: Vistex, Usptex, Outex13, CURet, and MBT. Results among various literature methods are compared, including deep convolutional neural networks. The proposed method presented the highest overall performance over the 5 datasets, with 97.7 of mean accuracy against 97.0 achieved by the ResNet convolutional neural network with 50 layers.Consejo Nacional de Ciencia, Tecnología e Innovación Tecnológica - ConcytecengElsevier Inc.Information Sciencesinfo:eu-repo/semantics/openAccessThreshold selectionClassification (of information)-1Color-1Convolution-1Deep neural networks-1Feature extraction-1Multilayers-1Network layers-1Neural networks-1Neural networks-1Textures-1Adaptive approach-1Characterization techniques-1Convolutional neural network-1Multi-layer network-1Spatial interaction-1Texture analysis-1Texture characterizations-1Complex networks-1https://purl.org/pe-repo/ocde/ford#1.02.02-1Multilayer complex network descriptors for color–texture characterizationinfo:eu-repo/semantics/articlereponame:CONCYTEC-Institucionalinstname:Consejo Nacional de Ciencia Tecnología e Innovacióninstacron:CONCYTEC#PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE#20.500.12390/522oai:repositorio.concytec.gob.pe:20.500.12390/5222024-05-30 15:22:05.231http://purl.org/coar/access_right/c_14cbinfo:eu-repo/semantics/closedAccessmetadata only accesshttps://repositorio.concytec.gob.peRepositorio Institucional CONCYTECrepositorio@concytec.gob.pe#PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE#<Publication xmlns="https://www.openaire.eu/cerif-profile/1.1/" id="b85e4730-ecc8-4cf3-bfef-6687b34a9f12"> <Type xmlns="https://www.openaire.eu/cerif-profile/vocab/COAR_Publication_Types">http://purl.org/coar/resource_type/c_1843</Type> <Language>eng</Language> <Title>Multilayer complex network descriptors for color–texture characterization</Title> <PublishedIn> <Publication> <Title>Information Sciences</Title> </Publication> </PublishedIn> <PublicationDate>2019</PublicationDate> <DOI>https://doi.org/10.1016/j.ins.2019.02.060</DOI> <SCP-Number>2-s2.0-85063901933</SCP-Number> <Authors> <Author> <DisplayName>Scabini L.F.S.</DisplayName> <Person id="rp00843" /> <Affiliation> <OrgUnit> </OrgUnit> </Affiliation> </Author> <Author> <DisplayName>Condori R.H.M.</DisplayName> <Person id="rp00841" /> <Affiliation> <OrgUnit> </OrgUnit> </Affiliation> </Author> <Author> <DisplayName>Gonçalves W.N.</DisplayName> <Person id="rp00844" /> <Affiliation> <OrgUnit> </OrgUnit> </Affiliation> </Author> <Author> <DisplayName>Bruno O.M.</DisplayName> <Person id="rp00842" /> <Affiliation> <OrgUnit> </OrgUnit> </Affiliation> </Author> </Authors> <Editors> </Editors> <Publishers> <Publisher> <DisplayName>Elsevier Inc.</DisplayName> <OrgUnit /> </Publisher> </Publishers> <Keyword>Threshold selection</Keyword> <Keyword>Classification (of information)</Keyword> <Keyword>Color</Keyword> <Keyword>Convolution</Keyword> <Keyword>Deep neural networks</Keyword> <Keyword>Feature extraction</Keyword> <Keyword>Multilayers</Keyword> <Keyword>Network layers</Keyword> <Keyword>Neural networks</Keyword> <Keyword>Neural networks</Keyword> <Keyword>Textures</Keyword> <Keyword>Adaptive approach</Keyword> <Keyword>Characterization techniques</Keyword> <Keyword>Convolutional neural network</Keyword> <Keyword>Multi-layer network</Keyword> <Keyword>Spatial interaction</Keyword> <Keyword>Texture analysis</Keyword> <Keyword>Texture characterizations</Keyword> <Keyword>Complex networks</Keyword> <Abstract>A new method based on complex networks is proposed for color–texture analysis. The proposal consists of modeling the image as a multilayer complex network where each color channel is a layer, and each pixel (in each color channel) is represented as a network vertex. The network dynamic evolution is accessed using a set of modeling parameters (radii and thresholds), and new characterization techniques are introduced to capt information regarding within and between color channel spatial interaction. An automatic and adaptive approach for threshold selection is also proposed. We conduct classification experiments on 5 well-known datasets: Vistex, Usptex, Outex13, CURet, and MBT. Results among various literature methods are compared, including deep convolutional neural networks. The proposed method presented the highest overall performance over the 5 datasets, with 97.7 of mean accuracy against 97.0 achieved by the ResNet convolutional neural network with 50 layers.</Abstract> <Access xmlns="http://purl.org/coar/access_right" > </Access> </Publication> -1 |
score |
13.441895 |
Nota importante:
La información contenida en este registro es de entera responsabilidad de la institución que gestiona el repositorio institucional donde esta contenido este documento o set de datos. El CONCYTEC no se hace responsable por los contenidos (publicaciones y/o datos) accesibles a través del Repositorio Nacional Digital de Ciencia, Tecnología e Innovación de Acceso Abierto (ALICIA).
La información contenida en este registro es de entera responsabilidad de la institución que gestiona el repositorio institucional donde esta contenido este documento o set de datos. El CONCYTEC no se hace responsable por los contenidos (publicaciones y/o datos) accesibles a través del Repositorio Nacional Digital de Ciencia, Tecnología e Innovación de Acceso Abierto (ALICIA).