Land-Cover Semantic Segmentation for Very-High-Resolution Remote Sensing Imagery Using Deep Transfer Learning and Active Contour Loss

Descripción del Articulo

An accurate land-cover segmentation of very-high-resolution aerial images is essential for a wide range of applications, including urban planning and natural resource management. However, the automation of this process remains a challenge owing to the complexity of images, variability in land surfac...

Descripción completa

Detalles Bibliográficos
Autores: Chicchon, Miguel, León Trujillo, Francisco James, Sipiran, Iván, Madrid Argomedo, Manuel Ricardo
Formato: artículo
Fecha de Publicación:2025
Institución:Universidad de Lima
Repositorio:ULIMA-Institucional
Lenguaje:inglés
OAI Identifier:oai:repositorio.ulima.edu.pe:20.500.12724/23211
Enlace del recurso:https://hdl.handle.net/20.500.12724/23211
https://doi.org/10.1109/ACCESS.2025.3556632
Nivel de acceso:acceso abierto
Materia:Pendiente
id RULI_46ce2802f35336568d1e778f826f9693
oai_identifier_str oai:repositorio.ulima.edu.pe:20.500.12724/23211
network_acronym_str RULI
network_name_str ULIMA-Institucional
repository_id_str 3883
spelling Chicchon, MiguelLeón Trujillo, Francisco JamesSipiran, IvánMadrid Argomedo, Manuel RicardoMadrid Argomedo, Manuel Ricardo2025-09-09T21:26:37Z2025-09-09T21:26:37Z20252169-3536https://hdl.handle.net/20.500.12724/23211IEEE Access121541816https://doi.org/10.1109/ACCESS.2025.35566322-s2.0-105002585792An accurate land-cover segmentation of very-high-resolution aerial images is essential for a wide range of applications, including urban planning and natural resource management. However, the automation of this process remains a challenge owing to the complexity of images, variability in land surface features, and noise. In this study, a method for training convolutional neural networks and transformers to perform land-cover segmentation on very-high-resolution aerial images in a regional context was proposed. We assessed the U-Net-scSE, FT-U-NetFormer, and DC-Swin architectures, incorporating transfer learning and active contour loss functions to improve performance on semantic segmentation tasks. Our experiments conducted using the OpenEarthMap dataset, which includes images from 44 countries, demonstrate the superior performance of U-Net-scSE models with the EfficientNet-V2-XL and MiT-B4 encoders, achieving an mIoU of over 0.80 on a test dataset of urban and rural images from Peru.htmlengInstitute of Electrical and Electronics Engineers Inc.USurn:issn: 2169-3536info:eu-repo/semantics/openAccesshttps://creativecommons.org/licenses/by/4.0/PendientePendienteLand-Cover Semantic Segmentation for Very-High-Resolution Remote Sensing Imagery Using Deep Transfer Learning and Active Contour Lossinfo:eu-repo/semantics/articleArtículo (Scopus)reponame:ULIMA-Institucionalinstname:Universidad de Limainstacron:ULIMA20.500.12724/23211oai:repositorio.ulima.edu.pe:20.500.12724/232112025-09-16 10:47:23.627Repositorio Universidad de Limarepositorio@ulima.edu.pe
dc.title.none.fl_str_mv Land-Cover Semantic Segmentation for Very-High-Resolution Remote Sensing Imagery Using Deep Transfer Learning and Active Contour Loss
title Land-Cover Semantic Segmentation for Very-High-Resolution Remote Sensing Imagery Using Deep Transfer Learning and Active Contour Loss
spellingShingle Land-Cover Semantic Segmentation for Very-High-Resolution Remote Sensing Imagery Using Deep Transfer Learning and Active Contour Loss
Chicchon, Miguel
Pendiente
Pendiente
title_short Land-Cover Semantic Segmentation for Very-High-Resolution Remote Sensing Imagery Using Deep Transfer Learning and Active Contour Loss
title_full Land-Cover Semantic Segmentation for Very-High-Resolution Remote Sensing Imagery Using Deep Transfer Learning and Active Contour Loss
title_fullStr Land-Cover Semantic Segmentation for Very-High-Resolution Remote Sensing Imagery Using Deep Transfer Learning and Active Contour Loss
title_full_unstemmed Land-Cover Semantic Segmentation for Very-High-Resolution Remote Sensing Imagery Using Deep Transfer Learning and Active Contour Loss
title_sort Land-Cover Semantic Segmentation for Very-High-Resolution Remote Sensing Imagery Using Deep Transfer Learning and Active Contour Loss
author Chicchon, Miguel
author_facet Chicchon, Miguel
León Trujillo, Francisco James
Sipiran, Iván
Madrid Argomedo, Manuel Ricardo
author_role author
author2 León Trujillo, Francisco James
Sipiran, Iván
Madrid Argomedo, Manuel Ricardo
author2_role author
author
author
dc.contributor.other.none.fl_str_mv Madrid Argomedo, Manuel Ricardo
dc.contributor.author.fl_str_mv Chicchon, Miguel
León Trujillo, Francisco James
Sipiran, Iván
Madrid Argomedo, Manuel Ricardo
dc.subject.none.fl_str_mv Pendiente
topic Pendiente
Pendiente
dc.subject.ocde.none.fl_str_mv Pendiente
description An accurate land-cover segmentation of very-high-resolution aerial images is essential for a wide range of applications, including urban planning and natural resource management. However, the automation of this process remains a challenge owing to the complexity of images, variability in land surface features, and noise. In this study, a method for training convolutional neural networks and transformers to perform land-cover segmentation on very-high-resolution aerial images in a regional context was proposed. We assessed the U-Net-scSE, FT-U-NetFormer, and DC-Swin architectures, incorporating transfer learning and active contour loss functions to improve performance on semantic segmentation tasks. Our experiments conducted using the OpenEarthMap dataset, which includes images from 44 countries, demonstrate the superior performance of U-Net-scSE models with the EfficientNet-V2-XL and MiT-B4 encoders, achieving an mIoU of over 0.80 on a test dataset of urban and rural images from Peru.
publishDate 2025
dc.date.accessioned.none.fl_str_mv 2025-09-09T21:26:37Z
dc.date.available.none.fl_str_mv 2025-09-09T21:26:37Z
dc.date.issued.fl_str_mv 2025
dc.type.none.fl_str_mv info:eu-repo/semantics/article
dc.type.other.none.fl_str_mv Artículo (Scopus)
format article
dc.identifier.issn.none.fl_str_mv 2169-3536
dc.identifier.uri.none.fl_str_mv https://hdl.handle.net/20.500.12724/23211
dc.identifier.journal.none.fl_str_mv IEEE Access
dc.identifier.isni.none.fl_str_mv 121541816
dc.identifier.doi.none.fl_str_mv https://doi.org/10.1109/ACCESS.2025.3556632
dc.identifier.scopusid.none.fl_str_mv 2-s2.0-105002585792
identifier_str_mv 2169-3536
IEEE Access
121541816
2-s2.0-105002585792
url https://hdl.handle.net/20.500.12724/23211
https://doi.org/10.1109/ACCESS.2025.3556632
dc.language.iso.none.fl_str_mv eng
language eng
dc.relation.ispartof.none.fl_str_mv urn:issn: 2169-3536
dc.rights.none.fl_str_mv info:eu-repo/semantics/openAccess
dc.rights.uri.none.fl_str_mv https://creativecommons.org/licenses/by/4.0/
eu_rights_str_mv openAccess
rights_invalid_str_mv https://creativecommons.org/licenses/by/4.0/
dc.format.none.fl_str_mv html
dc.publisher.none.fl_str_mv Institute of Electrical and Electronics Engineers Inc.
dc.publisher.country.none.fl_str_mv US
publisher.none.fl_str_mv Institute of Electrical and Electronics Engineers Inc.
dc.source.none.fl_str_mv reponame:ULIMA-Institucional
instname:Universidad de Lima
instacron:ULIMA
instname_str Universidad de Lima
instacron_str ULIMA
institution ULIMA
reponame_str ULIMA-Institucional
collection ULIMA-Institucional
repository.name.fl_str_mv Repositorio Universidad de Lima
repository.mail.fl_str_mv repositorio@ulima.edu.pe
_version_ 1846612271129690112
score 13.936249
Nota importante:
La información contenida en este registro es de entera responsabilidad de la institución que gestiona el repositorio institucional donde esta contenido este documento o set de datos. El CONCYTEC no se hace responsable por los contenidos (publicaciones y/o datos) accesibles a través del Repositorio Nacional Digital de Ciencia, Tecnología e Innovación de Acceso Abierto (ALICIA).