Exploring double cross cyclic interpolation in unpaired image-to-image translation
Descripción del Articulo
The unpaired image-to-image translation consists of transferring a sample a in the domain A to an analog sample b in the domain B without intensive pixel-to-pixel supervision. The current vision focuses on learning a generative function that maps both domains but ignoring the latent information, alt...
| Autores: | , , |
|---|---|
| Formato: | artículo |
| Fecha de Publicación: | 2019 |
| Institución: | Consejo Nacional de Ciencia Tecnología e Innovación |
| Repositorio: | CONCYTEC-Institucional |
| Lenguaje: | inglés |
| OAI Identifier: | oai:repositorio.concytec.gob.pe:20.500.12390/2695 |
| Enlace del recurso: | https://hdl.handle.net/20.500.12390/2695 https://doi.org/10.1109/SIBGRAPI.2019.00025 |
| Nivel de acceso: | acceso abierto |
| Materia: | Unpaired Image to Image Translation Cross domain interpolation Latent space exploration http://purl.org/pe-repo/ocde/ford#2.02.03 |
| id |
CONC_881c77ce224d1460e5e48da6ee83982a |
|---|---|
| oai_identifier_str |
oai:repositorio.concytec.gob.pe:20.500.12390/2695 |
| network_acronym_str |
CONC |
| network_name_str |
CONCYTEC-Institucional |
| repository_id_str |
4689 |
| dc.title.none.fl_str_mv |
Exploring double cross cyclic interpolation in unpaired image-to-image translation |
| title |
Exploring double cross cyclic interpolation in unpaired image-to-image translation |
| spellingShingle |
Exploring double cross cyclic interpolation in unpaired image-to-image translation Lopez J. Unpaired Image to Image Translation Cross domain interpolation Latent space exploration http://purl.org/pe-repo/ocde/ford#2.02.03 |
| title_short |
Exploring double cross cyclic interpolation in unpaired image-to-image translation |
| title_full |
Exploring double cross cyclic interpolation in unpaired image-to-image translation |
| title_fullStr |
Exploring double cross cyclic interpolation in unpaired image-to-image translation |
| title_full_unstemmed |
Exploring double cross cyclic interpolation in unpaired image-to-image translation |
| title_sort |
Exploring double cross cyclic interpolation in unpaired image-to-image translation |
| author |
Lopez J. |
| author_facet |
Lopez J. Mauricio A. Camara G. |
| author_role |
author |
| author2 |
Mauricio A. Camara G. |
| author2_role |
author author |
| dc.contributor.author.fl_str_mv |
Lopez J. Mauricio A. Camara G. |
| dc.subject.none.fl_str_mv |
Unpaired Image to Image Translation |
| topic |
Unpaired Image to Image Translation Cross domain interpolation Latent space exploration http://purl.org/pe-repo/ocde/ford#2.02.03 |
| dc.subject.es_PE.fl_str_mv |
Cross domain interpolation Latent space exploration |
| dc.subject.ocde.none.fl_str_mv |
http://purl.org/pe-repo/ocde/ford#2.02.03 |
| description |
The unpaired image-to-image translation consists of transferring a sample a in the domain A to an analog sample b in the domain B without intensive pixel-to-pixel supervision. The current vision focuses on learning a generative function that maps both domains but ignoring the latent information, although its exploration is not explicit supervision. This paper proposes a cross-domain GAN-based model to achieve a bi-directional translation guided by latent space supervision. The proposed architecture provides a double-loop cyclic reconstruction loss in an exchangeable training adopted to reduce mode collapse and enhance local details. Our proposal has outstanding results in visual quality, stability, and pixel-level segmentation metrics over different public datasets. |
| publishDate |
2019 |
| dc.date.accessioned.none.fl_str_mv |
2024-05-30T23:13:38Z |
| dc.date.available.none.fl_str_mv |
2024-05-30T23:13:38Z |
| dc.date.issued.fl_str_mv |
2019 |
| dc.type.none.fl_str_mv |
info:eu-repo/semantics/article |
| format |
article |
| dc.identifier.uri.none.fl_str_mv |
https://hdl.handle.net/20.500.12390/2695 |
| dc.identifier.doi.none.fl_str_mv |
https://doi.org/10.1109/SIBGRAPI.2019.00025 |
| dc.identifier.scopus.none.fl_str_mv |
2-s2.0-85077031640 |
| url |
https://hdl.handle.net/20.500.12390/2695 https://doi.org/10.1109/SIBGRAPI.2019.00025 |
| identifier_str_mv |
2-s2.0-85077031640 |
| dc.language.iso.none.fl_str_mv |
eng |
| language |
eng |
| dc.relation.ispartof.none.fl_str_mv |
Proceedings - 32nd Conference on Graphics, Patterns and Images, SIBGRAPI 2019 |
| dc.rights.none.fl_str_mv |
info:eu-repo/semantics/openAccess |
| eu_rights_str_mv |
openAccess |
| dc.publisher.none.fl_str_mv |
Institute of Electrical and Electronics Engineers Inc. |
| publisher.none.fl_str_mv |
Institute of Electrical and Electronics Engineers Inc. |
| dc.source.none.fl_str_mv |
reponame:CONCYTEC-Institucional instname:Consejo Nacional de Ciencia Tecnología e Innovación instacron:CONCYTEC |
| instname_str |
Consejo Nacional de Ciencia Tecnología e Innovación |
| instacron_str |
CONCYTEC |
| institution |
CONCYTEC |
| reponame_str |
CONCYTEC-Institucional |
| collection |
CONCYTEC-Institucional |
| repository.name.fl_str_mv |
Repositorio Institucional CONCYTEC |
| repository.mail.fl_str_mv |
repositorio@concytec.gob.pe |
| _version_ |
1844883115824119808 |
| spelling |
Publicationrp07169600rp00530600rp07168600Lopez J.Mauricio A.Camara G.2024-05-30T23:13:38Z2024-05-30T23:13:38Z2019https://hdl.handle.net/20.500.12390/2695https://doi.org/10.1109/SIBGRAPI.2019.000252-s2.0-85077031640The unpaired image-to-image translation consists of transferring a sample a in the domain A to an analog sample b in the domain B without intensive pixel-to-pixel supervision. The current vision focuses on learning a generative function that maps both domains but ignoring the latent information, although its exploration is not explicit supervision. This paper proposes a cross-domain GAN-based model to achieve a bi-directional translation guided by latent space supervision. The proposed architecture provides a double-loop cyclic reconstruction loss in an exchangeable training adopted to reduce mode collapse and enhance local details. Our proposal has outstanding results in visual quality, stability, and pixel-level segmentation metrics over different public datasets.Consejo Nacional de Ciencia, Tecnología e Innovación Tecnológica - ConcytecengInstitute of Electrical and Electronics Engineers Inc.Proceedings - 32nd Conference on Graphics, Patterns and Images, SIBGRAPI 2019info:eu-repo/semantics/openAccessUnpaired Image to Image TranslationCross domain interpolation-1Latent space exploration-1http://purl.org/pe-repo/ocde/ford#2.02.03-1Exploring double cross cyclic interpolation in unpaired image-to-image translationinfo:eu-repo/semantics/articlereponame:CONCYTEC-Institucionalinstname:Consejo Nacional de Ciencia Tecnología e Innovacióninstacron:CONCYTEC#PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE#20.500.12390/2695oai:repositorio.concytec.gob.pe:20.500.12390/26952024-05-30 15:42:30.621http://purl.org/coar/access_right/c_14cbinfo:eu-repo/semantics/closedAccessmetadata only accesshttps://repositorio.concytec.gob.peRepositorio Institucional CONCYTECrepositorio@concytec.gob.pe#PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE##PLACEHOLDER_PARENT_METADATA_VALUE#<Publication xmlns="https://www.openaire.eu/cerif-profile/1.1/" id="681b811e-d1f7-4703-a1f8-b6ff740af007"> <Type xmlns="https://www.openaire.eu/cerif-profile/vocab/COAR_Publication_Types">http://purl.org/coar/resource_type/c_1843</Type> <Language>eng</Language> <Title>Exploring double cross cyclic interpolation in unpaired image-to-image translation</Title> <PublishedIn> <Publication> <Title>Proceedings - 32nd Conference on Graphics, Patterns and Images, SIBGRAPI 2019</Title> </Publication> </PublishedIn> <PublicationDate>2019</PublicationDate> <DOI>https://doi.org/10.1109/SIBGRAPI.2019.00025</DOI> <SCP-Number>2-s2.0-85077031640</SCP-Number> <Authors> <Author> <DisplayName>Lopez J.</DisplayName> <Person id="rp07169" /> <Affiliation> <OrgUnit> </OrgUnit> </Affiliation> </Author> <Author> <DisplayName>Mauricio A.</DisplayName> <Person id="rp00530" /> <Affiliation> <OrgUnit> </OrgUnit> </Affiliation> </Author> <Author> <DisplayName>Camara G.</DisplayName> <Person id="rp07168" /> <Affiliation> <OrgUnit> </OrgUnit> </Affiliation> </Author> </Authors> <Editors> </Editors> <Publishers> <Publisher> <DisplayName>Institute of Electrical and Electronics Engineers Inc.</DisplayName> <OrgUnit /> </Publisher> </Publishers> <Keyword>Unpaired Image to Image Translation</Keyword> <Keyword>Cross domain interpolation</Keyword> <Keyword>Latent space exploration</Keyword> <Abstract>The unpaired image-to-image translation consists of transferring a sample a in the domain A to an analog sample b in the domain B without intensive pixel-to-pixel supervision. The current vision focuses on learning a generative function that maps both domains but ignoring the latent information, although its exploration is not explicit supervision. This paper proposes a cross-domain GAN-based model to achieve a bi-directional translation guided by latent space supervision. The proposed architecture provides a double-loop cyclic reconstruction loss in an exchangeable training adopted to reduce mode collapse and enhance local details. Our proposal has outstanding results in visual quality, stability, and pixel-level segmentation metrics over different public datasets.</Abstract> <Access xmlns="http://purl.org/coar/access_right" > </Access> </Publication> -1 |
| score |
13.444802 |
Nota importante:
La información contenida en este registro es de entera responsabilidad de la institución que gestiona el repositorio institucional donde esta contenido este documento o set de datos. El CONCYTEC no se hace responsable por los contenidos (publicaciones y/o datos) accesibles a través del Repositorio Nacional Digital de Ciencia, Tecnología e Innovación de Acceso Abierto (ALICIA).
La información contenida en este registro es de entera responsabilidad de la institución que gestiona el repositorio institucional donde esta contenido este documento o set de datos. El CONCYTEC no se hace responsable por los contenidos (publicaciones y/o datos) accesibles a través del Repositorio Nacional Digital de Ciencia, Tecnología e Innovación de Acceso Abierto (ALICIA).