Ideal step size estimation for the multinomial logistic regression

Descripción del Articulo

At the core of deep learning optimization problems reside algorithms such as the Stochastic Gradient Descent (SGD), which employs a subset of the data per iteration to estimate the gradient in order to minimize a cost function. Adaptive algorithms, based on SGD, are well known for being effective in...

Descripción completa

Detalles Bibliográficos
Autor: Ramirez Orihuela, Gabriel
Formato: tesis de maestría
Fecha de Publicación:2024
Institución:Pontificia Universidad Católica del Perú
Repositorio:PUCP-Tesis
Lenguaje:inglés
OAI Identifier:oai:tesis.pucp.edu.pe:20.500.12404/29791
Enlace del recurso:http://hdl.handle.net/20.500.12404/29791
Nivel de acceso:acceso abierto
Materia:Aprendizaje automático (Inteligencia artificial)
Aprendizaje profundo (Aprendizaje automático)
Optimización matemática
Análisis de regresión
https://purl.org/pe-repo/ocde/ford#2.00.00
Descripción
Sumario:At the core of deep learning optimization problems reside algorithms such as the Stochastic Gradient Descent (SGD), which employs a subset of the data per iteration to estimate the gradient in order to minimize a cost function. Adaptive algorithms, based on SGD, are well known for being effective in using gradient information from past iterations, generating momentum or memory that enables a more accurate prediction of the true gradient slope in future iterations, thus accelerating convergence. Nevertheless, these algorithms still need an initial (scalar) learning rate (LR) as well as a LR scheduler. In this work we propose a new SGD algorithm that estimates the initial (scalar) LR via an adaptation of the ideal Cauchy step size for the multinomial logistic regression; furthermore, the LR is recursively updated up to a given number of epochs, after which a decaying LR scheduler is used. The proposed method is assessed for several well-known multiclass classification architectures and favorably compares against other well-tuned (scalar and spatially) adaptive alternatives, including the Adam algorithm.
Nota importante:
La información contenida en este registro es de entera responsabilidad de la institución que gestiona el repositorio institucional donde esta contenido este documento o set de datos. El CONCYTEC no se hace responsable por los contenidos (publicaciones y/o datos) accesibles a través del Repositorio Nacional Digital de Ciencia, Tecnología e Innovación de Acceso Abierto (ALICIA).