Tópicos Sugeridos dentro de su búsqueda.
Tópicos Sugeridos dentro de su búsqueda.
https://purl.org/pe-repo/ocde/ford#5.02.04
1,286
https://purl.org/pe-repo/ocde/ford#2.11.04
1,130
https://purl.org/pe-repo/ocde/ford#2.02.04
452
https://purl.org/pe-repo/ocde/ford#3.03.03
447
https://purl.org/pe-repo/ocde/ford#5.03.01
380
ODS 3: Salud y bienestar. Garantizar una vida sana y promover el bienestar de todos a todas las edades
322
Estudios de prefactibilidad
235
más ...
Buscar alternativas:
optar processing » stir processing (Expander búsqueda), capture processing (Expander búsqueda), unfair processing (Expander búsqueda)
data processing » image processing (Expander búsqueda)
optar processing » stir processing (Expander búsqueda), capture processing (Expander búsqueda), unfair processing (Expander búsqueda)
data processing » image processing (Expander búsqueda)
1
artículo
Publicado 2025
Enlace
Enlace
Esta investigación se centra en la implementación de la reingeniería del proceso de medición de los Stock Keeping Unit (SKU) importados por un centro de distribución de una empresa especializada en la venta de productos para la mejora del hogar y materiales de construcción. Para identificar los factores que influyen en el proceso y optimizarlo, se realizó un estudio de tiempos y movimientos del proceso utilizando la herramienta conocida como diagrama de espagueti. Como resultado, se incrementó la productividad y disminuyeron los tiempos muertos, lo que permitió medir en menos tiempo el universo de SKU. El estudio también condujo a la configuración del sistema utilizando datos correctos, lo que facilitó la operación dentro del almacén.
2
artículo
Publicado 2019
Enlace
Enlace
Is there a correspondence or affinity between the juridicalprincipiological and factual-economical conceptions for the effective protection of the consent of the holder of personal data when hiring in a network?Under the mantle of the present question, it aims to analyze the contemporary contractual scenario under the perspective of the privacy policy and the Brazilian General Data Protection Law (LGPD). In this context, it is proposed a skeptical reflection on the principles and economic guidelines defended by law and doctrine to verify if the consent is an instrument of real effectiveness to the tutelage of the subjects in network. The first topic concerns the conceptual and conceptual analysis of consent in the LGPD and in the specialized doctrine. The second topic deals with the limited rationality of the users of the network services in understanding the dispositions in the pol...
3
informe técnico
En el presente trabajo, se sistematizan los conceptos inherentes al Modelo Data Warehouse, haciendo referencia a cada uno de ellos en forma ordenada, en un marco conceptual claro, en el que se desplegarán sus características y cualidades, y teniendo siempre en cuenta su relación o interrelación con los demás componentes del ambiente. Inicialmente, se definirá los conceptos generales relacionados al Data WareHouse, Seguidamente, se introducirá a la definición de requerimientos y los procesos de negocio para modelar un Data Warehouse, y se expondrán sus aspectos más relevantes y significativos. Luego, se precisarán y detallarán todos los componentes que intervienen en la Integración de Datos, de manera organizada e intuitiva, atendiendo su interrelación. Posterior se describe el Diseño Dimensional para los procesos de Negocio. Finalmente, se describirán algunos conceptos qu...
4
tesis de maestría
Publicado 2021
Enlace
Enlace
Basado en los datos de observación de alta precisión GNSS y el cambio de coordenadas de la estación de monitoreo CORS antes y después del terremoto 8.0 de Perú de 2019, el autor desarrolló el software de análisis de deformación de la superficie basado en el software de procesamiento científico, que tiene valor científico y práctico en la investigación del epicentro del terremoto, magnitud y geodinámica. Se muestran los resultados obtenidos utilizando el software científico de procesamiento GNSS PANDA, un paquete de precisión para el análisis de datos GNSS, desarrollado por la Universidad de Wuhan, China. Los resultados son de alta precisión en el orden de los milímetros. Los resultados obtenidos tienen un desplazamiento de alrededor de 2 cm en las estaciones GNSS cercanas al terremoto, al noroeste.
5
tesis de grado
Publicado 2023
Enlace
Enlace
Music source separation is the task of isolating the musical phrases played by different instruments recorded individually and arranged together to form a song. Nowadays, several methods have been developed to cover the separation of music sources, which can be classified into supervised and unsupervised learning, however, no research has been developed in which the effectiveness of using different methods together are analyzed , that's the reason the present work seeks to measure the results of the use of two methods, REPET + (unsupervised) and UNet (supervised), jointly and in isolation to separate the music waves produced by a singer and the waves from the instruments. The results show an overall score (SDR) of the methods for vocal separation for the UNet network was 5.38 dB, REPET+ -4.3 dB, -2.55 dB for REPET+ & UNet, -0.38 dB for UNet & REPET+, -6.16 dB for REPET+ & REPET+ and 5.17...
6
artículo
Publicado 2018
Enlace
Enlace
Disruptive technologies and their impact on journalism and communication force us to assume challenges in learning new techniques for data and information processing. Interdisciplinary knowledge is evident in the teaching of new professional profiles. Data journalism is an example of this, so the immersion into a data culture must be preceded by awareness in the learning of news applications, algorithms or the treatment of Big Data, elements that configure new paradigms among journalists of the media on the Internet. With the revision of texts, direct observation of selected applications and case study, some conclusions are established that contain a growing demand in the knowledge of new techniques. The results show the use of technological resources and the proposal of changes in the curricula of the communication faculties.
7
artículo
Publicado 2018
Enlace
Enlace
Disruptive technologies and their impact on journalism and communication force us to assume challenges in learning new techniques for data and information processing. Interdisciplinary knowledge is evident in the teaching of new professional profiles. Data journalism is an example of this, so the immersion into a data culture must be preceded by awareness in the learning of news applications, algorithms or the treatment of Big Data, elements that configure new paradigms among journalists of the media on the Internet. With the revision of texts, direct observation of selected applications and case study, some conclusions are established that contain a growing demand in the knowledge of new techniques. The results show the use of technological resources and the proposal of changes in the curricula of the communication faculties.
8
artículo
Publicado 2023
Enlace
Enlace
Process automation is being implemented in different disciplines of earth sciences, as seen in the implementation of libraries such as Pyrolite, PyGeochemCalc, dh2loop 1.0, NeuralHydrology, GeoPyToo among others. The present work addresses a methodology to automate the geochemical univariate analysis by using Python and open-source packages such as pandas, seaborn, matplotlib, statsmodels which will be integrated into a script in a local work environment such as Jupyter notebook or in an online environment such as Google Collaboratory. The script is designed to process any type of geochemical data, allowing to remove outliers, perform calculations and graphs of the elements and their respective geological domain. The results include graphics such as boxplot, quantile-quantile and calculations of normality tests and geochemical parameters, allowing to determine the background and threshol...
9
artículo
Publicado 2014
Enlace
Enlace
Globalization has intensified competition in many markets. To remain competitive, the companies look for satisfying the needs of customers by meeting market requirements. In this context, Process Capability Indices (PCI) play a crucial role in assessing the quality of processes. In the case of non-normal data there are two general approaches based on transformations (Box-Cox and Johnson Transformation) and Percentiles (Pearson’s and Burr’s Distribution Systems). However, previous studies on the comparison of these methods show different conclusions, and thus arises the need to clarify the differences between these methods to implement a proper estimation of these indices. In this paper, a simulation study is made in order to compare the above methods and to propose an appropriate methodology for estimating the PCI in non-normal data. Furthermore, it is concluded that the best method ...
10
artículo
Globalization has intensified competition in many markets. To remain competitive, the companies look for satisfying the needs of customers by meeting market requirements. In this context, Process Capability Indices (PCI) play a crucial role in assessing the quality of processes. In the case of non-normal data there are two general approaches based on transformations (Box-Cox and Johnson Transformation) and Percentiles (Pearson’s and Burr’s Distribution Systems). However, previous studies on the comparison of these methods show different conclusions, and thus arises the need to clarify the differences between these methods to implement a proper estimation of these indices. In this paper, a simulation study is made in order to compare the above methods and to propose an appropriate methodology for estimating the PCI in non-normal data. Furthermore, it is concluded that the best method ...
11
tesis doctoral
Publicado 2024
Enlace
Enlace
This dissertation investigates the potential improvement of volcanic eruption understanding and forecasting methods by using advanced data processing techniques to analyze large datasets at three target volcanoes (Piton de la Fournaise (PdlF) (France), Sabancaya, and Ubinas (Peru)). The central objective of this study is to search for possible empirical relationships between the pre-eruptive behavior of the accelerated increase in seismic activity using the Failure Forecast Method (FFM) and velocity variations measured by Coda Wave Interferometry (CWI), since both observations are reported to be independently associated with medium damage. The FFM is a deterministic method used to forecast volcanic eruptions using an empirical relationship of increased and accelerated evolution of an observable (e.g., volcano-seismic event rates). The event rates used with FFM in this study were generate...
12
tesis doctoral
Publicado 2024
Enlace
Enlace
This dissertation investigates the potential improvement of volcanic eruption understanding and forecasting methods by using advanced data processing techniques to analyze large datasets at three target volcanoes (Piton de la Fournaise (PdlF) (France), Sabancaya, and Ubinas (Peru)). The central objective of this study is to search for possible empirical relationships between the pre-eruptive behavior of the accelerated increase in seismic activity using the Failure Forecast Method (FFM) and velocity variations measured by Coda Wave Interferometry (CWI), since both observations are reported to be independently associated with medium damage. The FFM is a deterministic method used to forecast volcanic eruptions using an empirical relationship of increased and accelerated evolution of an observable (e.g., volcano-seismic event rates). The event rates used with FFM in this study were generate...
13
artículo
Publicado 2019
Enlace
Enlace
This article presents a methodology that applies natural language processing and classification algorithms by using data mining techniques, and incorporating procedures for validation and verification of significance. This is conducted according to the analysis and selection of data and results based on quality statistical analysis, which guarantees the effectiveness percentage in knowledge construction. The analysis of computer incidents within an educational institution and a standardized database of historical computer incidents collected by the Service Desk area is used as case study. Such area is linked to all information technology processes and focuses on the support requirements for the performance of employee activities. As long as users’ requirements are not fulfilled in a timely manner, the impact of incidents may give rise to work problems at different levels, making it d...
14
artículo
Publicado 2019
Enlace
Enlace
This article presents a methodology that applies natural language processing and classification algorithms by using data mining techniques, and incorporating procedures for validation and verification of significance. This is conducted according to the analysis and selection of data and results based on quality statistical analysis, which guarantees the effectiveness percentage in knowledge construction. The analysis of computer incidents within an educational institution and a standardized database of historical computer incidents collected by the Service Desk area is used as case study. Such area is linked to all information technology processes and focuses on the support requirements for the performance of employee activities. As long as users’ requirements are not fulfilled in a timely manner, the impact of incidents may give rise to work problems at different levels, making it d...
15
artículo
Publicado 2005
Enlace
Enlace
A Schlumberger resistivity record was made over an area of 50 hectares. A new field of processes based on the analytical signal response of the resistivity data was tested in the presence of disturbed phosphate deposits. Geology models were successively obtained from a peak model of the 2D resistivity data. The optimization of the imaging process was based on the optimization of surface tools. The descending analytical extension of the modeled surface over a depth of 30 meters was used for optimization of the modeling. The analytical processes found were consistently useful. The optimization of the phosphate reserve was improved and better constructed.
16
tesis de grado
Publicado 2022
Enlace
Enlace
This work details the activities performed over my years of working experience. During this time, I have been capable of developing my technical and management abilities learned during my time as a university student. Also, I have been able to learn about new activities and processes which rely heavily on the use of different technologies to improve businesses and prevent risks. Since my first work experiences, I have been able to get involved in database administration and support from processing tools, from simple macros and spreadsheets to specialized data science and analytics software. I learned different data science techniques such as regression and classification models, anomaly detection and clustering, with the opportunity to apply models in different datasets. Through the years, I was also able to learn new topics, including financial audit, e-discovery, and forensic data anal...
17
tesis de grado
Publicado 2025
Enlace
Enlace
Este trabajo tiene como objetivo describir y analizar los beneficios que tuvo la implementación del módulo venta y distribución del sistema Systems, Applications, and Products in Data Processing (SAP) en la empresa Cumbra Ingeniería S.A. Se enfoca en cómo este módulo de ventas ha optimizado la gestión de los procesos comerciales, abarcando desde la cotización hasta la facturación. La adopción del sistema ha permitido mejorar la eficiencia operativa y facilitar en tiempo real la integración para las áreas contable, de proyectos y financiera. Gracias a la automatización de tareas y a la implementación del sistema, la empresa ha logrado obtener datos precisos sobre las ventas realizadas, lo que ha favorecido la toma de decisiones informadas para el futuro. Este trabajo también aborda los desafíos enfrentados durante la implementación, los marcos teóricos utilizados y los r...
18
tesis de grado
Publicado 2020
Enlace
Enlace
It is known that 33% of traffic accidents worldwide are caused by drunk driving or drowsiness [1] [2], so a drowsiness level detection system that integrates image processing was developed with the use of Raspberry Pi3 with the OpenCV library; and sensors such as MQ-3 that measures the percentage of alcohol and the S9 sensor that measures the heart rate. In addition, it has an alert system and as an interface for the visualization of the data measured by the sensors a touch screen. With the image processing technique, facial expressions are analyzed, while physiological behaviors such as heart rate and alcohol percentage are measured with the sensors. In image test training you get an accuracy of x in a response time of x seconds. On the other hand, the evaluation of the operation of the sensors in 90% effective. So the method developed is effective and feasible
19
artículo
Publicado 2015
Enlace
Enlace
This article is based on research conducted by the authors between 2011 and 2012, in the framework of the implementation of research projects, sponsored by the Scientifi c Research Institute of the University of Lima (IDIC). In such research, the technological evolution of the Datacenter was analyzed, and the creation of the technology and management aspects of a Virtualized Datacenter (DCV) were proposed, in order to provide services for the development of laboratories in the academic university fi eld. The findings of this research were applied in the planning process of the “Laboratory Datacenter” of the Systems Engineering Career at the University of Lima, which was used in implementing the new syllabus.
20
tesis de grado
Optimization and standardization of the sales process in a service sector company through lean tools
Publicado 2024
Enlace
Enlace
Introduction: Lead times in the design and implementation of spaces in Lima represent 42% of the causes of customer dissatisfaction in the sales process. Therefore, the research focuses on the optimization of lead times, as they are usually very high in small companies (SMEs) and with non-standardized and therefore inefficient processes. Methods: A model is proposed that delineates a standardization plan for a process. Initially, it is necessary to map the existing state of sales process, recommend a new mapping of the quotation process using Value Stream Mapping (VSM), locate the critical path using Pert-CPM, and identify the activity that is causing the greatest amount of process delay. To make the process of identifying standards simpler, data is then gathered and categorized. After standardizing the sales procedure and validating the proposal with Arena Simulator, the outcomes are as...