Tópicos Sugeridos dentro de su búsqueda.
https://purl.org/pe-repo/ocde/ford#5.02.04 1,172 https://purl.org/pe-repo/ocde/ford#2.11.04 683 https://purl.org/pe-repo/ocde/ford#2.02.04 490 https://purl.org/pe-repo/ocde/ford#3.03.03 454 https://purl.org/pe-repo/ocde/ford#5.03.01 388 ODS 3: Salud y bienestar. Garantizar una vida sana y promover el bienestar de todos a todas las edades 325 https://purl.org/pe-repo/ocde/ford#2.01.01 194 más ...
Mostrando 1 - 20 Resultados de 10,265 Para Buscar 'para ((willd processing) OR (((pre processing) OR (data processing))))', tiempo de consulta: 3.54s Limitar resultados
1
artículo
This article demonstrates the feasibility of installing a small processing company Huaral nectars, checking that there is a potential market, recommending the most appropriate technology, and demonstrating their economic and financial viability.
2
artículo
This article demonstrates the feasibility of installing a small processing company Huaral nectars, checking that there is a potential market, recommending the most appropriate technology, and demonstrating their economic and financial viability.
3
tesis doctoral
This dissertation investigates the potential improvement of volcanic eruption understanding and forecasting methods by using advanced data processing techniques to analyze large datasets at three target volcanoes (Piton de la Fournaise (PdlF) (France), Sabancaya, and Ubinas (Peru)). The central objective of this study is to search for possible empirical relationships between the pre-eruptive behavior of the accelerated increase in seismic activity using the Failure Forecast Method (FFM) and velocity variations measured by Coda Wave Interferometry (CWI), since both observations are reported to be independently associated with medium damage. The FFM is a deterministic method used to forecast volcanic eruptions using an empirical relationship of increased and accelerated evolution of an observable (e.g., volcano-seismic event rates). The event rates used with FFM in this study were generate...
4
tesis doctoral
This dissertation investigates the potential improvement of volcanic eruption understanding and forecasting methods by using advanced data processing techniques to analyze large datasets at three target volcanoes (Piton de la Fournaise (PdlF) (France), Sabancaya, and Ubinas (Peru)). The central objective of this study is to search for possible empirical relationships between the pre-eruptive behavior of the accelerated increase in seismic activity using the Failure Forecast Method (FFM) and velocity variations measured by Coda Wave Interferometry (CWI), since both observations are reported to be independently associated with medium damage. The FFM is a deterministic method used to forecast volcanic eruptions using an empirical relationship of increased and accelerated evolution of an observable (e.g., volcano-seismic event rates). The event rates used with FFM in this study were generate...
5
artículo
In this study, an analysis of pre-trial proceedings is made, given the new scenario of programming virtual hearings and the subsequent procedural burden that will arise once the state of emergency is lifted. Moreover, operational aspects are raised for the correct application of this procedural institution and statistical data on its use in specialized courts in different parts of the country are evaluated, in order to reevaluate the advantages of pre-trial proceedings as a tool for procedural simplification that leads to an expeditious, effective and timely resolution.
6
artículo
Esta investigación se centra en la implementación de la reingeniería del proceso de medición de los Stock Keeping Unit (SKU) importados por un centro de distribución de una empresa especializada en la venta de productos para la mejora del hogar y materiales de construcción. Para identificar los factores que influyen en el proceso y optimizarlo, se realizó un estudio de tiempos y movimientos del proceso utilizando la herramienta conocida como diagrama de espagueti. Como resultado, se incrementó la productividad y disminuyeron los tiempos muertos, lo que permitió medir en menos tiempo el universo de SKU. El estudio también condujo a la configuración del sistema utilizando datos correctos, lo que facilitó la operación dentro del almacén.
7
artículo
Today X has become one of the most important socialnetworks for expressing opinions and interests on the web.The large amount of data generated allows automatedsystems to profile users based on gender, nationality andthematic interests. There are difficulties in this process notonly because of the short content, but also because of theambiguity and the use of several languages.The goal of this proposal is to generate a deep learningmodel using BERT that is able to identify demographic andthematic attributes from tweets. Pre-trained models of theBERT and Multilingual BERT type will be used, applied on PAN Author Profiling Task (CLEF 2019) corpora in English and Spanish.The proposed work will deepen the analysis using supervised classification data for gender and nationality classification and topic extraction through unsupervised techniques, such as LDA and BERTopic. These options include...
8
artículo
Is there a correspondence or affinity between the juridicalprincipiological and factual-economical conceptions for the effective protection of the consent of the holder of personal data when hiring in a network?Under the mantle of the present question, it aims to analyze the contemporary contractual scenario under the perspective of the privacy policy and the Brazilian General Data Protection Law (LGPD). In this context, it is proposed a skeptical reflection on the principles and economic guidelines defended by law and doctrine to verify if the consent is an instrument of real effectiveness to the tutelage of the subjects in network. The first topic concerns the conceptual and conceptual analysis of consent in the LGPD and in the specialized doctrine. The second topic deals with the limited rationality of the users of the network services in understanding the dispositions in the pol...
9
informe técnico
En el presente trabajo, se sistematizan los conceptos inherentes al Modelo Data Warehouse, haciendo referencia a cada uno de ellos en forma ordenada, en un marco conceptual claro, en el que se desplegarán sus características y cualidades, y teniendo siempre en cuenta su relación o interrelación con los demás componentes del ambiente. Inicialmente, se definirá los conceptos generales relacionados al Data WareHouse, Seguidamente, se introducirá a la definición de requerimientos y los procesos de negocio para modelar un Data Warehouse, y se expondrán sus aspectos más relevantes y significativos. Luego, se precisarán y detallarán todos los componentes que intervienen en la Integración de Datos, de manera organizada e intuitiva, atendiendo su interrelación. Posterior se describe el Diseño Dimensional para los procesos de Negocio. Finalmente, se describirán algunos conceptos qu...
10
tesis de maestría
Basado en los datos de observación de alta precisión GNSS y el cambio de coordenadas de la estación de monitoreo CORS antes y después del terremoto 8.0 de Perú de 2019, el autor desarrolló el software de análisis de deformación de la superficie basado en el software de procesamiento científico, que tiene valor científico y práctico en la investigación del epicentro del terremoto, magnitud y geodinámica. Se muestran los resultados obtenidos utilizando el software científico de procesamiento GNSS PANDA, un paquete de precisión para el análisis de datos GNSS, desarrollado por la Universidad de Wuhan, China. Los resultados son de alta precisión en el orden de los milímetros. Los resultados obtenidos tienen un desplazamiento de alrededor de 2 cm en las estaciones GNSS cercanas al terremoto, al noroeste.
11
artículo
Disruptive technologies and their impact on journalism and communication force us to assume challenges in learning new techniques for data and information processing. Interdisciplinary knowledge is evident in the teaching of new professional profiles. Data journalism is an example of this, so the immersion into a data culture must be preceded by awareness in the learning of news applications, algorithms or the treatment of Big Data, elements that configure new paradigms among journalists of the media on the Internet. With the revision of texts, direct observation of selected applications and case study, some conclusions are established that contain a growing demand in the knowledge of new techniques. The results show the use of technological resources and the proposal of changes in the curricula of the communication faculties.
12
artículo
Disruptive technologies and their impact on journalism and communication force us to assume challenges in learning new techniques for data and information processing. Interdisciplinary knowledge is evident in the teaching of new professional profiles. Data journalism is an example of this, so the immersion into a data culture must be preceded by awareness in the learning of news applications, algorithms or the treatment of Big Data, elements that configure new paradigms among journalists of the media on the Internet. With the revision of texts, direct observation of selected applications and case study, some conclusions are established that contain a growing demand in the knowledge of new techniques. The results show the use of technological resources and the proposal of changes in the curricula of the communication faculties.
13
artículo
Process automation is being implemented in different disciplines of earth sciences, as seen in the implementation of libraries such as Pyrolite, PyGeochemCalc, dh2loop 1.0, NeuralHydrology, GeoPyToo among others. The present work addresses a methodology to automate the geochemical univariate analysis by using Python and open-source packages such as pandas, seaborn, matplotlib, statsmodels which will be integrated into a script in a local work environment such as Jupyter notebook or in an online environment such as Google Collaboratory. The script is designed to process any type of geochemical data, allowing to remove outliers, perform calculations and graphs of the elements and their respective geological domain. The results include graphics such as boxplot, quantile-quantile and calculations of normality tests and geochemical parameters, allowing to determine the background and threshol...
14
artículo
Globalization has intensified competition in many markets. To remain competitive, the companies look for satisfying the needs of customers by meeting market requirements. In this context, Process Capability Indices (PCI) play a crucial role in assessing the quality of processes. In the case of non-normal data there are two general approaches based on transformations (Box-Cox and Johnson Transformation) and Percentiles (Pearson’s and Burr’s Distribution Systems). However, previous studies on the comparison of these methods show different conclusions, and thus arises the need to clarify the differences between these methods to implement a proper estimation of these indices. In this paper, a simulation study is made in order to compare the above methods and to propose an appropriate methodology for estimating the PCI in non-normal data. Furthermore, it is concluded that the best method ...
15
artículo
Globalization has intensified competition in many markets. To remain competitive, the companies look for satisfying the needs of customers by meeting market requirements. In this context, Process Capability Indices (PCI) play a crucial role in assessing the quality of processes. In the case of non-normal data there are two general approaches based on transformations (Box-Cox and Johnson Transformation) and Percentiles (Pearson’s and Burr’s Distribution Systems). However, previous studies on the comparison of these methods show different conclusions, and thus arises the need to clarify the differences between these methods to implement a proper estimation of these indices. In this paper, a simulation study is made in order to compare the above methods and to propose an appropriate methodology for estimating the PCI in non-normal data. Furthermore, it is concluded that the best method ...
16
artículo
A dissatisfied customer with a product and/or service is motivated to express a complaint. Classifying complaints manually is a process that represents high costs in human and material resources. Artificial Intelligence (AI) allows the use of various algorithms to perform tasks that can simulate human intelligence, a branch of this is Natural Language Processing (NLP), its objective is that machines have the capacity to understand human language, allowing, for example, to classify and categorize data automatically. This article provides a systematic review of the literature addressing challenges in the classification of complaint texts, such as the lack of class balance, the presence of unlabeled data, and the interpretation of model results. Preprocessing techniques are explored, such as tokenization, stopword removal, and lemmatization, which influence model performance. Additionally, ...
17
artículo
The judicial process followed against César Vallejo is one of the most iconic cases of our legal system, as it not only resulted in the imprisonment of our universal poet but also took place when the foundations of our penal system were being reformed. In other words, the trial was an opportunity to demonstrate that our penal system evolved from an inquisitorial to an accusatorial system. Unfortunately, the opposite proved to be true: despite the change of code, the inquisitorial assumptions were unfairly applied against our universal poet with the purpose of ensuring his imprisonment during the instructive stage of the process, that is, before the relevant oral trial to determine his innocence. In this sense, this article seeks to demonstrate how the purpose of this first stage was undermined.
18
artículo
This article presents a methodology that applies natural language processing and classification algorithms by us­ing data mining techniques, and incorporating procedures for validation and verification of significance. This is conducted according to the analysis and selection of data and results based on quality statistical analysis, which guarantees the effectiveness percentage in knowledge construction. The analysis of computer incidents within an educational institution and a standardized database of historical computer incidents collected by the Service Desk area is used as case study. Such area is linked to all information technology processes and focuses on the support requirements for the performance of employee activities. As long as users’ requirements are not fulfilled in a timely manner, the impact of incidents may give rise to work problems at different levels, making it d...
19
artículo
This article presents a methodology that applies natural language processing and classification algorithms by us­ing data mining techniques, and incorporating procedures for validation and verification of significance. This is conducted according to the analysis and selection of data and results based on quality statistical analysis, which guarantees the effectiveness percentage in knowledge construction. The analysis of computer incidents within an educational institution and a standardized database of historical computer incidents collected by the Service Desk area is used as case study. Such area is linked to all information technology processes and focuses on the support requirements for the performance of employee activities. As long as users’ requirements are not fulfilled in a timely manner, the impact of incidents may give rise to work problems at different levels, making it d...
20
tesis de grado
El presente estudio de pre factibilidad desarrollará una investigación sobre la instalación de una planta procesadora de leche fresa enriquecida con chía. Cabe mencionar que el presente trabajo considera el estudio de la demanda, localización de planta, tamaño de planta, ingeniería del proyecto y la evaluación económica financiera.