Tópicos Sugeridos dentro de su búsqueda.
Tópicos Sugeridos dentro de su búsqueda.
https://purl.org/pe-repo/ocde/ford#5.02.04
1,234
https://purl.org/pe-repo/ocde/ford#2.11.04
515
https://purl.org/pe-repo/ocde/ford#3.03.03
489
https://purl.org/pe-repo/ocde/ford#2.02.04
377
ODS 3: Salud y bienestar. Garantizar una vida sana y promover el bienestar de todos a todas las edades
349
https://purl.org/pe-repo/ocde/ford#5.03.01
348
http://purl.org/pe-repo/ocde/ford#5.02.04
209
más ...
Buscar alternativas:
dairy processing » stir processing (Expander búsqueda), unfair processing (Expander búsqueda), olive processing (Expander búsqueda)
data processing » image processing (Expander búsqueda)
dairy processing » stir processing (Expander búsqueda), unfair processing (Expander búsqueda), olive processing (Expander búsqueda)
data processing » image processing (Expander búsqueda)
1
artículo
Publicado 2025
Enlace
Enlace
Esta investigación se centra en la implementación de la reingeniería del proceso de medición de los Stock Keeping Unit (SKU) importados por un centro de distribución de una empresa especializada en la venta de productos para la mejora del hogar y materiales de construcción. Para identificar los factores que influyen en el proceso y optimizarlo, se realizó un estudio de tiempos y movimientos del proceso utilizando la herramienta conocida como diagrama de espagueti. Como resultado, se incrementó la productividad y disminuyeron los tiempos muertos, lo que permitió medir en menos tiempo el universo de SKU. El estudio también condujo a la configuración del sistema utilizando datos correctos, lo que facilitó la operación dentro del almacén.
2
artículo
Publicado 2023
Enlace
Enlace
The process of producing dairy products generates a great deal of environmental contamination due to poor practices used during processing, which is why it is essential to know the degree of sustainability of this industry. The present study sought to evaluate the sustainability of the production of dairy products in the San Salvador de Quishcambal-Assaqui Agricultural Association in the town of San Salvador, Province of Luya in the Amazon Region. The ELANEM methodology was used and a survey validated by experts was used, with sustainability indicators and a double-entry matrix. For data processing, the normalization equation was used, in which the value of 1 represents an optimistic situation. The results obtained recorded values for the economic, social and environmental dimensions of 0.64; 0.58 and 0.77, respectively; and an overall sustainability index of 0.66. This reveals that the ...
3
artículo
Technological advances have allowed to collect and store large volumes of data over the years. Besides, it is significant that today's applications have high performance and can analyze these large datasets effectively. Today, it remains a challenge for data mining to make its algorithms and applications equally efficient in the need of increasing data size and dimensionality [1]. To achieve this goal, many applications rely on parallelism, because it is an area that allows the reduction of cost depending on the execution time of the algorithms because it takes advantage of the characteristics of current computer architectures to run several processes concurrently [2]. This paper proposes a parallel version of the FuzzyPred algorithm based on the amount of data that can be processed within each of the processing threads, synchronously and independently.
4
artículo
Publicado 2016
Enlace
Enlace
This article seeks to highlight the difficulties in the chemical processing of dairy products in the experience of a project “Capacity Building for Improving the Competitiveness of Market Chains Dairy in Huaytará Province”, Department of Huancavelica. Which she worked with the communities of the highlands which had craft supplies, lacking proper techniques and with proper oversight in terms of hygiene in the treatment of milk which was strengthened and raised awareness through and provides them with training and training and technical assistance from a multidisciplinary at various annexes Cordova District team. It is suggested that more often this kind of scope of knowledge is carried out by qualified professionals who make a fundamental task for the sake of improving the quality of life of these communities, since the raw material is milk you get a countless products as described i...
5
artículo
Publicado 2018
Enlace
Enlace
Geo-referenced textual data has been the subject of multiple investigations, by providing opportunities to better understand certain phenomena according to the content that is shared, either on-line such as social networks, blogs, and news; or through repositories such as scientific research articles, geo-referenced virtual books, among others. However, the characteristics of this information are studied, analyzed and processed separately, either through its textual components or its geo-spatial components, which offers a separate understanding of the results. In this paper, we propose an integration of textual and geo-spatial components from the pre-processing phase to the visualization stage, As a part of the Document Mapping process based on the phases of the Knowledge Discovery in Databases (KDD). Achieving two main results (1) minimize the problems that arise in the visual phase, su...
6
objeto de conferencia
Publicado 2018
Enlace
Enlace
The present work was achieved thanks to the joint work with my advisor, for her persistence and tenacity at the moment of sharing her teachings with me, to my distinguished teachers who have forged knowledge from the first day of classes, whom with nobility and enthusiasm influenced as an example in me and my colleagues in the master’s degree in computer science; also thanks to CONCYTEC, FONDECYT and Cienciactiva for the support and opportunities provided that made this work possible.
7
informe técnico
En el presente trabajo, se sistematizan los conceptos inherentes al Modelo Data Warehouse, haciendo referencia a cada uno de ellos en forma ordenada, en un marco conceptual claro, en el que se desplegarán sus características y cualidades, y teniendo siempre en cuenta su relación o interrelación con los demás componentes del ambiente. Inicialmente, se definirá los conceptos generales relacionados al Data WareHouse, Seguidamente, se introducirá a la definición de requerimientos y los procesos de negocio para modelar un Data Warehouse, y se expondrán sus aspectos más relevantes y significativos. Luego, se precisarán y detallarán todos los componentes que intervienen en la Integración de Datos, de manera organizada e intuitiva, atendiendo su interrelación. Posterior se describe el Diseño Dimensional para los procesos de Negocio. Finalmente, se describirán algunos conceptos qu...
8
tesis de maestría
Publicado 2021
Enlace
Enlace
Basado en los datos de observación de alta precisión GNSS y el cambio de coordenadas de la estación de monitoreo CORS antes y después del terremoto 8.0 de Perú de 2019, el autor desarrolló el software de análisis de deformación de la superficie basado en el software de procesamiento científico, que tiene valor científico y práctico en la investigación del epicentro del terremoto, magnitud y geodinámica. Se muestran los resultados obtenidos utilizando el software científico de procesamiento GNSS PANDA, un paquete de precisión para el análisis de datos GNSS, desarrollado por la Universidad de Wuhan, China. Los resultados son de alta precisión en el orden de los milímetros. Los resultados obtenidos tienen un desplazamiento de alrededor de 2 cm en las estaciones GNSS cercanas al terremoto, al noroeste.
9
artículo
Publicado 2024
Enlace
Enlace
This study presents Datalyzer, a system designed for data extraction, visualization, and prediction in the mining sector using advanced NLP and machine learning, specifically GPT-3.S Turbo. The system enhances operational efficiency through rigorous data preprocessing and specialized fine-tuning, validated on a simulated mining dataset. Results show significant improvements: data extraction time reduced by 94 % and visualization time by 97.6%. These improvements indicate a transformation in efficiency, usability, and user satisfaction. Despite limitations in data variability and complexity, this pioneering approach highlights the potential of NLP and machine learning in modernizing the mining industry and supporting data-driven decision-making.
10
artículo
Publicado 2015
Enlace
Enlace
In order to set realistic goals and carry out its functions effectively, the auditors of automatic data processing (EDP), should know what they expect their companies. Possess a clear understanding of the objectives of the administration. This booklet expresses these issues and describes some responsibilities that must exist between the EDP auditor and management in enterprises.
11
artículo
Publicado 2019
Enlace
Enlace
Is there a correspondence or affinity between the juridicalprincipiological and factual-economical conceptions for the effective protection of the consent of the holder of personal data when hiring in a network?Under the mantle of the present question, it aims to analyze the contemporary contractual scenario under the perspective of the privacy policy and the Brazilian General Data Protection Law (LGPD). In this context, it is proposed a skeptical reflection on the principles and economic guidelines defended by law and doctrine to verify if the consent is an instrument of real effectiveness to the tutelage of the subjects in network. The first topic concerns the conceptual and conceptual analysis of consent in the LGPD and in the specialized doctrine. The second topic deals with the limited rationality of the users of the network services in understanding the dispositions in the pol...
12
capítulo de libro
Publicado 2019
Enlace
Enlace
This work proposes a semi-automated analysis and modeling package for Machine Learning related problems. The library goal is to reduce the steps involved in a traditional data science roadmap. To do so, Sparkmach takes advantage of Machine Learning techniques to build base models for both classification and regression problems. These models include exploratory data analysis, data preprocessing, feature engineering and modeling. The project has its basis in Pymach, a similar library that faces those steps for small and medium-sized datasets (about ten millions of rows and a few columns). Sparkmach central labor is to scale Pymach to overcome big datasets by using Apache Spark distributed computing, a distributed engine for large-scale data processing, that tackle several data science related problems in a cluster environment. Despite the software nature, Sparkmach can be of use for local ...
13
artículo
Publicado 1998
Enlace
Enlace
Five strains of Kluyveromyces marxianus a one of Gandida pseudotropicalis were cultured in medium M-1 for productión of 13- galactosidasa (E.C : 3:2:1:23). The strain Kluyveromyces mandanus NRRL- Y - 1109 was selecteed for the best production of enzyme and the maxímum yield of enzyme was obtaíned on 5% of lactose supplemented with 0.5% yeast extract , 0.75% (NH4)2 S04 y 0.45% K2HP04 (w/v). lt obtained a yield of 5.43 U/mg of protein and 1.31 A.UoNPcl mg of dry weight. The extraction of the enzyme from viable cells was performed by toluene 2% (v/v), during 15 hours of treatment at 30°C, in 0.1 M potassium phosphate buffer, pH 7.0 ± 0.1, supplemented with 1 mM magnesium and 0.1 mM manganese sulfate . The yield of immobilization of enzyme on chitin treated with glutaraldehide was 41%, being the optimum pH of 6.6 ± 0.1. The values of Km and Vm for immobilized using lactase as substrate...
14
artículo
Publicado 1998
Enlace
Enlace
Five strains of Kluyveromyces marxianus a one of Gandida pseudotropicalis were cultured in medium M-1 for productión of 13- galactosidasa (E.C : 3:2:1:23). The strain Kluyveromyces mandanus NRRL- Y - 1109 was selecteed for the best production of enzyme and the maxímum yield of enzyme was obtaíned on 5% of lactose supplemented with 0.5% yeast extract , 0.75% (NH4)2 S04 y 0.45% K2HP04 (w/v). lt obtained a yield of 5.43 U/mg of protein and 1.31 A.UoNPcl mg of dry weight. The extraction of the enzyme from viable cells was performed by toluene 2% (v/v), during 15 hours of treatment at 30°C, in 0.1 M potassium phosphate buffer, pH 7.0 ± 0.1, supplemented with 1 mM magnesium and 0.1 mM manganese sulfate . The yield of immobilization of enzyme on chitin treated with glutaraldehide was 41%, being the optimum pH of 6.6 ± 0.1. The values of Km and Vm for immobilized using lactase as substrate...
15
tesis de grado
Publicado 2025
Enlace
Enlace
El presente estudio parte de la problemática que presenta una empresa del sector lácteo en la distribución de venta comprendida en la zona norte del Perú, la cual mediante la implementación de la herramienta de ingeniería Matriz Impacto-Factibilidad, para poder elegir mediante ranking de factores la mejor opción de las desplegadas en la Lluvia de ideas, logró un incremental de 14% en nivel de ventas brutas, una reducción en descuentos por venta del 64%, considerando el mayor poder de negociación e implementación de estrategia comercial de la empresa en comparación del distribuidor, obteniendo por consecuencia un incremental de 16% en nivel de ventas netas. Considerando la mejora de indicadores de venta y reducción de costos, se obtuvo un incremento del 29% respecto a la utilidad bruta y del 10% respecta a la utilidad operativa. Como precedente, se gestionaba la distribución...
16
artículo
Publicado 2023
Enlace
Enlace
Process automation is being implemented in different disciplines of earth sciences, as seen in the implementation of libraries such as Pyrolite, PyGeochemCalc, dh2loop 1.0, NeuralHydrology, GeoPyToo among others. The present work addresses a methodology to automate the geochemical univariate analysis by using Python and open-source packages such as pandas, seaborn, matplotlib, statsmodels which will be integrated into a script in a local work environment such as Jupyter notebook or in an online environment such as Google Collaboratory. The script is designed to process any type of geochemical data, allowing to remove outliers, perform calculations and graphs of the elements and their respective geological domain. The results include graphics such as boxplot, quantile-quantile and calculations of normality tests and geochemical parameters, allowing to determine the background and threshol...
17
artículo
Publicado 2014
Enlace
Enlace
Globalization has intensified competition in many markets. To remain competitive, the companies look for satisfying the needs of customers by meeting market requirements. In this context, Process Capability Indices (PCI) play a crucial role in assessing the quality of processes. In the case of non-normal data there are two general approaches based on transformations (Box-Cox and Johnson Transformation) and Percentiles (Pearson’s and Burr’s Distribution Systems). However, previous studies on the comparison of these methods show different conclusions, and thus arises the need to clarify the differences between these methods to implement a proper estimation of these indices. In this paper, a simulation study is made in order to compare the above methods and to propose an appropriate methodology for estimating the PCI in non-normal data. Furthermore, it is concluded that the best method ...
18
artículo
Globalization has intensified competition in many markets. To remain competitive, the companies look for satisfying the needs of customers by meeting market requirements. In this context, Process Capability Indices (PCI) play a crucial role in assessing the quality of processes. In the case of non-normal data there are two general approaches based on transformations (Box-Cox and Johnson Transformation) and Percentiles (Pearson’s and Burr’s Distribution Systems). However, previous studies on the comparison of these methods show different conclusions, and thus arises the need to clarify the differences between these methods to implement a proper estimation of these indices. In this paper, a simulation study is made in order to compare the above methods and to propose an appropriate methodology for estimating the PCI in non-normal data. Furthermore, it is concluded that the best method ...
19
artículo
Publicado 2018
Enlace
Enlace
Disruptive technologies and their impact on journalism and communication force us to assume challenges in learning new techniques for data and information processing. Interdisciplinary knowledge is evident in the teaching of new professional profiles. Data journalism is an example of this, so the immersion into a data culture must be preceded by awareness in the learning of news applications, algorithms or the treatment of Big Data, elements that configure new paradigms among journalists of the media on the Internet. With the revision of texts, direct observation of selected applications and case study, some conclusions are established that contain a growing demand in the knowledge of new techniques. The results show the use of technological resources and the proposal of changes in the curricula of the communication faculties.
20
artículo
Publicado 2018
Enlace
Enlace
Disruptive technologies and their impact on journalism and communication force us to assume challenges in learning new techniques for data and information processing. Interdisciplinary knowledge is evident in the teaching of new professional profiles. Data journalism is an example of this, so the immersion into a data culture must be preceded by awareness in the learning of news applications, algorithms or the treatment of Big Data, elements that configure new paradigms among journalists of the media on the Internet. With the revision of texts, direct observation of selected applications and case study, some conclusions are established that contain a growing demand in the knowledge of new techniques. The results show the use of technological resources and the proposal of changes in the curricula of the communication faculties.