Tópicos Sugeridos dentro de su búsqueda.
Tópicos Sugeridos dentro de su búsqueda.
https://purl.org/pe-repo/ocde/ford#5.02.04
232
https://purl.org/pe-repo/ocde/ford#3.03.03
181
ODS 3: Salud y bienestar. Garantizar una vida sana y promover el bienestar de todos a todas las edades
143
https://purl.org/pe-repo/ocde/ford#2.02.04
77
https://purl.org/pe-repo/ocde/ford#2.01.01
56
https://purl.org/pe-repo/ocde/ford#5.03.01
56
http://purl.org/pe-repo/ocde/ford#5.02.04
55
más ...
Buscar alternativas:
data processing » image processing (Expander búsqueda)
para data » para dama (Expander búsqueda), para damas (Expander búsqueda), para datos (Expander búsqueda)
data processing » image processing (Expander búsqueda)
para data » para dama (Expander búsqueda), para damas (Expander búsqueda), para datos (Expander búsqueda)
1
artículo
Publicado 2018
Enlace
Enlace
Disruptive technologies and their impact on journalism and communication force us to assume challenges in learning new techniques for data and information processing. Interdisciplinary knowledge is evident in the teaching of new professional profiles. Data journalism is an example of this, so the immersion into a data culture must be preceded by awareness in the learning of news applications, algorithms or the treatment of Big Data, elements that configure new paradigms among journalists of the media on the Internet. With the revision of texts, direct observation of selected applications and case study, some conclusions are established that contain a growing demand in the knowledge of new techniques. The results show the use of technological resources and the proposal of changes in the curricula of the communication faculties.
2
artículo
Publicado 2018
Enlace
Enlace
Disruptive technologies and their impact on journalism and communication force us to assume challenges in learning new techniques for data and information processing. Interdisciplinary knowledge is evident in the teaching of new professional profiles. Data journalism is an example of this, so the immersion into a data culture must be preceded by awareness in the learning of news applications, algorithms or the treatment of Big Data, elements that configure new paradigms among journalists of the media on the Internet. With the revision of texts, direct observation of selected applications and case study, some conclusions are established that contain a growing demand in the knowledge of new techniques. The results show the use of technological resources and the proposal of changes in the curricula of the communication faculties.
3
tesis de grado
Publicado 2025
Enlace
Enlace
Este trabajo tiene como objetivo describir y analizar los beneficios que tuvo la implementación del módulo venta y distribución del sistema Systems, Applications, and Products in Data Processing (SAP) en la empresa Cumbra Ingeniería S.A. Se enfoca en cómo este módulo de ventas ha optimizado la gestión de los procesos comerciales, abarcando desde la cotización hasta la facturación. La adopción del sistema ha permitido mejorar la eficiencia operativa y facilitar en tiempo real la integración para las áreas contable, de proyectos y financiera. Gracias a la automatización de tareas y a la implementación del sistema, la empresa ha logrado obtener datos precisos sobre las ventas realizadas, lo que ha favorecido la toma de decisiones informadas para el futuro. Este trabajo también aborda los desafíos enfrentados durante la implementación, los marcos teóricos utilizados y los r...
4
artículo
Publicado 2020
Enlace
Enlace
The objective of this work, is to make an analysis of the data (dates) or information, like a right protected by the cybernetic society. All society is constructed with uses, customs and values which are importants assets to their citizens, so they must protect them. These, rights, assets are not absolute, and are changing with the course of time. Some of them because of their importance and transcendence are constituted in main paradigms of the corresponding society. From half-full of the last century, a new form to process the information (computers) has been generated, with a special language (binary), that allows the data processing (systems) in a complex and different form from the traditional one. To these high technologies, communicational triggers were added (internet), that changed the value of the information at world-wide level, and constituted them in its main paradigm
5
artículo
Publicado 2020
Enlace
Enlace
The objective of this work, is to make an analysis of the data (dates) or information, like a right protected by the cybernetic society. All society is constructed with uses, customs and values which are importants assets to their citizens, so they must protect them. These, rights, assets are not absolute, and are changing with the course of time. Some of them because of their importance and transcendence are constituted in main paradigms of the corresponding society. From half-full of the last century, a new form to process the information (computers) has been generated, with a special language (binary), that allows the data processing (systems) in a complex and different form from the traditional one. To these high technologies, communicational triggers were added (internet), that changed the value of the information at world-wide level, and constituted them in its main paradigm
6
artículo
Publicado 2017
Enlace
Enlace
From the early 20th century, the Psychopedagogical Department of educational institutions has overseen the application of tests, scales or inventories to kinder, elementary and secondary education students, in other words, psychologists have been collecting psychometric information from students for more than 10 years. In that time, the filed data was printed in paper and ink complicating the filing process;therefore, large physical spaces were needed to preserve this psychometric data library and, indeed this was one of the factors that slowed the location process of a particular student’s file.In the 70’s, the arrival of computers allowed to store the files in digital and electronic format, but even with this technology the data filing process was complicated and the location of a student’s file remained slow. The question that arises is: why digital systems are still slow and co...
7
artículo
Publicado 2015
Enlace
Enlace
This article is based on research conducted by the authors between 2011 and 2012, in the framework of the implementation of research projects, sponsored by the Scientifi c Research Institute of the University of Lima (IDIC). In such research, the technological evolution of the Datacenter was analyzed, and the creation of the technology and management aspects of a Virtualized Datacenter (DCV) were proposed, in order to provide services for the development of laboratories in the academic university fi eld. The findings of this research were applied in the planning process of the “Laboratory Datacenter” of the Systems Engineering Career at the University of Lima, which was used in implementing the new syllabus.
8
artículo
Publicado 2018
Enlace
Enlace
The present investigation has focused on the analysis of the uncertainty of geospatial data in the initial stage of oil exploration. Through the evaluation of reliability indices of different data processing methods, a method has been developed that considers Voluntary Geographical Information (VGI) as a new data source and Geodatabase (GDB) as the repository where this information has been validated. With the creation of “Uncertainty Sphere” artifact that has been based on the algorithms of Guide for the Expression of Uncertainty and Measurement (GUM) and the recommendations of ISO 19157: 2013, Geographic Data Quality, Uncertainty Space has been delimited from data collected. The application of a Geographic Information System (GIS) has managed input data (IGV), geospatial processing (artifacts), storage (GDB) and information products (maps). As a case of study, geopositioning uncert...
9
artículo
Publicado 2019
Enlace
Enlace
Is there a correspondence or affinity between the juridicalprincipiological and factual-economical conceptions for the effective protection of the consent of the holder of personal data when hiring in a network?Under the mantle of the present question, it aims to analyze the contemporary contractual scenario under the perspective of the privacy policy and the Brazilian General Data Protection Law (LGPD). In this context, it is proposed a skeptical reflection on the principles and economic guidelines defended by law and doctrine to verify if the consent is an instrument of real effectiveness to the tutelage of the subjects in network. The first topic concerns the conceptual and conceptual analysis of consent in the LGPD and in the specialized doctrine. The second topic deals with the limited rationality of the users of the network services in understanding the dispositions in the pol...
10
informe técnico
En el presente trabajo, se sistematizan los conceptos inherentes al Modelo Data Warehouse, haciendo referencia a cada uno de ellos en forma ordenada, en un marco conceptual claro, en el que se desplegarán sus características y cualidades, y teniendo siempre en cuenta su relación o interrelación con los demás componentes del ambiente. Inicialmente, se definirá los conceptos generales relacionados al Data WareHouse, Seguidamente, se introducirá a la definición de requerimientos y los procesos de negocio para modelar un Data Warehouse, y se expondrán sus aspectos más relevantes y significativos. Luego, se precisarán y detallarán todos los componentes que intervienen en la Integración de Datos, de manera organizada e intuitiva, atendiendo su interrelación. Posterior se describe el Diseño Dimensional para los procesos de Negocio. Finalmente, se describirán algunos conceptos qu...
11
tesis de maestría
Publicado 2021
Enlace
Enlace
Basado en los datos de observación de alta precisión GNSS y el cambio de coordenadas de la estación de monitoreo CORS antes y después del terremoto 8.0 de Perú de 2019, el autor desarrolló el software de análisis de deformación de la superficie basado en el software de procesamiento científico, que tiene valor científico y práctico en la investigación del epicentro del terremoto, magnitud y geodinámica. Se muestran los resultados obtenidos utilizando el software científico de procesamiento GNSS PANDA, un paquete de precisión para el análisis de datos GNSS, desarrollado por la Universidad de Wuhan, China. Los resultados son de alta precisión en el orden de los milímetros. Los resultados obtenidos tienen un desplazamiento de alrededor de 2 cm en las estaciones GNSS cercanas al terremoto, al noroeste.
12
artículo
Geometallurgy is defined as the study of the genesis of minerals with respect to the performance of their metallurgical processing. The construction of geometallurgical models is of the utmost importance for the technical-economic evaluation of the deposit. The robustness of the model depends on the number of resources invested to generate potential information that will serve in decision making. The relevance of the geometallurgical model has a high value in mining management, which serves as an instrument for the planning, exploitation and design of metallurgical processes according to the type of deposit. Using the information to maximize economic performance in concentration processes has enormous potential and challenge for plant operators. This article will discuss the use of data analysis and its applications in several successful cases for porphyry-type deposits, taking...
13
artículo
This work relates to the analysis of a form of specialization based on Data Journalism and news organizations (old Media companies) have been incorporating this specialty as innovation journalism. Data Journalism involves using statistical and visualization tools to create and tell stories better in new and attractive to the Internet audience. It is the evolution of what some years ago was known as Precision Journalism or Desktop Journalism. The article, starting from an analytical contextualization, aims to demonstrate how Data Journalism, whose use is clear from existing technologies linked to data processing, supplementing the various forms and contents, giving way to a new specialty of Journalism and, specifically, Investigative Journalism.
14
tesis de grado
Publicado 2023
Enlace
Enlace
Las empresas data driven son aquellas organizaciones que son conducidas e impulsadas por grandes datos para el desarrollo de sus operaciones de negocio. El procesamiento, transformación y análisis de este valioso insumo ocurre a través del uso de diversas herramientas analíticas las cuales agregan un valor significativo a la industria, siendo de alto impacto en la efectividad de la toma de decisiones gerenciales y, por ende, en la obtención de un alto nivel de capitalización empresarial. Esta investigación presenta las distintas posturas de los autores acerca de las diferentes soluciones analíticas que resuelven las diversas necesidades de la empresa moderna; esto con el fin de dar cara a cada uno de los desafíos que enfrenta. Los datos son uno de los activos más importantes de una empresa, los cuales contribuyen en alcanzar un nivel alto en competitividad y obtener de una sól...
15
artículo
Publicado 2023
Enlace
Enlace
Process automation is being implemented in different disciplines of earth sciences, as seen in the implementation of libraries such as Pyrolite, PyGeochemCalc, dh2loop 1.0, NeuralHydrology, GeoPyToo among others. The present work addresses a methodology to automate the geochemical univariate analysis by using Python and open-source packages such as pandas, seaborn, matplotlib, statsmodels which will be integrated into a script in a local work environment such as Jupyter notebook or in an online environment such as Google Collaboratory. The script is designed to process any type of geochemical data, allowing to remove outliers, perform calculations and graphs of the elements and their respective geological domain. The results include graphics such as boxplot, quantile-quantile and calculations of normality tests and geochemical parameters, allowing to determine the background and threshol...
16
tesis de grado
Publicado 2023
Enlace
Enlace
En los últimos años, la llegada de las cámaras de profundidad de bajo costo y sensores LiDAR ha incentivado a las industrias a invertir en estas tecnologías, lo cual incluye también mayor interés en investigaciones sobre procesamiento digital de señales. En esta ocasión, la reconstrucción tridimensional de túneles mineros utilizando LiDARs y un robot de auto-navegación ha sido propuesta como proyecto de investigación, y el presente trabajo forma parte en cargándose del alineamiento de nubes de puntos tridimensionales en tiempo real, un proceso que es más conocido como Registro de Nubes de Puntos. Existen muchos algoritmos que pueden resolver este problema, pero para el proyecto, el algoritmo solo necesita calcular la alineación fina y rígida. Al comparar los algoritmos de registro más avanzados, se encontró que el popular algoritmo ICP es el más adecuado para este caso...
17
tesis de grado
Publicado 2023
Enlace
Enlace
En los últimos años, la llegada de las cámaras de profundidad de bajo costo y sensores LiDAR ha incentivado a las industrias a invertir en estas tecnologías, lo cual incluye también mayor interés en investigaciones sobre procesamiento digital de señales. En esta ocasión, la reconstrucción tridimensional de túneles mineros utilizando LiDARs y un robot de auto-navegación ha sido propuesta como proyecto de investigación, y el presente trabajo forma parte en cargándose del alineamiento de nubes de puntos tridimensionales en tiempo real, un proceso que es más conocido como Registro de Nubes de Puntos. Existen muchos algoritmos que pueden resolver este problema, pero para el proyecto, el algoritmo solo necesita calcular la alineación fina y rígida. Al comparar los algoritmos de registro más avanzados, se encontró que el popular algoritmo ICP es el más adecuado para este caso...
18
tesis doctoral
Publicado 2024
Enlace
Enlace
This dissertation investigates the potential improvement of volcanic eruption understanding and forecasting methods by using advanced data processing techniques to analyze large datasets at three target volcanoes (Piton de la Fournaise (PdlF) (France), Sabancaya, and Ubinas (Peru)). The central objective of this study is to search for possible empirical relationships between the pre-eruptive behavior of the accelerated increase in seismic activity using the Failure Forecast Method (FFM) and velocity variations measured by Coda Wave Interferometry (CWI), since both observations are reported to be independently associated with medium damage. The FFM is a deterministic method used to forecast volcanic eruptions using an empirical relationship of increased and accelerated evolution of an observable (e.g., volcano-seismic event rates). The event rates used with FFM in this study were generate...
19
tesis doctoral
Publicado 2024
Enlace
Enlace
This dissertation investigates the potential improvement of volcanic eruption understanding and forecasting methods by using advanced data processing techniques to analyze large datasets at three target volcanoes (Piton de la Fournaise (PdlF) (France), Sabancaya, and Ubinas (Peru)). The central objective of this study is to search for possible empirical relationships between the pre-eruptive behavior of the accelerated increase in seismic activity using the Failure Forecast Method (FFM) and velocity variations measured by Coda Wave Interferometry (CWI), since both observations are reported to be independently associated with medium damage. The FFM is a deterministic method used to forecast volcanic eruptions using an empirical relationship of increased and accelerated evolution of an observable (e.g., volcano-seismic event rates). The event rates used with FFM in this study were generate...
20
artículo
Publicado 2019
Enlace
Enlace
This article presents a methodology that applies natural language processing and classification algorithms by using data mining techniques, and incorporating procedures for validation and verification of significance. This is conducted according to the analysis and selection of data and results based on quality statistical analysis, which guarantees the effectiveness percentage in knowledge construction. The analysis of computer incidents within an educational institution and a standardized database of historical computer incidents collected by the Service Desk area is used as case study. Such area is linked to all information technology processes and focuses on the support requirements for the performance of employee activities. As long as users’ requirements are not fulfilled in a timely manner, the impact of incidents may give rise to work problems at different levels, making it d...