Tópicos Sugeridos dentro de su búsqueda.
https://purl.org/pe-repo/ocde/ford#5.02.04 1,231 https://purl.org/pe-repo/ocde/ford#2.11.04 568 https://purl.org/pe-repo/ocde/ford#3.03.03 451 https://purl.org/pe-repo/ocde/ford#2.02.04 378 https://purl.org/pe-repo/ocde/ford#5.03.01 354 Rendimiento académico 353 ODS 3: Salud y bienestar. Garantizar una vida sana y promover el bienestar de todos a todas las edades 325 más ...
Mostrando 1 - 20 Resultados de 10,219 Para Buscar 'para ((dar processing) OR (data processing))', tiempo de consulta: 0.89s Limitar resultados
1
tesis de grado
En los últimos años, la llegada de las cámaras de profundidad de bajo costo y sensores LiDAR ha incentivado a las industrias a invertir en estas tecnologías, lo cual incluye también mayor interés en investigaciones sobre procesamiento digital de señales. En esta ocasión, la reconstrucción tridimensional de túneles mineros utilizando LiDARs y un robot de auto-navegación ha sido propuesta como proyecto de investigación, y el presente trabajo forma parte en cargándose del alineamiento de nubes de puntos tridimensionales en tiempo real, un proceso que es más conocido como Registro de Nubes de Puntos. Existen muchos algoritmos que pueden resolver este problema, pero para el proyecto, el algoritmo solo necesita calcular la alineación fina y rígida. Al comparar los algoritmos de registro más avanzados, se encontró que el popular algoritmo ICP es el más adecuado para este caso...
2
tesis de grado
En los últimos años, la llegada de las cámaras de profundidad de bajo costo y sensores LiDAR ha incentivado a las industrias a invertir en estas tecnologías, lo cual incluye también mayor interés en investigaciones sobre procesamiento digital de señales. En esta ocasión, la reconstrucción tridimensional de túneles mineros utilizando LiDARs y un robot de auto-navegación ha sido propuesta como proyecto de investigación, y el presente trabajo forma parte en cargándose del alineamiento de nubes de puntos tridimensionales en tiempo real, un proceso que es más conocido como Registro de Nubes de Puntos. Existen muchos algoritmos que pueden resolver este problema, pero para el proyecto, el algoritmo solo necesita calcular la alineación fina y rígida. Al comparar los algoritmos de registro más avanzados, se encontró que el popular algoritmo ICP es el más adecuado para este caso...
3
artículo
Esta investigación se centra en la implementación de la reingeniería del proceso de medición de los Stock Keeping Unit (SKU) importados por un centro de distribución de una empresa especializada en la venta de productos para la mejora del hogar y materiales de construcción. Para identificar los factores que influyen en el proceso y optimizarlo, se realizó un estudio de tiempos y movimientos del proceso utilizando la herramienta conocida como diagrama de espagueti. Como resultado, se incrementó la productividad y disminuyeron los tiempos muertos, lo que permitió medir en menos tiempo el universo de SKU. El estudio también condujo a la configuración del sistema utilizando datos correctos, lo que facilitó la operación dentro del almacén.
4
artículo
Is there a correspondence or affinity between the juridicalprincipiological and factual-economical conceptions for the effective protection of the consent of the holder of personal data when hiring in a network?Under the mantle of the present question, it aims to analyze the contemporary contractual scenario under the perspective of the privacy policy and the Brazilian General Data Protection Law (LGPD). In this context, it is proposed a skeptical reflection on the principles and economic guidelines defended by law and doctrine to verify if the consent is an instrument of real effectiveness to the tutelage of the subjects in network. The first topic concerns the conceptual and conceptual analysis of consent in the LGPD and in the specialized doctrine. The second topic deals with the limited rationality of the users of the network services in understanding the dispositions in the pol...
5
informe técnico
En el presente trabajo, se sistematizan los conceptos inherentes al Modelo Data Warehouse, haciendo referencia a cada uno de ellos en forma ordenada, en un marco conceptual claro, en el que se desplegarán sus características y cualidades, y teniendo siempre en cuenta su relación o interrelación con los demás componentes del ambiente. Inicialmente, se definirá los conceptos generales relacionados al Data WareHouse, Seguidamente, se introducirá a la definición de requerimientos y los procesos de negocio para modelar un Data Warehouse, y se expondrán sus aspectos más relevantes y significativos. Luego, se precisarán y detallarán todos los componentes que intervienen en la Integración de Datos, de manera organizada e intuitiva, atendiendo su interrelación. Posterior se describe el Diseño Dimensional para los procesos de Negocio. Finalmente, se describirán algunos conceptos qu...
6
tesis de maestría
Basado en los datos de observación de alta precisión GNSS y el cambio de coordenadas de la estación de monitoreo CORS antes y después del terremoto 8.0 de Perú de 2019, el autor desarrolló el software de análisis de deformación de la superficie basado en el software de procesamiento científico, que tiene valor científico y práctico en la investigación del epicentro del terremoto, magnitud y geodinámica. Se muestran los resultados obtenidos utilizando el software científico de procesamiento GNSS PANDA, un paquete de precisión para el análisis de datos GNSS, desarrollado por la Universidad de Wuhan, China. Los resultados son de alta precisión en el orden de los milímetros. Los resultados obtenidos tienen un desplazamiento de alrededor de 2 cm en las estaciones GNSS cercanas al terremoto, al noroeste.
7
artículo
Disruptive technologies and their impact on journalism and communication force us to assume challenges in learning new techniques for data and information processing. Interdisciplinary knowledge is evident in the teaching of new professional profiles. Data journalism is an example of this, so the immersion into a data culture must be preceded by awareness in the learning of news applications, algorithms or the treatment of Big Data, elements that configure new paradigms among journalists of the media on the Internet. With the revision of texts, direct observation of selected applications and case study, some conclusions are established that contain a growing demand in the knowledge of new techniques. The results show the use of technological resources and the proposal of changes in the curricula of the communication faculties.
8
artículo
Disruptive technologies and their impact on journalism and communication force us to assume challenges in learning new techniques for data and information processing. Interdisciplinary knowledge is evident in the teaching of new professional profiles. Data journalism is an example of this, so the immersion into a data culture must be preceded by awareness in the learning of news applications, algorithms or the treatment of Big Data, elements that configure new paradigms among journalists of the media on the Internet. With the revision of texts, direct observation of selected applications and case study, some conclusions are established that contain a growing demand in the knowledge of new techniques. The results show the use of technological resources and the proposal of changes in the curricula of the communication faculties.
9
artículo
Software development involves a high volume of data transfer between web-based applications and the end-user; this requires implementing them using new web technologies. This work contributes to developing a web-based application to improve the process management of the security company UNICEPRI, integrating suitable elements such as the Laravel frameworks for the backend and VueJs for the frontend, and the database manager MariaDB. The Model-View-Controller approach reduces memory resource usage, browsing time, and data recovery by reusing components and partially loading the website, gaining flexibility, and communicating with other applications and hardware. The SCRUM agile methodology was used to follow up on the implementation, allow adequate communication between client and developer, and comply with the different activities in the established times. For the evaluation of the softw...
10
artículo
Process automation is being implemented in different disciplines of earth sciences, as seen in the implementation of libraries such as Pyrolite, PyGeochemCalc, dh2loop 1.0, NeuralHydrology, GeoPyToo among others. The present work addresses a methodology to automate the geochemical univariate analysis by using Python and open-source packages such as pandas, seaborn, matplotlib, statsmodels which will be integrated into a script in a local work environment such as Jupyter notebook or in an online environment such as Google Collaboratory. The script is designed to process any type of geochemical data, allowing to remove outliers, perform calculations and graphs of the elements and their respective geological domain. The results include graphics such as boxplot, quantile-quantile and calculations of normality tests and geochemical parameters, allowing to determine the background and threshol...
11
artículo
Globalization has intensified competition in many markets. To remain competitive, the companies look for satisfying the needs of customers by meeting market requirements. In this context, Process Capability Indices (PCI) play a crucial role in assessing the quality of processes. In the case of non-normal data there are two general approaches based on transformations (Box-Cox and Johnson Transformation) and Percentiles (Pearson’s and Burr’s Distribution Systems). However, previous studies on the comparison of these methods show different conclusions, and thus arises the need to clarify the differences between these methods to implement a proper estimation of these indices. In this paper, a simulation study is made in order to compare the above methods and to propose an appropriate methodology for estimating the PCI in non-normal data. Furthermore, it is concluded that the best method ...
12
artículo
Globalization has intensified competition in many markets. To remain competitive, the companies look for satisfying the needs of customers by meeting market requirements. In this context, Process Capability Indices (PCI) play a crucial role in assessing the quality of processes. In the case of non-normal data there are two general approaches based on transformations (Box-Cox and Johnson Transformation) and Percentiles (Pearson’s and Burr’s Distribution Systems). However, previous studies on the comparison of these methods show different conclusions, and thus arises the need to clarify the differences between these methods to implement a proper estimation of these indices. In this paper, a simulation study is made in order to compare the above methods and to propose an appropriate methodology for estimating the PCI in non-normal data. Furthermore, it is concluded that the best method ...
13
artículo
Objective: Evaluate the characteristics of the proposal for the improvement of administrative management processes in at university institution with world-class planning systems of business resources (ERP), 2016. Method: The type of research is descriptive, quasi-experimental and transversal design was used. Two questionnaires were administered: Current processes of its administrative management and Enterprise Resource Planning Systems (ERP); also, a survey to determine the security of your information based on ISO 17799, providing data that generated a basis in SPSS v.20 processed with Factor Analysis. Results: The tools of these questionnaires were evaluated based on a pilot sample, whose Cronbach's alpha was 0.964 and 0.907 respectively, probabilities that qualify the data collection instruments as highly reliable. Next, the factorial analysis defined four factors for management and f...
14
tesis de grado
Compañías mineras están en búsqueda constante de nuevas tecnologías para aumentar su productividad. Una de las tecnologías que les permite realizar la reconstrucción de la superficie sin poner en riesgo la vida de sus trabajadores es el uso de sensores LiDAR junto con plataformas móviles que les permiten rotar el sensor para realizar un escaneo completo de la estructura. Sin embargo, el procesamiento de los datos se realiza a través de ordenadores situados fuera de la mina, debido a su alto coste computacional, lo que se traduce en un alto coste de tiempo. En esta tesis presento como objetivo principal el diseño de un algoritmo paralelo para la fusión de nubes de puntos capturadas por un LiDAR y la reconstrucción de la superficie en tiempo real, con el fin de reducir el tiempo de procesado, teniendo en cuenta información a priori del patrón de barrido de los puntos. En la l...
15
tesis de grado
Compañías mineras están en búsqueda constante de nuevas tecnologías para aumentar su productividad. Una de las tecnologías que les permite realizar la reconstrucción de la superficie sin poner en riesgo la vida de sus trabajadores es el uso de sensores LiDAR junto con plataformas móviles que les permiten rotar el sensor para realizar un escaneo completo de la estructura. Sin embargo, el procesamiento de los datos se realiza a través de ordenadores situados fuera de la mina, debido a su alto coste computacional, lo que se traduce en un alto coste de tiempo. En esta tesis presento como objetivo principal el diseño de un algoritmo paralelo para la fusión de nubes de puntos capturadas por un LiDAR y la reconstrucción de la superficie en tiempo real, con el fin de reducir el tiempo de procesado, teniendo en cuenta información a priori del patrón de barrido de los puntos. En la l...
16
tesis doctoral
This dissertation investigates the potential improvement of volcanic eruption understanding and forecasting methods by using advanced data processing techniques to analyze large datasets at three target volcanoes (Piton de la Fournaise (PdlF) (France), Sabancaya, and Ubinas (Peru)). The central objective of this study is to search for possible empirical relationships between the pre-eruptive behavior of the accelerated increase in seismic activity using the Failure Forecast Method (FFM) and velocity variations measured by Coda Wave Interferometry (CWI), since both observations are reported to be independently associated with medium damage. The FFM is a deterministic method used to forecast volcanic eruptions using an empirical relationship of increased and accelerated evolution of an observable (e.g., volcano-seismic event rates). The event rates used with FFM in this study were generate...
17
tesis doctoral
This dissertation investigates the potential improvement of volcanic eruption understanding and forecasting methods by using advanced data processing techniques to analyze large datasets at three target volcanoes (Piton de la Fournaise (PdlF) (France), Sabancaya, and Ubinas (Peru)). The central objective of this study is to search for possible empirical relationships between the pre-eruptive behavior of the accelerated increase in seismic activity using the Failure Forecast Method (FFM) and velocity variations measured by Coda Wave Interferometry (CWI), since both observations are reported to be independently associated with medium damage. The FFM is a deterministic method used to forecast volcanic eruptions using an empirical relationship of increased and accelerated evolution of an observable (e.g., volcano-seismic event rates). The event rates used with FFM in this study were generate...
18
artículo
This article presents a methodology that applies natural language processing and classification algorithms by us­ing data mining techniques, and incorporating procedures for validation and verification of significance. This is conducted according to the analysis and selection of data and results based on quality statistical analysis, which guarantees the effectiveness percentage in knowledge construction. The analysis of computer incidents within an educational institution and a standardized database of historical computer incidents collected by the Service Desk area is used as case study. Such area is linked to all information technology processes and focuses on the support requirements for the performance of employee activities. As long as users’ requirements are not fulfilled in a timely manner, the impact of incidents may give rise to work problems at different levels, making it d...
19
artículo
This article presents a methodology that applies natural language processing and classification algorithms by us­ing data mining techniques, and incorporating procedures for validation and verification of significance. This is conducted according to the analysis and selection of data and results based on quality statistical analysis, which guarantees the effectiveness percentage in knowledge construction. The analysis of computer incidents within an educational institution and a standardized database of historical computer incidents collected by the Service Desk area is used as case study. Such area is linked to all information technology processes and focuses on the support requirements for the performance of employee activities. As long as users’ requirements are not fulfilled in a timely manner, the impact of incidents may give rise to work problems at different levels, making it d...
20
artículo
A Schlumberger resistivity record was made over an area of 50 hectares. A new field of processes based on the analytical signal response of the resistivity data was tested in the presence of disturbed phosphate deposits. Geology models were successively obtained from a peak model of the 2D resistivity data. The optimization of the imaging process was based on the optimization of surface tools. The descending analytical extension of the modeled surface over a depth of 30 meters was used for optimization of the modeling. The analytical processes found were consistently useful. The optimization of the phosphate reserve was improved and better constructed.