BookPDF Available

Geoinformática aplicada con Aprendizaje Basado en Problemas

Authors:

Abstract and Figures

Este libro es producto del proyecto de investigación Diseño e implementación de objetos virtuales de aprendizaje basados en problemas, para la formación específica en Sistemas de Información Geográfica de Código Abierto OPENGIS y Software Libre FOSS (Free and Open Source Software). El propósito de este texto es exponer metódicamente el análisis y la solución de problemas con componente geoinformático (adquisición, almacenamiento, procesamiento y despliegue de geodatos), para apoyar el logro de capacidades en los procesos de formación en las áreas de Sistemas de Información Geográfica (SIG), Geoinformática y afines, de la misma manera que proponer problemas cuyo proceso de análisis y solución puedan generar ideas de investigación y nuevas temáticas de trabajo en el aula. Para estructurar contenidos interactivos sobre temas de geoinformática, que luego se puedan empaquetar como objetos de aprendizaje de fácil uso, se requiere conformar equipos de trabajo con especialistas en diferentes áreas de la geoinformática para que diseñen problemas contextualizados, de interés para el estudiante y para la industria, que exijan trabajo cooperativo, justificación y explicación de suposiciones y toma de decisiones por parte de los estudiantes; además, que motiven a comprender nuevos conceptos, relacionen el tema con el entorno, con preguntas iniciales motivadoras, generen controversia, vínculos con conocimientos previos y respuestas y soluciones diversas. Para responder a estas características, en el libro se adopta el modelo de competencias Geospatial Management Competency Model (GTCM), desarrollado por The Urban and Regional Information Systems Association (URISA), en el que se busca desarrollar competencias personales, académicas, del lugar de trabajo, de la industria, específicas y de gestión. Por otro lado, si bien contiene problemas que exponen el proceso de análisis y presentan una solución, también se pueden adecuar en el aula para obtener otras soluciones de manera colaborativa y contextualizada, y para diferentes disciplinas, tal como lo sugiere el concepto de Aprendizaje Basado en Problemas (ABP). Los autores de los capítulos diseñaron los problemas teniendo en cuenta las áreas del modelo GTCM y contemplando competencias gruesas, tales como manejo, generación, procesamiento y análisis de datos, manejo de software, administración de proyectos, generación de productos y programación, con las correspondientes habilidades que se deben adquirir. El texto sirve de apoyo para cursos de pregrado y posgrado en los que se requiera solucionar problemas con componentes geoinformáticos. Por ejemplo, Ingeniería Ambiental (cartografía y topografía, hidráulica e hidrología, ordenamiento territorial, evaluaciones ambientales, geomática, gestión ambiental, modelación de recursos naturales en aire, en agua y en suelo), Ingeniería Industrial (logística, estrategia de operaciones, mercadeo), Ingeniería de Sistemas (nuevas tecnologías SIG, programación Python SIG, diseño y desarrollo de software, bases de datos espaciales), Ingeniería Multimedia (desarrollo de contenidos de aprendizaje interactivos), Ingeniería Electrónica (monitoreo remoto y telemetría) e Ingeniería de Sonido (modelamiento y mapeo del ruido ambiental). Igualmente, para cursos en especializaciones en Sistemas de Información Geográfica y en Maestría en Geoinformática. Si bien el libro es producto de un proyecto de investigación, el proceso de creación también una investigación en sí misma, no sólo por reunir a 27 especialistas en áreas y temas de la geoinformática, sino porque fue necesario diseñar un modelo para adecuar el ABP, como estrategia didáctica para interacción en el aula, a un documento escrito con su correspondiente plantilla de evaluación. Adicionalmente, se necesitará un nuevo proyecto para diseñar y desarrollar contenidos interactivos en el área de la geoinformática, adoptando y adecuando prácticas ágiles de la Ingeniería del Software, pero con la autorización de los autores y diseñadores de los problemas presentados en este libro.
Resumen técnico de DEM_10.img, tomado del conjunto de datos por medio de la herramienta LayerProperties de la suite ArcGIS 10.2 Este DEM tiene una profundidad radiométrica de 32bits y un valor de pixel de 10mX10m de lado en resolución espacial, calidad que permite obtener productos cartográficos en una escala de 20.000 a 25.000, teniendo en cuenta las características de resolución de la imagen y el principio de resolución detectado. Además, se encuentra correctamente posicionado geométricamente en el Sistema de Referencia MAGNA, con proyección Gauss origen Central. La escala se sustenta citando al IGAC [4] que define los procedimientos para la generación de cartografía oficial del país y señala los criterios de evaluación de las precisiones que se deben garantizar en los productos de acuerdo con su escala. Para el Instituto, los procesos y los instrumentos utilizados para la restitución fotogramétrica deben ser tales que los mapas finales cumplan con las siguientes normas mínimas: ▪ Precisión Planimétrica: El 90 % de los puntos extraídos del mapa, con excepción de aquellos que necesariamente son desplazados por la exageración de la simbología, deben estar localizados dentro de 0.5 mma escala de plano de sus posiciones verdaderas. El error medio cuadrático correspondiente es de 0.30 mm a la escala del mapa. ▪ Precisión Altimétrica: El 90 % de las curvas de nivel y de las elevaciones, interpoladas a partir de dichas curvas de nivel, deben estar dentro del ½ intervalo básico. Si "c" es este intervalo, el error medio cuadrático es de 0.3 c.
… 
Content may be subject to copyright.
A preview of the PDF is not available
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
PM10 prediction has attracted special legislative and scientific attention due to its harmful effects on human health. Statistical techniques have the potential for high-accuracy PM10 prediction and accordingly, previous studies on statistical methods for temporal, spatial and spatio-temporal prediction of PM10 are reviewed and discussed in this paper. A review of previous studies demonstrates that Support Vector Machines, Artificial Neural Networks and hybrid techniques show promise for suitable temporal PM10 prediction. A review of the spatial predictions of PM10 shows that the LUR (Land Use Regression) approach has been successfully utilized for spatial prediction of PM10 in urban areas. Of the six introduced approaches for spatio-temporal prediction of PM10, only one approach is suitable for high-resolved prediction (Spatial resolution < 100 m; Temporal resolution ≤ 24 h). In this approach, based upon the LUR modeling method, short-term dynamic input variables are employed as explanatory variables alongside typical non-dynamic input variables in a non-linear modeling procedure.
Article
Full-text available
This paper presents a procedure to address the lack of spatial air quality data in urban areas, based on the use of Geographic Information Systems (GIS) and spatial interpolation techniques as an alternative to conventional methods of statistical imputation. Two spatial interpolation algorithms are compared: IDW and spline. The procedure considers the spatial interpolation process, the cross validation with the index of agreement (IOA), and the analysis of the effect of sampling density and the coeffi cient of variation (CVOi), using different error statistics. The interpolation maps are complemented with gradient and directional gradient maps that may serve as complementary aides in the defi nition of critical sampling points. The procedure is applied to data imputation of three pollutants NO2, PM10 (particulate matter of diameter 10 microns) and TSP (total suspended solids) from observed data samples in the city of Medellín (Colombia).
Article
Full-text available
TOBEL is Geographic Information System entirely developed by one of the leading Bulgarian Geo-information companies – "Mapex" JSC. The system is based on modern information technology and it is especially designed for Bulgarian authorities. GIS – TOBEL provides to municipalities extraordinary quantitative and qualitative benefits. The system offers a method of quick access, evaluation, and format conversion. It also allows producing interactive maps from different sources, leveraging database information, and automating work processes. The paper contains a description of the main functions of the system, the used data and the whole process of development and system integration in Bulgarian Municipalities. The examples of successful working GIS systems integrated from our company are demonstrated.
Book
Putting Crime in its Place: Units of Analysis in Geographic Criminology focuses on the units of analysis used in geographic criminology. While crime and place studies have been a part of criminology from the early 19th century, growing interest in crime places over the last two decades demands critical reflection on the units of analysis that should form the focus of geographic analysis of crime. Should the focus be on very small units such as street addresses or street segments, or on larger aggregates such as census tracts or communities? Academic researchers, as well as practical crime analysts, are confronted routinely with the dilemma of deciding what the unit of analysis should be when reporting on trends in crime, when identifying crime hot spots or when mapping crime in cities. In place-based crime prevention, the choice of the level of aggregation plays a particularly critical role. This peer reviewed collection of essays aims to contribute to crime and place studies by making explicit the problems involved in choosing units of analysis in geographic criminology. Written by renowned experts in the field, the chapters in this book address basic academic questions, and also provide real-life examples and applications of how they are resolved in cutting-edge research. Crime analysts in police and law enforcement agencies as well as academic researchers studying the spatial distributions of crime and victimization will learn from the discussions and tools presented. © Springer Science+Business Media, LLC 2009. All rights reserved.
Chapter
Criminal activities are a major risk factor for the well-being of society in many countries. Police patrol service is a critical instrument to combat the criminal activities with violent aspects. The police resource allocation for street patrolling is one of the most important tactical management activities, and it is important to continuously improve the patrolling strategies. Previous research mainly focuses on determining the important scores of locations based on hotspots analysis or identifying important routes based on the topology of road networks. There are several limitations in the current patrol planning studies. First, the patterns in the distribution of hotspots were rarely considered in the patrol planning. Second, some existing patrol optimization algorithms can lead to a predictable patroller. Third, the overall performance of multiple patrol activities is often not optimized. Finally, more efficient algorithms are needed for a real-time solution that serves large jurisdictions. To deal with these limitations, this chapter aims to integrate a spatial pattern identification approach with an efficient route optimization algorithm to produce randomized optimal patrol routes. A case study is provided to illustrate our approach, which shows that the proposed approach can improve the overall effectiveness of multiple unpredictable patrol activities and it is efficient enough to be used as a real-time solution.
Article
Unlabelled: Estimation of daily average exposure to PM10 (particulate matter with an aerodynamic diameter<10 μm) using the available fixed-site monitoring stations (FSMs) in a city poses a great challenge. This is because typically FSMs are limited in number when considering the spatial representativeness of their measurements and also because statistical models of citywide exposure have yet to be explored in this context. This paper deals with the later aspect of this challenge and extends the widely used land use regression (LUR) approach to deal with temporal changes in air pollution and the influence of transboundary air pollution on short-term variations in PM10. Using the concept of multiple linear regression (MLR) modeling, the average daily concentrations of PM10 in two European cities, Vienna and Dublin, were modeled. Models were initially developed using the standard MLR approach in Vienna using the most recently available data. Efforts were subsequently made to (i) assess the stability of model predictions over time; (ii) explores the applicability of nonparametric regression (NPR) and artificial neural networks (ANNs) to deal with the nonlinearity of input variables. The predictive performance of the MLR models of the both cities was demonstrated to be stable over time and to produce similar results. However, NPR and ANN were found to have more improvement in the predictive performance in both cities. Using ANN produced the highest result, with daily PM10 exposure predicted at R2=66% for Vienna and 51% for Dublin. In addition, two new predictor variables were also assessed for the Dublin model. The variables representing transboundary air pollution and peak traffic count were found to account for 6.5% and 12.7% of the variation in average daily PM10 concentration. The variable representing transboundary air pollution that was derived from air mass history (from back-trajectory analysis) and population density has demonstrated a positive impact on model performance. Implications: The implications of this research would suggest that it is possible to produce a model of ambient air quality on a citywide scale using the readily available data. Most European cities typically have a limited FSM network with average daily concentrations of air pollutants as well as available meteorological, traffic, and land-use data. This research highlights that using these data in combination with advanced statistical techniques such as NPR or ANNs will produce reasonably accurate predictions of ambient air quality across a city, including temporal variations. Therefore, this approach reduces the need for additional measurement data to supplement existing historical records and enables a lower-cost method of air pollution model development for practitioners and policy makers.