Conference Paper
Scalable algorithms for large highresolution terrain data.
DOI: 10.1145/1823854.1823878 Conference: Proceedings of the 1st International Conference and Exhibition on Computing for Geospatial Research & Application, COM.Geo 2010, Washington, DC, USA, June 2123, 2010
Source: DBLP

Conference Paper: TerraNNI: natural neighbor interpolation on a 3D grid using a GPU.
[Show abstract] [Hide abstract]
ABSTRACT: With modern focus on LiDAR technology the amount of topographic data, in the form of massive point clouds, has increased dramatically. Furthermore, due to the popularity of LiDAR, repeated surveys of the same areas are becoming more common. This trend will only increase as topographic changes prompt surveys over already scanned terrain, in which case we obtain large spatiotemporal data sets. In dynamic terrains, such as coastal regions, such spatiotemporal data can offer interesting insight into how the terrain changes over time. An initial step in the analysis of such data is to create a digital elevation model representing the terrain over time. In the case of spatiotemporal data sets those models often represent elevation on a 3D volumetric grid. This involves interpolating the elevation of LiDAR points on these grid points. In this paper we show how to efficiently perform natural neighbor interpolation over a 3D volumetric grid. Using a graphics processing unit (GPU), we describe different algorithms to attain speed and GPUmemory tradeoffs. Our algorithm extends to higher dimensions. Our experimental results demonstrate that the algorithm is efficient and scalable. Categories and Subject.19th ACM SIGSPATIAL International Symposium on Advances in Geographic Information Systems, ACMGIS 2011, November 14, 2011, Chicago, IL, USA, Proceedings; 01/2011 
 [Show abstract] [Hide abstract]
ABSTRACT: Hand in hand with the increasing availability of highresolution digital elevation models (DEM), an efficient computation of landsurface parameters (LSPs) for largescale digital elevation models becomes more and more important, in particular for webbased applications. Parallel processing using multithreads on multicore processors is a standard approach to decrease computing time for the calculation of local LSPs based on moving window operations (e.g. slope, curvature). LSPs which require nonlocalities for their calculation (e.g. hydrological connectivities of grid cells) make parallelization quite challenging due to data dependencies. On the example of the calculation of the LSP 'flow accumulation', we test the two parallelization strategies 'spatial decomposition' and 'two phase approach' for their suitability to manage nonlocalities. Three datasets of digital elevation models with high spatial resolutions are used in our evaluation. These models are representative types of landscape of Central Europe with highly diverse geomorphic characteristics: a high mountains area, a low mountain range, and a floodplain area in the lowlands. Both parallelization strategies are evaluated with regard to their usability on these diversely structured areas. Besides the correctness analysis of calculated relief parameters (i.e. catchment areas), priority is given to the analysis of speedups achieved through the deployed strategies. As presumed, local surface parameters allow an almost ideal speedup. The situation is different for the calculation of nonlocal parameters which requires specific strategies depending on the type of landscape. Nevertheless, still a significant decrease of computation time has been achieved. While the speedups of the computation of the high mountain data set are higher by running the 'spatial decomposition approach' (3:2 by using four processors and 4:2 by using eight processors), the speedups of the 'two phase approach' have proved to be more efficient for the calculation of the low mountain and the floodplain data set (2:6 by using four processors and 2:9 by using eight processors). There, more nonlocalities in flat areas (e.g. filled sinks and floodplains) occur.Computers & Geosciences 08/2012; 44:19. · 1.56 Impact Factor
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.