Article

Accuracy assessment of digital elevation models by means of robust statistical methods

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Measures for the accuracy assessment of Digital Elevation Models (DEMs) are discussed and characteristics of DEMs derived from laser scanning and automated photogrammetry are presented. Such DEMs are very dense and relatively accurate in open terrain. Built-up and wooded areas, however, need automated filtering and classification in order to generate terrain (bare earth) data when Digital Terrain Models (DTMs) have to be produced. Automated processing of the raw data is not always successful. Systematic errors and many outliers at both methods (laser scanning and digital photogrammetry) may therefore be present in the data sets. We discuss requirements for the reference data with respect to accuracy and propose robust statistical methods as accuracy measures. Their use is illustrated by application at four practical examples. It is concluded that measures such as median, normalized median absolute deviation, and sample quantiles should be used in the accuracy assessment of such DEMs. Furthermore, the question is discussed how large a sample size is needed in order to obtain sufficiently precise estimates of the new accuracy measures and relevant formulae are presented.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Once signatures had been selected, samples were tested for normality using Shapiro-Wilks tests [35]. After that, visual inspection of density plots and quantile-quantile plot was carried out [36]. If samples were found to fit a Gaussian distribution, then subsequent analyses adopted a parametric approach, while non-Gaussian distributions were studied using robust statistical methods (sensu [25]). ...
... For descriptive statistics, the sample mean or median were used to calculate central tendency, using the former for homogeneous and the latter for inhomogeneous distributions respectively [36][37][38][39][40][41]. Likewise, assessment of sample variance was calculated either using the standard deviation, or the Square Root of the Biweight Midvariance ( p BWMV), for homogeneous and inhomogeneous distributions respectively [38,41]. ...
... Likewise, assessment of sample variance was calculated either using the standard deviation, or the Square Root of the Biweight Midvariance ( p BWMV), for homogeneous and inhomogeneous distributions respectively [38,41]. Next, non-symmetric 95% confidence intervals were constructed using the [0.05, 0.95] interquantile range [36]. ...
Article
Full-text available
One of the most common forms of cancer in fair skinned populations is Non-Melanoma Skin Cancer (NMSC), which primarily consists of Basal Cell Carcinoma (BCC), and cutaneous Squamous Cell Carcinoma (SCC). Detecting NMSC early can significantly improve treatment outcomes and reduce medical costs. Similarly, Actinic Keratosis (AK) is a common skin condition that, if left untreated, can develop into more serious conditions, such as SCC. Hyperspectral imagery is at the forefront of research to develop non-invasive techniques for the study and characterisation of skin lesions. This study aims to investigate the potential of near-infrared hyperspectral imagery in the study and identification of BCC, SCC and AK samples in comparison with healthy skin. Here we use a pushbroom hyperspectral camera with a spectral range of ≈ 900 to 1600 nm for the study of these lesions. For this purpose, an ad hoc platform was developed to facilitate image acquisition. This study employed robust statistical methods for the identification of an optimal spectral window where the different samples could be differentiated. To examine these datasets, we first tested for the homogeneity of sample distributions. Depending on these results, either traditional or robust descriptive metrics were used. This was then followed by tests concerning the homoscedasticity, and finally multivariate comparisons of sample variance. The analysis revealed that the spectral regions between 900.66–1085.38 nm, 1109.06–1208.53 nm, 1236.95–1322.21 nm, and 1383.79–1454.83 nm showed the highest differences in this regard, with <1% probability of these observations being a Type I statistical error. Our findings demonstrate that hyperspectral imagery in the near-infrared spectrum is a valuable tool for analyzing, diagnosing, and evaluating non-melanoma skin lesions, contributing significantly to skin cancer research.
... However, the RMSE also has limitations as an accuracy measure. For example, it does not distinguish between random errors, systematic errors or blunders; and the errors in a DEM do not always follow a normal distribution (Carrera-Hernández 2021; Höhle and Höhle 2009;Wise 2000). The MAD and NMAD overcome these limitations due to their robustness and distribution free approach to handling data outliers (Carrera-Hernández 2021; Höhle and Höhle 2009;Willmott and Matsuura 2005). ...
... For example, it does not distinguish between random errors, systematic errors or blunders; and the errors in a DEM do not always follow a normal distribution (Carrera-Hernández 2021; Höhle and Höhle 2009;Wise 2000). The MAD and NMAD overcome these limitations due to their robustness and distribution free approach to handling data outliers (Carrera-Hernández 2021; Höhle and Höhle 2009;Willmott and Matsuura 2005). ...
... The MAD and NMAD were calculated as follows. (Höhle and Höhle 2009;Wessel et al. 2018): ...
Article
Full-text available
Validation studies of global Digital Elevation Models (DEMs) in the existing literature are limited by the diversity and spread of landscapes, terrain types considered and sparseness of ground-truth. Moreover, there are knowledge gaps on the accuracy variations in rugged and complex landscapes, and previous studies have often not relied on robust internal and external validation measures. Thus, there is still only partial understanding and limited perspective of the reliability and adequacy of global DEMs for several applications. In this study, we utilize a dense spread of LiDAR groundtruth to assess the vertical accuracies of four medium-resolution, readily available, free-access and global coverage 1 arc-second (30 m) DEMs: NASADEM, ASTER GDEM, Copernicus GLO-30, and ALOS World 3D (AW3D). The assessment is carried out at landscapes spread across Cape Town, Southern Africa (urban/industrial, agricultural, mountain , peninsula and grassland/shrubland) and forested national parks in Gabon, Central Africa (low-relief tropical rainforest and high-relief tropical rainforest). The statistical analysis is based on robust accuracy metrics that cater for normal and non-normal elevation error distribution, and error ranking. In Cape Town, Copernicus DEM generally had the least vertical error with an overall Mean Error (ME) of 0.82 m and Root Mean Square Error (RMSE) of 2.34 m while ASTER DEM had the poorest performance. However, ASTER GDEM and NASADEM performed better in the low-relief and high-relief tropical forests of Gabon. Generally, the DEM errors have a moderate to high positive correlation in forests, and a low to moderate positive correlation in mountains and urban areas. Copernicus DEM showed superior vertical accuracy in forests with less than 40% tree cover, while ASTER and NASADEM performed better in denser forests with tree cover greater than 70%. This study is a robust regional assessment of these global DEMs.
... Extensive exploration has been conducted for many years to create DSMs from photographic data and assess their precision (Pulighe and Fava, 2013). There have been numerous comparative studies of digital modelling techniques and LiDAR methods (Haala et al., 2010;Höhle and Höhle, 2009). The technique of automatic DSM extraction from aerial photos using stereoscopic matching emerged in the early 1990s, relying on feature-based matching (FBM) algorithms (Nebiker et al., 2014). ...
... Various software applications facilitate the reconstruction of dense DSMs using images, mainly applying the Semi-Global Matching (SGM) method, pioneered by Hirschmüller (2011). Despite their advantages, elevation models derived from imagery face challenges, particularly in urban and forested areas, necessitating automatic ltering and classi cation to distinguish ground data accurately (Höhle and Höhle, 2009). ...
Article
Full-text available
Poland as well as other countries keep extensive collections of 20th and 21st-century aerial photos, which are underexploited compared to such other archival materials as satellite imagery. Meanwhile, they offer significant research potential in various areas, including urban development, land use changes, and long-term environmental monitoring. Archival photographs are detailed, often obtained every five to ten years, and feature high resolution, from 20 cm to 1 m. Their overlap can facilitate creating precise digital models that illustrate topography and land cover, which are essential variables in many scientific contexts. However, rapidly transforming these photographs into geographically accurate measurements of the Earth’s surface poses challenges. This article explores the obstacles in automating the processing of historical photographs and presents the main scientific research directions associated with these images. Recent advancements in enhancing work˚ows, including the development of modern digital photogrammetry tools, algorithms, and machine learning techniques are also discussed. These developments are crucial for unlocking the full potential of aerial photographs, making them easier accessible and valuable for a broader range of scientific fields. These underutilized photographs are increasingly recognized as vital in various research domains due to technological advancements. Integrating new methods with these historical images offers unprecedented opportunities for scientific discovery and historical understanding, bridging the past with the future through innovative research techniques.
... Uma alternativa seria a aplicação da metodologia Bootstrap, porém, na sua forma padrão, essa técnica também presume que a variável em estudo é independente e identicamente distribuída. Assim, a sua aplicação a dados autocorrelacionados, traduz-se em intervalos de confiança bastante estreitos e, portanto, inconsistentes (Efron & Tibshirani, 1993;Höhle & Höhle, 2009 (2015), deve-se escolher uma distância limite que melhor representa a dependência espacial da variável analisada. Geralmente, o valor máximo utilizável corresponde à metade da distância máxima entre os pontos (Ribeiro Júnior & Diggle, 2001;Diggle & Ribeiro Júnior, 2007). ...
... Esse gráfico permite verificar a adequação da distribuição de frequência dos dados a uma distribuição de probabilidades qualquer, em suma, os quantis da função de distribuição empírica são plotados contra os quantis teóricos da distribuição de probabilidades, nesse caso, a distribuição normal. Se a distribuição empírica é normal, o gráfico será apresentado como uma linha reta(Höhle & Höhle, 2009). ...
Article
Full-text available
Os corpos d’água interiores, existentes no território nacional, são de suma importância para o abastecimento hídrico da população e seu monitoramento contínuo é útil para uma adequada gestão hídrica de um município, estado ou país. O levantamento batimétrico possibilita o conhecimento sistêmico da profundidade de reservatórios e, consequentemente, o respectivo volume útil de água e o cálculo do assoreamento. No entanto, não existem metodologias na literatura que tratam da problemática acerca do nível de significância do assoreamento encontrado. Neste trabalho foram utilizados métodos estatísticos e geoestatísticos para avaliação do grau de assoreamento de reservatórios de abastecimento hídrico. Foram criados dois Modelos Digitais de Superfície (MDS) para a área de estudo em 2012 e 2022. A análise de normalidade dos resíduos confirmou a distribuição normal, respaldada pelo teste de Kolmogorov-Smirnov e Q-Q Plot. Para avaliar o grau de assoreamento, o teste t de Student unilateral foi aplicado, indicando uma redução significativa no reservatório entre 2012 e 2022, corroborando a diminuição do volume.
... The smaller the multiple, the greater the possibility of removing reliable points. The experimental recommendation is to set it between 2 and 4. In addition, the normalized median absolute deviation (NMAD) can also be tried as a threshold, which will not be listed here [39]. Formulae (1)-(4) derive the statistical outlier detection principle of three times the standard deviation, with dem raw being raw DEMs, dem external being the external DEM, d being the difference between the two DEMs, σ being the standard deviation of the difference, and µ being the mean difference. ...
... will not be listed here [39]. Formulae (1)-(4) derive the statistical outlier detection principle of three times the standard deviation, with being raw DEMs, being the external DEM, being the difference between the two DEMs, being the standard deviation of the difference, and being the mean difference. ...
Article
Full-text available
Accurate and complete digital elevation models (DEMs) play an important fundamental role in geospatial analysis, supporting various engineering applications, human activities, and scientific research. Interferometric synthetic aperture radar (InSAR) plays an increasingly important role in DEM generation. Nonetheless, owing to its inherent characteristics, gaps often appear in regions marked by significant topographical fluctuations, necessitating an extra void-filling process. Traditional void-filling methods have operated directly on preexisting data, succeeding in relatively flat terrain. When facing mountainous regions, there will always be gross errors in elevation values. Regrettably, conventional methods have often disregarded this vital consideration. To this end, this research proposes a DEM void-filling method based on incorporating elevation outlier detection. It accounts for the detection and removal of elevation outliers, thereby mitigating the shortcomings of existing methods and ensuring robust DEM restoration in mountainous terrains. Experiments were conducted to validate the method applicability using TanDEM-X data from Sichuan, China, Hebei, China, and Oregon, America. The results underscore the superiority of the proposed method. Three traditional methods are selected for comparison. The proposed method has different degrees of improvement in filling accuracy, depending on the void status of the local terrain. Compared with the delta surface fill (DSF) method, the root mean squared error (RMSE) of the filling results has improved by 7.87% to 51.87%. The qualitative and quantitative experiments demonstrate that the proposed method is promising for large-scale DEM void-filling tasks.
... It is important to note that the y-axis scales differ significantly between the datasets. In addition, the Normalized Mean Absolute Deviation (NMAD) of the dataset relative to LiDAR data was calculated which is one of the robust measures for the assessment of DEM accuracy (Hohle and Hohle, 2009). These DEMs did not have a standard normal error distribution which is also seen in global and other datasets (Hohle and Hohle, 2009;Uuemaa et al., 2020). ...
... In addition, the Normalized Mean Absolute Deviation (NMAD) of the dataset relative to LiDAR data was calculated which is one of the robust measures for the assessment of DEM accuracy (Hohle and Hohle, 2009). These DEMs did not have a standard normal error distribution which is also seen in global and other datasets (Hohle and Hohle, 2009;Uuemaa et al., 2020). The NMAD values for SRIIM, SRTM and MERIT Hydro were 0.467 m, 0.54 m and 0.481 m respectively, indicating good terrain accuracy below 1 m (Wessel et al., 2018). ...
... not used for the georeferencing procedure) GPSs points, with XY mean error of 2.4 cm and Z mean error of 4.6 cm, were acquired to assess the 2023 DSM accuracy. By comparing with these data we calculated the mean error, the median error and the root mean square error (RMSE) of the 2023 DSM. Figure 6 shows the vertical discrepancies of the 2023 DSM with respect to the 321 GPSs points as a histogram of the discrepancies [Höhle and Höhle, 2009]. The histogram shows that the error has a normal distribution. ...
... we also plotted a curve of a normal distribution (total Gaussian distribution) and a curve of a normal distribution truncated at the 3s threshold. In addition, we calculated the normalized median absolute deviation (NMAD), which is less sensitive to the presence of outliers [Höhle and Höhle, 2009]. All the metrics parameters reported in Figure 6 are in meters. ...
Article
Full-text available
Stromboli is a volcanic island in a persistent state of activity, located in the Tyrrhenian Sea off the northern coast of Sicily. During the night of 25 and 26 May 2022, a massive human-caused wildfire destroyed most of the vegetation cover on the NE flank of the island, just above the main village. On 12 August 2022, a particularly heavy rainfall event remobilized the loose volcaniclastic deposits that covered the burned volcanic flank, no longer protected by the vegetation. This event triggered several debris flows that were channeled by the roads and flooded several streets and buildings, causing severe damage to the village. In late-March 2023, just before the large spring vegetation growth, we conducted an Unmanned Aerial System (UAS) photogrammetric campaign over a sector of the NE flank of Stromboli Island, to acquire data on an area massively affected by the wildfire first and by the debris flows later. Here we present and share with the scientific community and civil authorities the results of this UAS campaign, which consists of a 1.4 km2 wide 10 cm-resolution Digital Surface Model (DSM) and 1.6 cm-resolution orthomosaic. These data clearly show the dramatic consequences of the 2022 tragic events at Stromboli. We also produced an elevation difference map by comparing the 2023 DSM here generated and the 2012 LiDAR DEM to provide a first overview of the thickness of the deposits that were removed from the Stromboli NE flank.
... Cross-sections are then imported into tpsDig2 v. 2.31 (Rohlf, 2017) for digitisation. So as to assess for intra-analyst error for the process of profile extraction, a repeatability test was performed (Blackwell et al., 2006;Muñoz-Muñoz & Perpiñ an, 2010), using robust statistical techniques to evaluate the magnitude of error Höhle & Höhle, 2009;Rodríguez-Gonz alez et al., 2014;Rodríguez-Martín et al., 2019). This consisted of measuring and finding the location of areas where cross-sections were to be extracted five times. ...
... Depending on the result of this, parametric or nonparametric hypothesis tests were used for Gaussian and non-Gaussian distributions respectively. For the univariate study, robust and traditional descriptive statistics were applied for each variable Höhle & Höhle, 2009;Rodríguez-Gonz alez et al., 2014;Rodríguez-Martín et al., 2019). As with repeatability tests, either the median, mean, standard deviation, or BWMV were used to describe the general properties of these measurements. ...
Article
Full-text available
The use of high-resolution silicone moulds for documenting bone surface modifications, such as cut marks, is common. However, it has not been evaluated whether moulding can affect the originals. In this work, the modification level derived from several moulding-demoulding processes on an experimental sample of cut marks has been characterised using geometric Mor-phometrics. It has been shown that moulds influence the morphology of cut marks, reducing their variability , and making the sample more homogeneous. These modifications do not affect the identification of cut marks, but if not considered, may have an effect on more specialised studies.
... DEM vertical error statistics were computed after Höhle and Höhle (2009). Please note that the vertical accuracy of data was obtained by comparing the DEM elevation data against 566 RTK-based GNSS point measurements acquired on well-distributed stable areas (e.g., big boulders, roadways, hydraulic defense works, not debris-flows affected areas, etc.) by the research group in December 2015 in a nearby study area. ...
Article
Full-text available
Debris flows are solid‐liquid mixtures originating in the upper part of mountain basins and routing downstream along incised channels. When the channel incises an open fan, the debris flow leaves the active channel and propagates downstream along a new pathway. This phenomenon is called an avulsion. We retrieve the most probable avulsion pathways leveraging a Monte Carlo approach based on using Digital Elevation Models (DEMs). Starting from LiDAR‐based DEMs, we build an ensemble of synthetic DEMs using a local Gaussian probability density function of local elevation values and obtain an ensemble of drainage networks using a gravity‐driven routing algorithm. The ensemble of drainage networks was used to obtain the most probable pathways of avulsions. We applied our methodology to a real monitored fan in the Dolomites (Northeastern Italian Alps) subjected to debris‐flow activity with avulsions. Our approach allows us to verify the consistency between the occurrence probability of a synthetic pathway and those that historically occurred. Furthermore, our approach can be used to predict future debris‐flow avulsions, assuming relevance in debris‐flow risk assessment and planning of debris‐flow control works.
... In this assessment, lidar DTM was used as the reference data and the vertical differences between lidar DTM and remaining models were calculated. We visualized the distributions of the height differences using density plots and summarized them by calculating three error metrics typically used in the literature (Höhle & Höhle, 2009): mean error (ME), mean absolute error (MAE), and root mean square error (RMSE) defined as follows: ...
Article
Full-text available
Satellite‐derived global digital elevation models (DEMs) are essential for providing the topographic information needed in a wide range of hydrological applications. However, their use is limited by spatial resolution and vertical bias due to sensor limitations in observing bare terrain. Significant efforts have been made to improve the resolution of global DEMs (e.g., TanDEM‐X) and create bare‐earth DEMs (e.g., FABDEM, MERIT, CEDTM). We evaluated the vertical accuracy of bare‐earth and global DEMs in Central European mountains and submontane regions, and assessed how DEM resolution, vegetation offset removal, land cover, and terrain slope affect stream network delineation. Using lidar‐derived DTM and national stream networks as references, we found that: (a) bare‐earth DEMs outperform global DEMs across all land cover types. RMSEs increased with increasing slope for all DEMs in non‐forest areas. In forests, however, the negative effect of the slope was outweighed by the vegetation offset even for bare‐earth DTMs; (b) the accuracy of derived stream networks was affected by terrain slope and land cover more than by the vertical accuracy of DEMs. Stream network delineation performed poorly in non‐forest areas and relatively well in forests. Increasing slope improved the streams delineation performance; (c) using DEMs with higher resolution (e.g., 12 m TanDEM‐X) improved stream network delineation, but increasing resolution also increased the need for effective vegetation bias removal. Our results indicate that vertical accuracy alone does not reflect how well DEMs perform in stream network delineation. This underscores the need to include stream network performance in DEM quality rankings.
... However, this issue can be mitigated by expanding the data area. Moreover, the accuracy of the DEM data is also closely related to the results, which may be due to a variety of factors, such as image quality, terrain slope, and curvature [47][48][49]. Steep slopes and shadows occurring during image acquisition can result in stereo matching errors, thereby increasing DEM inaccuracies, which is a common challenge in photogrammetry [44]. In addition, the processing of vegetation, noise, and other factors during DEM processing contributes to the expansion of the stabilized area within the region and improves the accuracy of the method. ...
Article
Full-text available
Landslides are geological disasters that are harmful to both humans and society. Digital elevation model (DEM) time series data are usually used to monitor dynamic changes or surface damage. To solve the problem of landslide deformation monitoring without ground control points (GCPs), a multidimensional feature-based coregistration method (MFBR) was studied to achieve accurate registration of multitemporal DEMs without GCPs and obtain landslide deformation information. The method first derives the elevation information of the DEM into image pixel information, and the feature points are extracted on the basis of the image. The initial plane position registration of the DEM is implemented. Therefore, the expected maximum algorithm is applied to calculate the stable regions that have not changed between multitemporal DEMs and to perform accurate registrations. Finally, the shape variables are calculated by constructing a DEM differential model. The method was evaluated using simulated data and data from two real landslide cases, and the experimental results revealed that the registration accuracies of the three datasets were 0.963 m, 0.368 m, and 2.459 m, which are 92%, 50%, and 24% better than the 12.189 m, 0.745 m, and 3.258 m accuracies of the iterative closest-point algorithm, respectively. Compared with the GCP-based method, the MFBR method can achieve 70% deformation acquisition capability, which indicates that the MFBR method has better applicability in the field of landslide monitoring. This study provides an idea for landslide deformation monitoring without GCPs and is helpful for further understanding the state and behavior of landslides.
... For this purpose, either Analysis of Variance (ANOVA) or Kruskal-Wallis tests were performed, for Gaussian and non-Gaussian distributions respectively. Distributions were also analyzed by applying descriptive statistics, using either traditional or robust metrics according to sample homogeneity (Höhle and Höhle 2009;Rodríguez-Martín et al. 2019;Courtenay et al. 2020bCourtenay et al. , 2021. Descriptive statistics included the calculation of extreme values (maximum and minimum), together with measures of central tendency and deviation. ...
Article
Full-text available
Cut marks are striae accidentally produced by the contact made between the edge of a cutting tool and bone surfaces by anthropogenic activity, presenting evidence of hominin carcass processing and behaviour, butchery activities or diet. Post-depositional processes can cause the alteration (chemical or mechanical) of bones surfaces, changing their composition and causing the modification of bone surfaces. Previous research has addressed the problem of chemical alteration from a qualitative perspective, resulting in the loss of all diagnostic characteristics of the cut marks affected by these processes. Geometrics Morphometrics has led to great progress in the study of cut marks from a quantitative perspective and can be useful for the study of altered cut marks. In this study, an experiment was carried out in which 36 cut marks were reproduced and chemically altered. These marks were scanned and digitized before and after each phase of alteration. They were analyzed metrically as well as using Geometric Morphometrics, in order to study the evolution of modifications to cut mark morphology during the experiment. Results show clear morphological differences between the different phases of alteration with altered cut marks presenting a general tendency towards a decrease in both the width and depth over time. Research of this type opens up a new path for the study of the chemical alteration of cut marks, as well as other striae, through the application of Geometric Morphometrics.
... We used the normalized median absolute deviation (NMAD) (Höhle and Höhle, 2009) of elevation change for the ∆ℎ component, representing the spread of noise across the stable surfaces and the mean elevation change, representing the 170 systematic (local) elevation change bias, as the ∆ℎ component: ...
Preprint
Full-text available
Many glaciers dam lakes at their margins that can drain suddenly. Due to downwasting of these glacier dams, the magnitude of glacier lake outburst floods may change. Judging from repeat satellite observations, most ice-dammed lakes with repeated outbursts have decreased in area, volume, and flood size. Yet, we find that some lakes oppose this trend by releasing progressively larger volumes over time, and elevating downstream hazards. One of these exceptions is Desolation Lake, southeastern Alaska, having drained at least 48 times since 1972 with progressively larger volumes despite the surface lowering of the local ice dam. Here we focus on explaining its unusual record of lake outbursts using estimates of flood volumes, lake levels, and glacier elevation based on a time series of elevation models and satellite images spanning five decades. We find that the lake grew by ~10 km2 during our study period, more than any other ice-dammed lake with reported outbursts in Alaska. The associated flood volumes tripled from 200–300 × 106 m3 in the 1980s up to ~700 × 106 m3 in the 2010s, which is more than five times the regional median of reported flood volumes from ice-dammed lakes. Yet, Lituya Glacier, which dams the lake, had a median surface lowering of ~50 m between 1977 and 2019 and the annual maximum lake levels dropped by 110 m since 1985, to a level of 202 m a.s.l. in 2022. We explain the contrasting trend of growing lake volume and glacier surface lowering in terms of the topographic and glacial setting of Desolation Lake. The lake lies in a narrow valley in contact with another valley glacier, Fairweather Glacier, at its far end. During our study period, the ice front of the Fairweather Glacier receded rapidly, creating new space that allowed the lake to expand laterally and accumulate a growing volume of water. We argue that the growth of ice-dammed lakes with outburst activity is controlled more by 1) the potential for lateral expansion and 2) meltwater input due to ablation at the glacier front, than by overall mass loss across the entire glacier surface. Lateral lake expansion and frontal glacier ablation can lead to larger lake outbursts even if ablation of the overall glacier surface accelerates and the maximum lake level drops. Identifying valleys with hazardous ice-topographic conditions can help prevent some of the catastrophic damage that ice dam failures have caused in past decades.
... In other words, input data where the residuals of the InSAR and optical DEMs relative to the corresponding reference data are greater than three times the NMAD will be considered outliers and removed. It is recommended to use NMAD as a robust accuracy measure rather than the classic RMSE to mitigate the influence of outliers in elevation data studies [30]. In addition, we use RMSE (Root-Mean-Squared Error) and NMAD (Normalized Median Absolute Deviation) as evaluation metrics for the accuracy of DEM fusion. ...
Article
Full-text available
InSAR and optical techniques represent two principal approaches for the generation of large-scale Digital Elevation Models (DEMs). Due to the inherent limitations of each technology, a single data source is insufficient to produce high-quality DEM products. The increasing deployment of satellites has generated vast amounts of InSAR and optical DEM data, thereby providing opportunities to enhance the quality of final DEM products through the more effective utilization of the existing data. Previous research has established that complete DEMs generated by InSAR technology can be combined with optical DEMs to produce a fused DEM with enhanced accuracy and reduced noise. Traditional DEM fusion methods typically employ weighted averaging to compute the fusion results. Theoretically, if the weights are appropriately selected, the fusion outcome can be optimized. However, in practical scenarios, DEMs frequently lack prior information on weights, particularly precise weight data. To address this issue, this study adopts a fully connected artificial neural network for elevation fusion prediction. This approach represents an advancement over existing neural network models by integrating local elevation and terrain as input features and incorporating curvature as an additional terrain characteristic to enhance the representation of terrain features. We also investigate the impact of terrain factors and local terrain feature as training features on the fused elevation outputs. Finally, three representative study areas located in Oregon, USA, and Macao, China, were selected for empirical validation. The terrain data comprise InSAR DEM, AW3D30 DEM, and Lidar DEM. The results indicate that compared to traditional neural network methods, the proposed approach improves the Root-Mean-Squared Error (RMSE) ranges, from 5.0% to 12.3%, and the Normalized Median Absolute Deviation (NMAD) ranges, from 10.3% to 26.6%, in the test areas, thereby validating the effectiveness of the proposed method.
... Due to the absence of independent high-resolution ground control points (GCPs), only the relative accuracy of the winter to the autumn DEM was assessed with statistical metrics based on stable terrain (Berthier et al., 2014;Bessette-Kirton et al., 2018;Larsen et al., 2007;Marti et al., 2016;Rolstad et al., 2009;Shaw et al., 2020a). Besides the xyz-translation values, the standard deviation (SD) and the normalized median absolute deviation (NMAD) (Höhle and Höhle, 2009), which can be used to estimate the magnitude of random errors, were retrieved before and after co-registration. Statistics indicate improvements with only slight differences between the study sites ( Table 3). ...
... Second, we compute and show the estimated standard error from stable ground offsets. For that purpose, we compute the normalized median absolute deviation of stable ground offsets for each time interval and rock glacier, and from this absolute deviation, we derive a robust estimate of the standard deviation of stable ground offsets 54 . To then estimate from this standard deviation (i.e. ...
Article
Full-text available
Despite their extensive global presence and the importance of variations in their speed as an essential climate variable, only about a dozen global time series document long-term changes in the velocity of rock glaciers – large tongue-shaped flows of frozen mountain debris. By analysing historical aerial photographs, we reconstruct here 16 new time series, a type of data that has not previously existed for the North American continent. We observe substantial accelerations, as much as 2–3 fold, in the surface displacement rates of rock glaciers across the mountains of the western contiguous United States over the past six to seven decades, most consistent with strongly increasing air temperatures in that region. Variations between individual time series suggest that different local and internal conditions of the frozen debris bodies modulate this overall climate response. Our observations indicate fundamental long-term environmental changes associated with frozen ground in the study region.
... Simultaneously, we also employed the Normalized Median Absolute Deviation (NMAD) because this measure is intimately tied to the median of the absolute variations between the errors and the median error. NMAD serves as a robust estimate for the standard deviation, showing heightened resilience to potential outliers in the dataset [Höhle and Höhle 2009]: ...
Article
Full-text available
The 2011–2012 eruption at Cordón Caulle, Chile offers an exceptional opportunity to investigate topographic evolution of a laccolith, lava flows, and tephra during and after rhyolitic eruptions using satellite TanDEM-X and Plèiades data. We find distinct phases: rapid surface uplift from the laccolith and tephra (June–August 2011) and lava (June 2011–March 2012), followed by a reduction in the elevation of the laccolith and tephra (up to 19 m yr−1) until February 2013, and slower subsidence of all deposits until 2019 (the most recent data). The spatial distribution of subsidence-to-uplift ratios shows different volcanic and geomorphological processes occurring (degassing, cooling, crystallization, lateral movement, compaction, erosion). Pre-eruptive river channels showed elevation increases of up to 10–50 m due to tephra deposition, but this tephra was largely removed within three to four years. This research shows the potential of repeating high-resolution remote sensing elevation data to elucidate volcanic landscape evolution and yields insights into the co- and post-eruptive evolution of deposits.
... This is because standard metrics are overly influenced by outliers in the error distribution which is common in LiDAR. Robust measures like the normalized median absolute deviation (NMAD) are less prone to outliers and better measure the error (Höhle and Höhle 2009). Therefore, evaluating DEM accuracy under varying LiDAR densities warrants the calculation of both traditional and robust error statistics to comprehensively quantify modeling performance and reliability. ...
Article
Full-text available
Accurate digital elevation models (DEMs) derived from airborne light detection and ranging (LiDAR) data are crucial for terrain analysis applications. As established in the literature, higher point density improves terrain representation but requires greater data storage and processing capacities. Therefore, point cloud sampling is necessary to reduce densities while preserving DEM accuracy as much as possible. However, there has been a limited examination directly comparing the effects of various sampling algorithms on DEM accuracy. This study aimed to help fill this gap by evaluating and comparing the performance of three common point cloud sampling methods octree, spatial, and random sampling methods in high terrain. DEMs were then generated from the sampled point clouds using three different interpolation algorithms: inverse distance weighting (IDW), natural neighbor (NN), and ordinary kriging (OK). The results showed that octree sampling consistently produced the most accurate DEMs across all metrics and terrain slopes compared to other methods. Spatial sampling also produced more accurate DEMs than random sampling but was less accurate than octree sampling. The results can be attributed to differences in how the sampling methods represent terrain geometry and retain microtopographic detail. Octree sampling recursively subdivides the point cloud based on density distributions, closely conforming to complex microtopography. In contrast, random sampling disregards underlying densities, reducing accuracy in rough terrain. The findings guide optimal sampling and interpolation methods of airborne lidar point clouds for generating DEMs for similar complex mountainous terrains.
... It is considered the primary data source for GIS-based spatial analysis to manage various geomorphological and natural resource problems [10]. In reality, various attributes characterize DEM surface, including resolution, accuracy, and others, which are decisive components for extracting vital terrain variables [11][12][13] required in many scientific disciplines [14,15]. The quality of the DEM relies on grid spacing, data collection methods, interpolation algorithms, and morphological terrain features [16][17][18][19]. ...
Article
Full-text available
Soil resource management is fundamentally integral to environmental sustainability and agricultural productivity. The digital elevation model (DEM) is the fundamental data for analyzing landform surfaces, which introduces an opportunity to obtain a broad spectrum of terrain factors to simplify interpreting the patterns and processes in the geoscience field. The accuracy and resolution of DEM are crucial for their effective use, and many algorithms have been developed to interpolate digital elevation data from a set of known points. Although primary topographic variables derived from grid datasets are important, secondary variables, such as the relief index (RFI), play a more critical role in understanding the complicated relationship between soil properties and landform attributes. The RFI is attained from a DEM by calculating the elevation range within a given neighborhood surrounding a central cell. It is an essential predictor of soil natural resource management that measures the degree of differentiation surface relief. In addition, it is beneficial for perceiving the landscape and its management. This study presents a comprehensive zonal analysis comparing the RFI values derived from multiple interpolation-based DEMs. It investigates deterministic and geostatistical interpolators, such as inverse distance weighted and natural neighbor across distinct zones with diverse topographical characteristics. The findings indicated a high correlation between the RFI and the reliability of the DEM, and the natural neighbor technique provided superior performance against others. The results revealed that the choice of spatial interpolation technique significantly affects the accuracy and reliability of RFI models.
... However, a qualified 3D model generation with optical remote sensing in underwater conditions is not as simple as surface conditions and in many cases large outliers are likely to arise. In that case, NMAD is used as a robust scale estimator to estimate the scale of the DZ distribution and can be considered as an estimate for the standard deviation more resilient to outliers in the dataset (Höhle and Höhle, 2009). To estimate the relative accuracy level between neighboring pixels on produced DBMs, contour maps of the evaluated DBMs were created and relative standard deviation (RSZ) was calculated for each pixel using 10 pixel diameter (point spacing × 10) distance groups (D) by Equation 21. ...
Article
Full-text available
Recently, the use of unmanned aerial vehicles (UAVs) in bathymetric applications has become very popular due to the rapid and periodic acquisition of high spatial resolution data that provide detailed modeling of shallow water body depths and obtaining geospatial information. In UAV-based bathymetry, the sensor characteristics, imaging geometries, and the quality of radiometric and geometric calibrations of the imagery are the basic factors to achieve most reliable results. Digital bathymetric models (DBMs) that enable three-dimensional bottom topography definition of water bodies can be generated using many different techniques. In this paper, the effect of different UAV imaging bands and DBM generation techniques on the quality of bathymetric 3D modeling was deeply analyzed by visual and statistical model-based comparison approaches utilizing reference data acquired by a single-beam echosounder. In total, four different DBMs were generated and evaluated, two from dense point clouds derived from red–green–blue (RGB) single-band and multispectral (MS) five-band aerial photos, and the other two from Stumpf and Lyzenga empirical satellite-based bathymetry (SDB) adapted to UAV data. The applications were performed in the Tavşan Island located in Istanbul, Turkey. The results of statistical model-based analyses demonstrated that the accuracies of the DBMs are arranged as RGB, MS, Lyzenga, and Stumpf from higher to lower and the standard deviation of height differences are between ±0.26 m and ±0.54 m. Visual results indicate that five-band MS DBM performs best in identifying the deepest areas.
... Techniques like photogrammetry using optical stereo-images, radar interferometry (stereoscopic measurements of synthetic aperture radar images), and laser scanning that yields point clouds are employed for DEM generation (Yu et al. 2015). In general, DEMs generated from space-borne sensors capture the elevation information of land cover and land use features that are present above the bare-earth surface, resulting in the Digital Surface Model (DSM); DEMs representing only the bare-earth features are termed as Digital Terrain Model (DTM) (Höhle and Höhle 2009;DeWitt et al. 2017;Dandabathula et al. 2023a). ...
Article
High Mountain Asia (HMA), recognized as a third pole, needs regular and intense studies as it is susceptible to climate change. An accurate and high-resolution Digital Elevation Model (DEM) for this region enables us to analyze it in a 3D environment and understand its intricate role as the Water Tower of Asia. The science teams of NASA realized an 8-m DEM using satellite stereo imagery for HMA, termed HMA 8-m DEM. In this research, we assessed the vertical accuracy of HMA 8-m DEM using reference elevations from ICESat-2 geolocated photons at three test sites of varied topography and land covers. Inferences were made from statistical quantifiers and elevation profiles. For the world’s highest mountain, Mount Everest, and its surroundings, Root Mean Squared Error (RMSE) and Mean Absolute Error (MAE) resulted in 1.94 m and 1.66 m, respectively; however, a uniform positive bias observed in the elevation profiles indicates the seasonal snow cover change will dent the accurate estimation of the elevation in this sort of test sites. The second test site containing gentle slopes with forest patches has exhibited the Digital Surface Model (DSM) features with RMSE and MAE of 0.58 m and 0.52 m, respectively. The third test site, situated in the Zanda County of the Qinghai-Tibet, is a relatively flat terrain bed, mostly bare earth with sudden river cuts, and has minimal errors with RMSE and MAE of 0.32 m and 0.29 m, respectively, and with a negligible bias. Additionally, in one more test site, the feasibility of detecting the glacial lakes was tested, which resulted in exhibiting a flat surface over the surface of the lakes, indicating the potential of HMA 8-m DEM for deriving the hydrological parameters. The results accrued in this investigation confirm that the HMA 8-m DEM has the best vertical accuracy and should be of high use for analyzing natural hazards and monitoring glacier surfaces.
... where C is a constant determined as 1.4826 [35], and m is the median of all WSE values at a VS. Photons with a WSE beyond m ± NMAD were regarded as outliers and removed. ...
Article
Full-text available
The application of ICESAT-2 altimetry data in river hydrology critically depends on the accuracy of the mean water surface elevation (WSE) at a virtual station (VS) where satellite observations intersect solely with water. It is acknowledged that the ATL13 product has noise elevations of the adjacent land, resulting in biased high mean WSEs at VSs. Earlier studies have relied on human intervention or water masks to resolve this. Both approaches are unsatisfactory solutions for large river basins where the issue becomes pronounced due to many tributaries and meanders. There is no automated procedure to partition the truly representative water height from the totality of the along-track ICESAT-2 photon segments (portions of photon points along a beam) for increasing precision of the mean WSE at VSs. We have developed an automated approach called “auto-segmentation”. The accuracy of our method was assessed by comparing the ATL13-derived WSEs with direct water level observations at 10 different gauging stations on 37 different dates along the Lower Murray River, Australia. The concordance between the two datasets is significantly high and without detectable bias. In addition, we evaluated the effects of four methods for calculating the mean WSEs at VSs after auto-segmentation processing. Our results reveal that all methods perform almost equally well, with the same R² value (0.998) and only subtle variations in RMSE (0.181–0.189 m) and MAE (0.130–0.142 m). We also found that the R², RMSE and MAE are better under the high flow condition (0.999, 0.124 and 0.111 m) than those under the normal-low flow condition (0.997, 0.208 and 0.160 m). Overall, our auto-segmentation method is an effective and efficient approach for deriving accurate mean WSEs at river VSs. It will contribute to the improvement of ICESAT-2 ATL13 altimetry data utility on rivers.
... (SD) and the normalized median absolute deviation (NMAD) to assess their vertical precision. The NMAD is a metric for the dispersion of the data (also at the 1σ confidence level), which is less sensitive to outliers than the SD, and is recommended for use in DEM precision assessments ( Höhle and Höhle, 2009). ...
Article
Full-text available
Digital elevation models (DEMs) from the spaceborne interferometric radar mission TanDEM-X hold a large potential for glacier change assessments. However, a bias is potentially introduced through the penetration of the X-band signal into snow and firn. To improve our understanding of radar penetration on glaciers, we compare DEMs derived from the almost synchronous acquisition of TanDEM-X and Pléiades optical stereo-images of Grosser Aletschgletscher in March 2021. We found that the elevation bias – averaged per elevation bin – can reach up to 4–8 m in the accumulation area, depending on post co-registration corrections. Concurrent in situ measurements (ground-penetrating radar, snow cores, snow pits) reveal that the signal is not obstructed by the last summer horizon but reaches into perennial firn. Because of volume scattering, the TanDEM-X surface is determined by the scattering phase centre and does not coincide with a specific firn layer. We show that the bias corresponds to more than half of the decadal ice loss rate. To minimize the radar penetration bias, we recommend to select DEMs from the same time of the year and over long observation periods. A correction of the radar penetration bias is recommended, especially when combining optical and TanDEM-X DEMs.
... DoD analyses were conducted at the following intervals: 2012-2016 (data from ALS), 2016-2022 (data from ALS and geodetic measurements). To determine minimum level of detection (minLoD), we calculated root mean square error (RMSE), mean error (ME) and standard deviation of error (SDE), according to the formulas presented byHöhle and Höhle (2009) ( ...
Article
Full-text available
Ski tourism's popularity is driving a rise in the number of ski resorts. This study aims to present the impact of ski infrastructure on soil erosion processes in the example from a small catchment in the Gubałowskie Foothills in southern Poland, where landscape changes before (since 1879) and after the construction of the ski station (2007) are presented. The analyses of changes in flow accumulation, slope morphom-etry, and drainage ditches were conducted in the test area. Quantitative analyses were performed using repeated DEMs derived from aerial LiDAR survey and detailed geodetic measurements, complemented by geomorphological mapping done in the field. The study has revealed that the ski infrastructure has not only directly transformed the hillslope by flattening and constructing escarpments (up to 3 m high) but has also created alternating patterns of erosion and accumulation. In the test area, the drainage ditch was poorly designed. It was filled with materials (0.1-0.5 m), and two new outlets formed. The escarpment of the analyzed ski run has been diminished by 0.5 m. An alluvial fan (0.1-0.22 m thick) has developed on the flattened surface below the escarpment with drainage ditch outlets. This fan is eroded by subsurface flow that creates a piping system. The gully below the alluvial fan has retreated upslope, accelerated by subsurface erosion. This study enables the presentation of hillslope adjustments and processes in response to the new conditions caused by ski infrastructure. Such results may support more effective land management in regions changed by ski infrastructure. K E Y W O R D S DEM of difference, drainage lines, human impact, land degradation, ski run, soil piping
... S1). The main statistics of accuracy and precision such as mean, standard deviation, and RMSE as well as robust statistics of the DEM of difference such as median and sigma MAD (σ MAD , where MAD is the median absolute deviation) (Höhle and Höhle, 2009) are calculated to assess the accuracy and precision of the generated DEMs. For normally distributed data, the σ MAD is defined as 1.4826 * MAD. ...
Article
Full-text available
Alpine rivers have experienced considerable changes in channel morphology over the last century. Natural factors and human disturbance are the main drivers of changes in channel morphology that modify natural sediment and flow regimes at local, catchment, and regional scales. In glaciated catchments, river sediment loads are likely to increase due to increasing snowmelt and glacier melt runoff, facilitated by climate change. Additionally, channel erosion and depositional dynamics and patterns are influenced by sediment delivery from hillslopes and sediment in the forefields of retreating glaciers. In order to reliably assess the magnitudes of the channel-changing processes and their frequencies due to recent climate change, the investigation period needs to be extended to the last century, ideally back to the end of the Little Ice Age. Moreover, a high temporal resolution is required to account for the history of changes in channel morphology and for better detection and interpretation of related processes. The increasing availability of digitised historical aerial images and advancements in digital photogrammetry provide the basis for reconstructing and assessing the long-term evolution of the surface, in terms of both planimetric mapping and the generation of historical digital elevation models (DEMs). The main issue of current studies is the lack of information over a longer period. Therefore, this study contributes to research on fluvial sediment changes by estimating the sediment balance of a main Alpine river (Fagge) in a glaciated catchment (Kaunertal, Austria) over 19 survey periods from 1953 to 2019. Exploiting the potential of historical multi-temporal DEMs combined with recent topographic data, we quantify 66 years of geomorphic change within the active floodplain, including erosion, deposition, and the amounts of mobilised sediment. Our study focuses on a proglacial river that is undergoing a transition phase, resulting from an extensive glacier retreat of approximately 1.8 km. This has led to the formation of new channel networks and an overall negative cumulative sediment balance for the entire study area. We found that high-magnitude meteorological and hydrological events associated with local glacier retreats have a significant impact on the sediment balance. The gauge record indicates an increase in such events, as well as in runoff and probably in sediment transport capacity. Despite this, the sediment supply has declined in the last decade, which can be attributed to a lower contribution of the lateral moraines coupled to the channel network and less sediment sourced from the melting Gepatsch Glacier as evidenced by roches moutonnées exposed in the current/most recent forefield. Nonetheless, we observed significant erosion in the tributary, leading to the transport of sediment downstream. Overall, this study enhances our understanding of the complexity of sediment dynamics in proglacial rivers across various spatial and temporal scales and their relationship to climate change factors.
... The elevation changes were corrected for outliers by widely used Normalized Median Absolute Deviation (NMAD) approach (Höhle and Höhle 2009). The final uncertainty in thickness change rate ( ) was calculated using the widely accepted approach (Rolstad et al. 2009;Seehaus et al. 2019) as: ...
Article
Himalayan glaciers are shrinking rapidly, especially after 2000. Glacier shrinkage, however, shows a differential pattern in space and time, emphasizing the need to monitor and assess glacier changes at a larger scale. In this study, changes of 48 glaciers situated around the twin peaks of the Nun and Kun mountains in the northwestern Himalaya, hereafter referred to as Nun-Kun Group of Glaciers (NKGG), were investigated using Landsat satellite data during 2000–2020. Changes in glacier area, snout position, Equilibrium Line Altitude (ELA), surface thickness and glacier velocity were assessed using remote sensing data supplemented by field observations. The study revealed that the NKGG glaciers have experienced a recession of 4.5%±3.4% and their snouts have retreated at the rate of 6.4±1.6 m·a−1. Additionally, there was a 41% increase observed in the debris cover area during the observation period. Using the geodetic approach, an average glacier elevation change of −1.4±0.4 m·a−1 was observed between 2000 and 2012. The observed mass loss of the NKGG has resulted in the deceleration of glacier velocity from 27.0±3.7 m·a−1 in 2000 to 21.2±2.2 m·a−1 in 2020. The ELA has shifted upwards by 83.0±22 m during the period. Glacier morphological and topographic factors showed a strong influence on glacier recession. Furthermore, a higher recession of 12.9%±3.2% was observed in small glaciers, compared to 2.7%±3.1% in larger glaciers. The debris-covered glaciers showed lower shrinkage (2.8%±1.1%) compared to the clean glaciers (9.3%±5%). The glacier depletion recorded in the NKGG during the last two decades, if continued, would severely diminish glacial volume and capacity to store water, thus jeopardizing the sustainability of water resources in the basin.
... The second method is Normalized Median Absolute Deviation (NMAD) error (Eq. 2), which is a measure of accuracy based on robust estimators, suggested by (Hohle and Hohle 2009). NMAD is thus proportional to the median of the absolute differences between errors and the median error. ...
Article
Full-text available
Landslide is one of the natural calamities that severely affect society and economy, specifically in the mountainous regions. Monitoring of the possible landslide areas is essential to minimize the effect. Remote sensing and geographical information system are useful tools for landslide monitoring in order to prevent the loss associated with landslide. In recent years, the development of unmanned aerial vehicle (UAV) photogrammetry opened a new window in the field of geosciences for reconnaissance survey, mapping and hazard monitoring. In the present study, high-resolution aerial images are collected by DJI phantom 4 drone, with 24 megapixel camera, at an elevation of 70 m from the ground. From the aerial images, 3D point clouds, orthomosaic and DEM are generated, using Agisoft Photoscan software. The collected ground control points are used for georeferencing the model and for error estimation. The slope, slope direction and drainage pattern maps are prepared using ArcGIS. Landslide material spread over 0.365 m² area with more than 1 million m³ of debris transported. However, the debris material is not completely washed out of the slope and stuck at the upper part of the village that has a chance of further slide. In such a devastating situation, the southern part of the Kangpokpi area could be affected in the future. With the help of this mapping and photogrammetric technique, the landslide can be monitored constantly so that future disasters can be prevented.
... (4) Robust and traditional accuracy metrics were adopted to evaluate subsidence rate accuracy (Höhle and Höhle, 2009;Li et al., 2021;Wang et al., 2020). ...
Article
Coastal subsidence is a geological disaster that has devastating consequences. However, an accurate understanding of its risks involves more than simply assessing the amount or rate of land subsidence. The existing methods used to evaluate geological disaster risks depend on extensive data collection, entail substantial workloads, suffer from error estimation challenges, and lack regional adaptability. These limitations prevent us from fully understanding coastal subsidence risks in estuarine deltas. Therefore, in this study, we propose a new subsidence risk assessment method that addresses the challenges of traditional geological risk assessments in terms of spatial coverage, spatiotemporal resolution, and data collection difficulty. First, Sentinel-1 multitemporal interferometric synthetic aperture radar (MT-InSAR) and cluster analysis were used to estimate the subsidence hazards. Subsequently, Landsat-8 imagery and a random forest (RF) classifier were used to obtain land use and land cover (LULC), and the analytic hierarchy process (AHP) was used to obtain settlement vulnerability. Thereafter, subsidence susceptibility was derived from the sediment layer thickness. By combining subsidence hazard, vulnerability, and susceptibility, the first subsidence risk map with a 30 m resolution was generated. The results showed that 4.54 % of the Yellow River Delta (YRD) area was high-risk, 8.75 % was medium-risk, and 10.14 % was low-risk. Notably, the risk map shows a clear overlap between high-risk and saltwater mining areas in the YRD. The proposed method is expected to improve our understanding of the coastal subsidence risk in estuarine deltas. Considering that the risk in high-value economic areas in the YRD is increasing, whereas the risk in low-value economic areas may change owing to human activity, early preventive measures are required.
... All three DTMs generated with different image combinations and also one combined DTM were compared with the reference DTM with two metrics, i.e. SZ (standard deviations of the height) and the NMAD (Normalized Median Absolute Deviation) (Höhle and Höhle, 2009). Figure 14 shows the frequency of the differences in height between the DTM based on combined triplet images and the reference DTM. ...
Conference Paper
Full-text available
The information contents of high resolution space images, usable for mapping, are not only depending upon the image resolution that means in case of digital data, depending upon the pixel size in the object space. Important is also the contrast, the spectral range, radiometric resolution and colour beside the atmospheric condition and the object contrast. From the area of Zonguldak, Turkey different space images are available like taken by IKONOS, KVR-1000, SPOT-5, IRS-1C, TK-350, ASTER, Landsat TM, JERS and SRTM X-band. Of course the information content is mainly depending upon the pixel size on the ground, but this is still quite different for the RADAR images taken by JERS and SRTM. The object identification in these images disturbed by speckle cannot be compared with optical images having the same pixel size. There is a rule of thumb for the relation of the pixel size to the possible map scale, but it cannot be used for ground pixels with a size exceeding 5m because this is leading to a loss of important information which must be available also in small scale maps. The limited radiometric resolution of IRS-1C images is still a disadvantage, especially in dark and shadow areas. The KVR-1000 available with 1.6m pixel size cannot be compared directly with the information contents which should be included with this resolution. The colour information of IKONOS supports the object identification, so the 4m ground pixel size includes a higher information contents like a panchromatic image with the same resolution and the object identification is quite easier. With IKONOS pan sharpened images maps up to a scale 1:7000 can be created.
... Around our reported DEM errors, the vertical precision (standard deviation on stable terrain) of our Pléiades time series in Muztagh Ata and western Nyainqêntanglha were 2.1 and 1.2 m, respectively (Table 2). DEM precision can be also described using the normalized median absolute deviation (NMAD), which is less sensitive to outliers compared to standard deviation (Höhle and Höhle, 2009;Dehecq et al., 2016). Independently from seasonal snow conditions, the NMAD over off-glacier terrain was consistently around ±1.3 m in Muztagh Ata and varied between ±0.6 and ±1.2 m in western Nyainqêntanglha (Table 2). ...
Article
Full-text available
Glaciers are crucial sources of freshwater in particular for the arid lowlands surrounding High Mountain Asia. To better constrain glacio-hydrological models, annual, or even better, seasonal information about glacier mass changes is highly beneficial. In this study, we evaluate the suitability of very-high-resolution Pléiades digital elevation models (DEMs) to measure glacier mass balance at annual and seasonal scales in two regions of High Mountain Asia (Muztagh Ata in Eastern Pamirs and parts of western Nyainqêntanglha, south-central Tibetan Plateau), where recent estimates have shown contrasting glacier behaviour. The average annual mass balance in Muztagh Ata between 2019 and 2022 was -0.07 ± 0.20 m w.e. a-1, suggesting the continuation of a recent phase of slight mass loss following a prolonged period of balanced mass budgets previously observed. The mean annual mass balance in western Nyainqêntanglha was highly negative for the same period (-0.60 ± 0.15 m w.e. a-1), suggesting increased mass loss rates compared to the approximately previous 5 decades. The 2022 winter (+0.13 ± 0.24 m w.e.) and summer (-0.35 ± 0.15 m w.e.) mass budgets in Muztagh Ata and western Nyainqêntanglha (-0.03 ± 0.27 m w.e. in winter; -0.63 ± 0.07 m w.e. in summer) suggest winter- and summer-accumulation-type regimes, respectively. We support our findings by implementing the Sentinel-1-based Glacier Index to identify the firn and wet-snow areas on glaciers and characterize the accumulation type. The good match between the geodetic and Glacier Index results supports the potential of very-high-resolution Pléiades data to monitor mass balance at short timescales and improves our understanding of glacier accumulation regimes across High Mountain Asia.
... To asses the results in terms of accuracy, completeness and effect of the baseline, we tested the algorithms: The first metric used to analyze the results is the Median Absolute Deviation (MAD), as this is a robust metric for skew distributions (Höhle and Höhle, 2009). This is computed as: ...
Article
Full-text available
The reconstruction of 3D scenes from images has usually been addressed with two different strategies, namely stereo and multiview. The former requires rectified images and generates a disparity map, while the latter relies on the camera parameters and directly retrieves a depth map. For both cases, deep learning architectures have shown an outstanding performance. However, due to the differences between input and output data, the two strategies are difficult to compare on a common scene. Moreover, for remote sensing applications multi-view data is hard to acquire and the ground truth is either sparse or affected by outliers. Hence, in this article we evaluate the performance of stereo and multi-view architectures trained on synthetic data resembling remote sensing images. The data has been and processed and organized to be compatible with both kind of neural networks. For a fair comparison, training and testing are done only with two views. We focus on the accuracy of the reconstruction, as well as the impact of the depth range and the baseline of the stereo array. Results are presented for deep learning architectures and non-learning algorithms.
Article
This study evaluated six open‐access digital elevation models (DEMs) for the Del Azul Creek Basin in the Argentine Chaco‐Pampean Plain: Shuttle Radar Topography Mission, Advanced Land Observing Satellite Phased Array L‐Band Synthetic Aperture Radar, TerraSAR‐X Add‐On for Digital Elevation Measurements (TanDEM‐X), NASADEM Global DEM, Forest and Building height biases were removed from Copernicus GLO 30 DEM V1‐0 (FABDEM), and TanDEM‐X 30 m Edited DEM (EDEM). Statistical metrics were calculated for (i) residuals between DEMs and the Ice, Cloud and Land Elevation Satellite‐2 (ICESat‐2); (ii) the minimum distance between DEM‐derived drainage networks and those from the Buenos Aires Provincial Water Authority; and (iii) DEM‐derived slopes in shallow water bodies compared with the Joint Research Centre's Global Surface Water Mapping product. Analyses were performed for four elevation and seven slope bands. TanDEM‐X had the smallest errors compared to ICESat‐2 (median 0.19 m, NMAD 0.38 m), followed by FABDEM (median 0.31 m, NMAD 0.23 m). EDEM performed best in drainage networks (median 99.45 m, NMAD 117.16 m), followed by FABDEM. In general, the vertical error increased with elevation and the accuracy of the drainage network estimates improved. The vertical accuracy decreased with steeper slopes, with FABDEM performing the best across all slope ranges. FABDEM exhibited the best performance in determining seasonally dispersed shallow water bodies, demonstrating its overall usefulness for hydrological applications in large‐scale plains characterized by aeolian geoforms of lowland accumulation and erosion. Assessing freely available products provides valuable resources for researchers and professionals and can guide decision making for managing hydrological resources, including flood risk and infrastructure development.
Article
The objective of this study is to characterize benthic substrate of Bonetambung reef area using worldview 2 highresolution satellite imagery. About 14.273 depth spot were deploy to build bathimetric model of the study area. The correlation of pixel radiance value from satellite imagery and depth value from field measurement werebecame the basis to classify geomorphologic zone of shallow water area up to 7 meter depth. Between threeinterpolator tested, the natural neighbor interpolator has produced the best bathymetric model with root meansquare error 0.3 meter and benthic substrate coverage model with thematic accuracy 62%. These approach beable to recognize four substrate type in general, such as sand, seagrass, rubble and coral. Geomorphicsegmentation based on bathymetric profile and radiance value of worldview 2 imagery may also maps the reefflat, reef slope and lagoon area hence may support the benthic substrate modeling. This study showed us apotential technic to develop a modelling for juvenile fish transport at Bonetambung coral waters.Key words: benthic substrate bathymetry, worldview imagery, thematic accuracy
Article
Warming in the Third Pole region accelerates glacier and snow melt, leading to a rise in glacial lake numbers and sizes. However, accurately measuring their water level changes poses challenges, hindering precise volume assessments and evaluation of glacier mass balance contributions. Here, we took the Ak-Shyirak glaciers and the largest Petrov proglacial lake in the Central Tien Shan as a case study to investigate these phenomena. Specifically, firstly, we conducted mass balance assessments for the Ak-Shyirak massif for six sub-periods from 1973 to 2023 using KH-9 DEMs, SRTM DEM, and ASTER DEMs. The results indicate that glaciers were in a state of rapid melting for 1980–2000 and 2005–2012, with rates of −0.46 m w.e./a and − 0.37 m w.e./a; moderate melting during 1973–1980 and 2012–2018, with rates of −0.26 m w.e./a and − 0.28 m w.e./a, while slower melting during 2000–2005 and 2018–2023, with rates of −0.08 m w.e./a and − 0.18 m w.e./a. Subsequently, we conducted assessments of the area change of Petrov Lake for 1973–2023 using KH-9 and Landsat images. The results reveal a significant increase in the glacial lake area by 2.81 km2 (150.25 %), corresponding to a rate of 0.054 km2/a over the entire study period. Furthermore, we conducted monitoring of Petrov Lake's water level from 2019 to 2023 by utilizing ICESat-2 laser altimetry and Sentinel-3 radar altimetry data. Our findings indicate that the glacial lake level shows intra-annual fluctuations and inter-annual change, with amplitudes of 0.67 ± 0.09 m and increase rate of 0.30 ± 0.05 m/a, respectively, as determined by a periodic fluctuation model. Finally, after a comprehensive analysis of ERA5-Land meteorological data, topography, glacier mass balance, lake area, and water level, we can draw the following conclusions: (1) glacier mass balance is predominantly influenced by the air temperature and snowfall; (2) changes in glacial lake area are driven by factors such as the lake basin, glacier surface elevation, and drainage event; (3) intra-annual fluctuations and inter-annual change in glacial lake levels are both primarily influenced by precipitation and glacier mass balance; (4) glacier mass balance accounts for (36.19 ± 8.47)% of the water supply contributing to changes in glacial lake volume change, while precipitation represents (63.81 ± 5.08)%. Glacier mass balance measurements reveal changing patterns in the Ak-Shyirak massif, Central Tien Shan, due to climate change. Inaugural proglacial lake level measurements provide unique insights into both intra-annual and inter-annual changes, serving as a reference for Third Pole region-wide glacial lake monitoring. Additionally, quantifying glacier meltwater contributions to lake volumes will aid future glacial lake evaluation and potential outburst flood impacts.
Conference Paper
Full-text available
DEM generation since the turn of 21th century, predominantly relied upon space-borne imagery captured with active or passive sensors. However, Türkiye recently managed just that utilizing stereo captured air photography. Considerably high resolution national Digital Elevation Model, 5 m spatial, was the resampled version of the originally built 0.30 m baseline DEM which is custom tailored to national 1:25000 topographic map grid. Mosaicked Kastamonu study area spanning 119 piece such topographic map frames, was subjected to previously recorded GNSS readings. Educational/cadastral and dam-reservoir measurement related CORS records collected since 2014, were used to assess to see how this new national DEM behaved, vertically. Overall results showed that both Root Mean Square Error and Mean Absolute Error were less than 2.5 m overall.
Article
Full-text available
Topographic data is a fundamental input to flood hazard models and controls the quality of the outputs. However, open-access global digital elevation models (DEMs) are dated and limited to 30 m resolution, which hinders modelling efforts in urban or topographically complex environments. We used the flood prone and expanding city of Kathmandu, Nepal, to evaluate the impact of topographic data source and resolution on flood model outputs. All DEMs evaluated featured spatially correlated topographic sinks with depths exceeding 20 m that required hydrological conditioning before being used in flood modelling. Incomplete hydrological conditioning appeared related to the overestimation of flood extent and therefore limited agreement when comparing a global 90 m resolution flood hazard model with a bespoke city-scale model at 10 m resolution (F1 score = 0.40). Instead, we found that the height above nearest drainage (HAND) metric was better able to replicate the higher resolution flood map as an indicator of flood susceptibility requiring only topographic information as an input. We also found that the computationally efficient FastFlood model was able to match the inundation extent (F1 score = 0.79) and flood depths (mean absolute error and root mean square error of 0.46 and 0.76 m respectively) of a published 10 m physics-based flood hazard model whilst requiring 212 times less computation time. Our analysis demonstrated that mapping city-scale flood inundation required hydrologically conditioned high-resolution topographic data but not physically complex flood models, highlighting the need for greater availability of high quality open access topographic data.
Article
Satellite‐derived digital elevation models (DEMs) and geographic information systems (GIS) offer effective means for both qualitative and quantitative analysis of drainage networks within watersheds. An extensive review of various stream order systems reveals their outdated nature, given advancements in digital technology and the availability of fine to coarse resolutions data. Analysis of open‐source DEMs with resolutions ranging from 12.5 to 225 m, along with topographical maps at scales of 1:50 000 and 1:250 000, was conducted across four physiographically distinct micro‐watersheds of approximately equal size (~300 km ² ). The steepest descent algorithm (D8) was utilized to derive raster stream networks, applying a channel initiation threshold (CIT) of 900 m ² as a criterion. It was observed that stream order numbers are influenced by map scale, leading to variability in mainstream order extraction across different DEM datasets and topographic maps. To address this issue, an innovative stream order system was proposed to ensure consistent mainstream order regardless of the spatial resolution of DEM data or map scale. Correlation analysis highlighted the importance of considering both spatial resolution and topographic variability in stream order analysis, underscoring the significance of accurate DEM data and landscape characteristics in understanding stream network dynamics. This method of classifying stream orders is recognized by scientists in geography, geology, hydrology and geomorphology for providing crucial information about the size and strength of waterways within stream networks, contributing to effective water management strategies.
Article
The vertical accuracy of elevation data in coastal environments is critical because small variations in elevation can affect an area's exposure to waves, tides, and storm-related flooding. Elevation data contractors typically quantify the vertical accuracy of lidar-derived digital elevation models (DEMs) on a per-project basis to gauge whether the datasets meet quality and accuracy standards. Here, we collated over 5200 contractor elevation checkpoints along the Atlantic and Gulf of Mexico coasts of the United States that were collected for project-level analyses produced for assessing DEMs acquired for the U.S. Geological Survey's Three-Dimensional Elevation Program. We used land cover data to quantify non-vegetated vertical accuracy and vegetated vertical accuracy statistics (overall and by point spacing bins) and assessed elevation error by land cover class. We found the non-vegetated vertical accuracy had an overall root mean square error of 6.9 cm and vegetated areas had a 95th percentile vertical error of 22.3 cm. Point spacing was generally positively correlated to elevation accuracy, but sample size limited the ability to interpret results from accuracy by land cover, particularly in wetlands. Based on the specific questions a researcher may be asking, use of literature or fieldwork could assist with enhancing error statistics in underrepresented classes.
Article
Debris cover either enhances or reduces glacier melting, thereby modulating glacier response to increasing temperatures. Debris cover variation and glacier recession were investigated on five glaciers; Pensilungpa (PG), Drung Drung (DD), Haskira (HK), Kange (KG) and Hagshu (HG), situated in the topographically and climatically similar zone in the Zanskar Himalaya using satellite data between 2000 and 2020. Analyses reveals that the HK, KG, and HG had a debris-covered area of ~24% in 2020, while PG and DD had a debris cover of <10%. Comparing PG to the other four glaciers, it had the highest shrinkage (5.7 ± 0.3%) and maximum thinning (1.6 ± 0.6 m a−1). Accordingly, detailed measurements of PG's debris cover thickness, temperature and ablation were conducted for eleven days in August 2020. The results indicated a significant variation of temperature and the highest melting was observed near dirty and thin debris-covered ice surface. Thermal conductivity of 0.9 ± 0.1 Wm−1 K−1 and 1.1 ± 0.1 Wm−1 K−1 was observed at 15 cm and 20 cm debris-depth, respectively. The ablation measurements indicated an average cumulative melting of 21.5 cm during eleven days only. Degree-day factor showed a decreasing trend towards debris cover depth with the highest value (4.8 mm w.e.°C−1 d−1) found for the dirty ice near the glacier surface and the lowest value (0.4 mm w.e.°C−1 d−1) found at 30 cm depth. The study highlights the importance of in-situ debris cover, temperature and ablation measurements for better understanding the impact of debris cover on glacier melting.
Article
This study focuses on evaluating the accuracy of Digital Elevation Models (DEMs) in Bosnia and Herzegovina, a region characterized by diverse topography. The accuracy of the gravimetric geoid model is directly correlated with the resolution and accuracy of the digital elevation model (DEM). To date, no research has been conducted on the accuracy of global and regional models for the area of Bosnia and Herzegovina, using ground control points. The analysis encompasses various aspects of DEMs, including interpolation methods (bicubic, B-spline, bilinear, and nearest neighbor), data sources, control points, and datum alignment. Both global and regional DEM models are integrated, introducing a native terrain model. Two sets of control points are utilized for accuracy assessment, along with datum alignment approach. The performance of the BIHDEM model is scrutinized, and a comparison with LIDAR data is conducted. Accuracy assessment involves calculating the Root Mean Square Error (RMSE) for all DEMs and interpolation methods. The study reveals that the bicubic interpolation method yields the best results. The BIHDEM model proves to be the most effective for SET1 (high precision leveling network), while COPERNICUS emerges as the optimal choice for SET2 (trigonometric network). Furthermore, an analysis of the elevation profiles obtained from LIDAR and DEM models highlights significant differences, with ASTER, MERIT, NASADEM and SRTM exhibiting the largest variations. In contrast, FABDEM and BIHDEM models demonstrate minimal differences. Mean values and RMSE further support the superiority of FABDEM and BIHDEM models, providing valuable insights for geospatial applications in the region.
Article
Full-text available
Flood models rely on accurate topographic data representing the bare earth ground surface. In many parts of the world, the only topographic data available are the free, satellite-derived global Digital Elevation Models (DEMs). However, these have well-known inaccuracies due to limitations of the sensors used to generate them (such as a failure to fully penetrate vegetation canopies and buildings). We assess five contemporary, 1 arc-second (≈30 m) DEMs -- FABDEM, Copernicus DEM, NASADEM, AW3D30 and SRTM -- using a diverse reference dataset comprised of 65 airborne-LiDAR surveys, selected to represent biophysical variations in flood-prone areas globally. While vertical accuracy is nuanced, contingent on the specific metrics used and the biophysical character of the site being assessed, we found that the recently-released FABDEM consistently ranked first, improving on the second-place Copernicus DEM by reducing large positive errors associated with forests and buildings. Our results suggest that land cover is the main factor explaining vertical errors (especially forests), steep slopes are associated with wider error spreads (although DEMs resampled from higher-resolution products are less sensitive), and variable error dependency on terrain aspect is likely a function of horizontal geolocation errors (especially problematic for AW3D30 and Copernicus DEM).
Conference Paper
Full-text available
Positional accuracy control methodologies, for example the EMAS, the ASPRS and the NSSDA standards, are based on classic statistical estimation techniques, all of them using a previous step of extreme values filtering. A study based on robust estimators is presented in order to modify the statistical bases of positional control estimation techniques, in such a way that extreme, atypical or outliers, values could be incorporated to the analysis. Diverse estimators are analyzed (the Danish, the Geman and McClure methods, and so on). First results have being obtained on synthetic contaminated populations and they suggest the use of the Danish method.
Article
Full-text available
In this paper, a theoretical analysis is presented of the degree of correctness to which the accuracy figures of a grid Digital Elevation Model (DEM) have been estimated, measured as Root Mean Square Error (RMSE) depending on the number of checkpoints used in the accuracy assessment process. The latter concept is sometimes referred to as the Reliability of the DEM accuracy tests. Two theoretical models have been developed for estimating the reliability of the DEM accuracy figures using the number of checkpoints and parameters related to the statistical distribution of residuals (mean, variance, skewness, and standardized kurtosis). A general case was considered in which residuals might be weakly correlated (local spatial autocorrelation) with non-zero mean and non-normal distribution. Thus, we avoided the “strong assumption” of distribution normality accepted in some of the previous works and in the majority of the current standards of positional accuracy control methods. Sampled data were collected using digital photogrammetric methods applied to large scale stereo imagery (1:5 000). In this way, seven morphologies were sampled with a 2 m by 2 m sampling interval, ranging from flat (3 percent average slope) to the highly rugged terrain of marble quarries (82 percent average slope). Two local schemes of interpolation have been employed, using Multiquadric Radial Basis Functions (MRBF) and Inverse Distance Weighted (IDW) interpolators, to generate interpolated surfaces from high-resolution grid DEMs. The theoretical results obtained were experimentally validated using the Monte Carlo simulation method. The proposed models provided a good fit for the raw simulated data for the seven morphologies and the two schemes of interpolation tested (r2 > 0.96 as mean value). The proposed theoretical models performed very well for modeling the non-gaussian distribution of the errors at the checkpoints, a property which is very common in geographically distributed data.
Article
Full-text available
This paper explores three theoretical approaches for estimating the degree of correctness to which the accuracy figures of a gridded Digital Elevation Model (DEM) have been estimated depending on the number of checkpoints involved in the assessment process. The widely used average‐error statistic Mean Square Error (MSE) was selected for measuring the DEM accuracy. The work was focused on DEM uncertainty assessment using approximate confidence intervals. Those confidence intervals were constructed both from classical methods which assume a normal distribution of the error and from a new method based on a non‐parametric approach. The first two approaches studied, called Chi‐squared and Asymptotic Student t, consider a normal distribution of the residuals. That is especially true in the first case. The second case, due to the asymptotic properties of the t distribution, can perform reasonably well with even slightly non‐normal residuals if the sample size is large enough. The third approach developed in this article is a new method based on the theory of estimating functions which could be considered much more general than the previous two cases. It is based on a non‐parametric approach where no particular distribution is assumed. Thus, we can avoid the strong assumption of distribution normality accepted in previous work and in the majority of current standards of positional accuracy. The three approaches were tested using Monte Carlo simulation for several populations of residuals generated from originally sampled data. Those original grid DEMs, considered as ground data, were collected by means of digital photogrammetric methods from seven areas displaying differing morphology employing a 2 by 2 m sampling interval. The original grid DEMs were subsampled to generate new lower‐resolution DEMs. Each of these new DEMs was then interpolated to retrieve its original resolution using two different procedures. Height differences between original and interpolated grid DEMs were calculated to obtain residual populations. One interpolation procedure resulted in slightly non‐normal residual populations, whereas the other produced very non‐normal residuals with frequent outliers. Monte Carlo simulations allow us to report that the estimating function approach was the most robust and general of those tested. In fact, the other two approaches, especially the Chi‐squared method, were clearly affected by the degree of normality of the residual population distribution, producing less reliable results than the estimating functions approach. This last method shows good results when applied to the different datasets, even in the case of more leptokurtic populations. In the worst cases, no more than 64–128 checkpoints were required to construct an estimate of the global error of the DEM with 95% confidence. The approach therefore is an important step towards saving time and money in the evaluation of DEM accuracy using a single average‐error statistic. Nevertheless, we must take into account that MSE is essentially a single global measure of deviations, and thus incapable of characterizing the spatial variations of errors over the interpolated surface.
Article
Full-text available
There are a large number of different definitions used for sample quantiles in statistical computer packages. Often within the same package one definition will be used to compute a quantile explicitly, while other definitions may be used when producing a boxplot, a probability plot, or a QQ plot. We compare the most commonly implemented sample quantile definitions by writing them in a common notation and investigating their motivation and some of their properties. We argue that there is a need to adopt a standard definition for sample quantiles so that the same answers are produced by different packages and within each package. We conclude by recommending that the median-unbiased estimator be used because it has most of the desirable properties of a quantile estimator and can be defined independently of the underlying distribution.
Article
Full-text available
All digital data contain error and many are uncertain. Digital models of elevation surfaces consist of files containing large numbers of measurements representing the height of the surface of the earth, and therefore a proportion of those measurements are very likely to be subject to some level of error and uncertainty. The collection and handling of such data and their associated uncertainties has been a subject of considerable research, which has focused largely upon the description of the effects of interpolation and resolution uncertainties, as well as modelling the occurrence of errors. However, digital models of elevation derived from new technologies employing active methods of laser and radar ranging are becoming more widespread, and past research will need to be re-evaluated in the near future to accommodate such new data products. In this paper we review the source and nature of errors in digital models of elevation, and in the derivatives of such models. We examine the correction of errors and assessment of fitness for use, and finally we identify some priorities for future research.
Article
For testing that an underlying population is normally distributed the skewness and kurtosis statistics, √b1and b2, and the D’Agostino-Pearson K2 statistic that combines these two statistics have been shown to be powerful and informative tests. Their use, however, has not been as prevalent as their usefulness. We review these tests and show how readily available and popular statistical software can be used to implement them. Their relationship to deviations from linearity in normal probability plotting is also presented.
Article
The bootstrap is a computer-intensive method that provides answers to a large class of statistical inference problems without stringent structural assumptions on the underlying random process generating the data. Since its introduction by Efron (1979), the bootstrap has found its application to a number of statistical problems, including many standard ones, where it has outperformed the existing methodology as well as to many complex problems where conventional approaches failed to provide satisfactory answers. However, it is not a panacea for every problem of statistical inference, nor does it apply equally effectively to every type of random process in its simplest form. In this monograph, we shall consider certain classes of dependent processes and point out situations where different types of bootstrap methods can be applied effectively, and also look at situations where these methods run into problems and point out possible remedies, if there is one known.
Article
Recently, Moors (1986) showed that kurtosis is easily interpreted as a measure of dispersion around the two values μ±σ\mu \pm \sigma. For this dispersion an alternative measure, based on quantiles, is proposed here. It is shown to have several desirable properties: (i) the measure exists even for distributions for which no moments exist, (ii) it is not influenced by the (extreme) tails of the distribution, and (iii) the calculation is simple (and is even possible by graphical means).
Book
Written by experts, Digital Terrain Modeling: Principles and Methodology provides comprehensive coverage of recent developments in the field. The topics include terrain analysis, sampling strategy, acquisition methodology, surface modeling principles, triangulation algorithms, interpolation techniques, on-line and off-line quality control in data acquisition, DTM accuracy assessment and mathematical models for DTM accuracy prediction, multi-scale representation, data management, contouring, visual analysis (or visualization), the derivation of various types of terrain parameters, and future development and applications.
Article
Abstract Spatial data quality is a paramount concern in all GIS applications. Existing spatial data accuracy standards, including the National Standard for Spatial Data Accuracy (NSSDA) used in the United States, commonly assume the positional error of spatial data is normally distributed. This research has characterized the distribution of the positional error in four types of spatial data: GPS locations, street geocoding, TIGER roads, and LIDAR elevation data. The positional error in GPS locations can be approximated with a Rayleigh distribution, the positional error in street geocoding and TIGER roads can be approximated with a log-normal distribution, and the positional error in LIDAR elevation data can be approximated with a normal distribution of the original vertical error values after removal of a small number of outliers. For all four data types considered, however, these solutions are only approximations, and some evidence of non-stationary behavior resulting in lack of normality was observed in all four datasets. Monte-Carlo simulation of the robustness of accuracy statistics revealed that the conventional 100% Root Mean Square Error (RMSE) statistic is not reliable for non-normal distributions. Some degree of data trimming is recommended through the use of 90% and 95% RMSE statistics. Percentiles, however, are not very robust as single positional accuracy statistics. The non-normal distribution of positional errors in spatial data has implications for spatial data accuracy standards and error propagation modeling. Specific recommendations are formulated for revisions of the NSSDA.
Article
Abstract Assessment of a DEM's quality is usually undertaken by deriving a measure of DEM accuracy – how close the DEM's elevation values are to the true elevation. Measures such as Root Mean Squared Error and standard deviation of the error are frequently used. These measures summarise elevation errors in a DEM as a single value. A more detailed description of DEM accuracy would allow better understanding of DEM quality and the consequent uncertainty associated with using DEMs in analytical applications. The research presented addresses the limitations of using a single root mean squared error (RMSE) value to represent the uncertainty associated with a DEM by developing a new technique for creating a spatially distributed model of DEM quality – an accuracy surface. The technique is based on the hypothesis that the distribution and scale of elevation error within a DEM are at least partly related to morphometric characteristics of the terrain. The technique involves generating a set of terrain parameters to characterise terrain morphometry and developing regression models to define the relationship between DEM error and morphometric character. The regression models form the basis for creating standard deviation surfaces to represent DEM accuracy. The hypothesis is shown to be true and reliable accuracy surfaces are successfully created. These accuracy surfaces provide more detailed information about DEM accuracy than a single global estimate of RMSE.
Article
For testing that an underlying population is normally distributed the skewness and kurtosis statistics, Öb1 and b2, and the D'Agostino-Pearson K2 statistic that combines these two statistics have been shown to be powerful and informative tests. Their use, however, has not been as prevalent as their usefulness. We review these tests and show how readily available and popular statistical software can be used to implement them. Their relationship to deviations from linearity in normal probability plotting is presented.
Article
Hansen, Kooperberg and Sardy introduced a family of continuous, piecewise linear functions defined over adaptively selected triangulations of the plane as a general approach to statistical modelling of bivariate densities and regression and hazard functions. These "triograms" enjoy a natural affine equivariance that offers distinct advantages over competing tensor product methods that are more commonly used in statistical applications. Triograms employ basis functions consisting of linear 'tent functions' defined with respect to a triangulation of a given planar domain. As in knot selection for univariate splines, Hansen and colleagues adopted the regression spline approach of Stone. Vertices of the triangulation are introduced or removed sequentially in an effort to balance fidelity to the data and parsimony. We explore a smoothing spline variant of the triogram model based on a roughness penalty adapted to the piecewise linear structure of the triogram model. We show that the roughness penalty proposed may be interpreted as a total variation penalty on the gradient of the fitted function. The methods are illustrated with real and artificial examples, including an application to estimated quantile surfaces of land value in the Chicago metropolitan area. Copyright 2004 Royal Statistical Society.
ASPRS Guidelines Vertical Accuracy Reporting for Lidar Data A theoretical approach to modeling the accuracy assessment of digital elevation models
  • References Lidar
References ASPRS Lidar Committee. 2004. ASPRS Guidelines Vertical Accuracy Reporting for Lidar Data, http://www.asprs.org/society/committees/lidar/Downloads/ Vertical_Accuracy_Reporting_for_Lidar_Data.pdf (accessed 28.01.2009) p. 20. Aguilar, F., Agüera, F., Aguilar, A., 2007a. A theoretical approach to modeling the accuracy assessment of digital elevation models. Photogrammetric Engineering & Remote Sensing 73 (12), 1367–1379.
Quantile Regression. Economic Society Monographs
  • R Koenker
Koenker, R., 2005. Quantile Regression. Economic Society Monographs. Cambridge University Press.
Resampling Methods for Dependent Data Digital Terrain Modeling – Principles and Methodology
  • S N Lahiri
  • Springer
  • Z Li
  • Q Zhu
  • C Gold
Lahiri, S.N., 2003. Resampling Methods for Dependent Data. Springer. Li, Z., Zhu, Q., Gold, C., 2005. Digital Terrain Modeling – Principles and Methodology. CRC Press, ISBN: 0-415-32462-9.
R: A language and environment for statistical computing. R Foundation for Statistical computing
  • R Development
  • Core Team
R Development Core Team. 2008. R: A language and environment for statistical computing. R Foundation for Statistical computing, Vienna, Austria. ISBN: 3-900051-07-0. http://www.R-project.org (accessed 28.01.2009).
Understanding Robust and Exploratory Data Analysis The EuroSDR test checking and improving of digital terrain models
  • D C Hoaglin
  • F Mosteller
  • J W J Tukey
  • M Potuckova
Hoaglin, D.C., Mosteller, F., Tukey, J.W., 1983. Understanding Robust and Exploratory Data Analysis. John Wiley & Sons, Inc. Höhle, J., Potuckova, M., 2006. The EuroSDR test checking and improving of digital terrain models. In: EuroSDR Official Publication no. 51. ISBN: 9789051794915, pp. 10–55.
ASPRS Guidelines Vertical Accuracy Reporting for Lidar Data
  • Asprs Lidar Committee
ASPRS Lidar Committee. 2004. ASPRS Guidelines Vertical Accuracy Reporting for Lidar Data, http://www.asprs.org/society/committees/lidar/Downloads/ Vertical_Accuracy_Reporting_for_Lidar_Data.pdf (accessed 28.01.2009) p. 20.
Causes and consequences of error in digital elevation models
  • Fisher
Fisher, P.F., Tate, N.J., 2007. Causes and consequences of error in digital elevation models. Progress in Physical Geography 30 (4), 467-489.