Science topic

GIS Analysis - Science topic

Explore the latest questions and answers in GIS Analysis, and find GIS Analysis experts.
Questions related to GIS Analysis
  • asked a question related to GIS Analysis
Question
3 answers
We are looking to identify characteristics of a river from drone images in ArcGIS pro.
I have been able to successfully delineate the wetted channel, riparian zone and and wood material based on hue, saturation and value in ImageJ but we are hoping to find a similar tool in ArcGIS. I have investigated the classification tools and they seem to either be designed for specific jobs (e.g. NDVI), rely on a training dataset (e.g. supervised classification) or segment into groups (but this provides too many of the wrong groups and we cant specify the colour characteristics of each group!).
We are looking for a tool which allows you to threshold the images based on known values of RGB or HSV but maybe this doesn't exist?
Thanks for your help!
Rich
Relevant answer
Answer
ArcGIS Pro provides several tools for image processing and analysis that can be used to extract information about rivers from drone images. Here are a few suggestions:
  1. Image Classification: ArcGIS Pro provides several image classification tools, including supervised and unsupervised classification. You can use these tools to classify the river features based on their spectral characteristics, which may include hue, saturation, and value. You can train the classification algorithm using a set of ground-truth samples that represent the different river features of interest.
  2. Raster Functions: ArcGIS Pro provides a set of raster functions that can be used to process and analyze drone images. You can create a custom function that uses thresholding to extract the river features based on their RGB or HSV values. The thresholding can be applied to the entire image or to specific bands, such as the red, green, and blue bands.
  3. Image Segmentation: ArcGIS Pro provides a set of image segmentation tools that can be used to segment the drone images into regions based on their spectral characteristics. You can use the segmentation tools to identify the river features based on their color, texture, and shape. You can also adjust the segmentation parameters to control the number and size of the resulting segments.
  • asked a question related to GIS Analysis
Question
5 answers
How to make Least Cost Path Analysis on the taking only LULC into consideration?
Relevant answer
Answer
To perform Least Cost Path Analysis based on Land Use/Land Cover (LULC), you will need a raster dataset representing the LULC classes and their associated cost values. Here are the general steps you can follow in ArcGIS to perform Least Cost Path Analysis based on LULC:
  1. Convert LULC dataset to a cost surface: Use the Reclassify tool to assign each LULC class with a cost value. For example, areas with high urban density may be assigned a high cost value, while areas with low vegetation cover may be assigned a lower cost value. Convert the reclassified LULC dataset into a cost surface using the Cost Distance tool.
  2. Define the starting and ending points: Choose the starting and ending points for the Least Cost Path analysis. These points can be represented as point features in a separate layer or as raster cells in the LULC dataset.
  3. Run Least Cost Path Analysis: Use the Path Distance tool to generate the Least Cost Path between the starting and ending points. Specify the cost surface created in step 1 as the input surface and the starting and ending points as the source and destination raster.
  4. Visualize and analyze the output: Visualize the output Least Cost Path on top of the LULC dataset to identify the areas of high and low cost. You can also use GIS tools to analyze and quantify the cost along the path, such as calculating the total cost, the average cost per unit distance, or the percentage of each LULC class along the path.
It's important to note that Least Cost Path Analysis based on LULC is subject to several assumptions and limitations, such as the accuracy of the LULC dataset, the assignment of cost values to each class, and the spatial resolution of the data. Therefore, it's important to carefully validate the results and understand the limitations of the analysis.
  • asked a question related to GIS Analysis
Question
5 answers
We need to calculate the influence of 3 different sources of water intake (river, eutrophicated bay and sea) on the bay, located in the middle of these sources. You can see a map with explanations as the 1st attachment (Slide1).
We have a boat with a flow-through system of sensors, so we measure different parameters every 5 sec (60m approx.). Is it possible to "imagine" the narrow streams of water intake as 3 point-source of "pollution" and calculate the influence of each point on the bay? Like, for example, to understand what source is more responsible for the increased level of pollution: river, bay or sea? Could it be done with kriging in ArcGIS Pro, and then probably visualized the influence with arrows, where the length of the arrow will represent the strength of the influence (2nd attachment, Slide2)?
Or maybe there are other suitable ways to do so? By the way, we don't have information about the speed of the flow in and out of the bay, however, it's probably possible to get it later. But we do have lots of other parameters like salinity, turbidity, fDOM, temperature etc.
I would really appreciate your help and suggestions on what articles I can read about it.
Thank you!
Relevant answer
Answer
Thank you for your answer!
It's quite hard to distinguish in this area what's tributary and what's not, as it's a coastal area, an inner archipelago. However, the northern stream is generally considered to be a main stem. The western eutrophicated bay is a flad (as they usually call such coastal lagoons in Finland). The characteristics of a flad are usually that they are under some kind of protection because of their unique ecosystem and the "entrance" to the flad is usually very narrow and very shallow (however inside a flad it's usually 2-8m deep).
As you can see on the maps, this flad is connected to the other bigger bay (fjärden) through a small ditch that goes through wetlands. The coordinates of the central bay of interest are 59°58'00.9"N 23°40'09.7"E.
There is basically no tide in that area of the Baltic sea. The central bay is something like 9-14km from the open sea (areas of the outer archipelago) depending on what route to choose through all these islands.
Basically, we have made a GAM analysis for that area based on our other parameters, and we have a predicted "normal level" for turbidity and a map, which shows SD from that normal level for every occasion (approx. 6 occasions during the non-ice period). So, basically, we were thinking about if it's possible to understand which source has a greater influence on this high positive SD turbidity in the bay and to which spatial extent. Like the spatial influence of a "point-source pollution". We have lots of high-resolution data, so we were thinking that probably it's possible indirectly calculate the strength and direction of influence. Kind of to draw the area where this particular point of pollution has influence, and where it stops having significant influence, being too diluted. But now thinking more about it, it seems that it's not really possible to predict such things without knowing velocities.
All the best,
Maria
  • asked a question related to GIS Analysis
Question
7 answers
Based on your expertise and experience,
What are the Python packages that are commonly utilized for tasks related to GIS, remote sensing, and spatial data science in 2022?
and/or
What are the Python packages that you recommend for use in GIS, remote sensing, and spatial data science applications in 2023?
please consider following domains for/as reference,
## GIS ##
  • Data management and processing
  • Geospatial analysis
  • Map production
  • Web mapping
  • etc
## Remote Sensing ##
  • Image processing
  • Feature extraction
  • Change detection
  • Image analysis
  • etc
## Spatial Data Science ##
  • Spatial statistics and modeling
  • Machine learning
  • Data visualization
  • etc
Relevant answer
Answer
There are several popular Python packages that are commonly used for GIS, remote sensing, and spatial data science:
  1. GeoPandas: a library for working with geospatial data in Python. It combines the capabilities of pandas and shapely to perform operations on geospatial data.
  2. Fiona: a library for reading and writing vector data formats.
  3. Rasterio: a library for reading, writing and analyzing geospatial raster data.
  4. GDAL: a library for handling raster and vector data formats, and a common tool for converting between different geospatial data formats.
  5. EarthPy: a library for working with remote sensing data in Python, designed for use by Earth scientists.
  6. PySAL: a library for spatial analysis and spatial econometrics in Python.
  7. OGR: a library for handling vector data formats, and a common tool for converting between different vector data formats.
  8. PyProj: a library for handling projections and transforming geospatial data between different coordinate reference systems.
  9. Shapely: a library for performing operations on geometric objects, such as points, lines, and polyggons.
  10. Scikit-image: a library for image processing in Python, with tools for working with remote sensing data.
These packages are widely used by geospatial professionals and researchers, and offer a range of capabilities for data manipulation, analysis, and visualization.
  • asked a question related to GIS Analysis
Question
1 answer
Both Ripley's k-function and Moran's index measure the statistically significant clustering within data. However, how to know, which method is performing better for our data?
What are the advantages and disadvantages of each method which can help to choose a better method?
Relevant answer
Answer
Maybe of minor concern, but let me note that it is kind of misleading to call Ripley's K and Moran's I methods of 'cluster analysis'. I know that ArcGIS does so, but still. Traditionally, 'cluster analysis' means making groups of initially ungrouped observations within a sample. In contrast, Ripley's K and Moran's I analyse the spatial structure of the sample. They do not create groups, but describe/test aggregation of a point pattern or spatial autocorrelation of a variable.
Ripley's K and Moran's I serve different purposes. Ripley's K describes the expected number of neighbours of any point in a point pattern across different radii. You can use it to analyse point patterns in the space, whether they are aggregated, random, or systematic. The only information you use is the coordinates (or distances between them). Moran's I analyses how a measured variable is structured in space. It uses a spatial neighbourhood matrix between the observations and a measured variable for each observation. It has a global and a local variant; with the latter, you can step across different distance classes to reveal scale-dependency in the spatial correlation. If reasonable null-hypotheses are provided, both can be used for hypothesis testing.
Anyway, it would be interesting to outline a real clustering method based on the Ripley's K or Moran's I. Maybe such a method exists already.
  • asked a question related to GIS Analysis
Question
5 answers
I would to obtain a risk map related to the oak habitat using several ecological variables referring to climate change.
Thanks for your help.
Relevant answer
Answer
Antonio Luca Conte There are various software choices available for creating a climate change risk map for an oak ecosystem. Some common choices are:
1. ArcGIS is a geographic information system (GIS) program that may be used to produce comprehensive maps and analyze data. It is capable of analyzing climatic data and creating danger maps for specific ecosystems.
2. QGIS is a free and open-source geographic information system (GIS) program that may be used to build and analyze maps. It shares many of ArcGIS's features and may be used to generate risk maps for specific environments.
3. R is a statistical computing and graphics programming language and software environment. It's commonly used for data analysis and visualization, as well as risk mapping. Many R packages, such as 'raster' and 'dismo,' may be used to construct risk maps.
4. Google Earth Engine is a cloud-based platform for accessing and processing huge volumes of geographical data, including climate data. It may be used to generate risk maps for individual environments.
It should be noted that developing a risk map for a given ecosystem would need an understanding of GIS, data analysis, and modeling. If you lack such information, it may be preferable to seek advice from professionals in the subject, such as ecologists, climatologists, or geologists.
  • asked a question related to GIS Analysis
Question
5 answers
Dear comunity,
at present, I am in a time sensitive situation, my doctoral dissertation is scheduled to be submitted in the next three months. However, I still need a publication to meet the requirements.
My PhD research include the application of remote sensing and GIS for Groundwater exploration
I prepared a manuscript on the integration of RS, GIS, and AHP techniques to map groundwater potential zones.
knowing that, Journals takes a lot of time to respond. Which are the quality/Free journals that take less time to respond or give review?
Thank you,
Relevant answer
Answer
Journal of Climate and Water resources. Springer.
Also, visit Nature Journals. All are free.
  • asked a question related to GIS Analysis
Question
7 answers
I am looking for an algorithm or GIS software that, for a solar panel or a larger area equipped with solar panels, determines which buildings in the vicinity are blinded by the panels due to the reflection of sunlight.
Inputs are 3D planar polygons that describe the panel geometries. Furthermore, a surface model of the earth's surface. The algorithm's output would be the locations that are affected by the glare effects.
Who knows algorithmic approaches or software solutions?
Relevant answer
Answer
Michael John Patrick Sir, There is no particular 3D GIS software that can do this. However, some software packages such as Autodesk 3ds Max and Unreal Engine by Epic Games have capabilities to render lighting and reflections accurately. You can also try Computer Generated Architectural (CGA) Rule method for 3D GIS.
For projects within the scope of FAA review, a glare analysis is typically required. A glare analysis involves a visual simulation of the array’s impact on the visibility of nearby aircraft pilots. The analysis includes a 3D model of the array and its location, as well as a representation of the surrounding terrain. The analysis should also account for the angle of the sun, which changes throughout the day and year, and other factors such as the reflectivity of the array’s materials.
Some states and local governments are also beginning to require a glare analysis to be conducted as part of the permitting process. While the exact requirements may vary, they generally involve a 3D model of the array and its location, a measure of the maximum uplight generated by the array, and an analysis of the maximum brightness of the array at certain angles, both on an annual basis and at specific times of day.
A glare analysis is often conducted with a computer simulation program, such as RhinoGlare or Radiance, to create a visual representation of the array’s effects on the surrounding environment. However, it is important to note that the results of the analysis are only as good as the input data. Therefore, it is important to ensure that the model is as accurate as possible, including all relevant features such as topography and nearby buildings.
The results of the analysis are typically presented in the form of a report and visual representation of the array’s impact on the surrounding environment. This report can then be used to inform the decision-making process when it comes to the proposed solar project.
  • asked a question related to GIS Analysis
Question
2 answers
I have two incidents (pre and post incidents) for which NDVIs will be calculated and used to detect any change by subtracting. Now before calculating NDVI, I want to normalize radiometrically the post images with respect to the pre images using regression which uses pseudo-invariant target (PIF). I am looking to do this whole process in Google Earth Engine.
My questions are:
  • Can anybody please kindly share the script?
  • During selecting the PIFs, should I select them from the reference/base images or the target image?
Relevant answer
Answer
Relative radiometric normalization using pseudo invariant features (PIF) in Google Earth Engine can be done by following these steps:
1. Add the input imagery to Earth Engine.
2. Identify potential PIFs by exploring the image and selecting features that do not change significantly in the image over the time period of interest.
3. Label PIFs and assign each one a unique identifier.
4. Select the sample points to be used for radiometric normalization.
5. Calculate the mean and standard deviation of the radiometric values for the sample points.
6. Calculate the radiometric value of the PIFs in the input imagery.
7. Calculate the normalization factor for each PIF by dividing the radiometric value of the PIF by the mean and standard deviation of the sample points.
8. Apply the normalization factor to the input imagery in order to achieve relative radiometric normalization using the PIFs.
Script:
// Define a function to calculate the relative radiometric normalization using PIF
var relativeRadiometricNormalization = function(image){
// Calculate the normalization factor
var normalizationFactor = image.normalizedDifference(['B7','B5']).select('B7');
// Select the bands to be used for the normalization
var bands = ['B2','B3','B4','B5','B6','B7'];
// Map the normalization over the bands
var normalized = image.select(bands).multiply(normalizationFactor).divide(100);
// Return the normalized image
return normalized;
};
// Apply the function to the image
var normalizedImage = relativeRadiometricNormalization(image);
Hope this will work.
  • asked a question related to GIS Analysis
Question
3 answers
Hello, all:
I am looking for some Open-sourced Downscaling Algorithms or Methods applied to the High-resolution Remote Sensing Data (such as Land Cover/ Vegetation Type and so on).
Could somebody help me out? Appreciate that!
Relevant answer
Answer
Dear Chenyuan,
Here you are a dissertation about it
and on this webpage, you can find most of the algorithms you could need
Cheers,
Ivan
  • asked a question related to GIS Analysis
Question
4 answers
I want to get the coordinates from the vertices of a set of singlepart polygons (all in the same vector layer .shp) using R. I would like to have a list with the x and y coordinates per polygon. Do you know how this can be obtained?
Thank you in advance!
Relevant answer
Answer
Nikolaos Tziokas at the end I ended up reading the file as a simple feature (SF), and using this package (sf) for the purpose of my research. This has been a better solution for what I aimed to do.
Thank you for reminding me
  • asked a question related to GIS Analysis
Question
5 answers
What do you consider are the implications of Big Data on urban planning practice?
Relevant answer
Glory be to Allah... As time progresses, new developments appear that help people to complete their needs with flexibility and ease.
  • asked a question related to GIS Analysis
Question
10 answers
How can Big Data optimize urban planning and design?
Relevant answer
Answer
For this question the factors like population and consumption in urban places can be devised.
Suitable, data model is to optimize the factors within the population demands like consumption of nature resources.
  • asked a question related to GIS Analysis
Question
6 answers
What can be regarded as Big Data in the context of urban planning and design?
Relevant answer
Answer
URBAN BIG DATA Various sources for data, including: - sensor data for all types of urban infrastructures - [real-time] transport tracking data - social network data [information about events or opinions] - public app data - user volunteered data [including geographic data] - phone data - open data provided by government [e.g., air pollution data, crime data, meteorological data, land use data]
Regards,
Shafagat
  • asked a question related to GIS Analysis
Question
6 answers
What are the application opportunities of Big Data in urban planning?
Relevant answer
Answer
Malakia David Naholo, BigData is a technology for maintaining data of higher volumes which are on magnitude bigger than modern software used in DBMS.
  • asked a question related to GIS Analysis
Question
4 answers
Dear Scholars,
Assume a mobile air pollution monitoring strategy using a network of sensors that move around the city, specifically a network of sensors that quantify PM2.5 at a height of 1.5 meters that lasts about 20 minutes. Clearly, using this strategy we would lose temporal resolution to gain spatial resolution.
If we would like to perform spatial interpolation to "fill" the empty spaces, what would you recommend? What do you think about it? What would be your approximations?
Regards
Relevant answer
Hi,
If you expect some local variation and then a non-stationary behavior of your data, probably Empirical Bayesian Kriging will be the one. This is assuming you have a lot of data non left-skewed and you can asumme Gaussian distribution.
In any case, I recommend you to carry out a Cross Validation with a study of RMSE and AMSE, as you can see in Pellicone (2019; 10.1080/17445647.2019.1673840) or in Ferreiro-Lera (2022; 10.1080/17445647.2022.2101949).
I hope I have been helpful.
All the best,
Giovanni.
  • asked a question related to GIS Analysis
Question
1 answer
I am using a NetCDF image comprising 12 subNetCDF images of different duration, as shown in the Figure attached; I tried to average/separate by using every tool, software, and different source, such as the use of Origin as mentioned in this link https://www.youtube.com/watch?v=14-sLHOaaOg&ab_channel=OriginLabCorp. I have used the Make NetCDF raster file tool, which won't import the subNetCDF in batch. It uploaded the images one at a time. I have used ArcGIS supporting tools such as https://r-video-tutorial.blogspot.com/2016/07/time-averages-of-netcdf-files-from.htmland also used python and R codes, but I failed to separate the subNetCDF files. I am working on large datasets. Uploading one-by-one images of multiple parameters with multiple duration will make my work more crucial, typical, and time-consuming. My work requires the final image as a raster file. Please recommend some solution to deal with the issue. Please help me get images at one or an average of all images in one. My work requires the final image as a raster file.
Relevant answer
Answer
① add data .nc,Band Dimension(optional) is ‘time’
② “Export data”, to format TIFF (Location is a file, not gdb)
③Define Projection to WGS1984(for example)
④Add Tiff Data(after projection) and its all bands(Band1-Band12)
⑤Raster Calculator: calculate the sum or average of all bands and output grid or tiff
Just for reference.
  • asked a question related to GIS Analysis
Question
4 answers
I'm doing research for my degree thesis in architecture on the urban heat islands of the city of Naples - Italy.
I'm reclassifying the Land surface temeperature map in gis and I am looking for a method to classify the temperatures on the ground in a precise way, according to the classes that allow me to locate the heat islands.
Relevant answer
Answer
Cara Rosa,
There are no universal thresholds to classify urban heat islands because each city will experience its own specific climate and environmental constraints. Since you are dealing with a single city, I suggest you start by determining the lowest and the highest temperatures in your data set, and then divide that range (Tmax - Tmin) in 3 equal intervals. For instance, if Tmin = 18 and Tmax = 24, your intervals would be 18 to 20, 20 to 22 and 22 to 24 C. In that case, all areas falling in the lower interval could be labeled 'cool', the areas falling in the next class could be called 'warm', and those belonging to the higher interval could be indicative of 'hot' conditions.
You should make sure that your data set covers a region around the city large enough to include agricultural fields or forests: those areas would provide you with a baseline environmental temperature away from the urban center. An obvious extension of this approach would be to map the city areas in single degree intervals, from blues through to greens, yellows, oranges and red. That will clearly indicate which areas are hotter.
A better approach would be to start from a preliminary question: why do you need to make a map of the urban heat island effect of Naples in the first place? If you were concerned by the health effect of temperature on morbidity and mortality of the inhabitants, for instance, then your temperature thresholds should be driven by medical rather than purely statistical considerations. Similarly, if your underlying concern were energy expenditures (cooling during the summer or heating during the winter), then your thresholds should relate to the corresponding critical rates of energy expenditure. In other words, your approach should depend on your ultimate goal.
Remember also that
- the urban heat island is quite time-dependent: it varies with cloudiness and synoptic conditions such as sea-breeze on a daily time scale, it is much more noticeable during the winter than the summer, and it may evolve on longer time scales, depending on the rates of urbanization and industrialization;
- land surface temperature is quite dependent on altitude, so you might want to acquire a topographic map of your area and look at the correlation between these two parameters;
- a land use map of the region will be useful to properly interpret your results, as the hotter area may not be the city center but some industrial area, depending on their relative rates of energy consumption.
I hope these comments may help you in your work. Michel.
  • asked a question related to GIS Analysis
Question
3 answers
I am working on Forest Canopy Density. There is a parameter called "Scaled Shadow Index(SSI)" while computing Forest canopy density. In most of the papers I found that, SSI has been calculated by "Linearly Transforming" Shadow Index. I have computed the Shadow Index. But i am not getting the idea to compute Scaled Shadow Index. Kindly help me out. Moreover, If I am using Landsat 5 and 8 Surface Reflectance Image for FCD Mapping and as the Reflectance value ranges from 0 to 1, is it still mandatory to normalize these Surface Reflectance data before calculating Vegetation Indices?
Relevant answer
The density clustering with wavelength clustering algorithms and Clustering by Wavelet Analysis may help your work
  • asked a question related to GIS Analysis
Question
6 answers
Dear colleagues I'm trying to calculate the land surface temperature for the Bakun Catchment, Sarawak, Malaysia region using different landsat satellite images of different years with the help of Google Earth Engine Code Editor. I'm not sure why I'm getting incomplete results for just this region and even the pixel values are too high in some of the years (please check the attached screenshots). Yes, for the NDVI and NDWI the results are covering the whole region.
I tried the same code for other regions of Pakistan and the results are very well.
So, my question is can anyone help me to get to know that this region the Bakun Catchment, Sarawak, Malaysia comes under special consideration or are there any other satellite data issues or maybe something wrong with the code? Also, does this region's LST goes down to -3 to -10?
Relevant answer
Answer
Just curious, are you using the ready to use LST product from USGS or are using a separate algorithm to derive Landsat based LST? In the former0 case, most probably, the negative values in the Landsat Collection 2 ST products from USGS are due to the presence of 'clouds' in the satellite images. Cross check with the cloud mask/Quality Assessment band to make sure there are no clouds in your study area in Sarawak. If you are using the USGS Surface Temperature product, I assume, the reason for incomplete results could be because these LST products are based on ASTER Global Emissivity Database (GED). Wherever, GED has missing data, there will be missing data in the Landsat ST product as well.
I might be wrong, I will have to look at the ST products for further scrutiny.
  • asked a question related to GIS Analysis
Question
19 answers
  • I am trying to use the DSAS tool (version 5) for both ArcMap 10.4.1. and ArcMap 10.8 in Windows 10. I am able to output the transects, but when it comes to trying to calculate the statistic. I get an error message telling me to reference the DSAS_log (photo attached) (which does not provide a clear solution of what the error is). I wonder how to solve this problem? I have tried to re-create the baseline multiple times and am getting the same error for both two versions of ArcMap (10.4 &10.8). Knwoing that I followed the Guide and I am using (English US and mm/dd/yyyy format).
Relevant answer
Answer
I had the same problem and solved it by working on preprocessing. My steps were: Digitize the shorelines (with a fixed zoom scale) and create a baseline on the mainland, you may have to create both on the mainland and on the water (depending on our region). You must use shapefiles in the same coordinate system, and must be based on meters (UTM). In the case of your shorelines had been digitalized in different shapefiles, use the "Merge" tool to join all shorelines in one shapefile. In ArcCatalog, click with the right bottom in the folder you want to put your geodatabase, so go to "New" and in "Personal Geodatabase". After that, with a double click in the archive just created, click with the right bottom in the space of it, and go to "Import" to put there two shapefiles, one with your shorelines and other with the baseline(s).
Some classes and their types are required, so you should check if your shapefiles have:
- Baseline:
--> User created: id (Long Integer)
--> Created by DSAS through Attribute Automator: SHAPE_Length (Double), DSAS_ID (Long Integer)
- Coast:
--> User created: id (Long Integer), Uncy (Double), Date_ (Text).
--> Created by DSAS via Attribute Automator: DSAS_date (text), DSAS_uncy (Double), DSAS_type (text)
I leave the uncertainty field (Uncy) empty and DSAS defaults to 10 meters in this case. After that, I used the DSAS video (https://www.usgs.gov/media/videos/introduction-dsas-v50-sample-data-workflow) as a guide.
  • asked a question related to GIS Analysis
Question
19 answers
Dear Experts,
What is the best method and resource to create a detailed landcover map of an urban area?
i need these classes: green space, water-body, farmland, bare land, building, and road
I need the land cover maps of 2000, 2005, 2010, 2015, and 2020.
Thank you
Relevant answer
Answer
Dear Majid,
You may use Landsat satellites to cover these years. You may also increase the spatial resolution to 15 meters using pan sharping technique. About the classes, many methods can be applied whether supervised or non. It depends in the analysis and purposes. Envi and Erdas are good choices to apply that.
Best wishes,
Jasem
  • asked a question related to GIS Analysis
Question
6 answers
Hello, I am a senior grade student in university and nowadays conducting a study about change of the urban structure(in terms of commercial activities) after certain transportation project.
After the project, several pedestrian roads will be installed(which is correspond to axial line in SSA) so that the number of people in the area will grow to much larger degree.
If I can quantify the differences of number of pedestrians between before/after of business commensement I will eventually reflect this as a variable to making a model(ex: regression, whatever) that analyzing commercial activity.
I've been studying SSM theory and notions, but couldn't find materials that I can get technical information...And as I heard, the procedure is AutoCAD or QGIS > depthmap > R, but so what? I have no idea how to do this in detail, and above all, I don't know how to use AutoCAD. So can anyone tell me how to analyize space syntax with QGIS? I've downloaded depthmapX 0.35 and QGIS space syntax toolkit already but don't know how to do further. I'm despairing now.
So please give your hand regarding this to this poor novice student :(
Relevant answer
Answer
You may take a look at the following publication regarding the procedure of combination of SS and GIS in respect to land use compatibility with pedestrian demand.
  • asked a question related to GIS Analysis
Question
6 answers
I am writing a paper that focuses on the importance of centralizing file-based spatial that exist in data silos currently fragmented in file servers, email servers, web servers, document management systems, etc. Most of the solutions that consolidate spatial data are for database-based spatial data. However, much spatial data still exists in files and are unconsolidated.
One description missing from the paper is how enterprise spatial organisations are currently handling security of these spatial files, especially those that exist in data silos (fragmented in file servers, email servers, web servers, document management systems, etc).
Unfortunately, I do not know anyone in the enterprise GIS organisations and therefore its hard to know how these organisations store and handle the security of these data. Could anyone with an industry background please guide me.
  • Are these files normally stored in data silos?
  • Are there any movements towards consolidating these file data?
  • Are there any software solutions that enable consolidating these spatial files?
Any online document/paper that describes this issue which I can cite would be well appreciated.
Relevant answer
Answer
The first link contains among other things:
WHITE PAPER
An adaptive Markov chain algorithm applied over map-matching of vehicle trip GPS data
DEMO
Dynamic Telco Signal Analysis with Active Analytics
DATASHEET
Case Study: Data-Driven Network Prioritization
The second link contains also:
IDL : The Interactive Data Language Visualization Solution
  • asked a question related to GIS Analysis
Question
22 answers
I am currently working on my research topic on “Safe Urban Mobility”. I mean by “Safe” which ensures no transmission the infections during transport journeys in the time of the pandemic (Covid-19), especially with the poor mobility choices.
After searching the literature, I got a few studies pointing to the topic.
You can view and discuss your perceptions on this topic.
I welcome all opinions.
Ahmed.
  • asked a question related to GIS Analysis
Question
10 answers
I have two datasets. One with 9 past cyclones with their damage on the forest, wind speed, distance from the study site, recovery area. Another dataset with future sea-level rise (SLR) projections and potential loss area due to SLR.
  1. By using data from both disturbance events datasets (loss area, recovery area, wind speed, predicted loss area from SLR) can I create any kinds of disturbance risk/vulnerability/disturbance index/ hazard indicator map of the study area?
  2. What kinds of statistical analysis can I include in my study with these limited data sets which will help me to show some sort of relationship of "Loss Area" with other variables?
  • asked a question related to GIS Analysis
Question
2 answers
I wanted to get the TOA reflectance from MODIS data (Terra surface reflectance, MOD09A1). hence I used preprocessing too. I got the following reflectance values which I suppose are very less than I expected. There are so many zeros after the decimal. Would you please help me to solve this?
(As I was having a problem downloading the data. So I prefer to use this preprocess tool to use the data downloaded directly from the official website.)
Thank you in advance.
Relevant answer
Answer
  • asked a question related to GIS Analysis
Question
6 answers
Respected All
The data which is available at https://swat.tamu.edu/data/india-dataset/ , the file named "SOIL_WATERBASE.7z" on this link has only the raster along with "VALUE" and "NAME" of the soil series, which is taken/subsetted from HWSD_FAO,
But the database file require a large number of inputs of different soil properties, for these soil series like, compaction, number of layers, Texture, Bulk density, Available water capacity, Proportion of sand silt and clay, etc, in the dbf file, which has these columns
OBJECTID
MUID
SEQN
SNAM
S5ID
CMPPCT
NLAYERS
HYDGRP
SOL_ZMX
ANION_EXCL
SOL_CRK
TEXTURE
SOL_Z1
SOL_BD1
SOL_AWC1
SOL_K1
SOL_CBN1
CLAY1
SILT1
SAND1
ROCK1
SOL_ALB1
USLE_K1
SOL_EC1
SOL_Z2
SOL_BD2
SOL_AWC2
SOL_K2
SOL_CBN2
CLAY2
SILT2
SAND2
ROCK2
SOL_ALB2
USLE_K2
SOL_EC2
and so on_____________till 10 layers (if available)
The another file named " soil_HWSD_FAO.7z " has its global raster without any information about the soil properties which are used in input database.
I have sub-setted/clipped the indian raster from the global,
although the global raster was missing spatial referencing,
So if someone is sub-setting the same for their area of interest, then you have to manually geo-reference the subsetted image to use it while creating HRU's.
Kindly share if you have any idea about, the source of the soil properties for these soil series.
Relevant answer
Answer
Hey! Abhilash plz Share protected password with me
  • asked a question related to GIS Analysis
Question
5 answers
The Geology of the Himalaya is a record of the most dramatic and visible creations of modern plate tectonic forces. I wonder how vegetation flourished in such a mountain system.
Relevant answer
Answer
Tectonic processes work over long time spans. The pre-Himalaya land surface would have been well vegetated and would have supplied the diverse variety of flora. As the mountains rose, much of the plant life would have needed to adapt to colder conditions. Once steeper slopes were created, rockfall and landslides were inevitable.
Plants need soil and parent material provides the base from which soils develop. Therefore the local rock type is critical to the early development of soil. Calcareous rocks, e.g. limestone, provide a rich sweet element base to support grasses. Quartzites on the other hand break down to infertile silicates and soil develops very slowly on such. Another factor related to rock type is its propensity to weather and/or slide. Marls, clays, shales and highly micaceous schists are very prone to rapid weathering both physical and chemical and also to slippage along planes of weakness. Granites, quartzite and psammites are much more durable forming blocky scree in which little soil can form or plant get rooted. Mosses and pioneer species take hold first.
i have noted that in areas affected by recent debris flows that vegetation can very rapidly recolonise even in one year, so that any recent activity can become concealed.
George Strachan
  • asked a question related to GIS Analysis
Question
4 answers
Are there any iPad pro apps for GIS mapping and analysis? Thanks
Relevant answer
Answer
Dear Jens Kleb Antônio Carlos Pereira dos Santos Junior Omid Vakili , for precision monitoring I use an emlid reach rs2+ GNSS antenna and its app is well supported by apple so this doesn't worry me. My basic idea is to use the iPad as if it were a topographic map on which to digitize with the pen in the field or in the laboratory the various shapes obviously geo-referenced and using as a base map a topographic map in ecw format, export in shape format and load everything into my beloved QGis, but taking full advantage of the potential of the apple pencil. Is this possible? I must also admit that I am a big fan of global mapper. Regards
  • asked a question related to GIS Analysis
Question
8 answers
Can we draw any statistical and spatial relationship between fine parameters of road dust ( Pd, Zn, Cr, and Ca) and some key air pollutants ( like PM_2.5, CO, NO, CH4, O3, HCHO, BC, NOx, SO2)? Did anyone make any spatial analysis using these both datasets?
I have both datasets and would like to do some statistical and spatial analyses. Kindly suggest, if there is any possibility to draw any relationship.
  • asked a question related to GIS Analysis
Question
28 answers
Is there any good software that able to show the good results of spatiotemporal analysis of GIS outputs?
Relevant answer
Answer
-QGIS, ArcGIS (GIS software)
-R, Python (Programming software)
-Google Earth Engine (Online platform)
  • asked a question related to GIS Analysis
Question
6 answers
If we apply atmospheric correction to the data to enhance the coherence, will it provide accurate deformation result in a densely vegetated mountainous region like the Himalayas.
Relevant answer
Answer
Thank you very much for this interesting question. This indeed is a research question that can only be answered after thorough research. While I was searching for some answers, I came across the attached publication. I hope it provides some insights.
Thanks and regards
Gowhar Meraj
  • asked a question related to GIS Analysis
Question
2 answers
I am running Geographically Weighted Regression (GWR) on a categorical dependent variable so my model is basically a Geographically Weighted Logistic Regression.
I have multiple independent variables, some numerical and some categorical.
While interpreting the results of numerical variables is straight forward, I want to know how to distinguish the reference level of the categorical independent variables and how to interpret those?
let's say I code males as 0 and females as 1. so the coefficients should be interpreted for females as they are coded 1? what if I coded males as1 and females as 2?
Relevant answer
Answer
Dear Nasser,
There should be no difference in the results if you coded (0,1) or (1,2)
In both cases, the result will be as a ratio between them. The parameter B equals the ratio of males/females (as males variable is coded with the smaller value)
  • asked a question related to GIS Analysis
Question
9 answers
I have worked on RUSLE model and collected all the required data from flied survey and lab analysis.
But now I am stuck in Data validation process.
Please suggest the procedure to do it.
Thank you
Relevant answer
  • asked a question related to GIS Analysis
Question
6 answers
Hi,
Can I prove or model a reclamation in the past using GIS/RS? Can you provide me some discussion, readings, or techniques?
Relevant answer
Yes, you can via the satellite images of the same area using the 4D analysis, the time period between these images have to be equal, this can help you to predict the future effects. I hope the link below can help you a little https://doi.org/10.3390/app10072266
  • asked a question related to GIS Analysis
Question
7 answers
I try to calculate Stream Power Index (SPI) in ArcGIS. For this reason, i checked many videos or documents but there is no certainty about the formula in the raster calculator. So I wrote these formulas below to learn which one is right. Each one creates different results.
DEM Cell Size=10m
SPI_1 --> "flow_accumulation" * 10 * Tan("slope_degree" * 0.017453)
SPI_2 --> Ln("flow_accumulation" * 10 * Tan("slope_degree" * 0.017453))
SPI_3 --> Ln("flow_accumulation" + 0.001) * (("slope_percent" / 100) + 0.001)
SPI_4 --> Ln(((flow_accumulation+1)*10) * Tan("slope_degree"))
SPI_5 --> "flow_accumulation" * Tan("slope_degree")
Also, while creating a slope, am I need to choose which one? DEGREE or PERCENT_RISE
And the last question: When I calculate SPI with the formulas above, I get SPI map that included negative values. Is it true? Are negative values a problem or not?
Relevant answer
Answer
i think the correct formula is SPI_2 where:
flow accumulation must be in square meters and then not multiplied for 10;
the slope degree must not be equal to 0, so add + 0.1 degrees
In this way, SPI should be a dimensionless positive value (more or less)
  • asked a question related to GIS Analysis
Question
2 answers
After the GIS 3D model is established, what data types need to be imported into ABAQUA? After importing, can the model be modified in the GUI?
Relevant answer
Answer
Could you give more details regarding to ABAQUA in order to help you
regards
  • asked a question related to GIS Analysis
Question
2 answers
Littoral zone is wider than coastal zone, while second is a physical concept, littoral is also understood as social and economic. Which criteria could we use to draw the limits of littoral zone?
Relevant answer
Answer
This is a good question.
  • asked a question related to GIS Analysis
Question
4 answers
I have a point shapefile of 12 islands (created by using centroid in Arcmap) with attribute of Macro benthic speciesl and their abundance. I would like to analyze a distribution of these species using Arcmap. Would it be suitable to use kernel density to analyze Species distribution? If not, which method should be used in analyzing instead?
I am concerned about number of points (too few points) and long distance among island.
Thank you
Relevant answer
Answer
maxent software
  • asked a question related to GIS Analysis
Question
5 answers
Dear all,
Are there publications/guidelines about methodologies for identifying critical areas based on stormwater quality? I am especially interested in methods based on GIS analysis and water quality sampling. By sampling I mean quite practical approaches that municipalities can utilize while trying to map critical areas for stormwater quality management, and with city-scale focus. I assume that a city-scale project has to start with a GIS analysis of critical areas and then, as a second phase, continue with a water quality sampling campaing(s) of selected sites.
Relevant answer
Answer
One of my advisors once wrote:
I hope this may help in identifying key sampling sites.
  • asked a question related to GIS Analysis
Question
1 answer
At present, most researches construct the topic model of the text first, and then analyze the results in time and space. Is there a better way to combine?
Relevant answer
Answer
That is an interesting question. KeyATM and Topic modeling allows for covariates. But I am not sure if they allow for spatial variables.
  • asked a question related to GIS Analysis
Question
12 answers
  • asked a question related to GIS Analysis
Question
7 answers
One critical component of the study of landuse change is the testing of level of agreement between pixels of a satellite-derived landuse map and a reference map (produced with authenticated ground information). This comparison remains the veritable approach for checking the degree of correctness of the image analysis task. However, one of the indices developed for scaling through this task is Kappa Index for the assessment of the components of the study. However, scholars such as Prof Gil Pontius of Clark University have repeatedly affirmed that Kappa Index gives grossly inaccurate estimates with misses and other erroneous outputs. My questions is what are the proffered alternatives to Kappa if we have to accept that the index is inaccurate and misleading? How available at the open science level (open source software packages) are these alternatives so as to have inclusive and easily accessible tools for future land change science studies? These answers are needed because some young and early career remote sensing experts are still glued to Kappa.
Thank you for your contributions to this inquiry.
Relevant answer
Todo índice es util siempre y cuando según la tecnología disponible se puedan resolver problrmas tecnicos.
  • asked a question related to GIS Analysis
Question
6 answers
Hi everyone!
I have origin points in different areas and within these areas also specific values within raster cells (in 50m resolution). My goal is to sum up the values of all previous cells in the respective cells that are traversed.
This movement can be calculated for example with Least Cost Path. With this tool i created a Backlink Raster, which shows me the movement.
But when I use the Least Cost Path tool to accumulate values, the respective values are extended by the grid cell size.
Does anyone have an idea how this can be used to accumulate only the actual values without the grid size?
I tried this with flow accumulation, but some cells will get no value, this i because there is no other cell ending in it. But it needs the value added from the prior cell (or value of the cell) in which this cell is "moving"/"flowing".
I hope someone could help me out with this issue.
Cheers!
Relevant answer
Answer
Iman Javadzarin
thanks again for your reply.
A movement is not uniform. The direction is constantly changing, so a summation of rows or columns does not make sense. Also the movement starts somewhere in the middle of an area.
I will attach an example.
So if you follow the arrows backwards, in the last cell there should be the sum of all values which are passed through.
  • asked a question related to GIS Analysis
Question
32 answers
I have attached the OSM map of Pannipitiya, Sri Lanka. So looking at the map, what kind of geographical questions you can ask?
To me, the following came to my mind
1. What are the places where house dwellers can walk and reach within 1 minute (600 m ?) ?
2. What is the calmest and quiet place to meditation?
Relevant answer
Answer
Describe the geographical features and locations of the places and roads on the map, in relation to one another
  • asked a question related to GIS Analysis
Question
9 answers
As a part of my PhD, I conducted a study to assess health inequities in Amaravati capital region of Andhra Pradesh using two composite indices made from health determinants indicators and health outcome indicators.
Health outcome indicators data was available at the sub-district level. The data were interpolated to create a heatmap of the health outcome index. Whereas health determinants data was available at the village level. Thus I created a choropleth map using the health determinants index.
Later interpolated health outcome index map was overlayered on the choropleth map of health outcomes. It highlighted some interesting findings, i.e. areas of concern (Villages). The colour combinations created because of overlaying two layers revealed the areas with poor health outcomes and poor health determinants and areas with poor health outcomes with better determinants.
Kindly check these files and give your valuable opinions. Whether this type of analysis can be used to highlight the areas with health inequities or not? Please comment on the method used and the results obtained in the overlayered map.
Relevant answer
Answer
The OPGD model and "GD" R software package were recommended to identify spatial determinants from a perspective of spatial heterogeneity. You can refer the guide to use the model https://cran.r-project.org/web/packages/GD/vignettes/GD.html. As a result, you can visualise contributions of determinants, and the interactive impacts of spatial variables.
  • asked a question related to GIS Analysis
Question
2 answers
CI-110 Plant Canopy Imager gives two reading related to leaf area index (LAI) GF lai and PAR lai. Is these values are the final ones if I am trying to Effective leaf area index or do I need to do any other calculation ? if yes, then, Which values should I use as the effective leaf area index (GF lai / Par lai)? I am trying to work on lai prediction using remote sensing.
  • asked a question related to GIS Analysis
Question
6 answers
I am a novice researcher, and i'm working on a project which is busy analyzing water quality data from different water sources such as dams,rivers, and springs , and also from secondary sources such water treatment plants, and households, i have collected water GPS coordinates, which appear on the image* , hence im having difficult to find the right methodology to analyse these results spatially. The microbiological parameters that are going to analysed include bacteria such as E.coli, salmonella spp , Shigella spp, Giardia spp, Entamoba Histolytica spp.
please help and be kind :)
i have attached an image showing the sample locations , additionally the study area is in six Quaternary drainage basins
  • asked a question related to GIS Analysis
Question
9 answers
Hi. I'm searching different methods to delimitate a study area. For example, in hidrological studies is usual to create a polygon for the watersheed (Using contour lines or a DEM). In Environmental impact assessment is the direct and indirect areas that determine the polygon to use. Other cases use political division (Neightborhoods, streets, estates...).
All of them base the delimitations on physical characteristics.
But how to delimitate the study area when the phenomenon intersects several watersheeds, or if i'm investigating a community that move across several estates or political divisions. How to know where the limits are to create my polygon? And how to know if the delimitation is right and that i'm not losing information or data by bad delimitation.
Thanks for your answers.
Relevant answer
Answer
Check Invest model. It has serviceshed sub-model.
I hope it will be helpful.
  • asked a question related to GIS Analysis
Question
14 answers
The parallel drainages are commonly observed over estuarine environments, flood plains and reservoirs of the area
Relevant answer
Answer
We also faced a similar problem in the Jhelum basin. In larger basins, you are very much right that natural drainage is not always available for stream burning. Further in our case, the pseudo drainage occurred in flat surfaces and over the lakes. For lakes, we had the lake polygon layer to burn. I think increasing the threshold is one of the viable options that does reduce such spurious drainage lines.
The solution lies in between the opinions provided by W. J. van Verseveld Richard Gloaguen Neelakantan Sajikumar
Thank you all for this wonderful knowledgeable discussion.
Regards
Gowhar Meraj
  • asked a question related to GIS Analysis
Question
5 answers
I have an image of the map ( no lat long present on the map )which I have to convert into a shapefile to import into GPlates. How can I do so? Any help or idea is much appreciated.
Relevant answer
Answer
Hi Ananya!
I agree with some of the answers that mention that, for the software you are planning to use the vector layer with, you will need to have the information in a common geographic reference system. In order to do that with you can use the free software QGIS for the georeferencing process (look for Raster menu - Georeferencer) and then in the same software you can perform the conversion from raster to shape using (Raster- conversion-Polygonize).
The algorithm usually works well but the result will depend a lot on the complexity of the raster data.
Hope it helps! regards, Soledad
  • asked a question related to GIS Analysis
Question
16 answers
I am trying to delineate agricultural fields using Sentinel 2 imagery. I have been implementing different image segmentation algorithms to a time series of this data set. My best output so far has false positive errors of some non-agricultural zones (like forests). Hence, I'm looking for the best way to distinguish forests from ag-fields as a post-processing step.
Relevant answer
Answer
Hi!,
Given that the crop seasons are known and the availability of sentinel data, maybe instead of post-processing you can perform a pre-processing, select a date or dates when the crops are still not developed and classify the forests, then create a mask of forest areas and apply that mask to an image or images for the crops classification. Depending on the complexity and types of crops you may have to anlyse more than one date but in this way you remove the effect of forests.
Hope it helps, I have solved this way a similar problem I had with urban and soil with stubble mixed signatures.
Regards,
Soledad
  • asked a question related to GIS Analysis
Question
6 answers
The SAR data shows decorrelation for highly vegetated terrains. Has any of the latest software overcome this shortcoming?
Relevant answer
Answer
Hi Swati,
Decorrelation in highly vegetated terrain is to be expected and IMHO I don't think the choice of the software will have a significant impact on the results but rather the processing method and the properties of the input data.
SNAP by Esa is a freely available software capable of interferommetric processing, Gamma is expensive (very) but has all one can ask, there are also others e. g. Erdas Imagine
  • asked a question related to GIS Analysis
Question
2 answers
Hi,
I am creating stream orders for morphometric analysis.
At what threshold will we write to the "Con" tool when creating a stream? (Example "Value > 100").
As known, this affects all accounts as it affects the number of Strahler stream orders.
How can I determine the threshold that I am going to write in the "Con" tool and that is suitable for my basin?
Best wishes, Ahmet.
Relevant answer
Answer
First need to determine in which stream order network delineation then based on the river order determine the threshold value.
  • asked a question related to GIS Analysis
Question
5 answers
1.Mean Bifurcation Ratio (Rb)
2.Drainage Density
3.Stream Frequency
4. Form Factor
5.Stream length Ratio (RL)
I am able to find values for the above using certain formulae. But, how do I know if it is less or more and what does it signify?
Relevant answer
Answer
I see that there is another scientific shortcut looming, purporting the well-known “philosopher´s stone” of geomorphology and basin analysis. It is similar to the approach taken in geochemistry where a plethora of indices and ratios pretend to avoid experience and knowledge in geology and mineralogy prior to the use of endless numerical datasets.
In the current issue there will be no by-pass of geoscientific anamnesis and visual examination (mapping, sampling leading to 2- and 3-D visualization as well as follow-up laboratory analysis.
Numerical analysis encompassing hydrography such as sinuosity of channel systems, slope angle must be supported by hydrodynamic investigations (GMS tool = granulometry, morphology, situmetry) all of which is grouped under the header “geoscientific measurements, analysis, and computations”. It needs to be corroborated by measurements in structural geology and by in-door analysis of the mineralogy, petrography and chemistry as well geochronology using methods appropriate for the Quaternary and Neogene , in particular. Any stand-alone synthetic geomorphometric analysis will yield synthetic results.
To substantiate my arguments I refer to the following papers dealing with this holistic approach in basin analysis and geomorphology.
DILL, H.G., BUZATU A., BALABAN S.-I. , UFER K., GÓMEZ TAPIAS J., BÎRGĂOANU D. and CRAMER T. (2020) The “badland trilogy” of the Desierto de la Tatacoa, Upper Magdalena Valley, Colombia, a result of geodynamics and climate: With a review of badland landscapes.- Catena 194: 1-20.
DILL, H.G., BUZATU A., BALABAN S.-I. , UFER K., TECHMER A. SCHEDLINSKY, W., and FÜSSL M., (2020) The transition of very coarse-grained meandering to straight fluvial drainage systems in a tectonized foreland-basement landscape during the Holocene (SE Germany) – A joint geomorphological-geological study.- Geomorphology (in print)
Available and ready for download on the RG server.
H.G.Dill
  • asked a question related to GIS Analysis
Question
22 answers
Dear all,
I know it might depend also in the distribution / behavior of the variable that we are studying. The sample spacing must be able to capture the spatial dependence .
But, since Kriging is very much dependent in the computed variance within lag distance, if we have few number of observations we might fail to capture the spatial dependence because we would have few pairs of points within a specific lag distance. We would also have few number of lags. Specially, when we have points with a very irregular distribution across the study area, with a lot of observation in a specific region and sparce observations in other region, this will also will affect the estimation of computed variance among lag (different accuracy).
Therefore, I think in such circumstances computing semivariogram seems useless. What is the best practices if iwe still want to use kriging instead of other interpolation methods?
Thank you in advance
PS
Relevant answer
Answer
You need to separate two questions, first there is the number and spatial pattern of the data locations used in estimating and modeling the variogram. Secondly there is the number and spatial pattern of the data locations used in applying the kriging estimator/interpolator. These are two entirely different problems. The system of equations used to determine the coefficients in the kriging estimator only requires ONE data location but the results will not be very useful or reliable. Now you must decide whether to use a "unique" search neighborhood to determine the data locations used in the kriging equations or a "moving" neighborhood. Most geostatistical software will use a "moving" neighborhood, if you use a moving neighborhood then about 25 data locations is adequate, using more may result in negative weights and larger kriging variances. Depending on the total number of data locations and the spatial pattern there may be interpolation locations where there are less than 25 data locations. Using a "unique" search neighborhood will likely result in a very large coefficient matrix to invert.
With respect to estimating and modeling the variogram you must first consider how you are going to do this. Usually this will include computing empirical/experimental variograms but for a given data set the empirical variogram is NOT unique. It will depend on various choices made by the user such as the maximum lag distance, the width of the lag classes and whether it is directional or omnidirectional. An empirical variogram does not directly determine the variogram model type, e.g. spherical, gaussian, exponential, etc. It also does not directly determine the model parameters such as sill, range.
Silva's question may seem like a reasonable one to ask but it does NOT have a simple answer. Asking it implies a lack of understanding about geostatistics and kriging.
1991, Myers,D.E., On Variogram Estimation in Proceedings of the First Inter. Conf. Stat. Comp., Cesme, Turkey, 30 Mar.-2 April 1987, Vol II, American Sciences Press, 261-281
  • 1987, A. Warrick and D.E. Myers, Optimization of Sampling Locations for Variogram Calculations Water Resources Research 23, 496-500
  • asked a question related to GIS Analysis
Question
6 answers
I need to write a MATLAB code that has the ability to process a GIS image in order to extract the coordinates of the grid points within the red region (R) and that are at least distance "d" from its boundary. Each point in the R is given a weight w1 (attached figure). The same procedure is to be made for the green region (G) but w2 is the weight of any point in G. The gathered data are saved in a matrix formed of three rows: row 1 contains the abscissa, row 2 contains the ordinate, and row 3 the weight.
I am looking forward to getting your suggestions...thanks in advance.
Relevant answer
Answer
im=imread('GIS.jpg');
%imshow(im);
BW = im2single(im);
numclust=5;
[L,Centers] = imsegkmeans(BW,numclust); % matlab inbuilt
temp_mask = L==1;
figure,imshow(temp_mask);
might help:)
  • asked a question related to GIS Analysis
Question
3 answers
Hi
I am working with eCognition Developer software. resently, I used it for UAV imagery segmentation. This image is for the fall season and it has 4 bands( Green, Red, Reg and NIR).
My purpos is classify trees speces of forest dense. But, classification does not done well. becuse, each tree is classified into several classes. My openion is segmentation does not done well. I tried lots of for improved segmentation, But I did not achieve acceptable results.
Do you have an opinion on this?
Thank you all dear ones
Relevant answer
Answer
Can you post a sample image to see resolution, contrast, etc. and look at image quality?
  • asked a question related to GIS Analysis
Question
6 answers
I have been trying to download the bands 10 and 11 of LANDSAT 8 for a GIS analysis however the images downloaded from both Glovis and Earthexplorer have been resampled to a 30meters spatial resolution. Please who knows how I can get the 100m spatial resolution TIR band for my analysis
Relevant answer
Answer
Hi Ibrahim,
to get Landsat 8 TIRS band 10 and 11 you can follow the steps below:
2- create your account
3- Enter your search criteria (place and date range)
4- Click on the Datasets tab and expend Landsat
5- Once you expend Landsat, click on the + sign to choose Landsat Collection 1
6- Expend Landsat Collection 1 and click on Landsat Collection 1-Level-1
7- Once Landsat Collection 1 Level-1 is expended, check the box of Landsat 8 OLI/TIRS C1Level-1
8-Optionally, you can click on the Additional Criteria to narrow down your search (cloud coverage percentage)
9- Click on the Results tab to see your search results
10- Download the data to your working folder and unzip it to extract band 10 and 11.
I hope that could help. Best of luck with your research.
  • asked a question related to GIS Analysis
Question
1 answer
Dear all!
I would like to suggest you take the survey "NSDI FOR YOU AND YOUR COUNTRY":
The results of this survey will help me to research the issues of the implementation of NSDI and get an overview of this policy.
I will be very grateful for your help, cooperation and sharing!
Relevant answer
Answer
Indeed, good start
  • asked a question related to GIS Analysis
Question
12 answers
I am seeking the best current methods and datasets (highest possible resolution) for defining and assessing global land degradation - ideally with a time series. I know there are different ways of exploring this e.g. biomass, productivity, land use/cover etc., but I would appreciate any thoughts on current modelling, datasets/resources and novel approaches.
I am also interested in the best methods for quantitatively mapping/modelling land restoration (biophysical) on a global scale, and if possible, historic land reconstruction.
Thanks!
Relevant answer
Answer
Hi Jake,
It is a daunting task and not very easy to answer. We have been struggling with it for a while now. From our experiences it comes back to two basic challenges. The first one is how you define land degradation (see our recent publication below), but of course also to obtain the proper datasets. In our research we are focusing on creating a time series of a range of soil properties. We are currently aiming to at a time series of soil conditions for the UN statistics division in the context of their national accounting.
Cheers Jetse
Sterk, G., Stoorvogel, J.J., 2020. Desertification–Scientific Versus Political Realities. Land 2020, 9, 156; doi:10.3390/land9060156.
Stoorvogel, J.J., Bakkenes, M., ten Brink, B.J.E., and Temme, A.J.A.M., 2017. To what extent did we change our soils? A global comparison of natural and current conditions.. Land Degrad. Develop., doi: 10.1002/ldr.2721.
Stoorvogel, J. J., Bakkenes, M., Temme, A.J.A.M., Batjes, N.H., and ten Brink, B.J.E., 2017. S-World: A Global Soil Map for Environmental Modelling. Land Degrad. Develop., 28: 22–33. doi: 10.1002/ldr.2656.
  • asked a question related to GIS Analysis
Question
3 answers
For my Masters I am looking into elephant space-use and its effect on vegetation change. I have elephant GPS collar data and was going to use Google Earth Engine to obtain NDVI values. I would like to establish elephant space-use across a fenced area in the form of either areas most frequently occupied (at the pixel level) or pixels the most time was spent in. I would then like to use that data to determine whether any vegetation changes (recent NDVI as a percentage of long term average NDVI) are linked to the elephant space-use (i.e. to determine whether vegetation change is elephant-mediated). I have looked at methods such as UD and BBMM but was wondering if there were more suggestions out there? Ultimately the method I use for space-use needs to have an output I can use to compare it to the NDVI values. Thanks in advance!
Relevant answer
Answer
The "tlocoh" package for R might be of interest for you. Within this package you are able to analyse the space use of animals, including the revisit rate of specific areas by individuals, based on GPS data. Further you can calculate the amount of time spent during each visit. These information could be linked in a further step with your NDVI data.
You will find all information about this package here:
I hope this helps!
Best regards
Paul
  • asked a question related to GIS Analysis
Question
5 answers
I have multiple line outlining annual glacier extent (over 20 lines per glacier). I would like to calculate the average linear retreat of the glaciers. Is there a tool to calculate such distance on arcGIS?Or is there a python script I could use?
Relevant answer
Answer
This resource might be helpful:
It is designed for shoreline change analysis, but applications to other boundary change studies are possible ("The software is also suitable for any generic application that calculates positional change over time, such as assessing change of glacier limits in sequential aerial photos...").
  • asked a question related to GIS Analysis
Question
14 answers
Dear Researchers,
I am applying interpolation by kriging method using GIS and interpolation did not cover the whole of area under consideration.
How to do this? Please guide me.
Regards
Naveed
Relevant answer
Answer
Most of the comments are wrong or incomplete. First of all you need to distinguish between a particular software implementation and the underlying theory. The theory of kriging, i.e. geostatistics is not based on GIS. Nor is an software package implementing kriging limited to or necessarily related to GIS so the original question implies a serious mis-understanding about kriging and GIS
One of the basic assumptions underlying kriging is either second order stationarity or intrinsic stationarity. Unfortunately with only a finite data set it is not possible to test whether either of these assumptions is really valid. There are ways to ascertain whether it is "reasonable" to make one of these assumptions or rather whether there is evidence that both assumptions are unreasonable. e.g if the empirical variogram increases at a quadratic or higher order then intrinsic stationarity is not reasonable.
If having modeled a variogram with a sill and hence at least a practical range then you would want to be cautious about using kriging at a distance greater than the range from all the data locations. Two things will happen; the kriging variance will be much larger and secondly the "kriged" value will be essentially just the arithmetic mean of the data values. A particular software implementation of kriging might then prevent generating kriged values too far away from all data locations.
Software manuals, particularly those for commercial software are likely not sufficient to address questions such as these. Instead one should consult a book on geostatistics, e.g the one by J.- P. Chiles and P. Delfiner (John Wiley & sons)
  • asked a question related to GIS Analysis
Question
14 answers
I'm trying to do pixel-wise correlation comparison of two datasets? Any standard procedure or codes to perform this?
Like Correlation Coefficient map and RMSE map of a particular region.
Relevant answer
Answer
Certainly; I wrote up the code this morning. Best of luck! https://www.timothyfraser.com/tutorials-in-r/mapping-raster-data-in-the-tidyverse
  • asked a question related to GIS Analysis
Question
5 answers
I want shoreline extraction using a water mask index which is MNDWI. I decided use Landsat data for it. There are two kind of data sets. One is Collection-01 level-01 another one is collection-01 level-02(Surface reflectance). Which is better for analysis? Level-01 or Level-02?
Relevant answer
Answer
Two useful methods:
- NIR as a single band
- NDWI as index (G-NIR/G+NIR)