ArticlePDF Available

Cartographic data harmonisation for a cross-border project development

Authors:

Abstract and Figures

An essential support for environmental monitoring activities is a rigorous definition of a homogeneous cartographic system, required to correctly georeference and analyse the acquired data. Furthermore, since 2007, the European Infrastructure for Spatial Information in the European Community (INSPIRE) Directive affirms the necessity to harmonise the European maps for permitting cross-border analysis. For satisfying these requirements, the authors have developed a procedure for the cartographic harmonisation in the cross-border area studied during the European project Alpes Latines-Coopération Transfrontalière (ALCOTRA)–Alpes Latines-Individuation Resources Hydriques Souterraines (ALIRHyS). It concerns the hydrogeological study of various springs and other water resources in an area between Italy and France including their constitution in a cross-border system. The basic cartographic information is obtained from existing national maps (Italian and French data), which use different coordinate systems or projection methods and are produced from different data acquisitions and processes. In this paper, the authors describe the methods used to obtain well-harmonised middle-scale maps (aerial orthophotos, digital terrain model and digital maps). The processing has been performed using geographic information system (GIS) solutions or image analysis software in order to obtain useful and correct cartographic support for the monitoring data, even if the obtained maps could be further analysed or refined in future works.
Content may be subject to copyright.
A preview of the PDF is not available
... Reaping the results from recent studies on the integration of geoinformation and BIM (GeoBIM) and past data integration theories and experiences (Noardo, 2020a,b;Ellul, 2020;Biljecki, Tauscher, 2019;Arroyo Ohori et al., 2018;Noardo et al., 2016Ulubay, Altan, 2002;Laurini, 1998;Laurini, Thompson, 1992;Kavouras, Kokla, 2007), the complex issue of spatial data integration is analysed in its components and a reference workflow is proposed. ...
... Including therefore, many steps were performed manually or by means of existing tools. Other similar cases were proposed, e.g., by Noardo et al. (2016) and van Heerden (2021). ...
... The semantic paradigm must be compatible with the data requirements. For example, Noardo et al. (2016) reports differences in filling the roads classification in Italian and French digital maps, being 'paved'/ 'unpaved' in the Italian maps and classified according to a hierarchy of functions in the French maps. In this case, it is hard to infer or calculate the values from the available data, and a third source of information is likely necessary. ...
Preprint
Full-text available
Big opportunities are given by the reuse and integration of data, which, nowadays, are more and more available, thanks to advances in acquisition and modelling technologies and the open data paradigm. Seamlessly integrating data from heterogenous data sources has been an interest of the geospatial community for long time. However, the higher semantic and geometrical complexity pose new challenges which have never been tackled in a comprehensive methodology. Building on the previous theories and studies, this paper proposes an overarching methodology for multisource spatial data integration. Starting from the definition of the use case-based requirements for the integrated data, it proposes a framework to analyse the involved datasets with respect to integrability and suggests actions to harmonise them towards the destination model. The overall workflow is explained, including the data merging phase and validation. The methodology is tested and exemplified on a case study. Considering the specific data sets’ features and parameters, this approach will allow the development of consistent, well documented and inclusive data integration workflows, for the sake of use cases processes automation and the production of Interoperable and Reusable data.
... As usual, a conceptual schema for modelling the data is used, but particular care was taken in choosing entities useful for the specific needs of the study and appropriate for the two reference ontologies: the SWEET ontology and the INSPIRE UML model. The resulting conceptual model is shown inFigure 2. It shows the entities present in the INSPIRE conceptual model (in blue), which have been used mainly for the harmonisation of digital maps (Noardo et al., 2015); the entities extracted from the SWEET ontology (bordered in red) are used to manage the remaining concepts present in the system. These last ones and some additional entities are added because the needs of the project are in turn divided in spatial entities (in yellow), dynamic data tables (in violet), and geoprocessing products (in pink). ...
... As a first step in building a geographic database for the project map, GIS tools were used to harmonize the available national cartographic products by exploiting both their geoprocessing capabilities and their database characteristics (as better explained in Noardo et al., 2015). In this paper we focus on the part of the harmonisation processes concerning the digital maps. ...
... The national databases were analysed in order to extract a simplified version of the conceptual model structuring the maps, considering only entities useful for our representation needs. The next step was the mapping of each entity into the selected part of the INSPIRE model, using a transformation to make the data homogeneous (Noardo et al., 2015). ...
Conference Paper
Full-text available
The great potential of GIS to manage and analyse georeferenced information is well-known. The last several years of development of ICT (Information and Communication Technologies) saw a necessity of interoperability arise, from which the Semantic web standards and domain ontologies are derived. Specific application field ontologies are often insufficient for representing the information of multidisciplinary projects. Moreover, they are often aimed at the representation of homogeneous data formats (alphanumeric data, vector spatial data, raster spatial data, etc.). In this scenario, traditional GIS often have a limit: they implement personal data models, which are very difficult to exchange through different systems. In this study we structured a GIS for the monitoring project ALCOTRA ALIRHYS according to parts of two different self-integrated ontologies, from the perspective of the major interoperability of the system and the sharing of data through a web-GIS platform. The two standard models chosen (SWEET ontology and INSPIRE UML model) have been integrated in a unique conceptual model useful both for geometric and cartographic data, and for thematic information. In this case, the implemented schemas are published on the project website, and are available for other users who want to produce similar studies. Since user-friendly results were desirable, some integrated commercial widespread software programs have been used even if their abilities to manage such a GIS are suboptimal.
... Standardisation and harmonisation will increase confidence in the comparability of measures by minimising potential errors from heterogeneity in data collection procedures, software or methodological approaches; and are particularly important in projects between two countries where definitions, symbology and projection in source cartographic data will vary. 24 While both teams have access to a rich history of local data and built environment measure development, where feasible, we will use open data (eg, OpenStreetMap) and open-source software, such as QGIS, SQL, Python or R, to reduce barriers to reproducing this research in other contexts, thus increasing the potential impact. Methods and code will be made available in future publications, and variations in terminology will be mapped out in a typology with definitions. ...
Article
Full-text available
Introduction: Childhood obesity and physical inactivity are two of the most significant modifiable risk factors for the prevention of non-communicable diseases (NCDs). Yet, a third of children in Wales and Australia are overweight or obese, and only 20% of UK and Australian children are sufficiently active. The purpose of the Built Environments And Child Health in WalEs and AuStralia (BEACHES) study is to identify and understand how complex and interacting factors in the built environment influence modifiable risk factors for NCDs across childhood. Methods and analysis: This is an observational study using data from five established cohorts from Wales and Australia: (1) Wales Electronic Cohort for Children; (2) Millennium Cohort Study; (3) PLAY Spaces and Environments for Children's Physical Activity study; (4) The ORIGINS Project; and (5) Growing Up in Australia: the Longitudinal Study of Australian Children. The study will incorporate a comprehensive suite of longitudinal quantitative data (surveys, anthropometry, accelerometry, and Geographic Information Systems data) to understand how the built environment influences children's modifiable risk factors for NCDs (body mass index, physical activity, sedentary behaviour and diet). Ethics and dissemination: This study has received the following approvals: University of Western Australia Human Research Ethics Committee (2020/ET000353), Ramsay Human Research Ethics Committee (under review) and Swansea University Information Governance Review Panel (Project ID: 1001). Findings will be reported to the following: (1) funding bodies, research institutes and hospitals supporting the BEACHES project; (2) parents and children; (3) school management teams; (4) existing and new industry partner networks; (5) federal, state and local governments to inform policy; as well as (6) presented at local, national and international conferences; and (7) disseminated by peer-reviewed publications.
... The general objective of the project is to strengthen the cross-border institutional cooperation between public authorities and spatial planning operators, and implementing shared solutions for coordination and effective land management. It builds on previous and complementary work in the European context, from projects, such as, Plan4All (PLAN4ALL;Camarata et al., 2011), Habitats (HABITATS, 2018), Humboldt (Čerba et al., 2008;Fichtinger et al., 2011), HLANDATA (Goñi, 2011), ALCOTRA (Noardo et al. 2016)that already treated data harmonization needs for spatial management in the context of INSPIRE Directive (European Parliament, 2007). ...
Article
Full-text available
HARMO-DATA is an ongoing project, funded by EU in the framework of the INTERREG V-A Italy-Slovenia 2014-2020 Programme. It involves different stakeholders, target groups and end-users in three regions: Friuli-Venezia-Giulia (Italy), Veneto (Italy) and Slovenia. The main purpose of this project is to develop common solutions for more efficient cross-border spatial data management – by harmonizing the existing spatial data, implementing a cross-border spatial data platform, and developing a common protocol for the harmonization of territorial data. It will provide an instrument to define the specific obligations and rights of the involved parts – in terms of data harmonization, exchange, use and maintenance. Five pilot case studies were identified by the project partners – in cooperation with public and private end-users, and additional stakeholders. The core use cases of the project relate to spatial data search, view and download, and the harmonization model for spatial datasets applies the INSPIRE data specifications. A joined common spatial data platform was established as an extension of the existing search-view-download platforms (metadata systems), upgraded and improved to better enable open data access by users from both Italy and Slovenia. The common spatial HARMO-DATA data platform, as well as, a joint protocol for cross-border spatial data harmonization, have been formalized in an official bilateral agreement.
... One of the major and challenging issues in SDIs development is related to cartographic data harmonization [295], either in national or multinational level. Considering that environmental monitoring data are usually produced by several organizations (possibly with non-uniform standardization protocols) using different devices which may produce various file formats and/or data characterized by different ranges of spatiotemporal accuracy, this process of their integration is considered quite important. ...
Article
Full-text available
Human activities and climate change constitute the contemporary catalyst for natural processes and their impacts, i.e., geo-environmental hazards. Globally, natural catastrophic phenomena and hazards, such as drought, soil erosion, quantitative and qualitative degradation of groundwater , frost, flooding, sea level rise, etc., are intensified by anthropogenic factors. Thus, they present rapid increase in intensity, frequency of occurrence, spatial density, and significant spread of the areas of occurrence. The impact of these phenomena is devastating to human life and to global economies, private holdings, infrastructure, etc., while in a wider context it has a very negative effect on the social, environmental, and economic status of the affected region. Geospatial technologies including Geographic Information Systems, Remote Sensing-Earth Observation as well as related spatial data analysis tools, models, databases, contribute nowadays significantly in predicting , preventing, researching, addressing, rehabilitating, and managing these phenomena and their effects. This review attempts to mark the most devastating geo-hazards from the view of environmental monitoring, covering the state of the art in the use of geospatial technologies in that respect. It also defines the main challenge of this new era which is nothing more than the fictitious exploitation of the information produced by the environmental monitoring so that the necessary policies are taken in the direction of a sustainable future. The review highlights the potential and increasing added value of geographic information as a means to support environmental monitoring in the face of climate change. The growth in geographic information seems to be rapidly accelerated due to the technological and scientific developments that will continue with exponential progress in the years to come. Nonetheless, as it is also highlighted in this review continuous monitoring of the environment is subject to an interdisciplinary approach and contains an amount of actions that cover both the development of natural phenomena and their catastrophic effects mostly due to climate change.
... This process was later enhanced by developing specific algorithms to detect duplicates, using the centroid coordinates, area size, etc. to do so. This is very common practice when using cartography obtained from different data acquisitions and processes [15]. ...
Article
Currently, it is fairly widespread to use smartphones or tablets on field surveys to collect and geolocate damage data. However, geolocation is not a straight forward process and may give inaccurate results such as, for example when the size of the object to be surveyed is relatively small or the coverage of the satellite constellation (e.g. GPS) is inadequate due to obstacles and shadows present in urban areas. Moreover, the pressure that surveyors and technicians suffer during and after the impact of a natural hazard may make the whole geolocation process even more difficult. In this paper, we describe a methodology to overcome the issues of inaccurate records in five damage data surveys collected after the 7.8 magnitude earthquake that struck the coast of Ecuador in April 2016, together with the three administrative sources used to interpret the damage. We started off by homogenizing the various states of damage as charted in field and aerial surveys, including satellite imagery. We then resolved geolocation inaccuracies by using a set of algorithms that take into account the spatial context and the size of the building. These algorithms also flag the quality of the sources to ultimately compute a figure of the spatial distribution of the damage suffered by residential buildings, together with harm done to productive and social infrastructure. Without these preliminary proceedings, the geolocation inaccuracies of the damage data surveys would not have allowed for adequate and detailed risk assessment.
... This process was later enhanced by developing specific algorithms to detect duplicates, using the centroid coordinates, area size, etc. to do so. This is very common practice when using cartography obtained from different data acquisitions and processes [15]. ...
Conference Paper
Currently, even in socioeconomically deprived areas, useful data for earthquake risk assessment is present. The problem in fact is that those data need extensive preprocessing before being used for risk analyses. Although this situation preclude their use for quick response in emergency situations, detailed studies can be conducted with ample time allowed. An important issue in making these data useful is the support and participation of all institutions managing relevant data, because their improvement depends on our capacity to combine and interrelate a variety of data sources. Here imagination and data processing capacities are important skills. After the April 16 th 2016 earthquake, a detailed study was carried out at Portoviejo (Ecuador). Among others, one of the aims of this study was to understand the earthquake impact in Portoviejo's buildings inventory. This work is focused in the methodologies developed to prepare the data for risk analysis. Such methodologies, in addition to ad-hoc corrections, includes algorithms to a) correct the geographical coordinates related to the damage assessment, b) get the exposure and reposition costs for economic losses, c) characterize the buildings (their plan shape and regularity, level of isolation from other buildings, orientation). The procedures were applied mostly in the urban area, where roughly 80000 structures are present, being conducted at the individual level, namely building by building and it took one and a half year. Data processing applications were developed in ANSI C code language, combined with scripts in Python and R, which allows to recalculate the results very efficiently. The software also compute basic statistics, calculations used to describe the before-the-earthquake city and to correlate the damage with different cadastral, morphometric and on-site variables. This study also highlights the damage assessment and form damage scenario calculations.
... This process was later enhanced by developing specific algorithms to detect duplicates, using the centroid coordinates, area size, etc. to do so. This is very common practice when using cartography obtained from different data acquisitions and processes [15]. ...
Article
The reuse and integration of data give big opportunities, supported by the FAIR data principles. Seamless data integration from heterogenous sources has been an interest of the geospatial community for a long time. However, 3D city models, building information models, and information supporting smart cities present higher semantic and geometrical complexity, which pose new challenges never tackled in a comprehensive methodology. Building on previous theories and studies, this article proposes an overarching workflow and framework for multisource (geo)spatial data integration. It starts from the definition of use case‐based requirements for the integrated data, guides the analysis of integrability of the involved datasets, suggesting actions to harmonize them, until data merging and validation. It is finally tested and exemplified in a case study. This approach allows the development of consistent, well‐documented, and inclusive data integration workflows, for the sake of use case automation in various geospatial domains and the production of interoperable and reusable data.
Book
Full-text available
La présente publication est le résultat d’une collaboration entre les organismes suivants : Politecnico di Torino, Polytech Nice-Sophia, Regione Piemonte et Métropole Nice Côte-d’Azur, pour le développement du projet ALIRHYS dans le cadre du programme de coopération transfrontalière ALCOTRA 2007-2013. La contribution synergique de tous les partenaires en vue de la réalisation du projet a permis d’approfondir et de développer les thèmes proposés par l’objectif stratégique 2 qui a comme axe prioritaire la protection et la gestion du territoire. En particulier, le projet ALIRHYS vise à la connaissance et à la gestion des ressources en eaux souterraines qui alimentent de nombreuses sources à l’origine du réseau hydrographique qui se développe sur les territoires italiens et français. Les débits de ces cours d’eau sont étroitement influencés par la fonte des neiges, les précipitations et les nombreuses sources qui garantissent un flux hydrique conséquent, même après de longues périodes de sécheresse ; une partie de celles-ci sont captées pour l’eau potable. Ces sources sont, ces dernières décennies, particulièrement exposées aux risques naturels : les conditions climatiques extrêmes (périodes de sécheresse et inondations), ont significativement augmenté sous l’influence du changement climatique en cours. Les dommages potentiels qui découlent de ces risques sur les ressources hydriques peuvent affecter le développement des zones qui sont impliquées dans le programme de recherche. Les projets européens sont une occasion importante pour développer et attirer des ressources financières sur le territoire, nous invitons en conséquence les lecteurs à s’en inspirer pour donner lieu à de nouvelles synergies et propositions, afin d’assurer la continuité et de développer davantage ce secteur d’activité.
Conference Paper
Full-text available
The traditional workflows in geography and cartography have been redefined by the change in the production paradigm. From the single purpose data collection the focus has been shifted to information management; to the establishment of spatial data infrastructures (SDI) at local, national, regional and global levels. The concept for establishing the European SDI emerged from the need of the environment, where GIS give an efficient framework for data processing together with effective communication tool for representing information. The INSPIRE Directive of the European Commission, which has been agreed upon by the European Parliament and the Council will enforce better and wider use of the data and the interoperability between the systems operated by the Member States. In order to bring the initiative to success the provisions for the implementation will be based on the consensus of the participants. Five working groups, called Drafting Teams, are working on the aspects of metadata, data harmonisation, network services, data sharing and implementation monitoring. How do the traditions and the emerging technology interact in case of SDI and cartography? By its nature SDI is a much wider notion. Never the less, the experience of cartography directly contributes, amongst others, to the following aspects of SDI: 1. Spatial cognition, data harmonisation In spite that SDIs are usually defined within service oriented architecture, certain data harmonisation work is usually needed, which takes place at semantic, schema, data and information product level. Cartography has accumulated experience in describing and classifying the Universe with consistent models and in coherent channelling information to the users. 2. Level of details, scale and data quality Experience in coupling different levels of aggregation with reasonable spatial resolution (scale) and meaningful data quality requirements is another asset of cartography. 3. Multiple representation, data consistency A natural requirement in SDI is that objects represented at different level of details are consistent. Multiple-representation is widely researched and practiced in digital cartography. 4. Portrayal Portrayal plays an important role in discovery and viewing services, but also in communication of the spatially enabled information. Clear legend, adaptive zooming with the appropriate multiple-representation and generalisation capabilities in the background greatly facilitate this task. The presentation and the full paper will explain how the above fields may contribute to SDI building at European level, based on the requirements of the INSPIRE Directive.
Conference Paper
Full-text available
Introduction In Poland, spatial planning is currently in the transformation phase under both the local legal changes, and also a participation in various international projects e.g., INSPIRE, plan4ALL. The popularization of GIS-ICT solutions of spatial plans is also very important. One of the key issues in Polish planning is the need to transition from Research One of the key issues in Polish planning is the need to transition from analog to digital data. Moreover, it is necessary to standardize a feature catalogue for spatial plans and determine how to directly move graphic symbols from the study to the local plan, e.g. boundaries of parks, forests or protection zones. The imprecision of these elements very often prevented enact of a plan, since the curve from a study was not precisely reflected. Another problem is the availability of reference data which are essential to complete the local plans. In practice, it happens that the commune which acceded to draw up a plan, does not have the reference data with relevant an objective scope that leads to lower a design standard. Another aspect is the lack of availability of local plans on the Internet, on the official websites of communes, and the lack of widespread use of ICT solutions in the field of a social participation in the planning process.
Article
Full-text available
Increase in the number of satellites and the utilization of digital cameras in the aerial photography has spread the use of satellite image and oriented aerial photograph as real or near-real time resolution, accessible, cost effective spatial data. Co-registered images or aerial photos corrected for the height variations and orthogonality (scale) have become an essential input for geographical information systems and spatial decision making due to their integration with the other spatial data. Beyond that, images and photographs compose infrastructure for the other information in usage of spatial data with the help of the access and query facility web providing. Although the issue of the aerial photo ortho-rectification has been solved long ago, the problems related with the storage of huge amount of photos and images, their management, processes, and user accesses have been raised. These subjects concern the multitudinous private and governmental institutes. Some governmental organizations and private companies have gained the technical ability to perform these works in recent times. This situation has lead to significant increase in the amount of aerial photograph taking and processing in one year for whole country. General Command of Mapping has been using digital aerial camera since 2008 for the photograph taking. The total area covered by the satellite images, purchased for different purposes, and the aerial photographs, taken for some revision purposes or demands of governmental and private institutes, has reached up to 200.000 km². It is considered that, colored and high resolution orthophotos of the whole country can be achieved within four years; provided that the annual production would continue similarly without any increase in amount. From the numbers given above, it is clear and inevitable that the orthophoto production procedure must be improved in order to produce orthophotos in the same year just after the photograph takings. Necessary studies about the storage, management and presentation of the huge amounts of orthophoto images to the users must be started immediately. In this study; metadata components of the produced orthophotos compatible with the international standards have been defined, a relational database has been created to keep complete and accurate metadata, and a user interface has been developed to insert the metadata into the database. Through the developed software, some extra time has been saved while creating and querying the metadata.
Article
Full-text available
Laser scanners have increased their efficiency exponentially when compared to state of the art ten years ago. More data can be acquired—and higher accuracy can be achieved—over longer ranges thanks to advancements in sensor technology. The goal of this review is to present state of the art of terrestrial and aerial laser scanner surveys with a critical discussion over quality, which is a very important aspect for high-resolution topography.
Article
Full-text available
The project GiMoDig (Geospatial Info-Mobility Service by Real-Time Data Integration and Generalisation) started in November 2001 and is funded by the European Union. With a duration of 3 years this EU-project has the goal to develop methods for harmonisation, generalisation and visualisation of national topographic data sets for mobile users in real time. The project partners are the Finnish Geodetic Institute as a project coordinator, the National Mapping Agencies (NMAs) of Denmark, Finland, Sweden and Germany and the University of Hanover, the Institute of Cartography and Geoinformatics. One of the tasks in the GiMoDig project is to define a Global Schema for the core national topographic data sets. For this purpose an inventory on the national databases is prepared to list the differences in data availability and data modelling. Based on that inventory, a selection of feature types suitable for Location Based Services (LBS) is made. The idea is to use the least common denominator as selection criteria but this subset already lacks some important feature types. Finally all features that are supported by a majority of national data sets are integrated in the Global Schema. The Global Schema is defined with a detailed description about feature type, attributes, collection criteria and geometry type. All necessary information about harmonisation operations are given to be able to transform the topographic data from the national schema into the Global Schema. Further scrutinizations of test data lead to an improvement and adaptation of the Global Schema.
Article
Full-text available
Natural fires are an integral part of Mediterranean ecosystems. However, in Europe, the extensive use of natural and forest regions as recreational areas has increased the number of human caused fires. Additionally, the decrease of rural population and abandonment of agricultural regions has led to the build up of fire fuels on these areas and the consequent increase in the number and the damage of forest fires. On average, fires in Europe burn 0.5 million hectares of forest areas every year. Although the number of fires has been steadily increasing in the last decade, the area burnt by fires has not increased. This increase in the number of fires has been accompanied by a decrease in the mean burnt area due to the improvements in infrastructure and means to extinguish them. Traditionally, fire fighting was carried out by local administrations, which extinguished fires in the surrounding areas. However, as limited and expensive resources became available, these were often administered by the National forest fire services. This scaling factor has evolved with time and national forest fire risks maps are currently available in many European countries. However, the regional (supra-national) evaluation of forest fire risk was a task that was not tackled until recently by the European Union. This was probably due to the lack of regional datasets for the estimation of forest fire risk and the lack of regional information of forest fires that would necessarily be used for the calibration and validation of fire risk indices. The European Commission (EC) Civil Protection Unit, at the EC Directorate for Environment, aware of the strong impact of forest fires in Europe, established a research group at the EC – Joint Research Centre (JRC) to work specifically on developing and implementing methods for the evaluation of forest fire risk at the European scale. This group has been working closely with the national forest fire administrations towards the development a European Forest Fire Information System (EFFIS) that will provide up-to-date and harmonized information to all the services in charge of forest fire prevention and management. EFFIS, which is available at http://natural-hazards.jrc.it/fires, includes a system for the delivery of forest fire risk forecast during peak of the fire campaign (6 months), and the yearly evaluation of fire damages through the analysis of satellite imagery.
Article
Full-text available
IDE-OTALEX is the first crossborder spatial data infrastructure between contiguous Portuguese (Alentejo and Centro) and Spanish (Extremadura) regions. It was implemented to share official geographic information from Alentejo and Extremadura, and now Centro region, with everyone. This is the most effective way to have a distributed and flexible system to be used as a territorial observatory for sustainable development and environment protection in these rural and low populated regions. It also contributes to territorial cohesion, one of the tree main pillars of European Cohesion Policy. It’s characterized for being a distributed, decentralized, modular and collaborative system, based on standards OGC (Open Geospatial Consortium), W3C (World Wide Web Consortium), ISO (International Organization for Standardization) and open source technology, developed to guarantee interoperability between the different GIS (Geographic Information System) provided by each project partner. The geoportal is multilingual (Portuguese, Spanish and English) and integrates a Map viewer, Metadata Catalogue and Gazetteer. It consists in central and local nodes which communicate through WMS (Web Map Services), CSW (Catalogue Service Web) and WFS (Web Feature Services). It is now implementing SOS (Sensor Observation Services) and WPS (Web Map Processing). The geographic information available results of an extensive work of data harmonisation adapted to INSPIRE Directive (D 2007/2/EC, the European Parliament and Council, March 14, 2007). It integrates basic cartography, socio-economic, territorial and environmental indicators.
Conference Paper
Today one of the open problems from the geodetic point of view is the determination of a geometric (and not physical) reference system that allow the harmonisation of the altimetric reference systems at continental scale. Trying to solve this problem at European level, one of the goal of the IAG Inter-Commission Project 1.2 is to provide the fundamentals for the installation of a unified european vertical reference frame. The EVRS (European Vertical Reference System) is a zero-tidal gravity-related height reference system, based on normal heights. Today all people use and know the orthometric heights, not the normal ones. So the goal is to move from the first one to the second one. In order to have correct normal heights the first step is to have correct orthometric heights: that does not always happen in the sense that if the orthometric corrections are not applied to the heights determined by high-precision leveling, it is not possible a correct unification of the reference systems. Within the HELIDEM (HELvetia-Italy Digital Elevation Model) project, among whose aims is to unify the reference systems between Italy and Switzerland with regard to regions involved in the Project, it was decided to analyze whether the orthometric corrections obtained for this area, considering a global model of geoid undulation (EGM2008), were sufficient to correct the elevations determined by static GNSS surveys carried out by Politecnico di Torino and Istituto Geografico Militare (IGM). In order to achieve the objectives of the project mentioned above, it was necessary to perform GNSS static campaigns on benchmarks of the Italian geodetic reference network that were subsequently leveled with high-precision leveling. The considered benchmarks are settled in the Piedmont area and were chosen in the way that the height difference between two adjacent vertices was limited mainly to reduce elevation waste errors resulting from static GNSS measurements. Both dynamic and orthometric corrections have been computed in order to finally get geopotential numbers, normal heights and orthometric heights from raw levelling increments. In order to complete and make the analysis more significant, some measured gravity values were also considered for the estimation of the orthometric corrections, in order to compare these results with those previously described. This analysis showed that the measured gravity values well agree with those obtained from the EGM2008 up to 800-1000 meters of altitude, since the values of orthometric corrections obtained in both cases are quite agree (the order of magnitude of the differences are centimeters).