OGC - Open Geospatial Consortium
  • Wayland, United States
Recent publications
Recent advances in modelling capabilities and data processing combined with vastly improved observation tools and networks have resulted in the expansion of available weather and climate information, from historical observations to seasonal climate forecasts, as well as decadal climate predictions and multi-decadal climate change projections. However, it remains a key challenge to ensure this information reaches the intended climate-sensitive sectors (e.g. water, energy, agriculture, health), and is fit-for-purpose to guarantee the usability of climate information for these downstream users. Climate information can be produced on demand via climate resilience information systems which are existing in various forms. To optimise the efficiency and establish better information exchange between these systems, standardisation is necessary. Here, standards and deployment options are described for how scientific methods can be be deployed in climate resilience information systems, respecting the principles of being findable, accessible, interoperable and reusable. Besides the general description of OGC-API Standards and OGC-API Processes based on existing building blocks, ongoing developments in AI-enhanced services for climate services are described.
This scientific review paper aims at challenging a common point of view on metadata as a necessary evil and something mandatory to the data creating and dataset publishing process. Metadata are instead presented as a crucial element to ensure the findability of data services and repositories. This paper describes a way through four levels of metadata management and publication, from default unstructured data, through schema-based metadata with literal values and/or URIs, towards linked open (meta)data providing explicit linkage between reliable data resources. Such research was conducted within the European Union's project PoliVisu. Special attention is given to the following: (1) guidance on publication aimed at the broad audience of search engine users and (2) the publication of geo (meta)data not only via standard technologies, such as the OGC Catalogue Service for Web and open data portals, but also through leading search engines (that are Schema.org-based).
One of the key challenges towards the realization of smart farming solutions is related to the lack of interoperability between different systems and platforms in the agri-food sector, especially the ones offered by different technology providers. In this respect, seamless exchange and integration of the data produced or collected by those systems is of major importance, which unfortunately is rarely supported. This is in principle due to the wide heterogeneity of data models and semantics used to represent data in the agri-food domain, as well as the lack of related standards to dominate this space and the lack of sufficient interoperability mechanisms that enable the connection of existing agri-food data models. This chapter presents the Agriculture Information Model (AIM) that has been developed by the H2020 DEMETER project, which aims to address the aforementioned issues. AIM has been designed following a layered and modular approach and is realized as a suite of ontologies implemented in line with best practices, reusing existing standards and well-scoped models as much as possible and establishing alignments between them to enable their interoperability and the integration of existing data. AIM is scalable and can easily be extended to address additional needs and incorporate new concepts, maintaining its consistency and compliance. The AIM specification includes a set of guidelines and examples for application developers and other users and is currently being validated in demanding large-scale pilots across various operational environments in the framework of the H2020 DEMETER project aiming to enable the provision of efficient interoperable solutions to farmers and other stakeholders in the agri-food value chain.
With the increasing amount of publicly available geospatial data, the demand on spatial data exploration and analysis kept growing. The SIGSPATIAL community is both a provider of new systems with cutting-edge technology on accessing and processing geospatial data, and a user for all these systems. The SpatialAPI workshop is designed to help the SIGSPATIAL community by growing the knowledge of the existing well-established systems that are available for accessing and processing geospatial data. This includes, but is not limited to, web APIs, programming libraries, database systems, and geospatial extensions to existing systems.
GeoNode is an open source framework designed to build geospatial content management systems (GeoCMS) and spatial data infrastructure (SDI) nodes. Its development was initiated by the Global Facility for Disaster Reduction and Recovery (GFDRR) in 2009 and adopted by a large number of organizations in the following years. Using an open source stack based on mature and robust frameworks and software like Django, OpenLayers, PostGIS, GeoServer and pycsw, an organization can build on top of GeoNode its SDI or geospatial open data portal. GeoNode provides a large number of user friendly capabilities, broad interoperability using Open Geospatial Consortium (OGC) standards, and a powerful authentication/authorization mechanism. Supported by a vast, diverse and global open source community, GeoNode is an official project of the Open Source Geospatial Foundation (OSGeo).
GeoNode is an open source framework designed to build geospatial content management systems (GeoCMS) and spatial data infrastructure (SDI) nodes. Its development was initiated by the Global Facility for Disaster Reduction and Recovery (GFDRR) in 2009 and adopted by a large number of organizations in the following years. Using an open source stack based on mature and robust frameworks and software like Django, OpenLayers, PostGIS, GeoServer and pycsw, an organization can build on top of GeoNode its SDI or geospatial open data portal. GeoNode provides a large number of user friendly capabilities, broad interoperability using Open Geospatial Consortium (OGC) standards, and a powerful authentication/authorization mechanism. Supported by a vast, diverse and global open source community, GeoNode is an official project of the Open Source Geospatial Foundation (OSGeo).
Earth Observation data archives are currently growing at unprecedented speeds. New satellites add petabytes of data every year. At the same time, the amount of data provided by Earth-based in-situ networks is growing at enormous rates. These developments have led to a change in data processing paradigms. Data is not down-loaded and processed locally anymore, but applications are sent to the data. This paper demonstrates an Big Data architecture that allows for interoperable solutions across data providers, integrators, and users. The availability of a mature domain architecture as provided by the Open Geospatial Consortium provides a solid base and allows for uniform microservice handling. The makro-and microarchitecture described herein uses self-contained Docker-images to allow for transparent microservices, horizontal scale-out, and high reliability and maintainability thanks to decoupled and self-sustained execution elements.
A spatial data infrastructure (SDI) is a framework of geospatial data, metadata, users and tools intended to provide an efficient and flexible way to use spatial information. One of the key software components of an SDI is the catalogue service which is needed to discover, query and manage the metadata. Catalogue services in an SDI are typically based on the Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) standard which defines common interfaces for accessing the metadata information. A search engine is a software system capable of supporting fast and reliable search, which may use ‘any means necessary’ to get users to the resources they need quickly and efficiently. These techniques may include full text search, natural language processing, weighted results, fuzzy tolerance results, faceting, hit highlighting, recommendations and many others. In this paper we present an example of a search engine being added to an SDI to improve search against large collections of geospatial datasets. The Centre for Geographic Analysis (CGA) at Harvard University re-engineered the search component of its public domain SDI (Harvard WorldMap) which is based on the GeoNode platform. A search engine was added to the SDI stack to enhance the CSW catalogue discovery abilities. It is now possible to discover spatial datasets from metadata by using the standard search operations of the catalogue and to take advantage of the new abilities of the search engine, to return relevant and reliable content to SDI users.
The process of sharing of data has become easier than ever with the advancement of cloud computing and software tools. However, big challenges remain such as efficient handling of big geospatial data, supporting and sharing of crowd sourced/citizen science data, integration with semantic heterogeneity, and inclusion of agile processes for continuous improvement of geospatial technology. This paper discusses the new frontiers regarding these challenges and the related work performed by the Open Geospatial Consortium, the world leading organization focused on developing open geospatial standards that “geo-enable” the Web, wireless, and location-based services and mainstream IT.
Open standards like OGC standards can be used to improve interoperability and support machine-to-machine interaction over the Web. In the Big Data era, standard-based data and processing services from various vendors could be combined to automate the extraction of information and knowledge from heterogeneous and large volumes of geospatial data. This paper introduces an ongoing OGC China forum initiative, which will demonstrate how OGC standards can benefit the interaction among multiple organizations in China. The ability to share data and processing functions across organizations using standard services could change traditional manual interactions in their business processes, and provide on-demand decision support results by on-line service integration. In the initiative, six organizations are involved in two "MashUp" scenarios on disaster management. One "MashUp" is to derive flood maps in the Poyang Lake, Jiangxi. And the other one is to generate turbidity maps on demand in the East Lake, Wuhan, China. The two scenarios engage different organizations from the Chinese community by integrating sensor observations, data, and processing services from them, and improve the automation of data analysis process using open standards.
A Spatial Database Infrastructure (SDI) is a framework of geospatial data, metadata, users and tools intended to provide the most efficient and flexible way to use spatial information. One of the key software component of a SDI is the catalogue service, needed to discover, query and manage the metadata. Catalogue services in a SDI are typically based on the Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) standard, that defines common interfaces to access the metadata information. A search engine is a software system able to perform very fast and reliable search, with features such as full text search, natural language processing, weighted results, fuzzy tolerance results, faceting, hit highlighting and many others. The Centre of Geographic Analysis (CGA) at Harvard University is trying to integrate within its public domain SDI (named WorldMap), the benefits of both worlds (OGC catalogs and search engines). Harvard Hypermap (HHypermap) is a component that will be part of WorldMap, totally built on an open source stack, implementing an OGC catalog, based on pycsw, to provide access to metadata in a standard way, and a search engine, based on Solr/Lucene, to provide the advanced search features typically found in search engines.
The necessity of open standards for effective sharing and use of remote sensing continues to receive increasing emphasis in policies of agencies and projects around the world. Coordination on the development of open standards for geospatial information is a vital step to insure that the technical standards are ready to support the policy objectives. The mission of the Open Geospatial Consortium (OGC) is to advance development and use of international standards and supporting services that promote geospatial interoperability. To accomplish this mission, OGC serves as the global forum for the collaboration of geospatial data / solution providers and users. Photogrammetry and remote sensing are sources of the largest and most complex geospatial information. Some of the most mature OGC standards for remote sensing include the Sensor Web Enablement (SWE) standards, the Web Coverage Service (WCS) suite of standards, encodings such as NetCDF, GMLJP2 and GeoPackage, and the soon to be approved Discrete Global Grid Systems (DGGS) standard. In collaboration with ISPRS, OGC working with government, research and industrial organizations continue to advance the state of geospatial standards for full use of photogrammetry and remote sensing.
Standardization is a major part of the current Industrie 4.0 activities. Dealing with spatial data, i.e., the location of assets and products is required in Industrie 4.0 to support use cases such as indoor navigation to support service technicians. In this article we analyze standards for spatial data and services defined by the Open Geospatial Consortium (OGC) can be used within Industrie 4.0.
The OGC Interoperability Program is a source of innovation in the development of open standards. The approach to innovation is based on hands-on; collaborative engineering leading to more mature standards and implementations. The process of the Interoperability Program engages a community of sponsors and participants based on an economic model that benefits all involved. Each initiative begins with an innovative approach to identify interoperability needs followed by agile software development to advance the state of technology to the benefit of society. Over eighty initiatives have been conducted in the Interoperability Program since the breakthrough Web Mapping Testbed began the program in 1999. OGC standards that were initiated in Interoperability Program are the basis of two thirds of the certified compliant products.
Warning This document is not an OGC Standard. This document is an OGC White Paper and is therefore not an official position of the OGC membership. It is distributed for review and comment. It is subject to change without notice and may not be referred to as an OGC Standard. Further, an OGC White Paper should not be referenced as required or mandatory technology in procurements.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
Information
Address
Wayland, United States