Article
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

The developing cyberinfrastructure affects the knowledge system by which geological surveys collect, represent and communicate their knowledge, and thereby influences their view of the geology. Consequences for four interacting aspects of the overall ...

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Knowledge capture and utilization provide for the smart discovery, integration, indexing, collection, and utilization of vast amounts of data, processing components, and other tools available in a GCI. Research on the Semantic Web (Brodaric, Fox, & McGuinness, 2009) includes research on semantic understanding (such as Embley, 2004), knowledge-based functions (Reynolds & Zhu, 2001), ontology-based dynamic query rewriting for the semantic translation of data, and other learning processes (Shimbo & Ishidab, 2003). Semantic interoperability is an important prerequisite to information integration across domains, where vocabulary differences may be common. ...
... Semantic Web and knowledge sharing is an essential ingredient to cross-domain collaborations, interdisciplinary discoveries (Berners-Lee, Hendler, & Lassila, 2001;Brodaric et al., 2009), and the life cycle from data to knowledge. For example, meaning-based data integration forms a basis for Web 3.0 heterogeneous data sharing (Lassila & Hendler, 2007), and the semantic, knowledge, or cross-cultural sharing of resources forms a basis for cross-domain studies (Lightfoot, Bachrach, Abrams, Kielman, & Weiss, 2009). ...
... Utilizing the Semantic Web to support building knowledge and semantics into the next generation of scientific tools will support smart processing of geospatial metadata, data, information, knowledge, and services for virtual communities and multiple scientific domains (Hendler, 2003). How to capture, represent, and integrate knowledge within and across geospatial domains are all ongoing challenges (Brodaric et al., 2009). Venkatasubramanian (2009aVenkatasubramanian ( , 2009b found that, in a data-rich world, we must find a way to automatically utilize knowledge acquired in the past to facilitate the automatic identification, utilization, and integration of datasets into operational systems. ...
Article
A Cyberinfrastructure (CI) is a combination of data resources, network protocols, computing platforms, and computational services that brings people, information, and computational tools together to perform science or other data-rich applications in this information-driven world. Most science domains adopt intrinsic geospatial principles (such as spatial constraints in phenomena evolution) for large amounts of geospatial data processing (such as geospatial analysis, feature relationship calculations, geospatial modeling, geovisualization, and geospatial decision support). Geospatial CI (GCI) refers to CI that utilizes geospatial principles and geospatial information to transform how research, development, and education are conducted within and across science domains (such as the environmental and Earth sciences). GCI is based on recent advancements in geographic information science, information technology, computer networks, sensor networks, Web computing, CI, and e-research/e-science. This paper reviews the research, development, education, and other efforts that have contributed to building GCI in terms of its history, objectives, architecture, supporting technologies, functions, application communities, and future research directions. Similar to how GIS transformed the procedures for geospatial sciences, GCI provides significant improvements to how the sciences that need geospatial information will advance. The evolution of GCI will produce platforms for geospatial science domains and communities to better conduct research and development and to better collect data, access data, analyze data, model and simulate phenomena, visualize data and information, and produce knowledge. To achieve these transformative objectives, collaborative research and federated developments are needed for the following reasons: (1) to address social heterogeneity to identify geospatial problems encountered by relevant sciences and applications, (2) to analyze data for information flows and processing needed to solve the identified problems, (3) to utilize Semantic Web to support building knowledge and semantics into future GCI tools, (4) to develop geospatial middleware to provide functional and intermediate services and support service evolution for stakeholders, (5) to advance citizen-based sciences to reflect the fact that cyberspace is open to the public and citizen participation will be essential, (6) to advance GCI to geospatial cloud computing to implement the transparent and opaque platforms required for addressing fundamental science questions and application problems, and (7) to develop a research and development agenda that addresses these needs with good federation and collaboration across GCI communities, such as government agencies, non-government organizations, industries, academia, and the public.
... WSTs are used widely in the geospatial domain, expanding the traditional focus on discovery and access to geospatial data. Geospatial WSTs are designed to integrate, edit and store a large amount of geospatial information and their corresponding metadata, promoting collaborative working and fulfilling users' requests [41]. However, our working framework does not include a volume of information large enough to implement a web service whose architecture is defined to support complex data infrastructures. ...
Article
Full-text available
The continuous development of machine learning procedures and the development of new ways of mapping based on the integration of spatial data from heterogeneous sources have resulted in the automation of many processes associated with cartographic production such as positional accuracy assessment (PAA). The automation of the PAA of spatial data is based on automated matching procedures between corresponding spatial objects (usually building polygons) from two geospatial databases (GDB), which in turn are related to the quantification of the similarity between these objects. Therefore, assessing the capabilities of these automated matching procedures is key to making automation a fully operational solution in PAA processes. The present study has been developed in response to the need to explore the scope of these capabilities by means of a comparison with human capabilities. Thus, using a genetic algorithm (GA) and a group of human experts, two experiments have been carried out: (i) to compare the similarity values between building polygons assigned by both and (ii) to compare the matching procedure developed in both cases. The results obtained showed that the GA—experts agreement was very high, with a mean agreement percentage of 93.3% (for the experiment 1) and 98.8% (for the experiment 2). These results confirm the capability of the machine-based procedures, and specifically of GAs, to carry out matching tasks.
... Some of the augmented data produced by the library are not (yet) explicitly required by any of the packages but are useful datasets for contextual regional analysis and can provide some guidance for studies unrelated to 3D modelling. A partner project led by the Geological Survey of Canada is developing a Knowledge Manager to support higher-level information as a geoscience ontology to provide conceptual frameworks for modelling, aggregated petrophysical data and other basic knowledge of relevance to 3D modelling workflows (Brodaric et al., 2009;Ma and Fox, 2013). The outputs of map2loop and map2model described above provide all of the information required to build 3D geological models in GemPy (de la Varga et al., 2019) and LoopStructural (Grose et al., 2021). ...
Article
Full-text available
At a regional scale, the best predictor for the 3D geology of the near-subsurface is often the information contained in a geological map. One challenge we face is the difficulty in reproducibly preparing input data for 3D geological models. We present two libraries (map2loop and map2model) that automatically combine the information available in digital geological maps with conceptual information, including assumptions regarding the subsurface extent of faults and plutons to provide sufficient constraints to build a prototype 3D geological model. The information stored in a map falls into three categories of geometric data: positional data, such as the position of faults, intrusive, and stratigraphic contacts; gradient data, such as the dips of contacts or faults; and topological data, such as the age relationships of faults and stratigraphic units or their spatial adjacency relationships. This automation provides significant advantages: it reduces the time to first prototype models; it clearly separates the data, concepts, and interpretations; and provides a homogenous pathway to sensitivity analysis, uncertainty quantification, and value of information studies that require stochastic simulations, and thus the automation of the 3D modelling workflow from data extraction through to model construction. We use the example of the folded and faulted Hamersley Basin in Western Australia to demonstrate a complete workflow from data extraction to 3D modelling using two different open-source 3D modelling engines: GemPy and LoopStructural.
... The client component is connected with a GIS-client and applications used databases, which are currently developed. Necessity of ontologies usage in geophysical sciences and their role have been demonstrated in the papers [5][6][7][8][9]. In this paper we describe new steps in development of the 'Climate+' platform, namely applications of different types and ontology that refer to three layers (Data, Metadata (Information) and Ontology (Knowledge) layers) of data processing services of the VRE under development. ...
Conference Paper
Full-text available
Two types of applied tasks used in the thematic virtual research environment (VRE) based on the "Climate+" platform are considered. Tasks of both types use significant amount of climatic or meteorological data. The first type of applied tasks whose solutions describe quantitatively climate of chosen territory are on-line computed and mapped using GIS technologies. The second type of applied tasks includes tasks used for decision making. Those along with the computational component, includes tools for expert selection of the initial conditions for these tasks, tools for determining the semantic homogeneity of physical quantities used in the calculations, and software for forming the A-box of the knowledge base of a decision support system (DSS). Presented are several first-type tasks and the second-type task about changing the depth of the active soil layer in the northern regions of Western and Eastern Siberia (the interfluve of the Ob and Yenisei rivers) for a period of 60 years. The solution of this task and the structure of a typical ontology individual used for the decision making are presented. The role of the ontology description of solutions of applied tasks in the VRE based on the "Climate+" platform is discussed.
... The important prerequisite required for such integration is an ontology description, since each domain type has its own knowledge representation and integration of those might be controlled on the base of their ontology description. Role and necessity of ontologies usage in geophysical sciences have been demonstrated in the papers of Athanasis et al. (2009), Bogdanović, Stanimirović & Stoimenov (2015, Brodaric, Fox & McGuinness (2009), Husain et al. (2011) and Lutz et al. (2009. ...
Article
Full-text available
This paper describes a Virtual Research Environment (VRE) based on a web GIS platform ‘­Climate+’, which provides an access to analytic instruments processing 19 collections of ­meteorological and climate data of several international organizations. This environment ­provides ­ystematization sof spatial data and related climate information and allows a user getting ­analysis results using geoinformation technologies. The ontology approach to this ­ s ystematization is described, ­making it possible to match semantics of meteorological and climate parameters presented in different collections and used in solving various applied problems.
... Тем не менее, вследствие быстрого роста количества наборов пространственных данных об окружающей среде, возник дефицит программных инструментов, обладающих дружественным интерфейсом, и решающих задачу поиска, интеграции и совместного анализа различных геофизических и социоэкономических данных [4]. Интеграция веб-сервисов и приложений с программными инструментами работы с данными наблюдений и моделирования [4,5] не является новым подходом. Она уже была в той или иной степени реализована для предметных областей климатологии [6], гидрологии [7], и т.д., то есть для случаев, когда технологии ГИС являются частью научных исследований. ...
... We can address the existing limitations in measuring, monitoring, and characterizing health care diffusion-with breast imaging as a timely example, given new technologies, such as digital breast tomosynthesis (DBT) [14][15][16], and legislation related to breast density notification [17]. With a geospatial semantic web [18][19][20][21][22], which combines web mining techniques with geographic information systems and census data, one can ascertain geographic uptake of DBT nationally, estimate potential access overall and by population subgroups, and identify correlates of dissemination patterns. ...
... (Hey et al. 2009) Science has now fully entered this new mode of operation, which combines science, informatics, computer science, cyberinfrastructure and information technology. It has been six years since the special issue Geoscience Knowledge Representation in Cyberinfrastructure (Brodaric et al. 2009) appeared in the journal Computers & Geosciences. In the ensuing years e-Science has changed how science disciplines conduct both individual and collaborative work. ...
Article
Full-text available
Jim Gray described e-Science as where “IT meets scientists.” (Hey et al., 2009) Science has now fully entered this new mode of operation, which combines science, informatics, computer science, cyberinfrastructure and information technology. It has been six years since the special issue Geoscience Knowledge Representation in Cyberinfrastructure (Brodaric et al., 2009) appeared in the journal Computers & Geosciences. In the ensuing years e-Science has changed how science disciplines conduct both individual and collaborative work. It is time to once again review the state of e-Science research. A special session was held at the American Geophysical Union (AGU) 2013 Fall Meeting. This special session, titled Semantically Enabling Annotation, Discovery, Access, and Integration of Scientific Data, hosted 25 presentations on current e-Science projects. We initiated this special issue by sending invitations to authors in the 2009 Computers & Geoscience special issue as well as the 2013 AGU presenters. Submission to this special issue was also open to everyone, and we were happy to have received manuscripts from authors across the world. This finalized special issue consists of 11 papers, which cover various subjects in Earth and environmental sciences, and demonstrate state-of-the-art technologies in knowledge representation, data interoperability, vocabulary and data services, and data processing.
... The geospatial Semantic Web is the third generation Web in which the browser, crawler, and other tools understand spatial content and can exploit this knowledge on-the-fly [97]. Enabling technologies that provide technological support functions, such as collecting data through observations and utilising knowledge through a semantic web [46] are essential ingredients to cross-domain collaborations and interdisciplinary discoveries [98,99]. Technology may provide a vehicle for reaching out to communities and institutions that hold crucial information capable of informing decisions and drawing new stakeholders into the MPA planning process [75] . ...
Article
Full-text available
Stakeholder participation has received increased attention as a key process for enhancing mitigation of conflicts between different interests for the same resources and transparent decision-making in marine protected areas (MPAs). A wide range of advanced web tools is available nowadays that integrate stakeholder participation by generating new information and allow interaction between actors in MPA management. However, such technologies are frequently used without much consideration regarding the complexity of the decision to be made and the heterogeneity of stakeholder preferences and understanding in order to be related to these technologies. In order to understand how technology corresponds to the changing needs of MPA management, we have reviewed a range of different participation strategies adopted by web technology, based on a set of criteria that define a successful participation approach. We start from simple towards more sophisticated tools that have been developed worldwide in order to better inform decisions, and contribute to more effective and efficient MPA management. Finally, we draw a theoretical framework for the development of a community-based web tool with the capacity to incorporate the philosophy of stakeholder participation by generating new and high quality information flow for effective MPA management.
... The way the concepts of SDI and Cyberinfrastructure complement each other is perceived in different ways by various authors. Thurston (2007) sees Cyberinfrastructures as hardware pools that support SDIs, while Brodaric, Fox, & McGuinness, 2009 consider SDIs as a specific case of Cyberinfrastructures dealing with spatial information. Georgiadou (2008) sees Cyberinfrastructures as a key driver for SDI innovation (p. ...
Article
User-generated content, interoperability and the social dimension are the cornerstones of an emerging paradigm for the creation and sharing of information: Web 2.0. This article studies how geoportals can benefit from the Web 2.0 features. Geoportals are World Wide Web gateways that organize content and services related to geographic information. They are the most visible part of Spatial Data Infrastructures (i.e. distributed systems that aid acquisition, processing, distribution, use, maintenance, and preservation of spatial data). Today’s geoportals are focusing on interoperability through the implementation of standards for discovery and use of geographic data and services. Will tomorrow’s Geoportals focus more on organising communities of users sharing common interests? Recent papers are arguing for deeper integration of the Web 2.0 paradigm within the geospatial web. This article aims to provide an overview supporting the next generation geoportal development by defining related concepts, by emphasising advantages and caveats of such an approach, and proposing appropriate implementation strategies.
... By selecting reliable services based on the service performance and then integrating them seamlessly, it can help solve complicated problems (Nikola and Miroslaw 2004). Many initiatives, such as geospatial cyberinfrastructure (CI) (GCI, Brodaric et al. 2009, Wang and Liu 2009 and Digital Earth (Yang et al. 2008) aim to utilize the interoperable OWSs and other services to foster the integration of heterogeneous geospatial information, Webbased mapping, and geo-analytical services across the Internet (Zhang et al. 2006) to support geospatial sciences. Besides OWS, another popular Web service is the Tiled Map Service (TMS) which provides a simple HTTP interface to serve the tiled maps of geo-referenced data. ...
Article
Full-text available
OGC Web Services (OWS) are essential building blocks for the national and global spatial data infrastructure (NSDI and GSDI) and the geospatial cyberinfrastructure (GCI). Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), and Catalogue Service for Web (CSW) have been increasingly adopted to serve scientific data. Interoperable services can facilitate the integration of different scientific applications by searching, finding, and utilizing the large number of scientific data and Web services. However, these services are widely dispersed and hard to be found and utilized with acceptable performance. This is especially true when developing a science application to seamlessly integrate multiple geographically dispersed services. Focusing on the integration of distributed OWS resources, we propose a layer-based service-oriented integration framework and relevant optimization technologies to search and utilize relevant resources. Specifically, (1) an AJAX (Asynchronous JAvaScript and eXtensible Markup Language)-based synchronous multi-catalogue search is proposed and utilized to enhance the multi-catalogue searching performance; (2) a layer-based search engine with spatial, temporal, and performance criteria is proposed and used for identifying better services; (3) a service capabilities clearinghouse (SCCH) is proposed and developed to address the service issues identified by a statistical experiment. A science application of data correlation analysis is used as an example to demonstrate the performance enhancement of the proposed framework.
... A new information infrastructure, the so-called Cyberinfrastructure (in the United States) or e-Infrastructure (in Europe) [1], is being developed to support the next generation of geoscientific research. The traditional focus on discovery of and access to geospatial data is being expanded primarily to enable scientific research using the Cyberinfrastructure, with its heavy analysis and synthesis demands [2]. Typical activities involve distributed geoprocessing workflows that support information processing and knowledge discovery from vast, heterogeneous data sets. ...
Article
A geospatial catalogue service provides a network-based meta-information repository and interface for advertising and discovering shared geospatial data and services. Descriptive information (i.e., metadata) for geospatial data and services is structured and organized in catalogue services. The approaches currently available for searching and using that information are often inadequate. Semantic Web technologies show promise for better discovery methods by exploiting the underlying semantics. Such development needs special attention from the Cyberinfrastructure perspective, so that the traditional focus on discovery of and access to geospatial data can be expanded to support the increased demand for processing of geospatial information and discovery of knowledge. Semantic descriptions for geospatial data, services, and geoprocessing service chains are structured, organized, and registered through extending elements in the ebXML Registry Information Model (ebRIM) of a geospatial catalogue service, which follows the interface specifications of the Open Geospatial Consortium (OGC) Catalogue Services for the Web (CSW). The process models for geoprocessing service chains, as a type of geospatial knowledge, are captured, registered, and discoverable. Semantics-enhanced discovery for geospatial data, services/service chains, and process models is described. Semantic search middleware that can support virtual data product materialization is developed for the geospatial catalogue service. The creation of such a semantics-enhanced geospatial catalogue service is important in meeting the demands for geospatial information discovery and analysis in Cyberinfrastructure.
Preprint
Full-text available
We present two Python libraries (map2loop and map2model) which combine the observations available in digital geological maps with conceptual information, including assumptions regarding the subsurface extent of faults and plutons to provide sufficient constraints to build a reasonable 3D geological model. At a regional scale, the best predictor for the 3D geology of the near-subsurface is often the information contained in a geological map. This remains true even after recognising that a map is also a model, with all the potential for hidden biases that this model status implies. One challenge we face is the difficulty in reproducibly preparing input data for 3D geological models. The information stored in a map falls into three categories of geometric data: positional data such as the position of faults, intrusive and stratigraphic contacts; gradient data, such as the dips of contacts or faults and topological data, such as the age relationships of faults and stratigraphic units, or their adjacency relationships. This work is being conducted within the Loop Consortium, in which algorithms are being developed that allow automatic deconstruction of a geological map to recover the necessary positional, gradient and topological data as inputs to different 3D geological modelling codes. This automation provides significant advantages: it reduces the time to first prototype models; it clearly separates the primary data from subsets produced from filtering via data reduction and conceptual constraints; and provides a homogenous pathway to sensitivity analysis, uncertainty quantification and Value of Information studies. We use the example of the re-folded and faulted Hamersley Basin in Western Australia to demonstrate a complete workflow from data extraction to 3D modelling using two different Open Source 3D modelling engines: GemPy and LoopStructural.
Article
Full-text available
The Rapid Integrated Mapping and analysis System (RIMS) has been developed at the University of New Hampshire as an online instrument for multidisciplinary data visualization, analysis and manipulation with a focus on hydrological applications. Recently it was enriched with data and tools to allow more sophisticated analysis of interdisciplinary data. Three different examples of specific scientific applications with RIMS are demonstrated and discussed. Analysis of historical changes in major components of the Eurasian pan-Arctic water budget is based on historical discharge data, gridded observational meteorological fields, and remote sensing data for sea ice area. Express analysis of the extremely hot and dry summer of 2010 across European Russia is performed using a combination of near-real time and historical data to evaluate the intensity and spatial distribution of this event and its socioeconomic impacts. Integrative analysis of hydrological, water management, and population data for Central Asia over the last 30 years provides an assessment of regional water security due to changes in climate, water use and demography. The presented case studies demonstrate the capabilities of RIMS as a powerful instrument for hydrological and coupled human-natural systems research.
Article
The underlying vision of the Digital Earth (DE) calls for applications that can embed vast quantities of geo-referenced data and allow users to study and analyse of our planet. Since the declaration of this vision in the late 90s, a significant number of DE data-sets have been created by the industry, governments, non-governmental organisations and individuals. An overwhelming majority of the successful applications that use DE data-sets has its end-user applications running on the desktop. While these applications are great tools, they remain inaccessible to the community as a whole. In this paper, we present a framework for the development of cyber-applications. We define an abstract architecture for cyber-applications based on the model-view-controller paradigm, which allows the dynamic inclusion of functional and data components into its execution engine at run-time. We define the operational characteristics of cyber-applications. We also specify the interface of pluggable components to the architecture. Finally, we demonstrate the appropriateness of the abstract architecture by means of a case study.
Article
—This paper presents the framework of cyberinfrastructure and the state of the art in this area of research, as well as a preliminary analysis of USV capabilities for remote visualization and control of research and higher education equipments. References have been made to the cyber-tools and methods, such as high-performance computing, communication technologies, and simulation models, enabling progress in cyberinfrastructure development. The goal of cyberinfrastructure is to unite various distributed-knowledge communities in an integrated collaboration environment that will provide broad access to multiple scientific resources. Here, we present the main requirements for cyberinfrastructure development and the key components of a working cyberinfrastructure. A case study of the minimal requirements for a fluent video stream in a local communication network is discussed and the main data traffic characteristics are presented.
Article
Full-text available
The so-called “Informatics-revolution” and related Information Technology (IT) are a chance to increase spreading of geoscientific knowledge and transfer of information represented in geological maps, but, in order to actually take advantage of systems-aided communication, geological information has to be specifically harmonized and standardized. New methodological and technological approaches in the representation and communication of geological maps are enabled by widespread use of Geographic Information Systems (GIS) and web mapping services (e.g. the WMS standard protocol) that i) allow sharing and spatially discovering maps, and ii) support interchange of data amongst different systems, experts and communities. Since sharing and retrieving of geological maps through geoportals and spatial data infrastructures spread, problems about semantic heterogeneity and different structures of geological databases (DB) come out. A strategy to meet needs of interoperability consists of integrating geological map DB by standard languages and meta-information, and is carried out by the INSPIRE (Infrastructure for Spatial Information in Europe) European directive that defines two relevant guidelines (the “Data Specification on Geology” and the “Metadata Implementing Rules”) usable to encode and share standardised geological information. An example of application of these standards is the Geoportal of the Torino Unit of the Institute for Geosciences and Earth Resources, where field data, map features and peculiar geological interpretations, extracted from GIS DB of geological maps, are given as single datasets and homogeneously described through specific metadata classes such as the Abstract sub-element and the Lineage sub-element. The two given examples of metadata compilations, referring to the Lis-Trana Fault and to the block stream deposits occurring in the “Torino Ovest” geological map (sheet n.155 of the Geological Map of Italy at 1:50,000 scale), highlight that standardisation is an opportunity i) to better organize geological information, ii) to give information about quality of data, and iii) to specify intended meaning of interpreted features. Since metadata allow bringing out in an explicit format geological concepts and interpretations, reading of maps is improved and geoportals are actually a new method for encoding and sharing geological maps.
Article
Full-text available
Information Technologies (IT) have the capability to improve the clearness and the usefulness of scientific information, and related applications in earth sciences could allow to make geological data more sharable among different users. This paper illustrates an approach to represent the knowledge paths followed by field geologists involved in assessment and description of complex structural-geological settings and processes, through the use of specific IT applications. The proposed approach is based on three working steps: i) building of a conceptual map (cMap) that defines the project approach to the study matter, drives the acquisition of field data and gives rules for GIS representation of interpreted geological features (pre-fieldwork stage); ii) capturing of data directly in the field by means of digital devices, in a way suitable to retrace the acquisition data steps and to separate the observed features from the interpreted ones (fieldwork stage); iii) management of information in relational GIS databases by means of «geological» metadata that could define the «weight» of data and explain the adopted interpretations (post-fieldwork stage). An application of these working steps is given in a case study of the stability evaluation of a quarry rock mass. In the example, different conceptualizations and investigation methods are combined so that they are sharable among field geologists and engineering geologists in order to allow crucial decision in characterization and modelling of the quarry slope. Besides, IT-based approaches should get retraceability of decisional processes possible.
Article
Semantic similarity is a fundamental notion in GIScience for achieving semantic interoperability among geospatial data. Until now, several semantic similarity models have been proposed; however, few of these models address the issues related to the assessment of semantic similarity in ad hoc networks. Also, several models are based on a definition of concepts where features are independent, an assumption that reduces the richness of the geospatial concept representation. This article presents the conceptual basis for Sim-Net, a novel semantic similarity model for ad hoc networks based on Description Logics (DL). Sim-Net is based on the multi-view paradigm. This paradigm is used to include inferential knowledge in semantic similarity, that is, the knowledge about implicit dependencies between features of concepts. In Sim-Net, assessing semantic similarity relies on the notions of Semantic Reference Systems and Formal Concept Analysis (FCA), which are combined to establish a common semantic reference frame for ontologies of the ad hoc network called the view lattice. The Sim-Net semantic similarity measure distinguishes concepts that belong to different or similar domains and takes into account the neighbours of a concept in the network. An application example is used to show the positive impact of the properties of Sim-Net.
Article
Recent advances in Semantic Web and Web Service technologies has shown promise for automatically deriving geospatial information and knowledge from Earth science data distributed over the Web. In a service-oriented environment, the data, information, and knowledge are often consumed or produced by complex, distributed geoscientific workflows or service chains. In order for the chaining results to be consumable, sufficient metadata for data products to be delivered by service chains must be provided. This paper proposes automatic generation of geospatial metadata for Earth science virtual data products. A virtual data product is represented using process models, and can be materialized on demand by dynamically binding and chaining archived data and services, as opposed to requiring that Earth science data products be physically archived. Semantics-enabled geospatial metadata is generated, validated, and propagated during the materialization of a virtual data product. The generated metadata not only provides a context in which end-users can interpret data products before intensive execution of service chains, but also assures semantic consistency of the service chains.
Article
Full-text available
Here we describe the requirements of an e-Infrastructure to enable faster, better, and different scientific research capabilities. We use two application exemplars taken from the United Kingdom’s e-Science Programme to illustrate these requirements and make the case for a service-oriented infrastructure.We provide a brief overview of the UK ‘‘plug-andplay composable services’’ vision and the role of semantics in such an e-Infrastructure.
Article
Full-text available
As modern science grows in complexity and scope, the need for more collaboration between scientists at different institutions, in different areas, and across scientific disciplines becomes increasingly important. An emerging generation of World Wide Web technology, known as the Semantic Web, offers tremendous potential for collaborative and interdisciplinary science. However, to realize this potential, scientists and information technologists must forge new models of cooperation, and new thinking must go into the funding and dissemination of this next generation of scientific tools on the Web.
address: brodaric@nrcan.gc.ca Peter Fox, Deborah L. McGuinness Tetherless World Constellation, Rensselaer Polytechnic Institute United States E-mail addresses: pfox@cs.rpi.edu (P. Fox), dlm@cs
  • Boyan Brodaric
  • Survey
  • Booth Canada
  • Street
  • Ottawa
  • On
  • Canada
  • e
Boyan Brodaric à Geological Survey of Canada, 234B-615 Booth Street, Ottawa, ON, Canada K1A 0E9 E-mail address: brodaric@nrcan.gc.ca Peter Fox, Deborah L. McGuinness Tetherless World Constellation, Rensselaer Polytechnic Institute, 110 8th Street, Winslow Building, Troy, NY 12180, United States E-mail addresses: pfox@cs.rpi.edu (P. Fox), dlm@cs.rpi.edu (D.L. McGuinness) 19 January 2009 ARTICLE IN PRESS à Corresponding author.
Science and the semantic web. Science. v299 i5606
  • Hendler