Chapter

Putting Interoperability on the Map: Towards a Framework of Interoperability Approaches and Tools

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Information sharing and interoperability are key ingredients for any system that participates in Service Oriented Architectures (SOA) and wants to communicate and exchange information with other partners. Although there are many technologies that support interoperability, there is apparently no relevant research about how to extract and aggregate the requirements of a system, the necessary architecture components and tools regarding interoperability. This chapter reviews the principles and the constraints that affect architecture design and the research efforts about interoperability infrastructures, and proposes a set of architecture components and tools that can enable, support and maintain interoperability in heterogeneous, dynamic, and constantly changing environments.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Among the defining purposes of e-Government, highly agile, citizen-centric, accountable, transparent, effective, and efficient government operations and services have high rank. For reaching these goals, the integration of government information resources and processes, and ultimately, the interoperation of independent e-Government information systems appears essential. Yet, most integration and interoperation efforts meet serious challenges and constraints. This article contributes to the development of a research framework on integration and system interoperation in e-Government. We propose future research projects to study the foci and purposes, limitations and constraints, as well as processes and outcomes of integration and interoperation in electronic Government. For so doing, we suggest needs-and-wants theory as one promising theoretical lens (among others) for guiding that effort.
Conference Paper
Full-text available
Improving web service discovery constitutes a vital step for making a reality the Service Oriented Computing (SOC) vision of dynamic service selection, composition and deployment. Matching allows for comparing user requests with descriptions of available service implementations, and sits at the heart of the service discovery process. This paper firstly evaluates the efficacy of several key similarity metrics for matching syntactic, semantic and structural information from service interface descriptions, using a uniform corpus of web services. Secondly, it experiments with a hybrid style of matching that allows for blending various matching approaches and makes them configurable to cater service discovery given domain-specific constraints and requirements.
Article
Full-text available
Despite the numerous difficulties, many agencies and institutions in the United States and worldwide practice sharing of geographic information systems (GIS) and data, amounting to local, regional, national and international spatial data infrastructures. While the evidence on coordinated GIS development and spatial data sharing accumulates, there have been a few systematic evaluations of mechanisms and factors that facilitate or obstruct interorganizational GIS efforts. Moreover, the issue of "interoperability" has been addressed primarily in technical terms with its "soft" or organizational side mostly neglected. This paper presents results from several recent studies in the U.S. context. It explores the nature and effectiveness of interorganizational interaction, coordination and implementation processes, and assesses the benefits from achieving organizational interoperability.
Conference Paper
Full-text available
In the past 40 years, software engineering has emerged as an important sub-field of computer science. The quality and productivity of software have been improved and the cost and risk of software development been decreased due to the contributions made ...
Conference Paper
Full-text available
In recent years, Web Services technologies have been successfully used for simplifying interoperability while providing scalability and flexibility in multiple applications, including distributed simulation software. The RESTful-CD++ simulation Server provides Web Services according to the REST principles by exposing services as URIs and consumed via HTTP messages. Therefore, the server becomes a service part of the Web that can be easily mashed-up with other applications and simulation software. In contrast, RPC-style SOAP-based Web Services use the Web as a transmission medium by exposing few URIs and many RPCs. RESTful-CD++ is (to our best knowledge) the only existing RESTful system in this area. Further, this distributed simulation package provides pioneering distributed simulation services using the Web architectural style. We present an overview of the principles, design and implementation of the RESTful-CD++ HTTP server and DCD++ simulation. We show that REST fulfills WS objectives with a much better and easier style than the SOAP-based systems.
Conference Paper
Full-text available
Recent technology trends in the Web Services (WS) domain in- dicate that a solution eliminating the presumed complexity of the WS-* standards may be in sight: advocates of REpresentational State Transfer (REST) have come to believe that their ideas ex- plaining why the World Wide Web works are just as applicable to solve enterprise application integration problems and to simplify the plumbing required to build service-oriented architectures. In this paper we objectify the WS-* vs. REST debate by giving a quantitative technical comparison based on architectural principles and decisions. We show that the two approaches differ in the num- ber of architectural decisions that must be made and in the number of available alternatives. This discrepancy between freedom-from- choice and freedom-of-choice explains the complexity difference perceived. However, we also show that there are significant dif- ferences in the consequences of certain decisions in terms of re- sulting development and maintenance costs. Our comparison helps technical decision makers to assess the two integration styles and technologies more objectively and select the one that best fits their needs: REST is well suited for basic, ad hoc integration scenarios, WS-* is more flexible and addresses advanced quality of service requirements commonly occurring in enterprise computing.
Conference Paper
Full-text available
In this article we define a system, called XplainS that automatically generates an infrastructure for providing explanations of semantic web services described in OWL-S. XplainS has a strategy for generating production rules from a flow where several levels of distribution of the services exist.
Article
Full-text available
With the rising popularity of Web services, both academia and industry have invested considerably in Web service description standards, discovery, and composition techniques. The standards based approach utilized by Web services has supported interoperability at the syntax level. However, issues of structural and semantic heterogeneity between messages exchanged by Web services are far more complex and crucial to interoperability. It is for these reasons that we recognize the value that schema/data mappings bring to Web service descriptions. In this paper, we examine challenges to interoperability; classify the types of heterogeneities that can occur between interacting services and present a possible solution for data interoperability using the mapping support provided by WSDL-S, a key driver behind SAWSDL. We present a data mediation architecture using the extensibility features of WSDL and the popular SOAP engine, Axis 2.
Article
Full-text available
The introduction of software development via Web Services has been the most significant web engineering paradigm, in the last years. The widely acknowledged importance of the Web Services' concept lies in the fact that they provide a platform independent answer to the software component development question. Equally important are the mechanisms that allow for Web Service discovery, especially as the latter has turn to an arduous task. This paper critically presents the latest methods, architectures, models and concerns that have arisen in the Web Service Discovery area.
Article
Full-text available
Interoperable enterprise systems (be they supply chains, extended enterprises, or any form of virtual organizations) must be designed, controlled, and appraised from a holistic and systemic point of view. Systems interoperability is a key to enterprise integration, which recommends that the IT architecture and infrastructure be aligned with business process organization and control, themselves designed according to a strategic view expressed in an enterprise architecture. The paper discusses architectures and methods to build interoperable enterprise systems, advocating a mixed service and process orientation, to support synchronous and/or asynchronous operations, both at the business level (business events, business services, business processes) and at the application level (workflow, IT and Web services, application programs).
Article
Full-text available
Web services are expected to be the key technologyinenablingthenextinstallmentoftheWebin the form of the Service Web. In this paradigm shift, Web serviceswouldbetreatedasfirst-classobjectsthatcanbe manipulated much like data is now manipulated using a database management system. Hitherto, Web services have largely been driven by standards. However, there is a strong impetus for defining a solid and integrated foundation that would facilitate the kind of innovations witnessed in other fields, such as databases. This survey focusesoninvestigatingthedifferentresearchproblems, solutions, and directions to deploying Web services that are managed by an integrated Web Service Management System (WSMS). The survey identifies the key features of a WSMS and conducts a comparative study on how current research approaches and projects fit in.
Article
Full-text available
Offers guidelines for evaluating whether centralization or decentralization of application software is most appropriate for a particular organization. Advantages of centralization and decentralization; Hardware and software solutions; Elements of centralization and decentralization; Application modularity; Feasibility of a uniform software configuration.
Article
Full-text available
Comprehensive semantic descriptions of Web services are essential to exploit them in their full potential, that is, discovering them dynamically, and enabling automated service negotiation, composition and monitoring. The semantic mechanisms currently available in service registries which are based on tax- onomies fail to provide the means to achieve this. Although the terms "taxonomy" and "ontology" are sometimes used interchangably there is a critical difference. A taxonomy indicates only class/subclass relationship whereas an ontology describes a domain completely. The essential mechanisms that ontol- ogy languages provide include their formal specification (which allows them to be queried) and their ability to define properties of classes. Through properties very accurate descriptions of services can be defined and services can be related to other services or resources. In this paper, we discuss the advantages of describing service semantics through ontology languages and describe how to relate the semantics defined with the services advertised in service registries like UDDI and ebXML.
Article
Full-text available
Introduction The "Internet of Things," once reality, will have to rely on a global IT infrastructure that provides information about all those "things" in a secure and reliable manner. The EPCglobal Network is a proposal for a widely distributed information system to offer such services. But it may introduce more challenges concerning security, privacy, and political control than was initially anticipated. If the vision of many RFID proponents becomes true, more and more common objects will soon acquire a cyber presence. Objects will be equipped with RFID tags containing identification data and possibly some additional information about the object in question (data on tag). To keep tag costs low, one may often just store an identifier and use it as a key to access databases containing the actual object information (data on network). This second approach is typical for "EPC tags"—RFID tags that aim to replace the conventional barcode system. They use an Electronic Product Code (EPC, see Figure 1), which is globally unique, as a key to retrieve information from the EPCglobal Network, envisioned as a large distributed system of databases. The EPC standard represents a numbering framework that is independent of specific hardware features, such as tag generation or specific radio frequency. The databases compromising the EPCglobal Network are to be run by manufacturers, logistic providers, retailers, or third parties, and can be accessed via special web services called EPC Information Services (EPCIS). The network architecture is designed and administered by the standardization consortium EPCglobal, which is a joint venture of GS1 U.S. (formerly Uniform Code Council) and GS1 (formerly EAN International). By improving the information flow, as objects pass from suppliers to manufacturers, distributors, retail stores, and customers, the EPCglobal Network aims to facilitate cooperation within supply chains and thus to make them more efficient. Once established, it could also be used to support a wide range of applications in the area of ubiquitous computing. An often-cited example is the "smart home," in which "intelligent" cupboards and fridges could be realized using RFID technology. By scanning the RFID tags on objects and using the EPCglobal Network for information retrieval, such devices can identify their current content and offer new services like food counseling or automated replenishing of goods. As a result of this broadened use of the EPCglobal Network, its security context would change from closed supply chains to the rather open environments of ubiquitous computing–just like the security context of the Internet was changed by moving from relatively closed groups of fellow researchers to the global environment it represents today. In this article, we first describe the EPCglobal Network architecture, as currently specified. We then discuss its security and privacy risks, as well as possible countermeasures. We conclude with suggestions on how to improve existing design proposals, once appropriate security and privacy requirements have been established.
Conference Paper
Full-text available
In a business process execution language (BPEL) process definition the sequence of exchanged messages typically originates from the sequence of business process activities and from the need of coordination of those activities across the participants of the process. As such business concerns (e.g. the sequence of business process steps) are of en mixed with technical aspects (e.g. the sequence of coordination messages). In this article we present an architecture to separate business and technical concerns, which results in a clearer overview of the high-level business process and improves the flexibility and maintainability of the orchestration architecture. The described architecture depends on existing Web service standards. Different eventing and coordination specifications are discussed. The ultimate architecture is mainly based on the WS-Brokered Notification and WS-Coordination framework specifications.
Article
Full-text available
A key element to realizing the Semantic Web is developing a suitably rich language for encoding and describing Web content. Such a language must have a well defined semantics, be sufficiently expressive to describe the complex interrelationships and constraints between Web objects, and be amenable to automated manipulation and reasoning with acceptable limits on time and resource requirements. A key component of the Semantic Web services vision is the creation of a language for describing Web services. DAML-S is such a language it is a DAML+OIL ontology for describing Web services that a coalition of researchers created with support from DARPA.
Article
Full-text available
Information exchange across various networks is made possible with the help of ontology. This will be possible with the development of a joint standard for specifying and exchanging ontologies. A standard named OIL is proposed to achieve the purpose. Different levels of complexity can be achieved as OIL is properly grounded in Web languages and its inner layers provides for efficient reasoning support based on FaCT. The application of ontologies is possible in the areas of knowledge management, Web commerce and e-business.
Chapter
This chapter focuses on security concerns that arise or are amplified for the agile enterprise, particularly concerns raised by service-oriented architecture (SOA). It assumes that attention has already been given to such concerns as physical security, threats of intrusion from the World Wide Web, and virus protection. Its purpose is to provide an understanding of the requirements and approaches for optimizing security for services, from an enterprise perspective, so that management can plan for and commit to the necessary enterprise transformation. With the advent of the public Internet and the World Wide Web, security risks have increased dramatically. Systems are exposed to public access and email messages can carry or link to corrupting software. Automation and electronic communications have added new dimensions to security concerns. Electronic integration of services, extending beyond the walls of the enterprise, has created new security exposures. Fortunately, SOA technology and related industry standards have created new opportunities for accountability and control.
Article
The term electronic government (e-government) mainly refers to the information and communications technology (ICT) usage to modify structures and procedures of government agencies. Acknowledging the necessity of utilizing the new electronic, information, and communication technologies, the movement toward implementation of e-government in Iran has recently received the attention of authorities and policy makers. Public administrations have been very much concerned about the architecture of e-government, especially because of the boost of e-government that has taken place in recent years. The paper seeks to provide a set of heuristic principles affecting e-government overall architecture with respect to Iranian government-to-government (G2G) context requirements, which might be applicable for other developing countries with some customization. It is worth mentioning that the grounded action research method was applied to develop a systematic theory from data that contains both inductive and deductive thinking.
Article
It is no longer sufficient for a winning organisation to operate in isolation, however effective it may be in performing its core business. To survive, let alone win, it must be part of one or more supply chains producing world class performance. Each company in the chain must be internally ''lean'' but additionally must operate in a ''seamless'' environment in which all information relevant to the efficient operation of the total system is available on time and in an undistorted form. The term ''predator'' has been coined in the literature to describe the supply chain leader with the vision, drive, and determination to re-engineer the entire supply chain so as to satisfy end-customer needs. The paper reviews the techniques available to ''predators'' seeking to gain competitive advantage for their supply chains, including industrial engineering, operations engineering, production engineering, and information technology. Not all conceivable improvements can be implemented overnight, however desirable they might appear, hence the advocacy of simulation models within a decision support system so that top management can prioritise proposed Improvement Programmes against the relevant performance metric. In the example used to indicate the approach, the technological, organisational, and attitudinal problems to be solved by top management in achieving the seamless supply chain are all highlighted.
Article
Although information exchange among trading partners is consistently mentioned as a key requirement of successful supply chain management implementation, research on information exchange is scarce. This lack of research provides little guidance and support for those managers interested in improving their logistics operations through increased information exchange. The main goal of this paper is to identify potential antecedents of information exchange. Questionnaires were sent to logistics managers at manufacturing firms in several industries. The results of this exploratory study are detailed and the implications for logistics managers discussed.
Article
The global market is willing to improve their competitiveness through collaborative work and partnerships, motivating the companies to look for enhanced interoperability between computer systems and applications. However, the large number of system's heterogeneity and the company's lack of resources and know-show have been preventing organizations to move ahead in that direction. Today, the OMG's model-driven architecture (MDA) makes available an open approach to write specifications and develop applications, separating the application and business functionality from the platform technology. As well, the service-oriented architecture (SOA) establishes a software architectural concept that defines the use of services to support the requirements of software users, making them available as independent services accessible in a standardized way. Together, these two architectures seem to provide a suitable framework to improve company's competitiveness through the adoption of a standard-based extended environment, challenging and enhancing the interoperability between computer systems and applications in industry. The paper, after illustrating the general motivations the industrial SMEs have to adopt open architectures to achieve interoperability for extended and collaborative enterprise practices, presents the emerging model-driven and service-oriented architectures. Then, it describes an innovative case study in validation by the industry, proposing a standard-based extendable platform to support an interoperable environment through the adoption of MDA and SOA. The paper finishes with discussion and concluding remarks concerning the empirical results obtained from the pilot demonstrator.
Article
Small to medium sized public organizations (SMPOs) share some of their e-Government requirements with their larger counterparts, such as the pending needs for interoperability, security and user friendliness. Additionally, they have some specific needs that are either unique in their context or more demanding due to their characteristics. These are cost and resources considerations, enhanced accessibility and greater scalability due to the larger number of citizens and businesses served and automated processing because of the restricted number of trained personnel. This paper first proposes an architecture for a secure e-Government platform based on Web Services, which addresses the above requirements. Secondly, a specific service is built upon the proposed platform, in which a municipality generates and securely delivers a digital birth certificate to a citizen or another municipality.
Article
Interoperability of heterogeneous applications is defined as the ability for multiple software applications written in different programming languages running on different platforms with different operating systems to communicate and interact with one another over different computer networks. The emerging middleware technologies, including CORBA, COM/DCOM, and Enterprise JavaBeans offer an industrial defacto standard communication infrastructure to support the interoperability of heterogeneous applications in components. However, the implementation of a component suffers from high interaction complexities in the component that seriously degrades the application independence. Software components should be built to be independent of the context in which they are used, allowing them to be reused in many different computing environments. In this paper, we are presenting an adapter to isolate, encapsulate, and manage a component's interactions outside the component. The dynamic interface binding was designed to allow an adapter to examine the signature of the requested services at runtime such as operation names, parameters orders, parameters types, and parameters sizes. The interfaces of interconnecting components are bound at runtime. In addition, the interface language mapping allows an interface in a specific programming language to be automatically generated from an IDL interface. The use of adapters increases the reusability of components and also simplifies the integration of the components to an application.
Article
The need for managing distributed systems is prominent in e-government. Different applications and platforms that cover the overall range of the e-government implementation area need to interoperate in order to provide integrated governmental services to the citizens. This paper proposes an ontologically principled service-oriented architecture for the administration and integration of distributed nodes in an e-government network. The goal of the proposed design is to improve effectiveness and coherence by taking advantage of the enabling technologies of service-oriented computing, web services and ontologies. A two-level semantic mediator model is proposed to both provide an integrated description of entities and map actual information to them. This architecture was used on a prototype system developed for managing distributed educational directorates in the prefecture of Achaia, Greece. The pilot use of the system led to efficient decision making since it managed to mine information that was previously ‘buried’ in the local governmental infrastructure nodes.
Article
Web services have acquired enormous popularity among software developers. This popularity has motivated developers to publish a large number of Web service descriptions in UDDI registries. Although these registries provide search facilities, they are still rather difficult to use and often require service consumers to spend too much time manually browsing and selecting service descriptions. This paper presents a novel search method for Web services called WSQBE that aims at both easing query specification and assisting discoverers by returning a short and accurate list of candidate services. In contrast with previous approaches, WSQBE discovery process is based on an automatic search space reduction mechanism that makes this approach more efficient. Empirical evaluations of WSQBE search space reduction mechanism, retrieval performance, processing time and memory usage, using a registry with 391 service descriptions, are presented.
Article
Originally coming from the business world, service-oriented architecture (SOA) paradigm is expanding its range of application into several different environments. Industrial automation is increasingly interested on adopting it as a unifying approach with several advantages over traditional automation. In particular, the paradigm is well indicated to support agile and reconfigurable supply chains due to its dynamic nature. In this domain, the main goals are short time-to-market, fast application (re)configurability, more intelligent devices with lifecycle support, technology openness, seamless IT integration, etc. The current research challenges associated to the application of SOA into reconfigurable supply chains are enumerated and detailed with the aim of providing a roadmap into a major adoption of SOA to support agile reconfigurable supply chains.
Article
Given the current explosion of information resources available to decision makers, achieving semantic interoperability between a data source (e.g., a database) and a data receiver (e.g., a decision maker) is more critical than ever. As decision makers interact with unfamiliar sources that have been independently created and maintained, they need to expend non-trivial cognitive effort to understand the meaning of the information contained within these sources. To address this problem, an architecture called the Context Interchange Architecture is proposed. The central component in this architecture is the context mediator, an intelligent agent which facilitates source-receiver interoperability by enabling the receiver to issue queries and to be presented with answers in a manner that is consistent with the receiver's preferences, goals and knowledge. As a result, a convenient and consistent interface is presented to the receiver, reducing the cognitive effort required for the interaction. In this paper, the theoretical foundation for this architecture, based on the philosophical disciplines of Ontology and Semantics, is presented. A key result of this paper is the formal definition of the external behavior of the context mediator. Such a formal characterization provides a basis for the subsequent design of the knowledge representation and reasoning processes internal to the mediator.
Article
One of the most significant difficulties with developing Service-Oriented Architecture (SOA) involves meeting its security challenges, since the responsibilities of SOA security are based on both the service providers and the consumers. In recent years, many solutions to these challenges have been implemented, such as the Web Services Security Standards, including WS-Security and WS-Policy. However, those standards are insufficient for the new generation of Web technologies, including Web 2.0 applications. In this research, we propose an intelligent SOA security framework by introducing its two most promising services: the Authentication and Security Service (NSS), and the Authorization Service (AS). The suggested autonomic and reusable services are constructed as an extension of WS-∗ security standards, with the addition of intelligent mining techniques, in order to improve performance and effectiveness. In this research, we apply three different mining techniques: the Association Rules, which helps to predict attacks, the Online Analytical Processing (OLAP) Cube, for authorization, and clustering mining algorithms, which facilitate access control rights representation and automation. Furthermore, a case study is explored to depict the behavior of the proposed services inside an SOA business environment. We believe that this work is a significant step towards achieving dynamic SOA security that automatically controls the access to new versions of Web applications, including analyzing and dropping suspicious SOAP messages and automatically managing authorization roles.
Article
Many organisations are evaluating their supply chains because they are perceived to be an area for both cost cutting and increasing competitiveness. The objective is apparently very simple; optimise the supply chain via effective and efficient operating practices. This paper will demonstrate via the statistical analysis of 32 industrial case studies that the route to this desired fully integrated, effective supply chain is long established. The solution has been renamed, repackaged and adapted many times over the years, but what remains constant are the underlying principles of simplified material flow. The output of this paper includes the vision, design principles, and rules for action needed to enable effective supply chain integration.
Article
This paper empirically examines the impact of environmental uncertainty, intra-organizational facilitators, and inter-organizational relationships on information sharing and information quality in supply chain management.Based on the data collected from 196 organizations, multiple regression analyses are used to test the factor impacting information sharing and information quality respectively. It is found that both information sharing and information quality are influenced positively by trust in supply chain partners and shared vision between supply chain partners, but negatively by supplier uncertainty. Top management has a positive impact on information sharing but has no impact on information quality. The results also show that information sharing and information quality are not impacted by customer uncertainty, technology uncertainty, commitment of supply chain partners, and IT enablers.Moreover, a discriminant analysis reveals that supplier uncertainty, shared vision between supply chain partners and commitment of supply chain partners are the three most important factors in discriminating between the organizations with high levels of information sharing and information quality and those with low levels of information sharing and information quality.
Article
This paper investigates the problem of deploying network traffic monitors with optimized coverage and cost in an IP network. Deploying a network-wide monitoring infrastructure in operational networks is necessary for practical reasons. We investigate two representative solutions, a router-based solution called NetFlow and an interface-based solution called CMON. Several cost factors are associated with deploying either NetFlow or CMON in a network. We argue that enabling monitoring to cover a major portion of traffic instead of the entire traffic will achieve significant cost savings while at the same time give operators enough insight to their network. We use NetFlow as an example and develop a technique to achieve the optimal cost-coverage tradeoff. Specifically, we aim to solve the Optimal NetFlow Location Problem (ONLP) for a given coverage ratio. We analyze various cost factors to enabling NetFlow in such a network. We model the problem as an Integer Linear Program (ILP). We develop two greedy heuristics to cope with such problems of large scales given its NP-hard nature. The performance of the ILP and heuristics is demonstrated by numerical results and the LM heuristic is able to achieve sub-optimal solutions within 1–2% difference from the optimal solutions in a mixed router environment. It is observed that we can achieve 55% cost savings by covering 95% instead of 100% of the network traffic. We then extend our methodology to deploying CMON into such a network. The associated cost with deploying NetFlow and CMON is compared. The results demonstrate that CMON is more cost-effective when a small coverage ratio is desired because of its more modular nature.
Article
Ontologies are used within the context of Spatial Data Infrastructures to denote a formally represented knowledge that is used to improve data sharing and information retrieval. Given the increasing relevance of semantic interoperability in this context, this work presents the specification and development of a Web Ontology Service (WOS), based on the OGC Web Service Architecture specification, whose purpose is to facilitate the management and use of lexical ontologies. Additionally, this work shows how to integrate this service with Spatial Data Infrastructure discovery components in order to obtain a better classification of resources and an improvement in information retrieval performance.
Article
Enterprises integration has recently gained great attentions, as never before. The paper deals with an essential activity enabling seamless enterprises integration, that is, a similarity-based schema matching. To this end, we present a supervised approach to measure semantic similarity between XML schema documents, and, more importantly, address a novel approach to augment reliably labeled training data from a given few labeled samples in a semi-supervised manner. Experimental results reveal the proposed method is very cost-efficient and reliably predicts semantic similarity.
Article
Governments worldwide are increasingly using Web-based business models to enhance their service delivery. Yet the concept of the business model is unexplored within the context of e-government. Drawing upon the literature on e-commerce, we develop a taxonomy for analyzing Web-based business models for e-government. Based on a systematic survey of 59 e-government Web sites in the Netherlands, our findings indicate that most of the Web sites use the content provider or direct-to-customer business models, while only a few are using novel business models. Overall, the concept of business model is appealing and useful in the public sector. Specifically it compliments research on Web site quality by analyzing and describing Web sites using atomic e-government business models and suggesting improvements by using combinations of business models.
Article
The Government of Canada (GoC) has implemented several standardization initiatives toward establishing e-government in order to systematize the capture, description, organization and dissemination of data and information. This study examines the GoC's metadata strategy through the adoption of a Dublin Core (DC)-based metadata scheme toward establishing one unified metadata framework. The study examines the credibility of DC in relation to interoperability, application profiles, and controlled vocabularies and further provides a discussion on the current problems associated with metadata and possible improvements across government agencies in the GoC.
Article
Schema matching is a critical step for discovering semantic correspondences among elements in many data-shared applications. Most of existing schema matching algorithms produce scores between schema elements resulting in discovering only simple matches. Such results partially solve the problem. Identifying and discovering complex matches is considered one of the biggest obstacle towards completely solving the schema matching problem. Another obstacle is the scalability of matching algorithms on large number and large-scale schemas. To tackle these challenges, in this paper, we propose a new XML schema matching framework based on the use of Prüfer encoding. In particular, we develop and implement the XPrüM system, which consists mainly of two parts—schema preparation and schema matching. First, we parse XML schemas and represent them internally as schema trees. Prüfer sequences are constructed for each schema tree and employed to construct a sequence representation of schemas. We capture schema tree semantic information in Label Prüfer Sequences (LPS) and schema tree structural information in Number Prüfer Sequences (NPS). Then, we develop a new structural matching algorithm exploiting both LPS and NPS. To cope with complex matching discovery, we introduce the concept of compatible nodes to identify semantic correspondences across complex elements first, then the matching process is refined to identify correspondences among simple elements inside each pair of compatible nodes. Our experimental results demonstrate the performance benefits of the XPrüM system.
Article
Intelligent Wireless Web (IWW) employs the capabilities of high speed wireless networks and exploits the parallel advancements in Internet-based technologies such as the Semantic Web, Web Services, Agent-based Technologies, and context awareness. Considering its great potentials to be applied in business systems, we have devised an innovative model, based on the IWW services, for a typical mobile real-time supply chain coordination system which has been developed and tested in a real operational environment. Our article investigates the proposed system in this way: at the start, the building blocks of the IWW are discussed in detail. Then, we fully explain the basic concepts of mobile real-time supply chain coordination and concentrate on the motivations to implement such a modern system. The vision of intelligent wireless web services, as discussed in this paper, centers on the need to provide mobile supply chain members highly specific data and services in real-time on an as-needed basis, with the flexibility of use for the user. In this regard, we investigate nine enabling technologies of the IWW for our system and discuss how, by exploiting the convergence and synergy between different technologies, it has become possible to deliver intelligent wireless web support to mobile real-time supply chain coordination. Afterwards, a practical framework is clearly established in four phases. This initiative system has been implemented in the laboratory and has passed the evaluation processes successfully. Further details will be announced in near future in another research article.
Article
The paper defines and clarifies basic concepts of enterprise architectures. Then an overview on architectures for enterprise integration developed since the middle of the 1980s is presented. The main part of the paper focuses on the recent developments on architectures for enterprise interoperability. The main initiatives and existing works are presented. Future trends and some research issues are discussed and conclusions are given at the end of the paper.
Article
The Business Process Modelling Notation (BPMN) is a standard for capturing business processes in the early phases of systems development. The mix of constructs found in BPMN makes it possible to create models with semantic errors. Such errors are especially serious, because errors in the early phases of systems development are among the most costly and hardest to correct. The ability to statically check the semantic correctness of models is thus a desirable feature for modelling tools based on BPMN. Accordingly, this paper proposes a mapping from BPMN to a formal language, namely Petri nets, for which efficient analysis techniques are available. The proposed mapping has been implemented as a tool that, in conjunction with existing Petri net-based tools, enables the static analysis of BPMN models. The formalisation also led to the identification of deficiencies in the BPMN standard specification.
Conference Paper
Web services provision is the process of assigning particular services to the constituent tasks of business processes. It describes a promising scenario where services are dynamically chosen and invoked in a business process according to their functional and non-functional capabilities. It also introduces many challenging problems and has received much attention recently. In this paper, we distinguish three types of provision: independent provision, cooperative provision and multiple provisioning. For each type, we consider three major phases, that is, service discovery, service selection and service contracting. Following this rule, we investigate existing works in web services provision and shed some light on potential research directions.
Article
Enterprise networking refers to any kind of organization structures in which two or more geographically dispersed business entities need to work in interaction. This can happen within a single distributed enterprise (networked enterprise) or among several enterprises (network of enterprises), including the extended enterprise or virtual organizations. This concerns any kind of organizations, e.g. industrial firms, public organizations or large government agencies. Enterprise interoperability is a sine qua non-condition for enterprise integration and networking. It largely relies on information and communication technologies (ICT), especially Internet computing. The paper uses the European Interoperability Framework (EIF) as a foundational baseline to first discuss technical, semantic and organizational aspects of enterprise interoperability and networking and finally to address some open research issues.
Article
Modern medical information management is a knowledge intensive activity requiring a high degree of interoperability across various health management entities. Ontology-based multi-agent systems provide a framework for interactions in a distributed medical systems environment without the limitations of a more traditional client server approach. In this paper, we describe electronic Medical Agent System (eMAGS) a multi-agent system with an ontology based on an accepted public health message standard, Health Level Seven (HL7), to facilitate the flow of patient information across a whole healthcare organisation.
Conference Paper
With the increasing growth in popularity of Web services, matchmaking of relevant Web services becomes a significant challenge. Commonly, Web service is described by WSDL and published on UDDI registers. UDDI provides limited search facilities allowing only a keyword-based search of businesses, services, and the so called tModels based on names and identifiers. This category-based keyword-browsing method is clearly insufficient. Semantic Web service uses DAML-S instead of WSDL to represent capabilities of Web services. This improvement enables software agents or search engines to automatically find appropriate Web services via ontologies and reasoning algorithm enriched methods. However, the high cost of formally defining to the heavy and complicated services makes this improvement widespread adoption unlikely. To cope with these limitations, we have developed a suite of methods which assesses the similarity of Web services to achieve matchmaking. In particular, we present a conceptual model which classifies properties of Web services into four categories. For each category, a similarity assessment method has been given. In Web service matchmaking process, these similarity assessment methods can be used together or individually. Experiments highlight complementary contributions that our work makes to facilitate Web service matchmaking.
Article
Enterprise integration is a way that encourages the development of beneficial applications. Enterprise architects tend to favor practices and approaches based on a service-oriented architecture (SOA). Many approaches to connecting software systems, including but not limited to Corba, many Java-based integration systems, and .NET, generally follow the best practices of SOA.