Chapter

Putting Interoperability on the Map: Towards a Framework of Interoperability Approaches and Tools

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Information sharing and interoperability are key ingredients for any system that participates in Service Oriented Architectures (SOA) and wants to communicate and exchange information with other partners. Although there are many technologies that support interoperability, there is apparently no relevant research about how to extract and aggregate the requirements of a system, the necessary architecture components and tools regarding interoperability. This chapter reviews the principles and the constraints that affect architecture design and the research efforts about interoperability infrastructures, and proposes a set of architecture components and tools that can enable, support and maintain interoperability in heterogeneous, dynamic, and constantly changing environments.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Among the defining purposes of e-Government, highly agile, citizen-centric, accountable, transparent, effective, and efficient government operations and services have high rank. For reaching these goals, the integration of government information resources and processes, and ultimately, the interoperation of independent e-Government information systems appears essential. Yet, most integration and interoperation efforts meet serious challenges and constraints. This article contributes to the development of a research framework on integration and system interoperation in e-Government. We propose future research projects to study the foci and purposes, limitations and constraints, as well as processes and outcomes of integration and interoperation in electronic Government. For so doing, we suggest needs-and-wants theory as one promising theoretical lens (among others) for guiding that effort.
Article
Full-text available
Despite the numerous difficulties, many agencies and institutions in the United States and worldwide practice sharing of geographic information systems (GIS) and data, amounting to local, regional, national and international spatial data infrastructures. While the evidence on coordinated GIS development and spatial data sharing accumulates, there have been a few systematic evaluations of mechanisms and factors that facilitate or obstruct interorganizational GIS efforts. Moreover, the issue of "interoperability" has been addressed primarily in technical terms with its "soft" or organizational side mostly neglected. This paper presents results from several recent studies in the U.S. context. It explores the nature and effectiveness of interorganizational interaction, coordination and implementation processes, and assesses the benefits from achieving organizational interoperability.
Article
Full-text available
Interoperable enterprise systems (be they supply chains, extended enterprises, or any form of virtual organizations) must be designed, controlled, and appraised from a holistic and systemic point of view. Systems interoperability is a key to enterprise integration, which recommends that the IT architecture and infrastructure be aligned with business process organization and control, themselves designed according to a strategic view expressed in an enterprise architecture. The paper discusses architectures and methods to build interoperable enterprise systems, advocating a mixed service and process orientation, to support synchronous and/or asynchronous operations, both at the business level (business events, business services, business processes) and at the application level (workflow, IT and Web services, application programs).
Article
Full-text available
Web services are expected to be the key technologyinenablingthenextinstallmentoftheWebin the form of the Service Web. In this paradigm shift, Web serviceswouldbetreatedasfirst-classobjectsthatcanbe manipulated much like data is now manipulated using a database management system. Hitherto, Web services have largely been driven by standards. However, there is a strong impetus for defining a solid and integrated foundation that would facilitate the kind of innovations witnessed in other fields, such as databases. This survey focusesoninvestigatingthedifferentresearchproblems, solutions, and directions to deploying Web services that are managed by an integrated Web Service Management System (WSMS). The survey identifies the key features of a WSMS and conducts a comparative study on how current research approaches and projects fit in.
Article
Full-text available
Offers guidelines for evaluating whether centralization or decentralization of application software is most appropriate for a particular organization. Advantages of centralization and decentralization; Hardware and software solutions; Elements of centralization and decentralization; Application modularity; Feasibility of a uniform software configuration.
Article
Full-text available
Introduction The "Internet of Things," once reality, will have to rely on a global IT infrastructure that provides information about all those "things" in a secure and reliable manner. The EPCglobal Network is a proposal for a widely distributed information system to offer such services. But it may introduce more challenges concerning security, privacy, and political control than was initially anticipated. If the vision of many RFID proponents becomes true, more and more common objects will soon acquire a cyber presence. Objects will be equipped with RFID tags containing identification data and possibly some additional information about the object in question (data on tag). To keep tag costs low, one may often just store an identifier and use it as a key to access databases containing the actual object information (data on network). This second approach is typical for "EPC tags"—RFID tags that aim to replace the conventional barcode system. They use an Electronic Product Code (EPC, see Figure 1), which is globally unique, as a key to retrieve information from the EPCglobal Network, envisioned as a large distributed system of databases. The EPC standard represents a numbering framework that is independent of specific hardware features, such as tag generation or specific radio frequency. The databases compromising the EPCglobal Network are to be run by manufacturers, logistic providers, retailers, or third parties, and can be accessed via special web services called EPC Information Services (EPCIS). The network architecture is designed and administered by the standardization consortium EPCglobal, which is a joint venture of GS1 U.S. (formerly Uniform Code Council) and GS1 (formerly EAN International). By improving the information flow, as objects pass from suppliers to manufacturers, distributors, retail stores, and customers, the EPCglobal Network aims to facilitate cooperation within supply chains and thus to make them more efficient. Once established, it could also be used to support a wide range of applications in the area of ubiquitous computing. An often-cited example is the "smart home," in which "intelligent" cupboards and fridges could be realized using RFID technology. By scanning the RFID tags on objects and using the EPCglobal Network for information retrieval, such devices can identify their current content and offer new services like food counseling or automated replenishing of goods. As a result of this broadened use of the EPCglobal Network, its security context would change from closed supply chains to the rather open environments of ubiquitous computing–just like the security context of the Internet was changed by moving from relatively closed groups of fellow researchers to the global environment it represents today. In this article, we first describe the EPCglobal Network architecture, as currently specified. We then discuss its security and privacy risks, as well as possible countermeasures. We conclude with suggestions on how to improve existing design proposals, once appropriate security and privacy requirements have been established.
Article
Full-text available
A key element to realizing the Semantic Web is developing a suitably rich language for encoding and describing Web content. Such a language must have a well defined semantics, be sufficiently expressive to describe the complex interrelationships and constraints between Web objects, and be amenable to automated manipulation and reasoning with acceptable limits on time and resource requirements. A key component of the Semantic Web services vision is the creation of a language for describing Web services. DAML-S is such a language it is a DAML+OIL ontology for describing Web services that a coalition of researchers created with support from DARPA.
Article
Full-text available
Information exchange across various networks is made possible with the help of ontology. This will be possible with the development of a joint standard for specifying and exchanging ontologies. A standard named OIL is proposed to achieve the purpose. Different levels of complexity can be achieved as OIL is properly grounded in Web languages and its inner layers provides for efficient reasoning support based on FaCT. The application of ontologies is possible in the areas of knowledge management, Web commerce and e-business.
Article
Full-text available
In a virtual enterprise (VE), a company assembles a temporary consortium of partners and services for a certain purpose. The general rationale for forming the VE is to reduce costs and time to market while increasing flexibility and access to new markets and resources. As much as possible, individual companies focus on core competencies and mission-critical operations, outsourcing everything else. This paper examines some of the technical reasons for the fact that VE still remains out of range and shows the work that remains to be done.
Article
Full-text available
The article focuses on an architectural framework that permits the flexibility, interoperability and openness needed for electronic-commerce applications The article focuses on an architectural framework that permits the flexibility, interoperability and openness needed for electronic-commerce applications. Supply-chain management forces companies to streamline the ways they manufacture, distribute,and sell products and ultimately improve the way organizations conduct business. Multiple enterprises within a shared market segment collaboratively plan, implement, and manage the flow of goods, services and information along the value system in a way that increases customer-perceived value and optimizes the efficiency of the chain. Company value chains are transformed into integrated value systems if they are designed to act as an extended enterprise, creating and enhancing customer-perceived value by means of cross-enterprise collaboration. An electronic-commerce application should eliminate the gaps between ordering, distribution, and payment, enabling the development of interoperable links to record-keeping and accounting information systems.
Chapter
This chapter focuses on security concerns that arise or are amplified for the agile enterprise, particularly concerns raised by service-oriented architecture (SOA). It assumes that attention has already been given to such concerns as physical security, threats of intrusion from the World Wide Web, and virus protection. Its purpose is to provide an understanding of the requirements and approaches for optimizing security for services, from an enterprise perspective, so that management can plan for and commit to the necessary enterprise transformation. With the advent of the public Internet and the World Wide Web, security risks have increased dramatically. Systems are exposed to public access and email messages can carry or link to corrupting software. Automation and electronic communications have added new dimensions to security concerns. Electronic integration of services, extending beyond the walls of the enterprise, has created new security exposures. Fortunately, SOA technology and related industry standards have created new opportunities for accountability and control.
Article
The term electronic government (e-government) mainly refers to the information and communications technology (ICT) usage to modify structures and procedures of government agencies. Acknowledging the necessity of utilizing the new electronic, information, and communication technologies, the movement toward implementation of e-government in Iran has recently received the attention of authorities and policy makers. Public administrations have been very much concerned about the architecture of e-government, especially because of the boost of e-government that has taken place in recent years. The paper seeks to provide a set of heuristic principles affecting e-government overall architecture with respect to Iranian government-to-government (G2G) context requirements, which might be applicable for other developing countries with some customization. It is worth mentioning that the grounded action research method was applied to develop a systematic theory from data that contains both inductive and deductive thinking.
Article
It is no longer sufficient for a winning organisation to operate in isolation, however effective it may be in performing its core business. To survive, let alone win, it must be part of one or more supply chains producing world class performance. Each company in the chain must be internally ''lean'' but additionally must operate in a ''seamless'' environment in which all information relevant to the efficient operation of the total system is available on time and in an undistorted form. The term ''predator'' has been coined in the literature to describe the supply chain leader with the vision, drive, and determination to re-engineer the entire supply chain so as to satisfy end-customer needs. The paper reviews the techniques available to ''predators'' seeking to gain competitive advantage for their supply chains, including industrial engineering, operations engineering, production engineering, and information technology. Not all conceivable improvements can be implemented overnight, however desirable they might appear, hence the advocacy of simulation models within a decision support system so that top management can prioritise proposed Improvement Programmes against the relevant performance metric. In the example used to indicate the approach, the technological, organisational, and attitudinal problems to be solved by top management in achieving the seamless supply chain are all highlighted.
Although information exchange among trading partners is consistently mentioned as a key requirement of successful supply chain management implementation, research on information exchange is scarce. This lack of research provides little guidance and support for those managers interested in improving their logistics operations through increased information exchange. The main goal of this paper is to identify potential antecedents of information exchange. Questionnaires were sent to logistics managers at manufacturing firms in several industries. The results of this exploratory study are detailed and the implications for logistics managers discussed.
Article
The global market is willing to improve their competitiveness through collaborative work and partnerships, motivating the companies to look for enhanced interoperability between computer systems and applications. However, the large number of system's heterogeneity and the company's lack of resources and know-show have been preventing organizations to move ahead in that direction. Today, the OMG's model-driven architecture (MDA) makes available an open approach to write specifications and develop applications, separating the application and business functionality from the platform technology. As well, the service-oriented architecture (SOA) establishes a software architectural concept that defines the use of services to support the requirements of software users, making them available as independent services accessible in a standardized way. Together, these two architectures seem to provide a suitable framework to improve company's competitiveness through the adoption of a standard-based extended environment, challenging and enhancing the interoperability between computer systems and applications in industry. The paper, after illustrating the general motivations the industrial SMEs have to adopt open architectures to achieve interoperability for extended and collaborative enterprise practices, presents the emerging model-driven and service-oriented architectures. Then, it describes an innovative case study in validation by the industry, proposing a standard-based extendable platform to support an interoperable environment through the adoption of MDA and SOA. The paper finishes with discussion and concluding remarks concerning the empirical results obtained from the pilot demonstrator.
Article
Small to medium sized public organizations (SMPOs) share some of their e-Government requirements with their larger counterparts, such as the pending needs for interoperability, security and user friendliness. Additionally, they have some specific needs that are either unique in their context or more demanding due to their characteristics. These are cost and resources considerations, enhanced accessibility and greater scalability due to the larger number of citizens and businesses served and automated processing because of the restricted number of trained personnel. This paper first proposes an architecture for a secure e-Government platform based on Web Services, which addresses the above requirements. Secondly, a specific service is built upon the proposed platform, in which a municipality generates and securely delivers a digital birth certificate to a citizen or another municipality.
Article
Interoperability of heterogeneous applications is defined as the ability for multiple software applications written in different programming languages running on different platforms with different operating systems to communicate and interact with one another over different computer networks. The emerging middleware technologies, including CORBA, COM/DCOM, and Enterprise JavaBeans offer an industrial defacto standard communication infrastructure to support the interoperability of heterogeneous applications in components. However, the implementation of a component suffers from high interaction complexities in the component that seriously degrades the application independence. Software components should be built to be independent of the context in which they are used, allowing them to be reused in many different computing environments. In this paper, we are presenting an adapter to isolate, encapsulate, and manage a component's interactions outside the component. The dynamic interface binding was designed to allow an adapter to examine the signature of the requested services at runtime such as operation names, parameters orders, parameters types, and parameters sizes. The interfaces of interconnecting components are bound at runtime. In addition, the interface language mapping allows an interface in a specific programming language to be automatically generated from an IDL interface. The use of adapters increases the reusability of components and also simplifies the integration of the components to an application.
Article
The need for managing distributed systems is prominent in e-government. Different applications and platforms that cover the overall range of the e-government implementation area need to interoperate in order to provide integrated governmental services to the citizens. This paper proposes an ontologically principled service-oriented architecture for the administration and integration of distributed nodes in an e-government network. The goal of the proposed design is to improve effectiveness and coherence by taking advantage of the enabling technologies of service-oriented computing, web services and ontologies. A two-level semantic mediator model is proposed to both provide an integrated description of entities and map actual information to them. This architecture was used on a prototype system developed for managing distributed educational directorates in the prefecture of Achaia, Greece. The pilot use of the system led to efficient decision making since it managed to mine information that was previously ‘buried’ in the local governmental infrastructure nodes.
Article
Web services have acquired enormous popularity among software developers. This popularity has motivated developers to publish a large number of Web service descriptions in UDDI registries. Although these registries provide search facilities, they are still rather difficult to use and often require service consumers to spend too much time manually browsing and selecting service descriptions. This paper presents a novel search method for Web services called WSQBE that aims at both easing query specification and assisting discoverers by returning a short and accurate list of candidate services. In contrast with previous approaches, WSQBE discovery process is based on an automatic search space reduction mechanism that makes this approach more efficient. Empirical evaluations of WSQBE search space reduction mechanism, retrieval performance, processing time and memory usage, using a registry with 391 service descriptions, are presented.
Article
Originally coming from the business world, service-oriented architecture (SOA) paradigm is expanding its range of application into several different environments. Industrial automation is increasingly interested on adopting it as a unifying approach with several advantages over traditional automation. In particular, the paradigm is well indicated to support agile and reconfigurable supply chains due to its dynamic nature. In this domain, the main goals are short time-to-market, fast application (re)configurability, more intelligent devices with lifecycle support, technology openness, seamless IT integration, etc. The current research challenges associated to the application of SOA into reconfigurable supply chains are enumerated and detailed with the aim of providing a roadmap into a major adoption of SOA to support agile reconfigurable supply chains.
Article
Given the current explosion of information resources available to decision makers, achieving semantic interoperability between a data source (e.g., a database) and a data receiver (e.g., a decision maker) is more critical than ever. As decision makers interact with unfamiliar sources that have been independently created and maintained, they need to expend non-trivial cognitive effort to understand the meaning of the information contained within these sources. To address this problem, an architecture called the Context Interchange Architecture is proposed. The central component in this architecture is the context mediator, an intelligent agent which facilitates source-receiver interoperability by enabling the receiver to issue queries and to be presented with answers in a manner that is consistent with the receiver's preferences, goals and knowledge. As a result, a convenient and consistent interface is presented to the receiver, reducing the cognitive effort required for the interaction. In this paper, the theoretical foundation for this architecture, based on the philosophical disciplines of Ontology and Semantics, is presented. A key result of this paper is the formal definition of the external behavior of the context mediator. Such a formal characterization provides a basis for the subsequent design of the knowledge representation and reasoning processes internal to the mediator.
Article
One of the most significant difficulties with developing Service-Oriented Architecture (SOA) involves meeting its security challenges, since the responsibilities of SOA security are based on both the service providers and the consumers. In recent years, many solutions to these challenges have been implemented, such as the Web Services Security Standards, including WS-Security and WS-Policy. However, those standards are insufficient for the new generation of Web technologies, including Web 2.0 applications. In this research, we propose an intelligent SOA security framework by introducing its two most promising services: the Authentication and Security Service (NSS), and the Authorization Service (AS). The suggested autonomic and reusable services are constructed as an extension of WS-∗ security standards, with the addition of intelligent mining techniques, in order to improve performance and effectiveness. In this research, we apply three different mining techniques: the Association Rules, which helps to predict attacks, the Online Analytical Processing (OLAP) Cube, for authorization, and clustering mining algorithms, which facilitate access control rights representation and automation. Furthermore, a case study is explored to depict the behavior of the proposed services inside an SOA business environment. We believe that this work is a significant step towards achieving dynamic SOA security that automatically controls the access to new versions of Web applications, including analyzing and dropping suspicious SOAP messages and automatically managing authorization roles.
Article
Many organisations are evaluating their supply chains because they are perceived to be an area for both cost cutting and increasing competitiveness. The objective is apparently very simple; optimise the supply chain via effective and efficient operating practices. This paper will demonstrate via the statistical analysis of 32 industrial case studies that the route to this desired fully integrated, effective supply chain is long established. The solution has been renamed, repackaged and adapted many times over the years, but what remains constant are the underlying principles of simplified material flow. The output of this paper includes the vision, design principles, and rules for action needed to enable effective supply chain integration.
Article
This paper empirically examines the impact of environmental uncertainty, intra-organizational facilitators, and inter-organizational relationships on information sharing and information quality in supply chain management.Based on the data collected from 196 organizations, multiple regression analyses are used to test the factor impacting information sharing and information quality respectively. It is found that both information sharing and information quality are influenced positively by trust in supply chain partners and shared vision between supply chain partners, but negatively by supplier uncertainty. Top management has a positive impact on information sharing but has no impact on information quality. The results also show that information sharing and information quality are not impacted by customer uncertainty, technology uncertainty, commitment of supply chain partners, and IT enablers.Moreover, a discriminant analysis reveals that supplier uncertainty, shared vision between supply chain partners and commitment of supply chain partners are the three most important factors in discriminating between the organizations with high levels of information sharing and information quality and those with low levels of information sharing and information quality.
Article
This paper investigates the problem of deploying network traffic monitors with optimized coverage and cost in an IP network. Deploying a network-wide monitoring infrastructure in operational networks is necessary for practical reasons. We investigate two representative solutions, a router-based solution called NetFlow and an interface-based solution called CMON. Several cost factors are associated with deploying either NetFlow or CMON in a network. We argue that enabling monitoring to cover a major portion of traffic instead of the entire traffic will achieve significant cost savings while at the same time give operators enough insight to their network. We use NetFlow as an example and develop a technique to achieve the optimal cost-coverage tradeoff. Specifically, we aim to solve the Optimal NetFlow Location Problem (ONLP) for a given coverage ratio. We analyze various cost factors to enabling NetFlow in such a network. We model the problem as an Integer Linear Program (ILP). We develop two greedy heuristics to cope with such problems of large scales given its NP-hard nature. The performance of the ILP and heuristics is demonstrated by numerical results and the LM heuristic is able to achieve sub-optimal solutions within 1–2% difference from the optimal solutions in a mixed router environment. It is observed that we can achieve 55% cost savings by covering 95% instead of 100% of the network traffic. We then extend our methodology to deploying CMON into such a network. The associated cost with deploying NetFlow and CMON is compared. The results demonstrate that CMON is more cost-effective when a small coverage ratio is desired because of its more modular nature.
Article
Ontologies are used within the context of Spatial Data Infrastructures to denote a formally represented knowledge that is used to improve data sharing and information retrieval. Given the increasing relevance of semantic interoperability in this context, this work presents the specification and development of a Web Ontology Service (WOS), based on the OGC Web Service Architecture specification, whose purpose is to facilitate the management and use of lexical ontologies. Additionally, this work shows how to integrate this service with Spatial Data Infrastructure discovery components in order to obtain a better classification of resources and an improvement in information retrieval performance.
Article
Enterprises integration has recently gained great attentions, as never before. The paper deals with an essential activity enabling seamless enterprises integration, that is, a similarity-based schema matching. To this end, we present a supervised approach to measure semantic similarity between XML schema documents, and, more importantly, address a novel approach to augment reliably labeled training data from a given few labeled samples in a semi-supervised manner. Experimental results reveal the proposed method is very cost-efficient and reliably predicts semantic similarity.
Article
Governments worldwide are increasingly using Web-based business models to enhance their service delivery. Yet the concept of the business model is unexplored within the context of e-government. Drawing upon the literature on e-commerce, we develop a taxonomy for analyzing Web-based business models for e-government. Based on a systematic survey of 59 e-government Web sites in the Netherlands, our findings indicate that most of the Web sites use the content provider or direct-to-customer business models, while only a few are using novel business models. Overall, the concept of business model is appealing and useful in the public sector. Specifically it compliments research on Web site quality by analyzing and describing Web sites using atomic e-government business models and suggesting improvements by using combinations of business models.
Article
The Government of Canada (GoC) has implemented several standardization initiatives toward establishing e-government in order to systematize the capture, description, organization and dissemination of data and information. This study examines the GoC's metadata strategy through the adoption of a Dublin Core (DC)-based metadata scheme toward establishing one unified metadata framework. The study examines the credibility of DC in relation to interoperability, application profiles, and controlled vocabularies and further provides a discussion on the current problems associated with metadata and possible improvements across government agencies in the GoC.
Article
Schema matching is a critical step for discovering semantic correspondences among elements in many data-shared applications. Most of existing schema matching algorithms produce scores between schema elements resulting in discovering only simple matches. Such results partially solve the problem. Identifying and discovering complex matches is considered one of the biggest obstacle towards completely solving the schema matching problem. Another obstacle is the scalability of matching algorithms on large number and large-scale schemas. To tackle these challenges, in this paper, we propose a new XML schema matching framework based on the use of Prüfer encoding. In particular, we develop and implement the XPrüM system, which consists mainly of two parts—schema preparation and schema matching. First, we parse XML schemas and represent them internally as schema trees. Prüfer sequences are constructed for each schema tree and employed to construct a sequence representation of schemas. We capture schema tree semantic information in Label Prüfer Sequences (LPS) and schema tree structural information in Number Prüfer Sequences (NPS). Then, we develop a new structural matching algorithm exploiting both LPS and NPS. To cope with complex matching discovery, we introduce the concept of compatible nodes to identify semantic correspondences across complex elements first, then the matching process is refined to identify correspondences among simple elements inside each pair of compatible nodes. Our experimental results demonstrate the performance benefits of the XPrüM system.
Article
Intelligent Wireless Web (IWW) employs the capabilities of high speed wireless networks and exploits the parallel advancements in Internet-based technologies such as the Semantic Web, Web Services, Agent-based Technologies, and context awareness. Considering its great potentials to be applied in business systems, we have devised an innovative model, based on the IWW services, for a typical mobile real-time supply chain coordination system which has been developed and tested in a real operational environment. Our article investigates the proposed system in this way: at the start, the building blocks of the IWW are discussed in detail. Then, we fully explain the basic concepts of mobile real-time supply chain coordination and concentrate on the motivations to implement such a modern system. The vision of intelligent wireless web services, as discussed in this paper, centers on the need to provide mobile supply chain members highly specific data and services in real-time on an as-needed basis, with the flexibility of use for the user. In this regard, we investigate nine enabling technologies of the IWW for our system and discuss how, by exploiting the convergence and synergy between different technologies, it has become possible to deliver intelligent wireless web support to mobile real-time supply chain coordination. Afterwards, a practical framework is clearly established in four phases. This initiative system has been implemented in the laboratory and has passed the evaluation processes successfully. Further details will be announced in near future in another research article.
Article
The paper defines and clarifies basic concepts of enterprise architectures. Then an overview on architectures for enterprise integration developed since the middle of the 1980s is presented. The main part of the paper focuses on the recent developments on architectures for enterprise interoperability. The main initiatives and existing works are presented. Future trends and some research issues are discussed and conclusions are given at the end of the paper.
Article
The Business Process Modelling Notation (BPMN) is a standard for capturing business processes in the early phases of systems development. The mix of constructs found in BPMN makes it possible to create models with semantic errors. Such errors are especially serious, because errors in the early phases of systems development are among the most costly and hardest to correct. The ability to statically check the semantic correctness of models is thus a desirable feature for modelling tools based on BPMN. Accordingly, this paper proposes a mapping from BPMN to a formal language, namely Petri nets, for which efficient analysis techniques are available. The proposed mapping has been implemented as a tool that, in conjunction with existing Petri net-based tools, enables the static analysis of BPMN models. The formalisation also led to the identification of deficiencies in the BPMN standard specification.
Article
Modern medical information management is a knowledge intensive activity requiring a high degree of interoperability across various health management entities. Ontology-based multi-agent systems provide a framework for interactions in a distributed medical systems environment without the limitations of a more traditional client server approach. In this paper, we describe electronic Medical Agent System (eMAGS) a multi-agent system with an ontology based on an accepted public health message standard, Health Level Seven (HL7), to facilitate the flow of patient information across a whole healthcare organisation.
Article
Enterprise integration is a way that encourages the development of beneficial applications. Enterprise architects tend to favor practices and approaches based on a service-oriented architecture (SOA). Many approaches to connecting software systems, including but not limited to Corba, many Java-based integration systems, and .NET, generally follow the best practices of SOA.
Article
The semantic Web is a compelling vision, in which the World Wide Web will include a notion of meaning in data and services. Intelligent agents will exchange information and rules for how to interact with that information, with or without human intervention; appointments will be automatically scheduled; and automated agents will select and invoke services. Information will be easy to find without depending solely on keywords. In part one of this column, the author propose several reasons that this vision hasn't yet been adopted despite substantial research funding in the US and European Union (EU). These reasons will provide the foundation for a new approach, which propose in part two.
Article
This paper considers the interaction of HTTP with several transport protocols, including TCP, Transaction TCP, a UDP-based request-response protocol, and HTTP with persistent TCP connections. We present an analytic model for each of these protocols and use that model to evaluate network overhead carrying HTTP traffic across a variety of network characteristics. This model includes an analysis of the transient effects of TCP slow-start. We validate this model by comparing it to network packet traces measured with two protocols (HTTP and persistent HTTP) over local and wide-area networks. We show that the model is accurate within 5% of measured performance for wide-area networks, but can underestimate latency when the bandwidth is high and delay is low. We use the model to compare the connection-setup costs of these protocols, bounding the possible performance improvement. We evaluate these costs for a range of network characteristics, finding that setup optimizations are relatively unimportant for current modem, ISDN, and LAN users but can provide moderate to substantial performance improvement over high-speed WANs. We also use the model to predict performance over future network characteristics