Current research on distributed knowledge processes suggests a critical conflict between knowledge processes in groups and the technologies built to support them. The conflict centers on observations that authentic and efficient knowledge creation and sharing is deeply embedded in an interpersonal face to face context, but that technologies to support distributed knowledge processes rely on the assumption that knowledge can be made mobile outside these specific contexts. This conflict is of growing national importance as work patterns change from same site to separate site collaboration, and millions of government and industrial dollars are invested in establishing academic-industry alliances and building infrastructures to support distributed collaboration and knowledge. We describe our multi-method approach for studying the tension between embedded and mobile knowledge in a project funded by the U.S. National Science Foundation's program on Knowledge and Distributed Intelligence. This project examines knowledge processes and technology in distributed, multidisciplinary scientific teams in the National Computational Science Alliance (Alliance), a prototypical next generation enterprise. First we review evidence for the tension between embedded and mobile knowledge in several research literature. Then we present our three-factor conceptualization that considers how the interrelationships among characteristics of the knowledge shared, group context, and communications technology contribute to the tension between embedded and mobile knowledge. Based on this conceptualization we suggest that this dichotomy does not fully explain distributed multidisciplinary knowledge processes. Therefore we propose some alternate models of how knowledge is shared. Finally, we describe the data collection methodologies and present the current status of the project
One major challenge to successful aging is the capability to preserve health, or from another perspective to avoid disease. Unfortunately, a large percentage of the elderly people are living with chronic diseases or disabilities. Home care technologies and other emerging technologies have the potential to play a major role in home-based health care approach. The main goal of the presented research in this paper is to develop a cost-effective user-friendly telemedicine system to serve the elderly and disabled people in the community, and to establish a continuous communication link between patients and caregivers and allow physicians to offer help when needed. Hence, we are presenting here an integrated user-friendly model of a smart home which provides telemedicine for elderly at home. The paper focuses on implementation details and practical considerations of integrating the diverse technologies into a working system.
Adaptability has become one of the major research topics in the area of workflow management. Today's workflow management systems have problems dealing with both ad-hoc changes and evolutionary changes. As a result, the workflow management system is not used to support dynamically changing workflow processes or the workflow process is supported in a rigid manner, i.e., changes are not allowed or handled outside of the workflow management system. In this paper, we focus on a notorious problem caused by workflow change: the dynamic change bug (Ellis et al.; Proceedings of the Conference on Organizational Computing Systems, Milpitas, California, ACM SIGOIS, ACM Press, New York, 1995, pp. 10–21). The dynamic change bug refers to errors introduced by migrating a case (i.e., a process instance) from the old process definition to the new one. A transfer from the old process to the new process can lead to duplication of work, skipping of tasks, deadlocks, and livelocks. This paper describes an approach for calculating a safe change region. If a case is in such a change region, the transfer is postponed.
The UK National Health Service (NHS) is embarking on the largest investment programme in Information Technology (IT). The
National Programme for IT (NPfIT) in the NHS is the biggest civil IT project in the world and seeks to revolutionise the way
care is delivered, drive up quality and make more effective use of resources of the NHS. Despite these high expectations,
the NHS has historically experienced some high profile IT failures and the sponsors of the programme admitted that there remain
a number of critical barriers to the implementation of the programme. Clinicians’ reluctance to accept new IT systems at a
local level is seen to be a major factor in this respect. Focusing on such barriers, this paper reports research that explored
and explained why such reluctance occurs in the NHS. The main contribution of this research derives from the distinctive approach
based on Kelly’s Personal Construct Theory (PCT) to understand the ‘reluctance’. The argument presented in the paper indicates
that such reluctance should be viewed not as deliberate resistance imposed by clinicians, but as their inability of changing
their established group personal constructs related to ISDD activities. Therefore, this paper argues that the means that could
occur to reduce the ‘reluctance’ are creative rather than corrective or normative. The research took place in a NHS Trust
and the paper pays considerable attention to technological, behavioural and clinical perspectives that emerged from the study.
The research was conducted as a case study in a NHS trust and data was collected from two local NHS IT project. The main research
participants in this study were: (a) IT professionals including IT project managers and senior IT managers; and (b) senior
KeywordsHealthcare information systems development and delivery–National Health Service–Reluctance–Cognitions–Personal constructs theory
This research represents a theoretical extension of the Technology Acceptance Model (TAM), which IS researchers have used
to explain technologies’ perceived usefulness and individuals intention to use it. The authors developed a model, referred
to as the Mobile Wireless Technology Acceptance Model (MWTAM), to test the relationship between theoretical constructs spanning
technological influence processes (Perceived Ubiquity, and Perceived Reachability) and cognitive influence processes (Job
Relevance, Perceived Usefulness, and Perceived Ease of Use) and their impact on Behavioral Intention. MWTAM is assessed using
data collected from an online survey and analyzed using AMOS 5.0. Results provide evidence to support MWTAM as both the technological
and cognitive influence processes accounted for 58.7% of the variance explained in an individual’s Behavioral Intention toward
using mobile wireless technology. Additionally, the path coefficients between constructs ranged from 0.241 to 0.572 providing
further evidence to support the theoretical extension of TAM.
This article presents a ticket-based access model for mobile services. The model supports efficient authentication of users, services and service providers over different domains. Tickets are used to verify correctness of the requested service as well as to direct billing information to the appropriate user. The service providers can avoid roaming to multiple service domains, only contacting a Credential Centre to certify the user's ticket since tickets carry all authorization information needed for the requested services. The user can preserve anonymity and read a clear record of charges in the Credential Centre at anytime. Furthermore, the identity of misbehaving users can be revealed by a Trusted Centre.
E-Health systems, through their use of Internet and wireless technologies, offer the possibility of near real-time data integration
to support the delivery and management of health care. In practice, the wide range of choice in technologies, vendors, protocols,
formats, and information representations can make even simple exchanges of information between systems problematic. Much of
the focus on healthcare interoperability has been on resolving interoperability issues of system to system information exchanges.
But issues around people to people interactions and people to system interactions are just as important to address from an
interoperability point of view. In this paper, we identify interoperability deficiencies in collaborative care delivery and
develop a methodology in two parts. In the first part, an ontology is developed to represent collaborative care delivery.
In the second part, the ontology is used to design an architecture for interoperable clinical information system design. We
then use a case study in palliative care to provide a proof of concept of the methodology. The case study provides an inventory
of the interoperability requirements for palliative care and a perspective on the design and implementation of a people oriented
clinical information system that supports collaborative health care delivery in palliative care.
KeywordsCollaborative care delivery–Ontology–Process interoperability–People and process interoperability
This paper looks at the challenges associated with consolidating and leveraging patient information recorded at various points
in a distributed, multi-jurisdictional health care system. We draw on insights from two ethnographic case studies to illuminate
varied issues related to interoperability of information management systems. Our first case study is an investigation of duplicate
medical charts which exist in several ambulatory care clinics located on the same campus at an acute care hospital. The second
case study is an ongoing exploratory project intended to develop an understanding of information collection, storage and handover
procedures in the pre-hospital care chain, a health care domain that includes varied actors and organizations with different
information needs. Whereas findings from our case studies show that achievement of interoperability will be difficult, our
analysis suggests ways to begin to overcome these challenges.
KeywordsSocio-technical issues–Interoperability–Handovers in care–Data handovers–Ghost charts–Shadow records–Duplicate medical charts–Electronic health records–Interoperability
Insider attacks are often subtle and slow, or preceded by behavioral indicators such as organizational rule-breaking which
provide the potential for early warning of malicious intent; both these cases pose the problem of identifying attacks from
limited evidence contained within a large volume of event data collected from multiple sources over a long period. This paper
proposes a scalable solution to this problem by maintaining long-term estimates that individuals or nodes are attackers, rather
than retaining event data for post-facto analysis. These estimates are then used as triggers for more detailed investigation.
We identify essential attributes of event data, allowing the use of a wide range of indicators, and show how to apply Bayesian
statistics to maintain incremental estimates without global updating. The paper provides a theoretical account of the process,
a worked example, and a discussion of its practical implications. The work includes examples that identify subtle attack behaviour
in subverted network nodes, but the process is not network-specific and is capable of integrating evidence from other sources,
such as behavioral indicators, document access logs and financial records, in addition to events identified by network monitoring.
KeywordsInsider-Behavioural-Subtle attack-Intrusion detection-Security-Evidence-Network-Bayesian
Wireless sensor networks (WSNs) enable smart environments to create pervasive and ubiquitous applications, which give context-aware
and scalable services to the end users. In this paper, we propose an architecture and design of a web application for a sensor
network monitoring. Further, the variation in received signal strength indicator values is used for knowledge extraction.
Experiments are conducted in an in-door room environment to determine the activities of a person. For instance, a WSN consisting
of Moteiv’s Tmote Sky sensors is deployed in a bedroom to determine the sleeping behavior and other activities of a person.
Healthcare industries have not only been criticized for being slow in adopting IT (Barnes 2001) but also face tremendous challenges in developing and deploying HIS successfully (Teoh and Cai 2009). In view of these challenges, this study aims to articulate the theory and practice from Enterprise Systems (ES) as it is
perceived to produce an extensive suite of strategic, managerial, and operational benefits in the healthcare setting. Healthcare
institutions have begun to explore the possibilities of exploiting ES as a means to facilitate the delivery of high-quality
and integrated patient care. In particular, one of the benefits of ES is that it leads to better resource management in terms
of assets and manpower allocation. In our study, empirical data was collected and analyzed based on an in-depth case study
of two ES implementations at Alexandra Hospital in Singapore. Our findings contribute to the ES research on how an organization
manages its resource portfolio and activity during the implementation of a healthcare information system in a hospital. Theoretically,
we adapted and extended Sirmon et al.’s (2007) Dynamic Resource Management Model of Value Creation and integrated it with the Technochange Life Cycle framework proposed
by Markus (2004). Finally, this paper adds value by inductively deriving eight key resource management activities and seven key resources
that correspond to the phases of the technochange life cycle.
KeywordsHealthcare information systems (HIS)-Enterprise systems (ES)-Resource management activities (RMA)-Resource-based view (RBV)
Mobile computing is undoubtedly one of the predominant computer usage paradigms in operation today. The implications of what
might be cautiously termed a usage paradigm shift have still not crystallised fully, either for society, or those envisaging
a new raft of applications and services for mobile users. However, fundamental to the current and future success of mobile
computing are mobile telecommunications networks. Such networks have been a success story in their own right in recent years,
both as traditional voice carriers and, increasingly importantly, as a conduit of mobile data. The potential for new mobile
data applications is immense, but, crucially, this potential is severely compromised by two factors inherent in mobile computing:
limited bandwidth and computationally restricted devices. Hence, the academic and commercial interest in harnessing intelligent
techniques as a means of mitigating these concerns, and ensuring the user experience is a satisfactory one. In this paper,
the broad area of intelligence in telecommunications networks is examined, and issues relating to the deployment of intelligent
technologies are explored. In particular, the potential of intelligent agents is identified as a viable mechanism for realising
a full end-to-end deployment of intelligence throughout the network, including possibly the most crucial component: the end
user’s device. As an illustration of the viability of this approach, a brief description of a mobile blogging application
As a result of technological advances, a typical type of software systems has emerged. A large number of distributed software
components are networked together through a task flow structure, and each component may have alternative algorithms among
which it can choose to process tasks. However, the increased complexity and vulnerability to adverse events of such systems
give rise to the need for more sophisticated yet scalable control mechanisms. In this study a control mechanism is designed
to meet the need. First, stress environments are implicitly modeled by quantifying the resource availability of the system
through sensors. Second, a mathematical programming model is built with the resource availability incorporated and with the
stability in system behavior assured. Third, a multi-tier auction market is designed to solve the programming model by distributing
computation and communication overheads. By periodically opening the auction market, the system can achieve desirable performance
adaptively to changing stress environment while assuring stability and scalability properties. The control mechanism devised
in this paper contributes to the efforts of managing the ever-increasing complexity of modern software systems.
KeywordsAdaptive software components–Distributed control–Stability–Scalability
This paper is an exploratory study into the role of software piracy in the decision to adopt a video game console. The paper
takes a rational choice perspective, where actors evaluate the deterrent cost of moral transgression before acting, to explore
how users with different levels of video game usage intensity approach the adoption decision, on the grounds that more experienced
users can better assess the costs and benefits of moral transgression. The study used focus groups and a literature review
to develop a set of factors based on the Theory of Planned Behavior. The resulting factors were operationalized in an online
survey of 285 subjects of a variety of ages and incomes. The ability to pirate console software was significant for adopters
but not non-adopters. Perceived deterrence was associated with greater system use, as measured by hours of console use per
KeywordsConsole–Computer games–Adoption determinants–Piracy–Rational choice
Environmental concerns have led to a significant increase in the number and scope of compliance imperatives governing electrical,
electronics, and IT products across global regulatory environments. This is, of course, in addition to general compliance
and risk issues generated by the Sarbanes-Oxley Act, data protection and information privacy legislation, ethics and integrity
regulations, IT governance concerns, and so on. While the latter dimensions of enterprise-wide governance, compliance, and
risk (GRC) are far from straightforward, the complexity and geographical diversity of environment-based regulatory sources
cause considerable problems for organisations in the electrical, electronics and IT sectors. Although a variety of enterprise-level
information systems are presently available to help manage compliance and reduce risk across all areas, a majority of firms
still employ ad-hoc solutions. This paper focuses on the very-much underexplored issue of environmental compliance and risk.
The first objective of this exploratory study is to delineate the problems facing GRC and Environmental Health and Safety
(EH&S) functions in dealing with environmental regulations globally and to identify how these problems are being solved using
Environmental Compliance Management Systems (ECMS). The second objective is to propose a process-based conceptual model and
related IS framework on the design and adoption of ECMS that will inform future research and, it is hoped, the IS adoption
decisions of GRC and EH&S practitioners.
Despite the proliferation of Internet retailing and the relative novelty and complexity of this phenomenon, very little theory-guided qualitative research has been conducted to improve our understanding of the adoption of Internet retailing project by SMEs. This study presents a theoretical framework for analyzing the adoption of Internet retailing for SMEs. Organizational readiness (IT sophistication, financial resources, and customer readiness), perceived benefits of Internet retailing, and environmental factors are proposed to be the key drivers of adoption of Internet retailing. This research was designed using a qualitative approach through in-depth case studies selected from firms in Hong Kong where there is a proliferation of SMEs. The contextual meaning and practical manifestation of the key adoption factors were captured through the case studies. Our findings provide preliminary support for the proposed research framework, and contribute towards a better conceptual and practical understanding of the main factors driving SMEs to adopt Internet retailing. Contrary to popular belief, customer readiness for Internet shopping was not found to be a significant factor influencing SMEs' decision to adopt Internet retailing.
Key supplier-side factors that affect the usage level of mobile Internet were identified and the procedural mechanism among
the independent and dependent variables was investigated. For this, a research model was introduced to describe associations
among four external variables (access quality, service variety, cost rationality, and ease of use), an intermediate variable
(usefulness) and a dependent variable (the usage level of mobile Internet). Through the on-line survey, data were gathered
from actual mobile Internet users. Confirmatory factor analysis and path analysis were applied to test the overall integrity
of the research model and of proposed hypotheses. All four external variables affected user perceptions on the usefulness
of mobile Internet. Among them, service variety and cost rationality had a relatively larger influence on perceived usefulness.
Perceived usefulness of the mobile Internet had a positive effect on its usage, confirming the important role of usefulness
as a significant mediator between the four external variables and the dependent variable. Meanwhile, the cost rationality
was the only external variable with direct influence on the MI usage. Theoretical and practical implications of the study
results are discussed.
KeywordsMobile internet-Mobile data services-Mobile internet services-Post-adoption usage-Technology acceptance
Our society is becoming increasingly more IT-oriented, and the images and sounds that reflect our daily life are being stored
mainly in a digital form. This digital personal life can be part of the home multimedia contents, and users demand access
and possibly share these contents (such as photographs, videos, and music) in an ubiquitous way: from any location and with
any device. The purpose of this article is twofold. First, we introduce the Feel@Home system, whose main objective is to enable
the previously mentioned vision of an ubiquitous digital personal life. Second, we describe the security architecture of Feel@Home,
analyzing the security and privacy requirements that identify which threats and vulnerabilities must be considered, and deriving
the security building blocks that can be used to protect both IMS-based and VPN-based solutions.
KeywordsDigital home-Content sharing-Multimedia-Security-Privacy
This paper describes the rationale, design and implementation of a system for increasing the status women in developing communities.
AIR (Advancement through Interactive Radio) gives female community radio listeners a voice with which to respond to programming
and to create programming content. We first describe the cost of excluding women from Information and Communication Technologies
(ICT) for development, and explore how community radio represents an opportunity for inclusion. We draw upon feasibility studies
and site visits in Southeast Kenya to support the introduction of a mechanism that enables women to “talk back” to the community
radio station. Using the principles of Participatory Action Research (PAR), we argue that women will be more likely to benefit
from technology-mediated opportunities for development if they themselves produce information that contributes to their advancement,
rather than simply consuming information provided by others. Finally, we describe the design and implementation of simple
communications device that supports this model for use in communities that are, and will remain for some time, off the electrical
and cellular grid. This hand-held device enables women to record voice feedback and news for community radio. This feedback
is then routed asynchronously back to the radio station through a probabilistic, delay-tolerant network, where the feedback
can inform subsequent broadcasts and facilitate additional discussion. We conclude with a technical summary of the AIR prototype.
In this study, a method of implementation of cyber shopping services within a smart home environment with the use of sensor
technology is proposed. This proposed model uses Radio Frequency Identification tag, Ubiquitous Sensor Network and TinyOS
in order to transmit information to mobile interfaces. The residents and product information is collected and classified according
to proposed categorization module and will generate the residents’ needed product advertisement. This proposed method of implementation
provides an effective and convenient cyber shopping service environment to residents within their smart home environment and
defines the service providers who support the cyber shopping.
Many countries have devoted increasing attention to information infrastructures. However, a gap in digitalization exists among
different government agencies, causing unequal opportunities for accessing infrastructures, information, and communication
technologies. This paper, based on Gowin’s Vee structure, is an empirical study of the digital divide in the context of local
governments in Taiwan. A model for identifying and measuring aforementioned digital divide is constructed in this paper. We
first refer to the grounded theory to draft a framework for measuring the digital divide in local governments. Then, through
the use of a questionnaire distributed to experts implemented alongside the analytic hierarchy process (AHP), we generate
five dimensions (including ICT infrastructure, human resources, external environment, internals of organization, and information)
and 42 measures. Finally, we measure the actual levels of the digital divide in local governments with the resulting digital
divide evaluation model. This paper aims to generate results that can serve as a reference for government agencies (at all
levels) in the formulation of their digitalization strategies. Moreover, the digital divide evaluation model constructed in
this study goes beyond existing measures and may serve as a reference for academics in the examination of methods to narrow
the digital divide in various levels of governmental bodies. Taken together, the features of integration, comprehensiveness,
and wide applicability of this proposed model can be considered the theoretical contributions to digital divide and local
government hierarchy research.
KeywordsDigital governments–Digital divide–Evaluation model–Grounded theory–Expert questionnaire–Analytic hierarchy process
In order to propose a new cognitive map (CM) inference mechanism that does not require artificial assumptions, we developed
a case-based reasoning (CBR) based mechanism called the CBRMCM (Case-Based Reasoning based Multi-agent Cognitive Map). The
key idea of the CBRMCM mechanism involves converting all of the factors (nodes) that constitute the CM into intelligent agents
that determine their own status by checking status changes and relationship with other agents and the results being reported
to other related node agents. Furthermore, the CBRMCM is deployed when each node agent references the status of other related
nodes to determine its own status value. This approach eliminates the artificial fuzzy value conversion and the numerical
inference function that were required for obtaining CM inference. Using the CBRMCM mechanism, we have demonstrated that the
task of analyzing a sales opportunity could be systematically and intelligently solved and thus, IS project managers can be
provided with robust decision support.
KeywordsCognitive Map (CM)–Case-Based Reasoning (CBR)–Case-Based Reasoning based Multi-agent Cognitive Map (CBRMCM)–Sales opportunity assessment cases
Today, multi-agent technologies are widely used for realizing prevalent ubiquitous computing applications, in which service
discovery is a critical task for finding a particular service instance. While JADE is a popular system for multi-agent applications,
its directory facilitator (DF) used for service discovery employs a sequential search approach, which shows degraded performance
when the number of registered services becomes large. This paper proposes a new DF scheme employing the category-based classification
and search approach. It greatly reduces the search space and allows accurate matchmaking. The DF implemented with the proposed
approach and JADE-DF are compared in terms of query response time and memory space requirement. It demonstrates that the proposed
DF allows faster query processing than JADE-DF and requires smaller memory space, especially for a large number of services
and queries of multiple parameters.
KeywordsAgent platform-Category based search-Context-awareness-Middleware-Service discovery
To properly implement and use electronic commerce systems companies need to evaluate considerable amount of information from their business environment in activities such as environmental scanning. This information is needed both for strategic management and for operational decisions. However, the relevant information may be difficult to find and interpret due to information overload and the fact that the information may be in many locations. Fortunately, most of the information is on the Web (Internet, extranets). To overcome the search problem, especially in large organizations, one should consider the use of software agents. This paper presents an overview for the utilization of such agents today and in the future by describing a prototype system designed for information monitoring and collection from Web sources for the pulp and paper industry. The paper also describes the use of other agents that can supplement the system in specific electronic commerce application areas. Finally, some implementation issues are discussed.
Although next-generation information network infrastructure is prerequisite for continued economic growth, the United States
is losing ground in important areas relative to peer countries. Businesses and regulators have grown concerned that the U.S.
lacks the correct regulatory and business incentives to upgrade the existing network. Due to the complex and dynamic nature
of the interdependencies in the ICT value network, traditional methods of public policy and management analysis have proven
inadequate to fully understand the issues and possible solutions. This paper discusses a novel Genetic Programming (GP) approach
to the problem. Although only a first step towards addressing the problem, the GP discovered several interesting results stemming
from the complex interactions. For example, telecommunications companies would actually be hurt by the option to charge discriminatory
prices but application providers would benefit.
KeywordsEvolutionary algorithms–Internet–Network infrastructure–Net neutrality