This special issue of Information Systems Journal (ISJ) is devoted to questions of the philosophy and epistemology of information systems (IS) research. Epistemology is the discipline of questioning the truth claims of science : how and what can we know, and how do the knowledge claims of science relate to our broader society? The term ‘philosophy’ means that the level of questioning is not about a particular research method or technique, but includes a selfreflective questioning across other realms of enquiry, ranging from ethics to aesthetics, and locating our ideas and methods in important trends in the history of philosophy of science. We are pleased that ISJ is the first journal in IS to publish a special issue on such ambitious topics. Since its beginning, ISJ has been open to a wide variety of research methods, especially qualitative ones, and a wide variety of approaches, like interpretivism and critical social theory, that did not always find a home in other journals. ISJ has also been open to quantitative methods and positivist approaches, showing an impressive commitment to pluralism. We see this special issue as one more way that ISJ serves to link the technical dimensions of IS to organizational and management issues, including their social and intellectual context. Our goals for this special issue, as stated in the Call for Papers, were: 1 to discuss if the theories used in the IS field were specific to our field or not; 2 to illustrate the applicative nature of our field, by challenging the classical empiricist view and idea of design; 3 to explore the extent that our object of research, i.e. the IS, exists independently of our observation of it; and 4 to question the definition of science, especially the taken-for-granted ideas of ‘falsification’ and ‘paradigms’, through a comparison with other academic fields. We received 18 submissions for this special issue. Of those, four were selected by the review process and are included here. Of t
There are a number of approaches open to an information system (IS) development organization wishing to evaluate the potential relevance of an IT product for use in their own organization. This work considers method support for evaluation of CASE tools, as a complex example of an IT product. Our interest is in evaluation for selection, namely the evaluation of products prior to experience of their use in the defined context. Certain weaknesses are apparent in existing evaluation methods proposed in the literature for evaluation for selection. Our primary concern in this paper is to present a new method which, we claim, addresses these weaknesses. We provide a rationale and a detailed description of the new method, which is referred to as 2G. The method specifically addresses the important early phases of an evaluation, during which an evaluation framework is to be established. The method is to be used within the intended usage context for the IT product, so the resulting evaluation framework will be situated rather than general. We identify weaknesses in previously available methods, and discuss the advantages offered by the 2G method. The method has a solid basis in underlying theory, something which has influenced our presentation of it. Equally importantly, our presentation is strongly influenced by our experience with the method in actual usage situations, in small and medium IS development organizations. We claim that 2G is a practical candidate for any IS development organization wishing to undertake evaluations of CASE tools prior to adoption.
Abstract The most important goal of computer software training and interface design has been to help users learn and use software effectively. While research has provided useful information about how to achieve this outcome, it appears to have overlooked the contribution that ‘fun’ and ‘motivation’ can add to learning. This study develops and tests a model of the relationship between intrinsically motivating software instruction and two outcome variables — learning performance and perceived ease of software use. The researchers manipulated two design variables to create learning environments that were either high or low in intrinsic motivation. One variable, the computer interface, consisted of either a direct manipulation interface (the Macintosh Finder), defined as being high in intrinsic motivation, or a command-based interface (disk operating system — DOS), which was less motivating. The second design variable, training method, also consisted of two categories — exploration training (more motivating) and instruction-based training (less motivating). Additionally, the study evaluated the motivational impact of one individual characteristic — cognitive spontaneity. Results of a laboratory experiment indicated that subjects using the direct manipulation interface found the learning experience more intrinsically motivating and performed better in hands-on computer tasks than in command-based subjects. Also, subjects who were high in cognitive spontaneity were more likely to report the learning experience as intrinsically motivating. Learners who reported high levels of intrinsic motivation performed better in hands-on tasks than those who did not, and perceived the software as easier to use. Exploration training, however, failed to exhibit a positive relationship to intrinsic motivation. These results suggest that an intrinsically motivating instructional environment can lead to improved software learning and performance. This is explained in terms of four key dimensions spanning multiple theories of intrinsic motivation: challenge, curiosity, control and fantasy.
Information is fundamental to the discipline of information systems, yet there is little agreement about even this basic concept. Traditionally, information has been seen as ‘processed data,’ while more recently soft, interpretive approaches have taken information to be ‘data plus meaning.’ This paper provides a coherent and consistent analysis of data, information, meaning and their interrelations. It is particularly concerned with the semantic and pragmatic dimensions of information, and integrates the work of Maturana and Habermas into a framework provided by Dretske's theory of semantic information. The results show that meaning is generated from the information carried by signs. Information is objective, but inaccessible to humans, who exist exclusively in a world of meaning. Meaning is intersubjective — that is, based on shared agreement and understanding — rather than purely subjective. Information, and information processing systems, exist within the wider context of meaning or sense-making and the IS discipline needs take account of this.
We have found two main causes of inadequacies in a decision support system designed by us. They are, in general, the lack of direct user participation and the lack of designer accountability in the design process. In an attempt to solve the problem, we develop a new design method, the User Participation and Designer Accountability (UPDA) method. It involves users' direct manipulation of colour cards, which contain the contents of simple design charts. These cards are also made accessible to everyone involved in the design process and form the basis upon which the designer can be questioned. We have applied the UPDA method and so far the reactions are very positive.
Knowledge systems development and use have been significantly encumbered by the difficulties of eliciting and formalizing the expertise upon which knowledge workers rely. This paper approaches the problem from an examination of the knowledge competencies of knowledge workers in order to define a universe of discourse for knowledge elicitation. It outlines two categories and several types of knowledge that could serve as the foundations for the development of a theory of expertise.
Speech act theory focuses on pragmatic language qualities of making assertions, directions, promises, declarations and expressions. However, assigning speech acts to one of a few categories is not without controversy. The process is context dependent and, hence, individuals with divergent contextual views may categorize speech acts differently. Different categories, in turn, imply different speech act interpretations. The difficulty of disparate speech act interpretations can often be resolved by, for example, a process of negotiation. Nonetheless, it is easy to envisage situations in which agreement by negotiation is not possible. Such would be the case when a researcher studies transcribed speech act performances. We propose that Ballmer and Brennenstuhl's (1981) speech at classification method, which relies on an extensive speech act verb lexicon with sequencing and contextual information, can reduce disagreement among individuals who singly or together analyse and ascribe meaning to speech acts. We base this proposition on the results of exploratory research involving alternative knowledge acquisition methods. Our exploratory results suggest that Ballmer and Brennenstuhl's lexicon provides several promising future research directions of speech act use in information systems.
The purpose of this paper is to show that a number of basic issues have not been adequately addressed in existing office information systems research. Prominent among these are the nature and role of offices, the goals of office information systems development, and the nature of its organizational and managerial consequences. It is proposed that office information systems should be analysed as social action systems the behaviour of which is strongly affected by socially determined forces and constraints such as the behaviour-channelling influences of authority, norms, customs, habits and precedence. Four types of social action are discussed: instrumental, strategic, communicative and discursive. Three contexts for perceiving and analysing the effects of social action in offices are introduced: technology, language and organization. Office information systems changes affect elements and relationships in these three contexts in different ways. By cross-relating social action types and contexts, nine classes of object systems are identified. Each object system class implies a different category of effectiveness concerns which in turn implies different office information system design requirements. The paper notes that the existing research literature has primarily been concerned with only three of the nine object systems. For more effective office information systems development, however, the other systems also need to be considered. The paper concludes by exploring how this may be done.
Action research is located within a spectrum of problem-solving activity that ranges from pure basic scientific research, the purpose of which is to add to knowledge, to human action designed to achieve a purpose without thought of contributing to knowledge. Action research both adds to knowledge and applies it in practice. Action research is particularly appropriate information systems development. A number of alternative methodologies for action research exist that make differing philosophical assumptions. A body of theory exists that enables a choice to be made among methodologies. This theory is useful in analysing information system problems.
CASE research to date has been dominated by positivistic enquiry; particularly tool building, surveys and normative writings. In contrast, there is a growing community of IS researchers developing models of IS practice that highlight the complex relationship between context and process that has to be mastered in order to develop viable information systems. This paper bridges the gap between the two bodies of knowledge by presenting a phenomenological study of CASE tool usage in a large UK manufacturing company over a four year period which shows that organizational context, tool features and usage are inextricably linked. The lessons arising from the work are presented and grounded in interpretive IS theory. The results of the work are clearly in accord with this theory, thus, showing the importance of interpretive CASE research, as a complement to positivist thinking, in bringing to light the human and organizational issues which strongly influence systems development practice
While theories abound concerning knowledge transfer in organisations, little empirical work has been undertaken to assess any possible relationship between repositories of knowledge and those responsible for the use of knowledge. This paper develops a knowledge transfer framework based on an empirical analysis of part of the UK operation of a Fortune 100 corporation, which extends existing knowledge transfer theory. The proposed framework integrates knowledge storage and knowledge administration within a model of effective knowledge transfer. This integrated framework encompasses five components: the actors engaged in the transfer of knowledge, the typology of organisational knowledge that is transferred between the actors, the mechanisms by which the knowledge transfer is carried out, the repositories where explicit knowledge is retained and the knowledge administrator equivalent whose function is to manage and maintain knowledge. The paper concludes that a ‘hybridisation’ of knowledge transfer approach, revealed by the framework, offers some promise in organisational applications.
Reorganization of the UK primary health care system to create an internal market for health services depends upon local family doctors (general practitioners) taking on budgetary responsibilities and purchasing services from hospitals. These budgets will be monitored by local committees. The success of the internal market is heavily dependent upon computerized management information systems. This paper investigates the introduction of information systems into an organization that is ill prepared for change.
This paper presents a knowledge-focused perspective for the development of a model to explain the diffusion and adoption of complex integrating technologies. Business process re-engineering (BPR) is used as the example to illustrate the model. However, while BPR is used to illustrate our argument, the model that is developed is relevant to understanding the innovation processes surrounding any complex IT-based innovation. It is argued that the strength of this diffusion model is that it focuses not on the spread of particular technological artifacts (whether it is BPR or any other IT-based innovation), but on the spread of the ideas and knowledge underpinning the technology. In particular, the model draws attention to the ways in which technology suppliers commodify knowledge and present ‘packaged’ solutions. This creates problems for potential users who need to unpack this knowledge and integrate it with existing organizational knowledge. The diffusion and adoption of innovations is thus seen as a process of integrating knowledge across disparate communities. Such knowledge integration, however, is difficult. This can help to explain the apparent contradiction between the limited success rate of BPR and its widespread diffusion among western firms.
Electronic Data Interchange Systems (EDI) are increasingly being used by business firms to improve operations and customer service. One of the major motivations for business organizations using EDI is to gain a strategic advantage in the marketplace. Although EDI has been implemented by many organizations, unfortunately not all have gained the same level of expected advantage or envisioned benefits.
In this study we focus on the impact of EDI implementation commitment and implementation success on competitive advantage and firm performance. We study two categories of companies: companies that initiate the development of EDI and are known as hub companies and those that are their non-hub counterparts. Findings indicate that non-hub firms may not reap the same level of expected benefits resulting from EDI technology adoption and implementation as hub firms.
The use of computers in organizations may be influenced by both technical and social factors. In addition, the way in which these factors operate in any specific organization may be affected by the particular type of computer being used. This paper reports the results of semistructured interviews with users of Apple Macintosh computers in three different organizations. The relevance of factors identified in earlier interviews with academic Macintosh users to the use of microcomputers in the business context is considered. It was found that, while there was considerable individual variation in attitudes to microcomputer use, many of the factors identified in the earlier study were also present in the business organizations. Some reasons for the attitudes found are discussed.
The main thrust of the article is to distinguish between four types of IT sourcing decision: total outsourcing; multiple-supplier sourcing; joint venture/strategic alliance sourcing; and insourcing. To illustrate each type, detailed case histories are used that analyse the reasons why specific IT sourcing decisions were adopted. Here, we consider total outsourcing at the London Stock Exchange; multiple-supplier sourcing at ICI plc; joint venture project sourcing at CRESTCo Ltd; and insourcing at the Royal Bank of Scotland. The trend towards outsourcing is increasing in all industrial and commercial sectors. However, client organizations need to become more aware of some of the pitfalls, particularly in respect of large-scale outsourcing deals to single or multiple suppliers. This is because the move to IT outsourcing engenders the need to develop new capabilities and skills to manage complex commercial contracts. We therefore conclude that, while many IT outsourcing contracts followed rationalization, cost-cutting and disappointing results from in-house IT provision, short-termism and current uncertainties over market, business and political conditions pose problems for many organizations in deciding future outsourcing arrangements.
As business becomes established with a more international focus, there is a growing globalization of the information systems (IS) function. The need for compatible standards and procedures within these international networks means that the systems analyst will play a key role in this globalization process. It is, therefore, considered timely that there is a developing body of research into the effects of cultural differences on the design and development of IS. This article reports research into the way in which ‘excellent’ systems analysts are perceived in Canada and Singapore. The research method adopted the RepGrid technique from Kelly's Theory of Personal Constructs, which served as the vehicle to facilitate the elicitation of participant comments. Seventeen interviews were conducted in Singapore compared with 53 in Canada. The results suggest that, in terms of the overall approach and priorities that systems analysts place on their work, there is evidence of a commonality, which unites the profession in these two countries. This may be evidence of convergence, which occurs as the result of common education, training and socialization of new entrants into the international occupational community of systems analysts. However, there is also strong evidence of divergence between these two countries based on the different emphasis on the way in which systems analysts will play their roles in each country. As a reflection of the culture of each society, the Singaporean systems analysts are more likely to rely on expertise to influence clients, whereas the Canadian systems analysts rely more on encouraging the client to participate in the design effort.
There appears to be a general consensus within the information systems literature that formal specification of software systems is an inappropriate response to the perceived general failure of information systems to meet user requirements. Such views would seem to be based primarily on the difficulty of constructing formal specifications – and on the difficulty of understanding such specifications once constructed. Research into the applicability of formal methods has therefore tended to concentrate on the needs and the context of software developers specializing in critical and extremely complex software such as operating systems, transaction processing monitors, or nuclear reactor protection. More recently, however, formal methods have been applied successfully in more conventional and commercial areas, such as the development of a CASE tool, indicating that many of the perceived disadvantages of formal methods are merely myths.
This paper discusses the differing research directions of the information systems and software engineering disciplines and suggests that significant beneflts may result from a synthesis of the two approaches. We further suggest that there is a serious danger that approaches which have been shown to have value in one of the two domains are automatically being ignored in the other as being ‘irrelevant’. While each of the two areas ignores the contribution of the other, software systems will continue to be sub-optimal (in terms of relevance, as well as quality). We argue the relevance of formal specifications to the information systems discipline, illustrating the argument with a case study based within the IS domain.
Nijssen's Information Analysis Method (niam) provides a powerful grammar for generating conceptual schema diagrams. We evaluate this grammar systematically using an ontological model proposed by Bunge, Wand and Weber. Our analysis shows niam'Sgrammar has many desirable features. Nevertheless, we conclude it is deficient in three respects. First, we argue that construct overload occurs in niam'Sgrammar. Multiple real-world features are represented by the same grammatical construct in niam. As a result, we predict users of niam will sometimes be confused about which characteristics of the real world are being represented by niam'Soverloaded grammatical constructs. Second, we argue that construct redundancy occurs in niam. Multiple grammatical constructs can be used to represent the same real-world feature. As a result, we predict users of niam will choose to employ only one of the grammatical constructs provided to represent a particular real-world feature. Alternatively, they will equivocate in their choice of a grammatical construct and thus undermine their effectiveness and efficiency in using niam. Third, we argue niam has construct deficit. Some ontological constructs have no corresponding grammatical construct in niam. As a result, we predict users of niam will employ other grammars to represent real-world features that cannot be represented by niam'Sgrammar. We predict that using multiple grammars to represent real-world features, however, will evoke other problems associated with attaining consistent representations among the scripts generated via the grammars.
This paper compares the relative advantages of two models (a) a two-factor (i.e. source and channel) choice model; and (b) a theory of planned behaviour (TPB)–based acceptance model, developed to explain electronic government (e-government) service adoption. The models were empirically validated in the government-to-citizen (G2C) anti/counter-terrorism (ACT) service domain by a telephone survey administered to a sample of 500 US residents systematically drawn from the mainland USA. The structured telephone survey questionnaire measured respondents' intentions to use Web-based ACT services and their beliefs and attitudes towards various ACT service providers (e.g. the Federal Bureau of Investigation, Department of Homeland Security, local police, non-governmental organisations (NGOs) and news companies) and media (e.g. Web, email, telephone, TV, newspapers, postal mail). The results of multiple-regression analysis of the 500 interview responses show that the source–channel choice model can explain the service usage intentions as well as, or better than, the TPB-based acceptance model. Both service source and channel exerted consistent and substantive effects on citizens' intention to use ACT services, explaining, on average, more than 43% of variance in the dependent variable. In contrast, subjective norms in the TPB-based model seemed to have a marginal effect. Privacy protection and integrity of the service source appeared to be key antecedents of source preference, while the influence of channel attributes on channel preference varied widely depending on the type of services. This study provides an empirical basis for the validity and applicability of the source–channel choice model and offers insightful and prescriptive knowledge for e-government initiatives and service providers.
Factors which help to integrate business and information systems strategy are identified, a linchpin being the identification of what might be termed an information strategy. Information is defined and the implications of this definition for Vie provision of a flexible information environment are identified. The use of soft systems methodology in identifying key information needs and flows is summarized and an extension of this approach, which utilizes the concept of developing alternative future scenarios and associated business strategies, is introduced.
Information system (IS) security continues to present a challenge for executives and professionals. A large part of IS security research is technical in nature with limited consideration of people and organizational issues. The study presented in this paper adopts a broader perspective and presents an understanding of IS security in terms of the values of people from an organizational perspective. It uses the value-focused thinking approach to identify ‘fundamental’ objectives for IS security and ‘means’ of achieving them in an organization. Data for the study were collected through in-depth interviews with 103 managers about their values in managing IS security. Interview results suggest 86 objectives that are essential in managing IS security. The 86 objectives are organized into 25 clusters of nine fundamental and 16 means categories. These results are validated by a panel of seven IS security experts. The findings suggest that for maintaining IS security in organizations, it is necessary to go beyond technical considerations and adopt organizationally grounded principles and values.
The aim of this paper is to initiate a debate on quality assurance and the application of the ISO 9001 in the system development field. It is claimed that unquestioning application of ISO 9001 can be a backwards step in the practice and research of system development. Five aspects of quality are identified in order to design and evaluate the quality of information systems: technical quality, use quality, aesthetic quality, symbolic quality and organizational quality. We find that the standard solely emphasizes technical quality. The standard promotes the ideal of linear, phase-oriented system development, based on a fixed requirement specification and a document-driven process. This is in contrast to most research findings, which criticize one-sided adaptation of phase orientation in system development.