Article

Voluntarism and the Fruits of Collaboration: The IBM User Group, Share

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Technology and Culture 42.4 (2001) 710-736 In November 1956, Paul Armer went before an audience of data processing managers to expound upon the virtues of cooperation. Armer was one of the founders of Share, a computer user group made up of customers of the International Business Machines Corporation (IBM). Share was indeed a cooperative organization, if a somewhat curious one. The idea to create the first nationwide computer user group originated with a group of computing center directors intent on improving the operations of their own facilities. They envisioned an organization that could set technical standards, and much more: among their concerns were a shortage of skilled programmers, high labor costs, and, most important, the inefficiency inherent in the fact that firms that had purchased an IBM mainframe still had to write their own programs to perform basic computing functions, a situation that resulted in a massive duplication of programming effort. Share's most important work took place between 1955 and 1958, at a time when scientific and engineering installations still made up the majority of customers for IBM's new computers. The group was created just ten years after the first electronic computer was built, two years after the release of IBM's first mainframe, and a year or two before the release of Fortran. At the time, many groups, both within and without the universities, were beginning to articulate new programming techniques. Share's principal contributions lay in developing early operating procedures, operating systems, and a body of knowledge that would become known as systems programming. Equally important, Share gave a group of early programmers a forum for establishing their work as a new field of knowledge, a body of practice, and a nascent profession. Share appealed to voluntarism to justify what was, in effect, a collaboration among some highly competitive American corporations. Its organizers made decisions that reveal the broader entanglements among esoteric knowledge, institutional loyalties, and professionalization strategies, all within the larger context of a technology-driven cold war economy. The early history of Share provides an opportunity to reexamine traditional narratives of professionalization. The group's unusual mission and makeup remind us that professionalization involves historically specific strategies. Antecedent technical practices, the institutional dispersion of knowledge, and expectations about corporate propriety all influenced how computer programmers pursued the goal of professional status. So did the cold war economy, especially a labor market that allowed young men in search of upward mobility to turn to technical careers. This study also pays close attention to Share's organizational tactics in order to reveal the contingencies of professionalization. It is significant, for instance, that Share's founders did not set out to establish a professional society; rather, they sought a collaboration that would reduce their programming costs. In the end, Share emerged as an important intermediary between IBM and its end users. The organization was forced into an ambiguous position as both a corporate collaboration and a voluntary society in order to secure resources for its efforts. Institutional commitments and loyalties would eventually limit the scope of Share's activities, reproducing a limited path to professionalization similar to that pursued by engineers. Nevertheless, this article embraces what sociologists call a process-oriented approach to the study of professionalization in order to weigh what was and was not determined about the computer programmer's opportunities. Finally, in this account technical knowledge emerges as an important site for reworking the social relations through which new professions emerge. How does a group of specialists go about constructing its initial occupational identity? As was the case with so many postwar areas of expertise, computer programming drew its recruits from a variety of established occupations. One of the goals of an organization such as Share was to resolve competing claims of competence by establishing a coherent set of occupational identities. Technical knowledge provided an organizing principle around which to construct these new identities. Share mediated the process both by advancing certain kinds of knowledge and by deciding what groups could go on to claim professional competence. Moreover, the distinctions that Share helped to establish between operators and programmers, programmers and end users, and systems and applications programmers would continue to...

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... In surveying the relevant historiography, Hessler mentions Williams (1998), Rose (1995) and Goldstein (1997) Related to this typology of groups are the user groups that emerged around specific computers. On computing intermediaries, there is also Akera's (2001) article on the role of a 1950s IBM user group SHARE. Apparently, SHARE established, among other things, a terminology and a classificatory framework for software (cf. ...
... Hobbyists "produced" the machine -by defining their own characteristics -and used them. Douglass (1992) also Akera 2001;Haigh 2001). Software is a technology on its own: to put forward software programming as a mediator between hardware and users does not shed light on the concept of intermediary actor as put forward above. ...
Chapter
Full-text available
Introduction One of the major issues of the "Tensions of Europe" (ToE) network concerns the influence of technology on the process of economic, social and political integration in 20th century Europe. This is clearly reflected in the three main lines of investigation followed across the different themes: the linking of infrastructures, the circulation of artefacts and services, and the circulation of knowledge. At the same time, this process-and the role of technology in it-is not seen as deterministic or uncontested. As the name of the network indicates, the need is well recognised to examine the tensions resulting from the possible divergence of technological possibilities and socioeconomic realities. It seems fairly obvious that information systems and technologies (IST) have played an important linking role even before the advent of the Internet. Thus, for example, the possibilities offered by IST have strongly influenced the way managers were able to exercise control and therefore constituted an important factor in the organisation of large-scale enterprise and their geographic extension. The same is true for governments and their statistical apparatus for instance. The recent integration of computer networks and electronic data exchange facilitated the creation of common databases and policies among governments, speeding up developments, which had started earlier. It also created new possibilities for business, for example enabling companies to develop new organisational practices (e.g. just-in-time). The popularisation of the Internet has also created new forms of bilateral and multilateral communications among individuals (e-mail and "chat") and consumption (e-commerce). At the same time, there are a number of barriers that complicate these linkages and interchanges. The most obvious one is language, because it is the medium used for the storage and the dissemination of information. This is of particular importance in Europe, where there are not only the "official" national languages, but where regional languages have also been growing in importance over the last decades of the 20th century. A second barrier concerns national standards and industrial policies. Their influence can be seen in the development of the computer industry in Europe, where national efforts and rules might have played an important role in preventing European companies to become more competitive internationally. A third barrier, which usually receives less attention but is of considerable importance, derives from the national and cultural differences among the users of information systems and technologies, both in terms of individuals, organisations and society as a whole (cf. e.g. Hofstede 1980/2001; D'Iribarne 1998; Whitley 1999). This theme of the network has therefore moved from producer-centred accounts of the development of IST towards the user dimension. It has looked at how the persisting differences in Europe have shaped the use of IST and either facilitated or hampered the process of economic, social and political European integration during the 20th century. More particularly, we have tried to examined the contribution of IST to shaping organisations, society and the individuals, which form parts of both. Among the issues here are for example the influence of IST on the balance between centralised and decentralised forms of control or the differences in the use of IST driven by a variety of parameters, including (national) culture and gender. This is based on the growing recognition that human-made artefacts are not neutral, but always include a symbolic dimension in addition to their material characteristics, leaving it to a certain extent to the user to fill them with meaning (cf. Pinch 1996). This opens the possibility for different, sometimes competing interpretations of technology-interpretations influenced to a considerable degree by the characteristics of the users, e.g. their gender or their cultural and national background. Regarding information systems and technologies (IST), the potential for modifications resulting from national/cultural differences appears particularly high-both from organisational and individual perspectives. For example, one might ask to what extent the introduction of the same technology (e.g. a new filing system or standardised business software) in different countries is likely to reduce the existing differences and lead to an increasing similarity among organisations, even those operating within different socioeconomic contexts. This seems even more of an issue when these organisations are part of the same entity (e.g. a multinational) or have frequent interchanges of information (e.g. in a buyer-supplier relationship). In this context, it is important to recognise that individuals use the same technologies, e.g. a computer, in both a private and a work setting. If nationally or culturally determined user patterns are reproduced in the office, this might reinforce rather than eliminate existing differences. If, by contrast, a technology is first introduced at work, this will shape the ways it is used at home-thus possibly leading not only to a convergence of organisations, but also of individual behaviour. Such a technology-driven process of convergence might be facilitated by the emergence of transnational communities of actors, both within the organisation and outside. As a result, the application of IST might be defined more by the rules of a professional group rather than the national/cultural setting in which it takes place. Another important phenomenon regarding IST therefore concerns the importance of intermediaries in shaping the use and interpretation of these technologies. This intermediation can be more or less personal. Regarding the use of the Internet for example, the billing structure is an important determinant of usage and might displace users from one form of communication to another (e.g. voice to text/data). Probably the most personal form of intermediation concerns the role of consultants in overcoming the tensions between new technologies and the organisation and its members. They have played this role in the introduction of new filing systems at the beginning of the 20th century (an early form of information storage and knowledge management) and continue to play them nowadays-at a much larger scale-in the introduction of complex IT-based systems of information capture and exchange (such
... Related to this typology of groups are the user groups that emerged around specific computers. On computing intermediaries, there is also Akera's (2001) article on the role of a 1950s IBM user group SHARE. Apparently, SHARE established, among other things, a terminology and a classificatory framework for software (cf. ...
... In his recent article on the emergence of computers programmers, Ensmenger (2003: 155) focuses on the "boundary disputes" of the 1950s and the 1960s about the role of early computer programmers as "mediators" between the computer and the existing structures and practices of organisations (cf. also Akera 2001;Haigh 2001). Software is a technology on its own: to put forward software programming as a mediator between hardware and users does not shed light on the concept of intermediary actor as put forward above. ...
... This customer association, and the many others that followed, greatly participated in the early circulation of basic suites of programs. On this topic, see Akera (2001;2008, 249-274). ...
Book
Full-text available
A laboratory study that investigates how algorithms come into existence. Algorithms—often associated with the terms big data, machine learning, or artificial intelligence—underlie the technologies we use every day, and disputes over the consequences, actual or potential, of new algorithms arise regularly. In this book, Florian Jaton offers a new way to study computerized methods, providing an account of where algorithms come from and how they are constituted, investigating the practical activities by which algorithms are progressively assembled rather than what they may suggest or require once they are assembled. Drawing on a four-year ethnographic study of a computer science laboratory that specialized in digital image processing, Jaton illuminates the invisible processes that are behind the development of algorithms. Tracing what he terms a set of intertwining courses of actions sharing common finalities, he describes the practical activity of creating algorithms through the lenses of ground-truthing, programming, and formulating. He first presents the building of ground truths, referential repositories that form the material basis for algorithms. Then, after considering programming's resistance to ethnographic scrutiny, he describes programming courses of action he attended at the laboratory. Finally, he offers an account of courses of action that successfully formulated some of the relationships among the data of a ground-truth database, revealing the links between ground-truthing, programming, and formulating activities—entangled processes that lead to the shaping of algorithms. In practice, ground-truthing, programming, and formulating form a whirlwind process, an emergent and intertwined agency. The open access edition of this book was made possible by generous funding from Arcadia – a charitable fund of Lisbet Rausing and Peter Baldwin.
... e rst demonstration of time reversal dates back to 1965, when Parvulescu and Clay [1965] time-reversed a signal generated by a hydrophone in the ocean to improve underwater communication. Since then, various time reversal techniques have been developed and applied across a broad range of disciplines [Jackson and Dowling, 1991, Prada et al., 1991, Fink, 1992, Cassereau and Fink, 1992, 6 Wu et al., 1992, Derode et al., 1995, 2001, Prada, 2002, Fink et al., 2003. ere are two overarching categories of time reversal, both of which begin by recording a wave eld propagating forward in time in a physical medium. ...
Thesis
Elastic waves are used across a broad spectrum in medicine to non-invasively probe and image tissues up to centimeters deep. Ultrasonic (US) imaging is the most well-known modality used to image acoustic density and velocity contrasts. US is useful for imaging overall structure in tissue, however, acoustic contrasts are relatively low in biological tissue. In contrast, optical properties are highly specific, but imaging with light is typically limited to depths of about 1mm. Therefore, we aim to create images of both acoustic and optical properties centimeters deep in tissue non-invasively. Laser-induced acoustic waves are generated by the absorption of nanosecond pulses of laser light by biological tissue. The transient thermoelastic expansion that results propagates as a pressure wave. The wavelength of the source laser can be tuned such that absorption occurs either primarily at the tissue surface, generating a laser-ultrasound (LUS) wave, or deep inside to create photoacoustic (PA) waves due to absorption by embedded tissue chromophores. LUS waves can be thought of as mini-explosions at the tissue surface, much like the man-made explosions used by seismologists to image the subsurface of the earth. Likewise, PA waves are analogous to mini-earthquakes, originating from below the surface and propagating to the boundary where they are detected. We obtain optical absorption properties by inverting for the location of PA sources, and structural images by reconstructing the location of LUS scattering/reflection. The reconstruction methods utilized for PA and LUS are accordingly inspired by seismology. Reverse-time migration is adapted for reflection-mode LUS imaging to improve the imaging aperture and reduce artifacts. The velocity model is optimized in the LUS reconstruction, and subsequently applied to reconstruct PA images with time-reversal. Laser-generated PA and LUS waves are broadband, allowing for high-resolution images to be reconstructed. Detection using contacting piezoelectric transducers allows real-time imaging with high sensitivity, however, the frequency bandwidth is relatively narrow. Optical detectors provide an alternative to piezoelectric transducers when a small sensor footprint, large frequency bandwidth, or non-contacting detection is required. We introduce a fully non-contact gas-coupled laser acoustic detector (GCLAD) for medical imaging that utilizes optical beam deflection. We describe the underlying principles of GCLAD and derive a formula for quantifying the surface displacement from a remote, line-integrated GCLAD measurement. We quantify the surface displacement with GCLAD in a LUS experiment, which shows 94% agreement to line-integrated data from a commercial laser-Doppler vibrometer point detector. We further demonstrate the feasibility of PA imaging of an artery-sized absorber using GCLAD 5.8 cm from a phantom surface. Additionally, we advance all-optical imaging techniques using a laser-Doppler vibrometer point-detector. While previous reflection-mode all-optical systems use a confocal source and detection beam, we introduce nonconfocal acquisition to obtain angle-dependent data. We demonstrate that nonconfocal acquisition with a single source improves the signal-to-noise of low-amplitude PA and LUS signals using a normal-moveout processing technique. Incorporating multiple sources in this geometry allows us to apply reverse-time migration to reconstruct LUS images. We demonstrate this methodology with both a numerical model and tissue phantom experiment to image a steep-curvature vessel with a limited aperture 2 cm beneath the surface. Nonconfocal imaging demonstrates improved focusing by 30% and 15% compared to images acquired with a single LUS source in the numerical and experimental LUS images. PA images are straightforward to acquire with the all-optical system by tuning the source wavelength or the surface properties, which we reconstruct with time reversal. Therefore, we demonstrate broadband high-resolution PA and LUS imaging with this all-optical system. Subsequently, we demonstrate PA and LUS imaging of atherosclerotic plaque ex vivo. We apply our nonconfocal PA and LUS acquisition and reconstruction techniques to a fixed human carotid artery embedded in an agar tissue phantom. The LUS image provides structural information about acoustic contrasts, in which we distinguish between the layers of the artery wall and detect calcification. PA imaging is sensitive to optical absorbers, such as lipids and hemoglobin. In this ex vivo example, we image a synthetic absorber analogous to hemoglobin in the artery. We compare our nonconfocal LUS approach to confocal LUS imaging, and see a significant improvement in contrast and resolution, and reduce the appearance of artifacts. Further, we observe that LUS aids in the interpretation of PA images, specifically to identify reflection artifacts. We validate our results with both x-ray computed tomography and histology. Finally, we introduce a method for removing reflection artifacts in PA imaging altogether. We adapt a method known as Marchenko imaging developed for two-way imaging problems in seismology to the PA source inversion problem. Iterative convolutions of PA data with nonconfocal LUS predicts reflection artifacts, which we subtract from the PA image. We eliminate dominant artifacts in numerical data using a single iteration of the Marchenko scheme. Overall, we present novel methods for all-optical PA and LUS imaging including instrumentation, acquisition, image processing, and reconstruction. We present a new non-contact optical detector for medical imaging. We demonstrate unconventional experimental methods, combined with powerful imaging methods inspired by seismology to improve the resolution and contrast of LUS images and reduce artifacts in PA and LUS imaging. Furthermore, we demonstrate the potential for all-optical PA and LUS imaging to address the clinical need for non-invasive, multi-component imaging of vulnerable atherosclerotic plaque.
... Jusque dans les années 1960, la circulation du code source était le seul système utilisé dans l'industrie informatique (Akera, 2001). À cette époque, la société IBM dominait le marché et ses dirigeants ainsi que les juristes estimaient que le code source des programmes ne pouvait faire l'objet ni de droit d'auteur, ni de brevet (Campbell-Kelly, 2005). ...
Article
Full-text available
How have the Creative Commons’ ownership rules used by free websites like Wikipedia or Flickr and in 3D printer projects or in alternative kitchen gardens, been develo­ped? Internet users and technological experimentation communities rely heavily on these free tools, but the ideologies of the public domain and online communities that allowed their birth often remain obscure. In this article we used American legal doctri­ne, the scientific literature and specialized press archives. From these sources we analyzed the links between Copyright reforms and the institutionalization of the activity of free software developers, at the origin of Creative Commons licenses. The case of intangible goods property applied to tangibles goods shows how the community members and institutions legitimize their IT practices by means of several producers of norms, such as States or communities.
... The open-source software (OSS) movement traces back to the 1950s, 1 when voluntary organizations such as Sharean intermediary between IBM and its users-pursued collaboration and reduced programming costs by leveraging diverse expertise and practices of academic exchange. 2 However, notable developments, such as the launch of the GNU Project in 1984 3 and Netscape's decision to release the source code for its Internet browser in 1998, 1 significantly increased interest in the movement. More recently, examples such as the Linux operating system, 4 WordPress for blogging, 5 and Google Chrome for Internet browsing 6 have further demonstrated the potential for successful opensource projects. ...
Article
In modern laboratories, software can drive the full experimental process from data acquisition to storage, processing, and analysis. The automation of laboratory data acquisition is an important consideration for every laboratory. When implementing a laboratory automation scheme, important parameters include its reliability, time to implement, adaptability, and compatibility with software used at other stages of experimentation. In this article, we present an open-source, flexible, and extensible Python package for Laboratory Automation, Control, and Experimentation (PLACE). The package uses modular organization and clear design principles; therefore, it can be easily customized or expanded to meet the needs of diverse laboratories. We discuss the organization of PLACE, data-handling considerations, and then present an example using PLACE for laser-ultrasound experiments. Finally, we demonstrate the seamless transition to post-processing and analysis with Python through the development of an analysis module for data produced by PLACE automation.
... The group called itself SHARE, a word that its first Secretary said ''describes very well the objectives of the group'' (Mapstone & Bernstein, 1980, p. 363). The user group's name conveyed the values of ''cooperation and communication'' (Armer, 1956, p. 2), as well as voluntarism and collaboration (Akera, 2001), and reflected an important aspect of the early computing industry. ...
Article
Full-text available
Attempts by the state and the entertainment industry to impose the term “piracy” on practices of digital file sharing have been challenged by academics and activists alike. The notion of “file sharing,” however, seems to have escaped our attention. By placing that term in the context of the history of computing, where sharing of different kinds has always been a central feature, and by drawing on a nuanced understanding of the many meanings of “sharing,” this article shows that “file sharing,” unlike “piracy,” is a bottom-up term that has emerged from the field itself. The article shows that those who oppose the term “file sharing” certainly have good strategic reason to do so: sharing is by definition a positive social value and bestows a warm glow upon that which it touches. It is argued, though, that we should not allow the “war on piracy” metaphor to gain the ascendancy—not only because “piracy” is a such a negative term, and not only for strategic reasons, but also, and mainly, because when we call file sharing “file sharing” we are issuing a critical challenge to the current copyright regime.
... (ii) We would particularly highlight the increased role and importance of a new empirical phenomenon (and feature of late capitalism) in the last decade or two in which suppliers have developed mechanisms to sustain a more or less permanent relationship with their existing and potential customers (Sørensen 1996). The rise of the "software package user group" exemplifies this most visibly where users formally gather and organize to interact with and exert influence over technology vendors (Akera 2001, von Hippel 2005. These are key settings for study in their own right as they exhibit interesting dynamics. ...
Article
Full-text available
The single site implementation study is an invaluable tool for studying the large-scale enterprise solution. Together with constructivist frameworks and ethnographic approaches it has allowed the development of rich local pictures of the immediate and adaptive response by user organizations to the take-up of what are, today, often generic packaged systems. However, to view the packaged enterprise solution principally at the place where the user encounters it also has limitations. It produces somewhat partial understandings of these complex artifacts. In particular, it downplays important influences from other sites and time frames. This paper argues that if we are to understand the full implications of enterprise solutions for organizations then we should study their "biography." This idea points to how the career of workplace technology is often played out over multiple time frames and settings. To understand its shaping therefore requires scholars to go beyond the study of technology at a single locale or moment and, rather, attempt to follow it through space and time. The paper develops two ideas to aid this kind of study. We discuss better spatial metaphors that might help us explore the hybrid and extended spaces in which packaged software systems develop and evolve. We also review improved temporal understandings that may capture the multiple contemporary and historical time frames at play. The paper concludes by discussing some possible research directions that a focus on the biography of a technology might allow.
... Prosocial behaviors through the internet occurred when the internet was just a message board. IBM's release of its coding source for their operating system as well as the SHARE user group (i.e., volunteer run association providing enterprise technology professionals with education and training) are both early examples of open source software systems (Fisher, McKie, & Macke, 1983; Akera, 2001). These systems guided the usage of sharing free resources over the internet, representing the first occurrence of prosocial behaviors in the cyber context. ...
Chapter
Full-text available
Prosocial behaviors in the cyber context (i.e., the internet, text messages) can be traced back to when the internet was just a message board, used to share open source software. Following these early investigations of prosocial behaviors, clinicians recognized that the internet might remove barriers to help seeking. Recent investigations have provided support for the internet as a place to seek help among various populations. Prosocial behaviors in the cyber context also have benefits for the givers as well, including health benefits, personal satisfaction, and reputational increases. This chapter draws on multidisciplinary research to review prosocial behaviors in the cyber context.
Article
This is the story of how Unix users hacked the European data networks competition in the 1980s, by using and appropriating common resources for their own experimental and operational needs while planting the seeds of participatory digital culture, both within and outside of the European Union's tech policies frame, both in prominent roles and behind the scenes of the “standards wars”. I analyze how they were acknowledged among international computer networks, focusing on EUNET (1982–1992), a pioneering Internet provider in Europe. I show that Unix culture, beyond technical arguments, was reframed through their institutional relationships, when EUNET were courted by the EU tech programs to help develop a digital infrastructure for the knowledge sector. Eventually, the organization contributed but also profited from this alliance, before turning towards the new business of computerized telecom networks, illustrating the transfers between the knowledge institutions and the knowledge economy.
Book
In this book the diverse objects of the Whipple Museum of the History of Science's internationally renowned collection are brought into sharp relief by a number of highly regarded historians of science in fourteen essays. Each chapter focuses on a specific instrument or group of objects, ranging from an English medieval astrolabe to a modern agricultural 'seed source indicator' to a curious collection of plaster chicken heads. The contributors employ a range of historiographical and methodological approaches to demonstrate the various ways in which the material culture of science can be researched and understood. The essays show how the study of scientific objects - including instruments and models - offers a window into cultures of scientific practice not afforded by textual sources alone. This title is also available as Open Access on Cambridge Core.
Chapter
This chapter examines the historical dimension of gender bias in the US computing workforce. It offers new quantitative data on the computing workforce prior to the availability of US Census data in the 1970s. Computer user groups (including SHARE, Inc., and the Mark IV software user group) are taken as a cross-section of the computing workforce. A novel method of gender analysis is developed to estimate women’s and men’s participation in computing beginning in the 1950s. The data presented here are consistent with well-known NSF statistics that show computer science undergraduate programs enrolling increasing numbers of women students during 1965–1985. These findings challenge the “making programming masculine” thesis and serve to correct the unrealistically high figures often cited for women’s participation in early computer programming. Gender bias in computing today is traced not to 1960s professionalization but to cultural changes in the 1980s and beyond.
Chapter
In this book the diverse objects of the Whipple Museum of the History of Science's internationally renowned collection are brought into sharp relief by a number of highly regarded historians of science in fourteen essays. Each chapter focuses on a specific instrument or group of objects, ranging from an English medieval astrolabe to a modern agricultural 'seed source indicator' to a curious collection of plaster chicken heads. The contributors employ a range of historiographical and methodological approaches to demonstrate the various ways in which the material culture of science can be researched and understood. The essays show how the study of scientific objects - including instruments and models - offers a window into cultures of scientific practice not afforded by textual sources alone. This title is also available as Open Access on Cambridge Core.
Article
In this paper we pose a simple question: For firms that own or sponsor a computer or smartphone standard, is it better to make the standard open or closed? In other words, has openness paid as a business strategy? We explore the issue by examining the history of operating systems in computing and mobile phones, and rely on four different notions of openness: open systems, open innovation, open-source software, and open governance. We conclude that the truly successful operating systems have been those whose owner or sponsor has managed to combine some degree of openness with some measure of control.
Chapter
There is a decade of writing operating systems before the ‘classic’ period of the mid-1960s, when such complex operating systems as Multics or OS/360 were developed and the theoretical principles for designing an operating system were first outlined. The few accounts on these early systems mostly focus on those developed for IBM machines that dominated the market, but even there, there is a greater variation of systems than one would expect. During this period, running roughly from 1954 to 1964, not even the notion nor the name of ‘operating system’ had stabilized. Some used the term ‘monitor’, others ‘supervisor’, yet others ‘director’ or ‘executive’. These systems were still very closely tied up with the hardware, in particular, since processor memory was at a premium, the organization of the communication between the processor and external memory devices was a crucial issue. Magnetic tapes (and later disk drives) made operating systems really worthwhile, because it allowed for a faster I/O communication than punched cards or paper tape. The early operating systems were also were deeply entangled with programming systems. Programming languages, (macro) assembler systems, routine libraries, editing and debugging tools etc., were often, though not always and not necessarily, integral parts of early operating systems. Therefore, the question of what an operating system exactly is, and how it would differenciate itself from these other tools, was neither an easy nor unequivocal question during this early period. An operating system not only incorporates a vision on how to access the computer, but also on how to access the (variety of) user(s). These visions were influenced by local practices and design philosophies and often changed while accumulating experience in using the computer.
Article
The autonomous genesis of data networks in France and Europe (1978-1992) An extra-institutional story? From the 1970s, a part of the young community of French computer engineers with a particular interest in American technologies started working to import data network solutions that contrasted with local IT standards. The FNET network, the French local branch of a UUCP protocol-based network infrastructure linking the United States to Europe via the central EUNET node in the Netherlands, was one such solution. This international network of “Unix machines” was among the first, most active and largest to connect the IT R&D communities, and foreshadowed the first Internet connections in Europe. Based on a collective and self-managed organizational model, the unplanned project was deployed through horizontal scientific and industrial associative ties. It cultivated a spirit of independence from institutions, displayed collective representations and imagery of pioneering, but relied, in fact, on public research and telecommunications infrastructures. The history of Unix networks in Europe in the 1980s allows us to re-examine the autonomist utopias celebrated in the history of computer networks, in light of the institutional framework in which they took place.
Article
This paper examines the cultural climate faced by women in the American computer industry from the 1960s to the early 1980s, a period in which the percentage of the industry workforce that was female almost tripled. Drawing on a comprehensive study of articles and advertisements in the trade journal Datamation, sources from IBM, Control Data, and the Burroughs Corporation, and the records of the user group SHARE, Inc, the study argues that the cultural climate of the industry shifted radically in the early 1970s, from hostility in the 1960s to a more open one in the late 1970s and early 1980s.
Article
Full-text available
Open source software is developed by voluntary developers who use publicly available source codes and distributed freely with licenses. Open source software can be regarded as a new way of production in terms that: it exploits intellectual property rights not for exclusive benefits but for public usage; it limits direct private profits but opens possibilities for indirect ones; and it takes advantage of dispersed production networks. This paper has two main objectives: we try to explain that open source way has started from the beginning of computers and survey related researches in economics after the renowned work of Lerner and Tirole (2002). In this paper, mainly we deal with developers’ motivations, development processes and managements, market competition, licenses, policies and implications to other industries.
Article
Comment: Mediating Innovation: Reflections on the Complex Relationships of User and Supplier - Volume 7 Issue 3 - Steven W. Usselman
Chapter
Full-text available
Playfulness was at the heart of how European players appropriated microcomputers in the last quarter of the twentieth century. Although gaming has been important for computer development, that is not the subject of Hacking Europe. Our book’s main focus is the playfulness of hacker culture. The essays argue that no matter how detailed or unfinished the design projecting the use of computers, users playfully assigned their own meanings to the machines in unexpected ways. Chopping games in Warsaw, hacking software in Athens, creating chaos in Hamburg, producing demos in Turku, or partying with computing in Zagreb and Amsterdam—wherever computers came with specific meanings that designers had attached to them—local communities throughout Europe found them technically fascinating, culturally inspiring, and politically motivating machines. They began tinkering with the new technology with boundless enthusiasm and helped revolutionize the use and meaning of computers by incorporating them into people’s daily lives. As tinkerers, hackers appropriated the machine and created a new culture around it. Perhaps best known and most visible were the hacker cultures that toyed with the meaning of ownership in the domain of information technology. In several parts of Europe, hackers created a counterculture akin to the squatter movement that challenged individual ownership, demanded equal access, and celebrated shared use of the new technological potential. The German Chaos Computer Club best embodied the European version of the political fusion of the counterculture movement and the love of technology. Linguistically, in Dutch, the slang word “kraken,” the term used for both hacking and squatting, pointedly expressed such creative fusion that is the subject of this book.
Article
Full-text available
Along with the international trends in history of computing, Dutch contributions over the past twenty years moved away from a focus on machinery to the broader scope of use of computers, appropriation of computing technologies in various traditions, labour relations and professionalisation issues, and, lately, software.It is only natural that an emerging field like computer science sets out to write its genealogy and canonise the important steps in its intellectual endeavour. It is fair to say that a historiography diverging from such “home” interest, started in 1987 with the work of Eda Kranakis – then active in The Netherlands – commissioned by the national bureau for technology assessment, and Gerard Alberts, turning a commemorative volume of the Mathematical Center into a history of the same institute. History of computing in The Netherlands made a major leap in the spring of 1994 when Dirk de Wit, Jan van den Ende and Ellen van Oost defended their dissertations, on the roads towards adoption of computing technology in banking, in science and engineering, and on the gender aspect in computing. Here, history of computing had already moved from machines to the use of computers. The three authors joined Gerard Alberts and Onno de Wit in preparing a volume on the rise of IT in The Netherlands, the sequel of which in now in preparation in a team lead by Adrienne van den Bogaard.Dutch research reflected the international attention for professionalisation issues (Ensmenger, Haigh) very early on in the dissertation by Ruud van Dael, Something to do with computers (2001) revealing how occupations dealing with computers typically escape the pattern of closure by professionalisation as expected by the, thus outdated, sociology of professions. History of computing not only takes use and users into consideration, but finally, as one may say, confronts the technological side of putting the machine to use, software, head on. The groundbreaking works of the 2000 Paderborn meeting and by Martin Campbell-Kelly resonate in work done in The Netherlands and recently in a major research project sponsored by the European Science Foundation: Software for Europe.The four contributions to this issue offer a true cross-section of ongoing history of computing in The Netherlands. Gerard Alberts and Huub de Beer return to the earliest computers at the Mathematical Center. As they do so under the perspective of using the machines, the result is, let us say, remarkable. Adrienne van den Bogaard compares the styles of software as practiced by Van der Poel and Dijkstra: so much had these two pioneers in common, so different the consequences they took. Frank Veraart treats us with an excerpt from his recent dissertation on the domestication of the micro computer technology: appropriation of computing technology is shown by the role of intermediate actors. Onno de Wit, finally, gives an account of the development, prior to internet, of a national data communication network among large scale users and its remarkable persistence under competition with new network technologies.
Article
Users’clubs: customers’trade unions, marketing tools and ancestors of free software Computer user groups played an important role in the evolution of IT systems and professional practices. This was by contributing to specifying needs and products, constituting spaces for exchange and developing software, criticizing suppliers’ methods and approaches or exerting pressures on them, in order to change order of priorities as well as projects. In the French context, they also promoted “their” manufacturer’s offer towards clients, consultants and media. Finally, as they were concerned by defending their members’ data processing investment, they lobbied stockholders and administrations, by promoting the continuity of product ranges, if not of suppliers.
Article
The Bermuda Principles (1996) have been celebrated as a landmark for data sharing and open science. However, the form that data sharing took in genomics was a result of specific technological practices. Biologists developed and adopted technologies of the nascent World Wide Web and Free and Open Source Software (FOSS) communities for sharing biological information. These technologies supported decentralized, collaborative, and nonproprietary modes of production in biology. Such technologies were appealing not merely because they were expedient for genomic work but because they also offered a way of promoting a particular form of genomic practice. As the genome sequencing centers scaled up their sharing efforts, a small group of computer-savvy biologists used these tools to promote the interests of the public genome sequencing effort. The agreements at Bermuda should be understood as part of this attempt to foster a particular form of genomic work.
Article
Full-text available
By examining mobility in remote Arctic areas, we analyze how challenging environmental conditions – while affecting technology performance – evoke people’s creativity and efforts as technology users. Based on historical materials and ethnographic observations of user inventiveness in the transport sector in Russian North, we define and document a phenomenon of “proximal design,” in three different modes: 1) the proximal complementation of “distant design” machines (trucks and military equipment) to ascertain their reliability; 2) the emergence of a new type of homemade all-terrain vehicles called “karakats” made from salvaged parts to specialize in times and locations where other vehicles turn unreliable; and 3) the traditional craft of sledge-making by nomadic reindeer herders of Yamal Area where even materials are proximally collected and shaped. Our main argument is that continuous tuning, modification and redesign of technology carried out by immediate users in situ makes it possible for humans and machines to function in extreme settings and that this can lead also to emergence of enduring design principles. We outline key characteristics of proximal design such as constraining environment, inventiveness by necessity, flexible construction, personalization and symbolic meaning, and social embeddedness of making/maintaining practices.
Article
In the second half of the 1970s, established computer firms and new IT start-ups chose alternative paths to offer commercial access control systems to organizational mainframe computer users. These developments in effect launched the computer security software products industry with IBM's Resource Access Control Facility (RACF) and SKK's Access Control Facility 2 (ACF2).
Article
Historians have demonstrated how systems like Usenet and Minitel fostered the social practices that we now associate with the TCP/IP Internet, but no one has considered networked computing in education. From 1965 to 1975, Minnesota implemented interactive computing at its public schools and universities with time-sharing systems networks of teletypewriter terminals connected to computers via telephone lines. These educational networks, created with different priorities from military-sponsored networks, were user oriented from the start and encouraged software sharing and collaboration. Focusing on the educational setting gives us a history of the Internet firmly grounded in the social and political movements of the long 1960s.
Article
This chapter adds certain moderate adjustments to the body of work promulgated by the National Academy of Science, and also the work of David Mowery, that deals with the notion of modern computing being the product of massive public investment and government funding. The goal of the chapter, however, is to suggest that private enterprise and private capital—and not just government funding—played certain roles in influencing computing. IBM, in particular, is given more focus here to determine how the firm has contributed to the emergence and refinement of the storage capacity of the computer from the end of World War II until the development of the System/360. The conclusion arrived at is that while certain activities of the government may have benefited IBM, the government also drew IBM away from various opportunities that might have allowed them to blossom without the required intervention from the public sector.
Article
This article argues that an information ecosystem emerged rapidly after World War II that made possible the movement of knowledge about computing and its uses around the world. Participants included engineers, scientists, government officials, business management, and users of the technology. Vendors, government agencies, the military, and professors participated regardless of such barriers as languages, cold war politics, or varying levels of national economic levels of prosperity.
Article
Full-text available
What does knowledge do—the pursuit of it, the having and exposing of it, the receiving again of knowledge of what one already knows? How, in short, is knowledge performative, and how best does one move among its causes and effects? — Eve Kosofsky Sedgwick James Hixon, reflecting on these questions, notes that, "Sedgwick encourages us to think about knowledge as something that does, rather than something that is." But software is easily, if incompletely, understood as a kind of crystallized knowledge; most of our study of software, in computer science and beyond, is predicated on an understanding of it as a thing that does, with relatively little attention paid to what—or how—it is. In this project, we propose to ignore what it is that software does. Instead, we focus on what software is and the nature of its making: its poetics. Thus, the locus of our study is not software-in-execution but software-in-creation, or source code.
Article
This paper describes how information technology (IT) spread around the world, discussing the research and historiographical challenges for historians looking at this topic. It discusses patterns of adoption, spread of knowledge about IT, expanding modes of use, and implications for study of the diffusion of technologies.
Article
Full-text available
Language is one of the central metaphors around which the discipline of computer science has been built. The language metaphor entered modern computing as part of a cybernetic discourse, but during the second half of the 1950s acquired a more abstract meaning, closely related to the formal languages of logic and linguistics. The article argues that this transformation was related to the appearance of the commercial computer in the mid-1950s. Managers of computing installations and specialists on computer programming in academic computer centers, confronted with an increasing variety of machines, called for the creation of "common" or "universal languages" to enable the migration of computer code from machine to machine. Finally, the article shows how the idea of a universal language was a decisive step in the emergence of programming languages, in the recognition of computer programming as a proper field of knowledge, and eventually in the way we think of the computer.
Article
The article discusses how historians can tell stories about software without focusing solely on the code itself. Software is the soul. The essence of code is its immateriality. An invisible spark of life, it controls the operation of the machine and can transmigrate from one host to another. It is bound not by the laws of this world but by those of another. The distinction between hardware and software partitions the careers, journals, conferences, interest groups, and businesses of computing into separate camps. In recent years it has also shaped the work of historians of computing, as software history has become an increasingly popular area of research. Technologies such as FPGAs, virtual machines, APIs, and microcoded instruction sets complicate the simple picture of programs directly manipulating hardware. Second, the recent Turing Centenary indicates that the founding insight of theoretical computer science is that hardware and software are, from the viewpoint of computability, almost entirely interchangeable.
Article
In 1965, four undergraduates at the University of Waterloo wrote Watfor, a fast student-oriented Fortran compiler for the school's IBM 7040, largely because the available Fortran compiler was slow and offered weak diagnostic and debugging tools. This article describes the birth and evolution of the Watfor family and explores how it fits into the University of Waterloo's unique-within-Canada cooperative education program and pedagogical philosophy.
Article
Purpose – The purpose of this study is to increase understanding of service innovation in networks. Especially the most loosely coupled forms of innovation networks, innovation communities, can be valuable in service innovation, but may not be manageable in the traditional sense. Rather, they may require orchestration characterized by discreet guidance that also accommodates the specific nature of services. Through informed orchestration, it is possible to deal with several contingencies, and influence the absorptive capacity at the network level to generate new service innovations. Design/methodology/approach – These issues are examined through literature review and a case study. Findings – The findings suggest that individual orchestration mechanisms may be more closely connected to certain contingencies than others, and that both orchestration mechanisms and contingency factors have a role in absorptive capacity development within service innovation networks. Research limitations/implications – While the case study approach limits the possibility to make wide generalizations, the in-depth insights provide valuable knowledge. Practical implications – There has been a shift from inter-firm competition towards competition between networks of organizations, increasing relevance of absorptive capacity at the network level. Originality/value – Despite the recent increase in service innovation literature, research on service innovation taking place in networks is scant. Knowledge on some aspects can be derived from more traditional notions on technological innovation, but both the distinctive features of services and central characteristics of innovation networks make it necessary to reconsider some of the established views. In particular, managing – or rather orchestrating – service innovation is still a challenging area.
Article
The International Business Machines Corporation adapted early on to the opportunities created by the cold war economy in the United States. This account of IBM's adjustment to the circumstances of that time unveils the detailed process by which a firm situated outside the traditional defense industries forged new institutional allegiances between business and government and between science and industry. Beginning in 1949, IBM's Applied Science Department, under the leadership of Cuthbert Hurd, enabled the company to enter new technical markets that had been created by federal research and defense expenditures. But there were also broader consequences to IBM's decision to embrace scientific culture, among them the transformation of its traditional sales and product development strategies in ways that were not indisputably functional.
Article
This Article tells the story of the contest over the meaning of "open systems" from 1980 to 1993, a contest to create a simultaneously moral and technical infrastructure within the computer industry.
Article
The historiography of computing has until now considered real-time computing in banking as predicated on the possibilities of networked ATMs in the 1970s. This article reveals a different story. It exposes the failed bid by Barclays and Burroughs to make real time a reality for British banking in the 1960s.
Article
Compared to some of the other early programming languages, Algol 60 was not particularly successful in practical terms. This fact presents something of a puzzle: how did a language which was a relative failure in practical terms later come to be regularly described as the most influential of early programming languages? This chapter suggests an answer to this question, arguing that what changed the face of programming was not simply the Algol 60 language, but rather a coherent and comprehensive research programme within which the Algol 60 report had the status of a paradigmatic achievement, in the sense defined by the historian of science Thomas Kuhn. This research programme established the first theoretical framework for studying not only the design of programming languages, but also the process of software development, the subject of the next chapter.
Article
The Hagley Museum and Library has served as a center for research in the history of technology since the 1960s. Yet even scholars who have done extensive research at the library may be unaware of the breadth and diversity of its archival holdings. This essay introduces recent additions to the library's holdings and provides suggestions for further research in older archival collections. Emerging themes in the history of technology, such as the environment, internationalization, safety, users, and gender, frame the discussion.
Article
Full-text available
In 1995 AOL announced that it would be converting its pricing plan from an hourly rate that ranged from 3to3 to 6 an hour to a flat monthly rate of $15.95. The increase in member subscription was expected to be significant, and a wave of concern swept through the large remote-staff volunteer population, whose duties included monitoring electronic bulletin boards, hosting chat-rooms, enforcing the Terms of Service agreement (TOS), guiding AOL users through the online community, and even creating content using the AOL's own program, RAINMAN (Remote Automated Information Manager), the text scripting language and the publishing tool that allows remote staffers to update and change content on AOL. Chief among remote-staff volunteer's concerns was the initiative to convert many of the volunteer accounts from overhead accounts, which had access to tools and privileges that made remote-staff volunteers' duties on par with in-house employees, to unbilled or discounted accounts. In a meeting meant to address the emerging concerns of remote-staff volunteers held over electronic chat, Bob Marean, a representative for AOL, confronted over 450 remote-staff volunteers.
Article
Full-text available
The computer-services and software industry used to be conveniently divided into three main sectors: mass-market software vendors, enterprise software vendors, and computer services. The three sectors were distinct, because personal computers, corporate mainframes, and online computer networks operated in relative isolation. The arrival of the Internet effectively connected everything, facilitating the entry of mass-market vendors into enterprise software and of both mass-market and enterprise software vendors into computer services. As the turbulence of the first decade of the Internet era subsides, a gradual transition from traditional software products to "Web services" is taking place.
Chapter
From the beginning computer science has been a contentious subject, with practitioners disagreeing on whether computers and computing could (or indeed should) be the subject of a science and, if so, what kind of science it should be. As the subject has developed, it has grounded computing in a body of profound and elegant mathematical knowledge relating abstract machines to computational processes, thereby creating the field of “applied abstract algebra” and bringing the most advanced mathematics of the twentieth century to b ear on the defining technology of our age. That theory has led the development of powerful tools for programming and for monitoring the operations of computers, thus placing the power of the computer in the hands of professional and non -professional users alike. Software engineering has generally followed suit, taking as its (as yet unfulfilled) goal the automation of the processes of designing and producing the software systems that automate processes in the world. Indeed, much of its literature is redolent with the language of machine-based production. Alongside this mainstream of development has flowed a current critical of its focus on the computer and aimed at a broader view of the subject and of humans both as agents and subjects of automation. These critics argue that, properly construed, informatics (informatique, Informatik, informatica) extends beyond the computer and its operations to include its social context and that a commensurate theory of informatics would reach beyond mathematics to encompass social and political theory. Despite the failure so far to create a viable alternative to computer science as currently construed, recent trends in software engineering suggest how these two streams of thought may be converging.
Article
Generalized report generation and file maintenance programs were widely used in the 1950s, standardized by the Share user group with 9PAC and Surge. By the 1960s the first recognizable DBMS systems, such IMS and IDS, had evolved to address the challenges of disk drives and MIS projects. Finally, in the late 1960s Codasyl's Data Base Task Group formulated the DBMS concept itself.
Article
This paper is a history of Texaco’s Corporate IT Function (IT) from its inception until Chevron acquired Texaco in 2001. The four decades of Texaco IT are best characterized by a contrast between the function’s performance and its resources. According to third party measures, Texaco IT was a top performer amongst oil-industry IT functions and third party service providers. Yet starting soon after its inception, the department endured a resource squeeze. As the workload increased, IT’s relative resources shrank. Throughout its history, user dissatisfaction with the unit was prevalent. We believe that the Texaco IT story is a typical account of the experiences of many large corporate IT organizations. The unit was a success by the measures of the profession, but failed in the eyes of top management and business units.
Article
The first personal-computer spreadsheet, VisiCalc, was launched in May 1979. During the next decade, the spreadsheet evolved from a simple calculating aid to an indispensable tool of modern business, employed by tens of millions of users who had little or no direct computer experience. This article describes the development of spreadsheet usability from VisiCalc through Lotus 1-2-3 to Microsoft Excel.
Article
The abstract for this document is available on CSA Illumina.To view the Abstract, click the Abstract button above the document title.
Article
Whenever someone asks about SHARE, the first question is usually "What do the initials mean?" The answer is that SHARE is a name and not a set of initials. The second question is usually "Just what is SHARE?" SHARE has been frequently described as a "users' cooperative." It is made up of most of the organizations who have, or plan on getting, an IBM Type 704 EDPM. Like any cooperative, SHARE was formed to be of service to its members. Its aim is to eliminate, as much as possible, redundant effort expended in using the 704. It seeks to accomplish this aim by promoting cooperation and communication among installations that use the 704.