Chester Gordon BellMicrosoft · Microsoft Resrarch
Chester Gordon Bell
S.B., S.M. Electrical Engineering, M.I.T.
About
203
Publications
93,820
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
5,084
Citations
Introduction
Publications
Publications (203)
Computation has increased 17 orders of magnitude over the sixty years I have been visiting and celebrating gains in computational environments. The first “supercomputers” ran at a million ops per second and in 2021 the fastest computers operates at exa-ops or 1018ops with gains of 10 million occuring in the last twenty-five year by exploiting paral...
In retrospect, the first era of scientific computing, 1960-1995 was defined by Seymour Cray designed computers hosting single memory FORTRAN programs that can be executed by a dozen or so processors operating in parallel.
In 1993, the first multicomputer with a thousand, independent and interconnected computers outperformed mono-memory supercomput...
Scalable and coherent shared memory has been a long-sought-after but elusive goal. In contrast to today’s popular distributed-computing models, the authors present a software-defined server architecture that is a scale-up shared-memory multiprocessor, yet uses ubiquitous commodity scale-out clusters.
The Gordon Bell Prize is awarded each year by the Association for Computing Machinery to recognize outstanding achievement in high-performance computing (HPC). The purpose of the award is to track the progress of parallel computing with particular emphasis on rewarding innovation in applying HPC to applications in science, engineering, and large-sc...
The Gordon Bell Prizes chronicles the important innovations and transitions of HPC beginning with the first award in 1987 that demonstrated Amdahl’s Law was not impenetrable. For example, in 1987 MPI provided both a model and standard for rapid adoption . Almost every gain in parallelism has been recognized, from widely distributed workstations to...
New Scientist Gordon Bell Interview January 08, 2014
The temptation to constantly refer to the web, or lifelogevery event, or wear a watch (or other personal device) that connect you with the pulse of available information is strong. But is it a good idea to be that much online? Our panelists will address this question, each presenting a short opening statement about why their position (pro or con) i...
Ideally, if enough were known about all the variables affecting heart rate, the time of the next beat could be known. Of course this means knowing: the environment e.g. temperature, air density, wind, air quality; activity level e.g. sleeping, sitting, standing, walking, running, biking, rowing; diet and digestive loads including stimulants; physic...
The “ideal supercomputer” has an infinitely fast clock, executes a single instruction stream program operating on data stored in an infinitely large, and fast single-memory. Backus established the von Neumann programming model with FORTRAN. Supercomputers have evolved in steps: increasing processor speed, processing vectors, adding processors for a...
A Half-century of Supercomputing History Personal Reminiscences: �Engineer, researcher, bureaucrat, supporter, investor, critic, & historian of the �amazing race.�
On visiting Livermore and seeing LARC a half century ago, I marveled at a kind of large computer built at the edge of technology and a market’s ability to pay. As a machine builder, rese...
On visiting Livermore and seeing LARC a half century ago, I marveled at a kind of large computer built at the edge of technology and a market’s ability to pay. As a machine builder, researcher, administrator, supporter, investor, critic, and historian there’s nothing quite like supercomputers. I’ll posit the mono-memory monocomputer aka SR Cray com...
The goal of this presentation to present a view and theory of how a dozen different kinds of computers formed from 1951-2015.
Hardware technology generations enable new classes: vacuum tubes (1951), �discrete transistor circuits(1959), integrated circuits (1966), and �an evolving microprocessor 4-bit (1971) > 8 > 16 (1978) > 32 (1982) > 64-bit (199...
Gordon Bell, one of the first people to chronicle his existence digitally, explains how it has changed
his life and the potential pitfalls
How did lifelogging take off?
The secret was to be interviewed by New Scientist: your 2002 article about our MyLifeBits project was so widely read by other journalists that coverage snowballed.
This (his)story is about the establishment of players, smartphones and tablets as three new computer classes i.e. de facto, standards based “platforms” for personal computing 2001-2010. The three classes can be explained by Bell’s Law, described in Appendix 1: hardware technology, networks, and interfaces allows new, smaller, more specialized compu...
Since my first visit to Livermore in 1961 and seeing the LARC, recalling the elegance of the 6600, and just observing this computer class evolution have been high points of my life as an engineer and computing observer. Throughout their early evolution, supercomputer architecture “trickled down” in various ways for use with other computers. In the...
During the 1960s a new class of low-cost computers evolved, which were given the name minicomputers. Their development was facilitated by rapidly improving performance and declining costs of integrated circuit technologies. Equally important were the entrepreneurial activities of many companies. By the mid-1980s, nearly one hundred companies had pa...
James Nicholas (“Jim”) Gray was born January 12, 1944, in San Francisco, California, and lost at sea January 28, 2007, during a trip to cast his mother’s ashes at the Farallon Islands near San Francisco. The enormity of the loss to his personal friends and to computer science— especially the database community that he helped create and lead—was qui...
Lives is a system to author and visualize stories based on a collection of biographical and historical multimedia content enhanced with event objects. Stories are constructed as hyperlinked slide-shows, which may also be visualized in a timeline. Besides manually created hyperlinks, Lives also supports the discovery of stories and media that inters...
The 2011 opening at the Computer History Museum of the world’s largest and most complete physical and cyber exhibit of computing history marks the sixth stage of a public museum’s evolution, which began in 1975 with a closet-sized exhibit in a Digital Equipment Corporation building, migrating to The Computer Museum, Boston. It now lives in an 119,0...
Gordon Bell and Allen Newell authored Computer Structures: Readings and Examples in 1971, and with them, Daniel Siewiorek help create the follow-up book Computer Structures: Principles and Examples in 1982. In this Anecdotes article, authors Bell and Siewiorek share their recollections from writing these foundational technical books. The indirect e...
Enterprise and scientific data sets double every year, forcing similar growths in storage size and power con- sumption. As a consequence, current system architec- tures used to build data warehouses are about to hit a power consumption wall. In this paper we propose a novel alternative architecture comprising large num- ber of so-called Amdahl blad...
The demands of data-intensive science represent a challenge for diverse scientific communities.
Data intensive computing presents a significant challenge for traditional supercomputing architectures that maximize FLOPS since CPU speed has surpassed IO capabilities of HPC systems and BeoWulf clusters. We present the architecture for a three tier commodity component cluster designed for a range of data intensive computations operating on petasc...
James Nicholas Gray's understanding and experimentation gave him a special perspective. From 1995 his commitment was building indefinitely scalable tools by working on really hard data-intensive application problems with other scientific disciplines. His attention to research for both understanding and use made him a unique researcher. Jim pioneere...
The general Bell's theory for the creation, evolution, and death of various priced-based computer classes, explaining the history of computing industry based on the properties of computer classes and their determinants, is presented. The Bell's theory maintains that the class establishe a horizontally structured industry composed of hardware compon...
In 1951 a man could walk inside a computer. By 2010, a computer cluster with millions of processors will have expanded to building size. In this new paper Gordon Bell explains the history of the computing industry, positing a general theory ("Bell's Law) for the creation, evolution, and death of computer classes since 1951. Using the exponential tr...
The International Lattice Data Grid (ILDG) is an international collaboration that creates standards to enable sharing of data produced by Lattice Quantum ChromoDynamics (QCD) simulations, which are very computationally expensive. In this paper we summarize ...
New systems may allow people to record everything they see and hear—and even things they cannot sense—and to store all these data in a personal digital archive
Petascale computers providing a factor of thirty increase in capability are projected to be
installed at major Department of Energy computational facilities by 2010. The anticipated
performance increases, if realized, are certain to change the implementation of
computationally intensive scientific applications as well as enable new science. The ver...
One of the things that distinguishes human beings from other species is the magnitude to which we manipulate our (largely synthetically created) environments and our technologies in order to augment ourselves physically and mentally. Supporting our individual as well as collective memory has been a particularly important endeavor as we have continu...
A balanced cyberinfrastructure is necessary to meet growing data-intensitive scientific needs. We believe that available resources should be allocated to benefit the broadest cross-section of the scientific community. Given the power-law distribution of problem sizes, this means that about half of funding agency resources should be spent on tier-1...
The January 2001 Communications article “A Personal Digital Store” described our efforts to encode, store, andallow easy access to all of a person’s information for personal and professional use [1]. The goals included understanding the effort to digitize a lifetime of legacy content and the elimination of paper as a permanent storage medium. We us...
The real problem is not space, but how to "make use of the record" as Vannevar Bush said in his 1945 classic "As We May Think. " His Memex vision gave us a manifesto for creating MyLifeBits software, with specifications for links, text and audio annotations, and head-mounted cameras. Extending Memex into more media (video, sensors, etc. ) includes...
History shows how abuses of the standards process have impeded progress. Over the next decade, we will encounter at least three major opportunities where success will hinge largely on our ability to define appropriate standards. That's because intelligently crafted standards that surface at just the right time can do much to nurture nascent industr...
Within five years, our personal computers with terabyte disk drives will be able to store everything we read, write, hear, and many of the images we see including video. Vannevar Bush outlined such a system in his famous 1945 Memex article [1]. For the last four years we have worked on MyLifeBits www.MyLifeBits.com http://www.MyLifeBits.com, a syst...
Passive capture lets people record their experiences without having to operate recording equipment, and without even having to give recording conscious thought. The advantages are increased capture, and improved participation in the event itself. However, passive capture also presents many new challenges. One key challenge is how to deal with the i...
There are endless survival challenges for newly created businesses. The degree to which a business successfully meets these challenges depends largely on the nature of the organization and the culture that evolves within it. That's to say that while market size, technical quality, and product design are obviously crucial factors, company failures a...
Storage trends have brought us to the point where it is affordable to keep a complete digital record of one's life, and capture methods are multiplying. To experiment with a lifetime store, we are digitizing everything possible from Gordon Bell's life. The MyLifeBits system is designed to store and manage a lifetime's worth of data. MyLifeBits enab...
Review of visualization conferences going back to a 1990 visualization keynote. Challenge is for supercomputer users at extreme. For lifelogging, challenge is to be able to show a lifetime e.g timelines, relationships.
Storage trends have brought us to the point where it is affordable to keep a complete digital record of one's life. The MyLifeBits system is designed to store and manage a lifetime's worth of data. To experiment with a lifetime store, we have digitized everything possible from Gordon Bell's life. These are added to his existing digital assets. We a...
The home of the future will have an all-digital network for all media, backed by multi-terabyte storage. Users will be able keep an entire lifetime of personal media, and vast collections of media that may be of interest for future viewing, reading, or listening. MyLifeBits is a personal store for a digital life, designed to support efficient organ...
By 2047, one can imagine a body-networked that can capture and retrieve everything man can hear, read and see. It could have as much memory and processing power as its master, that is 1,000 million operations per second and a memory of 10 terabytes. Content and all electronically encodable information will be in cyberspace. Computers are predicted...
The Story of Digital Equipment Incorporation. Lessons on Innovation, Technology, and the Business Gener
MyLifeBits is a project to fulfill the Memex vision first posited by Vannevar Bush in 1945. It is a system for storing all of one's digital media, including documents, images, sounds, and videos.
The article focuses on the U.S.-based Home Media Networks Ltd. Home media acquisition, production, storage and use are on the cusp of a radical change as personal computer and network technologies integrate all media. Most current residences contain a jumbled mix of analog and digital equipment that will be replaced by all-digital, networked media...
The payment of a June 1996 bet with Jim Gray is overdue – I bet that: “By April 1, 2001, 50% of the PCs that run a Microsoft OS will ship with:
• 1-10 frames per second,
• videophone with
• telephone quality voice.
Focuses on the architectures of high-performance scientific computers in the United States. Definition of a computer cluster; Information of the Beowulf Project; Expectation of a revolutionary technology.
MyLifeBits is a project to fulfill the Memex vision first posited by Vannevar Bush in 1945. It is a system for storing all of one's digital media, including documents, images, sounds, and videos. It is built on four principles: (1) collections and search must replace hierarchy for organization (2) many visualizations should be supported (3) annotat...
After 50 years of building high performance scientific computers, two major architectures exist: (1) clusters of "Cray-style" vector supercomputers; (2) clusters of scalar uni- and multi-processors. Clusters are in transition from (a) massively parallel computers and clusters running proprietary software to (b) proprietary clusters running standard...
Finding a place to efficiently store all of one's digital materials.
CyberAll is a project to archive all my personal and professional information content including that which has been computer generated (since the mid 70s), scanned and recognized, and recorded on VHS tapes. The archive includes books, correspondence (i.e. letters, memos, and email), transactions, papers, photos and photo albums, and video taped lec...
this article appears at http://research.microsoft.com/pubs/ 2
TR published in CACM January 2001
First interview re. Cyberall project that became MyLifeBits
This paper is based off of a talk given to the Catalan Institute. Is covers the current rise, exploitation, and future of cyberspace and it's industry.
Distributed shared memory computers (DSMs) have arrived (G. Bell,
1992; 1996) to challenge mainframes. DSMs scale to 128 processors with
two to eight processor nodes. As shared memory multiprocessors (SMPs),
DSMs provide a single system image and maintain a “shared
everything” model. Large scale UNIX servers using the SMP
architecture challenge mai...
Our first speaker is known throughout the industry as the Father of the Minicomputer, although I’m sure he has mixed feelings about this. And I guess he is the embodiment of the saying, “You only know where you’re going if you know where you’ve been.” He’s been a very significant part of the industry since the very early days, so when he talks abou...
Over the PDP-11'S six year life about 20,000 specimens have been built based on 10 species (models). Although range was a design goal, it was unquantified; the actual range has exceeded expectations (500:1 in memory size and system price). The range has stressed the basic mini(mal) computer architecture along all dimensions. The main PMS structure,...
A Telepresentation is a presentation in which the presenter and/or some of the audience members are not physically present but are telepresent – in a different location and/or at a different time. Telepresentations promise to reach a wider audience by transmitting and/or recording the presentation for viewing at a different place and/or time and ti...
The article discusses about the new technology that allows people to attend business conferences without physically attending the conferences. This is about telepresentations -- a presentation in which the presenter and/or some of the audience members are not physically present but are telepresent, that is, in a different location or at different t...
Today, telepresentations are practical and low cost. They offer enormous advantages by making presentations that are delivered via slides and talking heads widely available at low cost by allowing presenters and viewers to be telepresent. Over time, we expect presentations ranging from courses to conferences to all be easily viewed via the Web so t...
By 2047, almost all information will be in cyberspace (1984) -- including all knowledge and creative works. All information about physical objects including humans, buildings, processes, and organizations will be online. This trend is both desirable and inevitable. Cyberspace will provide the basis for wonderful new ways to inform, entertain, and e...
Proceedings of the 50th Anniversary of the ACM held in San Jose, CA on 2 March1997.
Invited talk for the 50th anniversary of the ACM Conference. Description of historical predictions about computers and the computing industry. Future speculations about computing. See talk at: https://www.youtube.com/watch?v=nNTTNRrtQfk
The inevitability of complete computer systems on a ments tied to particular architectures and software; chip will create a microsystems industry. In addition, fore-* fabless and chipless IP companies that supply casters predict 32-Mbyte memory chips by 1999. So by designs for royalty; 2002 we would expect a personal computer on a chip with ECAD co...
An abstract is not available.
Like everyone who knew Allen, I feel deeply honored to have known him and was influenced by him in many ways. Allen was the most thought-ful, kind, and gentle gentleman I know. His intellect, coupled with his enthusiasm and smile, virtually always led a group in the right direction. He was the role model for a scientist, teacher, husband, father, a...
The dream of the Information Superhighway is one in which audio (telephone), video (television), information (news, libraries, images) and data are combined in a single network, universally available and inexpensive. Crippled by low bandwidth, the Internet remains a crude prototype of the Information Superhighway. The telephone and cable TV industr...
Two 50 minute video lectures of the pioneers who built the first pioneering computers from Atanasoff to Zuse.
Internet 1.0, 2.0, and 3.0: It’s the Symmetry and Bandwidth, Stupid!
Internet use is optimistically projected to reach one billion users in 2001! Cable TV and Telecommunications Industries are ignoring the bandwidth and symmetry requirements for the Internet. Various waves of use will evolve depending on the bandwidth.
Business Implications
Users are driving vendors to adopt open system standards that achieve portability and interoperability. However, given vendors' propensity to create unique products, a single operating system running on multiple platforms, which achieves essentially the same results, may be the only feasible way to meet these demands.
The pas...
Scalable, massively parallel processing computers promise to become the most cost effective approach to computing within the next decade, and the means by which to solve particular, difficult, large-scale commercial and technical problems.
The commercial and technical markets are fundamentally different. Massively parallel processors
may be more u...
The 1990s will be the era of scalable computers. By giving up uniform memory access, computers can be built that scale over
a range of several thousand. These provide highpeak announced performance (PAP), by using powerful, distributed CMOS microprocessor-primary memory pairs interconnected by a high performance switch
(network). The parameters tha...
fast, low-latency networks hased on ATM. Like MPPs, these networks offer size sculabilip (from fewer to more processors), but rhey also offerperntim wMdtiy (horn previous to future generations) and space scalabiliv (horn multiple nodes in a box, to compurers in multiple rooms, buildings, or geographic regions). Furthermore,
The quest for the teraflops supercomputer to operate at a peak speed of 10 to the 12th floating-point operations per second is over a decade old. Between 1987 and 1992, improvement occurred by a factor of 133. A teraflop supercomputer ($30 million) could be reached in 1996 based on the following: fast, cheap CMOS microprocessors; entrepreneurial co...
The developments in high-performance computers towards achieving
the goal of a teraflops supercomputer that would operate at a peak speed
of 10<sup>12</sup> floating-point operations per second are reviewed.
The net result of the quest for parallelism as chronicled by the Gordon
Bell Prize is that applications evolved 115% per year and will most
li...
The quest for the Teraflops Supercomputer to operate at a peak speed of 10" floating - point operations per sec is almost a decade old, and only one three-year computer generation from being fulfilled. The acceleration of its development would require an ultracomputer. First-generation, ultracomputers are networked computers using switches that int...
Computer designer- have been 'trying for a decade to build supercomputers that run at speeds near one teraflops (10**12 floating point operations per second). Accelerating this achievement would require the developments of what I term ultracomputers that heavily rely on parallel processing.
In my judgment, substantially more powerful computers wil...
During the last 25 years, the author has never really considered any alternative to the multiprocessor (mP) for general-purpose, cost-effective, non-minimal computers. This involvement with mPs at Digital Equipment Corporation (DEC), Carnegie Mellon University (CMU), Encore, Stardent, and Kendall Square Research (KSR) included 16 computers. Fourtee...
Shopping for Supercomputers In the market for supercomputing? Look beyond Cray supercomputers mainframes, Alliant minisupers and Digital minis. Instead, discover the power built around super workstations. Six options are described.
According to Congress and the press, technical computng is in trouble. But in fact, the future looks brighter than ever. New classes of computers and new software are being created by new and existing companies. Only the growth rate for the traditional supercomputer might be slow.
The reason is straightforward. A user can often do the same computat...