George O. StrawnNational Academy of Sciences | NAS · Policy and Global Affairs division
George O. Strawn
PhD
About
68
Publications
20,545
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
13,671
Citations
Introduction
Skills and Expertise
Publications
Publications (68)
Deep learning and generative artificial intelligence originated in 1943, which preceded our age of digital computers. These origins focused on understanding the functioning of the brain and creating an artificial model of it. The three masterminds who initiated this now very important strand of IT technology are highlighted and the elements of thei...
Generative AI (GenAI) is the newest dimension of AI to catch the publics’ attention. Some observers think this may be an inflection point on the road to general artificial intelligence. It’s too soon to make judgments such as that, but it’s not too soon to begin looking at the people who are developing it, not too soon to be seeking a basic underst...
IT has preceded through four phases like other technologies: experimental, exotic, manufacturable, and ubiquitous. IT Professional has been in existence since IT has become ubiquitous. A brief history of IT through these phases is reviewed, highlighting “Mastermind” articles relating to each phase.
Herman Hollerith was an American inventor and engineer who played a significant role in the development of computing and information processing systems. He is best known for inventing the punched card tabulating machine, which was used for processing data and performing calculations. This invention laid the foundation for modern data processing and...
This article presents my observations on how computing and computing education have evolved throughout my professional career; it concludes with some guesses about changes that may come in the future.
The design of computing machines might well be said to have begun in the early 19th century with the stories of two English titans, Charles Babbage and Ada Lovelace.1,2,3,4 However, the practice of computing machines had to wait until the 20th century when electromechanical and electronic technologies supplanted the merely mechanical technology of...
The acronym FAIR, (findable, accessible, interoperable, and reusable), refers to an international movement to effectively manage scientific data. The FAIR digital object (FDO) is an approach to implement FAIR data.1,2
In addition to the previous intensive discussion on the “Data Deluge” with respect to enormous increase of available research data, the 2022 Internet-of-Things conference confirmed that in the near future there will be billions if not trillions of smart IoT devices in a very wide range of applications and locations, many of them with computational...
I began writing occasional articles about the possible effects of IT on future employment in 2016. Although that was only six years ago, it seems much longer, given all that has happened since then. Some of these happenings were “unknown unknowns,” which helps confirm Yogi's quip that, “predictions are hard, especially about the future.” This artic...
As efforts advance around the globe, the US falls behind
On August 2, 2021 a group of concerned scientists and US funding agency and federal government officials met for an informal discussion to explore the value and need for a well-coordinated US Open Research Commons (ORC); an interoperable collection of data and compute resources within both the public and private sectors which are easy to use and ac...
Deep Learning is currently the most successful AI technique. This article describes the origins of deep learning, a description of how it works, and brief biographies of three recognized pioneers of deep learning research. It concludes with a brief description of protean folding, which is arguably its most significant application to date.
The 2018 paper titled “Common Patterns in Revolutionary Infrastructures and Data” has been cited frequently, since we compared the current discussions about research data management with the developments of large infrastructures in the past believing, similar to philosophers such as Luciano Floridi, that the creation of an interoperable data domain...
The Arpanet origins of the Internet are well known. For example, see “Masterminds of the Arpanet.”1 Not as well known is the NSFnet, which was the essential step between the Arpanet and the Internet. This article will highlight the unlikely origin of the NSFnet, its development over ten years (1985–1995), and its transition to the Internet of today...
In 1946, the American Institute of Electrical Engineers (AIEE) established a Committee on Large-Scale Computing. In 1951, the Institute of Radio Engineers (IRE) established a Professional Group on Electronic Computers. In 1963, these societies merged to form the IEEE, and they combined their computing activities into a Computer Group. Finally, in 1...
In this article, the effect of the COVID- pandemic on public education, both now and in the near future, is considered. These effects, which the author examines, are predicted to hasten the long-term future of public education.
Much research is dependent on Information and Communication Technologies (ICT). Researchers in different research domains have set up their own ICT systems (data labs) to support their research, from data collection (observation, experiment, simulation) through analysis (analytics, visualisation) to publication. However, too frequently the Digital...
The introduction of a new technology or innovation is often accompanied by “ups and downs” in its fortunes. Gartner Inc. defined a so-called Hype Cycle to describe a general pattern that many innovations experience: technology trigger, peak of inflated expectations, trough of disillusionment, slope of enlightenment, and plateau of productivity. Thi...
Reports on employment and hiring trends in the IT industry during times of pandemics.
The FAIR principles have been widely cited, endorsed and adopted by a broad range of stakeholders since their publication in 2016. By intention, the 15 FAIR guiding principles do not dictate specific technological implementations, but provide guidance for improving Findability, Accessibility, Interoperability and Reusability of digital resources. T...
As it dawned, people thought that the 21st century would be "just another century" in the age of science and technology; however, it turned out to be the beginning of a new - as of yet unnamed - age. This brief article will highlight some of the important changes that occurred in the 21st century, changes which may help determine an appropriate nam...
In perusing the earliest computers, we have often seen different definitions of what a computer was. This is certainly understandable as the pioneers juggled concepts and available technologies. Arthur Burks wrote three books1 describing progress toward what we consider today to be an electronic digital computer: the first electronic digital comput...
This article combines the thoughts from two different books: "The Globotics Upheaval: Globalization, Robotics and the Future of Work" by Richard Baldwin (an economist) and "Moving Beyond Fear: Upending the Security Tales in Capitalism, Fascism and Democracy" by Charles Derber and Yale Magrass (sociologists).
Discusses the work of Konrad Zuse and Heinrich Billing: two German pioneers of digital computers. Zuse (1910–1995) used telephone relays as opposed to vacuum tubes as the active computing elements in his early Z-series computers, and the programs were executed from an external tape; however, the Z3 was arguably the first implementation of a Univers...
Low molecular weight organic compounds (LMWOC) represent a small but critical component of soil organic matter (SOM) for microbial growth and metabolism. The fate of these compounds is largely under microbial control, yet outside the cell, intrinsic soil properties can also significantly influence their turnover and retention. Using a chronosequenc...
Reports on the blockchain phenomena - what it is, how it is used, and prospects for blockchain use in the future.
This article is a reflection on the work of Vannevar Bush (1890-1974), who was the grand old man of science during and a bit beyond the Second World War. His computing credentials were of the analog variety, and his career carried him well beyond computing and engineering into the highest ranks of science leadership.
This article explores the revolutionary possibility that the new IT is changing the capitalist system itself.
The author examines a US National Academies study that looks at employment in the 21st century, and what automation and other technological advances might mean for workers in the future.
The guest editors of this special issue provide readers with a unique mix of perspectives to develop a deeper understanding of the issues surrounding Moore's law and the prospects for continued exponential growth in the coming era of computing. The field is indeed too large for a thorough treatment of all its aspects within one special issue. Never...
In 2016, the world lost yet another computing pioneer, Erich Bloch. The author examines Bloch's life and accomplishments, including his work at IBM and his time as director of the NSF.
There are several possible relationships between long-term, automation-caused unemployment and short-term, offshoring-caused unemployment. The author describes some of these relationships, examines the connection between unemployment and the presidential election of 2016, and looks at two recent interviews that express concern about automation-caus...
The author reviews Homo Deus: A Brief History of Tomorrow, by historian Yuval Harari. He examines Harari's speculation about long-term consequences of automation that go beyond unemployment to ideology. In particular, Harari suggest that one potential ideology of the future is dataism, in which we trust technology with more and more of our decision...
In this installment, the authors examine the life of Norbert Wiener. They describe his early life and career, as well as his creation of and contributions to cybernetics, and his status as the 'dark hero' of the Information Age.
Marvin Minsky and Seymour Papert had closely joined careers at MIT in artificial intelligence and other fields. This article highlights some of their contributions.
The author looks at data science applied specifically to scientific disciplines, which is called data-intensive science.
The last installment of this column dealt with the specter of IT-caused unemployment. Here, the author considers a new IT-created employment opportunity - the data scientist. He looks at data, information, and knowledge and current IT job classifications to provide context, describes how big data has inspired the field of data science, and defines...
There is an urgent need to improve the infrastructure supporting the reuse of scholarly data. A diverse set of stakeholders—representing academia, industry, funding agencies, and scholarly publishers—have come together to design and jointly endorse a concise and measureable set of principles that we refer to as the FAIR Data Principles. The intent...
In 1970, an article by Edgar F. Codd determined the primary direction of high-level database languages for the next 40 years. The authors give an overview of the relational model for database systems, then highlight three of the masterminds who created and developed it: Codd, Michael Stonebraker, and Larry Ellison.
Many are attempting to predict the future of employment in a 'post-industrial' world. This future could well be determined by IT-based products and services. In this inaugural column on IT and future employment, the author examines the prediction that IT and related technologies are in the early stages of bringing about massive, systemic unemployme...
Modern information technology is transforming the collection, management, and sharing of scientific data in ways that greatly encourage convergence. Data-intensive science has evolved beyond the point at which all the information required for a research study can be centrally located, so interoperability across systems is required, with the additio...
In 1965, Gordon Moore predicted that the number of transistors on a chip would double every year for the next 10 years. Moore's law is still in effect today, with more than a billion transistors able to fit on a chip as of 2010. This article revisits Moore's law and the rise of microelectronics.
Modern information technology is transforming the collection, management, and sharing of scientific data in ways that greatly encourage convergence. Data-intensive science has evolved beyond the point at which all the information required for a research study can be centrally located, so interoperability across systems is required, with the additio...
In 2011, the US government asked for game-changing research that would fundamentally improve the security, safety, and trustworthiness of the national digital infrastructure. Agencies have responded with novel ideas, from moving target management to tailored trustworthy spaces. The Web extra at https://youtu.be/ZdUDIDolCM4 is a video of Vint Cerf i...
Seymour Cray is universally known as the father of supercomputing. This article describes some of Cray's many contributions to supercomputing as he worked in five different corporate environments from 1951 until his death.
This sketch of Grace Murray Hopper, the first famous female computer scientist, focuses on her early programming days, creation of the first compiler, leadership in creating the Cobol language, and latter-day speaking career. It also highlights the Grace Hopper Celebration of Women in Computing conference, which, since 1994, has brought together an...
Claude Shannon helped create the digital IT revolution by contributing to both digital computing and digital communications. Learn about Shannon's contributions to digital circuit theory and information theory and the connection he initiated between information and physics.
Some algorithms make for "better" programs than others--that is, programs that execute in less time or require less memory. How can we quantify differences to determine which algorithms are better? No one has done more to answer this question than Don Knuth, who has been called the "father of the analysis of algorithms."
The installment highlights the masterminds who created the Web, enabled its popularization via the Web browser, and greatly extended its value by enabling Web search.
The Arpanet, which was the beginning of the Internet, was born out of the Defense Advanced Research Projects Agency (DARPA), which was created in 1958. The Arpanet was built on 56,000 bits per second telephone lines. A young MIT computer scientist named Larry Roberts settled on a network design that connected these lines to small computers, called...
IEEE IT Professional magazine has added a new column this year called Masterminds of IT, which profiles innovators, inventors, and key people in the fields of IT, computer science, and information systems. Since I'm fortunate enough to be the first editor/author of this column, I'm happy to have this opportunity to talk about it. First, I'll discus...
This installment profiles John Atanasoff, John Mauchly, and John von Neumann--the "fathers of the electronic digital computer."
In this first installment of IT Professional's new Mastermind department, which will profile innovators, inventors, and key people in the fields of IT, computer science, and information systems, George Strawn reflects on the father of computer science, Alan Turing. He focuses on Turing's early theoretical work, noting how unusual it is for a major...
This keynote sessions discuss the following: the US big data initiative in cloud; using crowdsourcing for data analytics; cloud computing transformation in services and the next generation of cloud architecture; and managing the data flood in scientific domains.
A panel of distinguished information technology executives describes how Web services, services computing, service-oriented architectures, and services-centric models are changing their organizations. The panelists are members of IT Professional Magazine's Advisory and Editorial Boards, and hold (or have held) C-level positions in government, indus...
A new data structure is presented which may be used to specify programming languages. It is called a multidimensional tree. It is an extension of the normal concept of a tree, which is a two-dimensional concept, into higher dimensions. It is shown that a string may be considered a one-dimensional tree, and a node by itself, a zero-dimensional tree....
Most APL systems delay all parsing until run-time. Because the APL expression language is inherently ambiguous and because identifier binding is delayed until run-time, some runtime parsing is shown to be necessary. Nevertheless we argue that most APL statement parsing can be done at entry-time and that there are several reasons for doing so. Then...