Article

John Tukey at Bell Labs

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

John Tukey was a Bell Labs employee for forty years. In that time he influenced very many researchers, and contributed significantly to the growth and luster of the Bell Labs Statistics Research Department. His counsel was valued by senior management, and his involvement in many applied problems led to important advances, including his work on spectral analysis. I will survey this and other contributions.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Por otro lado, en las ciencias físicas la información es representada mediante unidades propias: El bit, contracción de "dígito binario". El término, acuñado por John Tukey, 17 hace referencia a una unidad o dispositivo capaz de almacenar uno de dos elementos o estados entre los que se puede escoger. Con un bit podríamos representar un 0 o un 1, positivo o negativo, blanco o negro, arriba o abajo etc., según la manera en la que codiiquemos la información. ...
Article
Full-text available
RESUMEN. A lo largo de los últimos dos siglos la medicina se vio nutrida con los descubrimientos bioquímicos que impulsaron el entendimiento de los mecanismos fisiopatológicos y facilitó el desarrollo de la terapéutica. En cambio, en el presente siglo entramos a la era de la genómica y del “big data”, por lo que el estudio de las funciones del ADN como dispositivo de almacenamiento de información es esencial para la comprensión de la nueva medicina genómica personalizada, de precisión. En la presente revisión, se analiza el ADN como un dispositivo informático con tres funciones: almacenamiento, expresión y transmisión de la información acumulada a lo largo de la filogenia en forma de secuencias de nucleótidos. Se describe cada una de estas funciones comparándolas con la información manejada por una computadora o una sociedad, y se brindan ejemplos de patologías que surgen ante el fallo de alguna de las funciones. La revisión bibliográica es amplia e incluye los artículos más relevantes, tanto históricos como del estado del arte, correspondientes a cada tema.
... Por otro lado, en las ciencias físicas la información es representada mediante unidades propias: El bit, contracción de "dígito binario". El término, acuñado por John Tukey, 17 La información necesita de dos componentes: uno o varios lenguajes (analógico o digital), y un código (digital) que es una serie de reglas de asignación de elementos (a pesar de la idea que se tiene de los lenguajes como códigos, no lo son por sí mismos). 18 Para poder asignar más de dos elementos a un estado no nos sirve un bit. ...
Preprint
Full-text available
A lo largo de los últimos dos siglos la medicina se vio nutrida con los descubrimientos bioquímicos que impulsaron el entendimiento de los mecanismos fisiopatológicos y facilitó el desarrollo de la terapéutica. En cambio, en el presente siglo entramos a la era de la genómica y del "big data", por lo que el estudio de las funciones del ADN como dispositivo de almacenamiento de información es esencial para la comprensión de la nueva medicina genómica personalizada, de precisión. En la presente revisión, se analiza el ADN como un dispositivo informático con tres funciones: almacenamiento, expresión y transmisión de la información acumulada a lo largo de la filogenia en forma de secuencias de nucleótidos. Se describe cada una de estas funciones comparándolas con la información manejada por una computadora o una sociedad, y se brindan ejemplos de patologías que surgen ante el fallo de alguna de las funciones. La revisión bibliográfica es amplia e incluye los artículos más relevantes, tanto históricos como del estado del arte, correspondientes a cada tema.
Chapter
John Tukey (1915?2000), is best known to statisticians for founding the field of exploratory data analysis, for introducing the jackknife as a tool for characterizing the uncertainty in a statistic, and for guiding and contributing to research in robust methods. His work in other fields included Tukey's lemma; fast fourier transform; and coining the word ?bit? for computer science.
Chapter
Exploratory data analysis (EDA) is a conceptual framework with a core set of ideas and values aimed at providing insight into data as it is presented to the working researcher (regardless of its origin), and to encourage understanding probabilistic and nonprobabilistic models in a way that guards against erroneous conclusions. Because this set of goals covers experimental and nonexperimental data, clean and messy data, and data in forms that may not be properly statistically modeled, Tukey distinguished these goals from the more specific probabilistic goals of traditional "statistics," which he referred to as "confirmatory data analysis" (CDA). Clearly these practice-based and pragmatic goals are well aligned with the needs of active researchers in the psychological community (Behrens, 1997a). Although an explicit account of EDA is slowly growing in the psychology literature, the influence of Tukey's principles are as far reaching as key works in philosophy of data analysis (Cohen, 1990, 1994; Nickerson, 2000), regression graphics (Cook & Weisberg, 1994), robustness studies (Wilcox, 1997, 2001), and computer graphics for statistical use (Scott, 1992; Wilkinson, 2005). Despite these influences, recent research regarding the training of psychological researchers suggests little advancement in psychologists' abilities to apply common techniques important to the EDA tradition, including dealing with nonlinearity, advanced graphics, or model diagnostics (Aiken, West, & Millsap 2008). Likewise, it is not unusual that authors of papers published in referred journals neglect detailed examination of data. To introduce the reader to EDA, the chapter is divided into four parts. First, the background, rationale, and philosophy of EDA are presented. Second, a brief tour of the EDA toolbox is presented. Third, computer software and future directions for EDA are discussed. The chapter ends with a summary and conclusion.
Article
Résumé L'essor et le développement de la recherche industrielle aux Etats‐Unis couvrent une grande partie du 20e siècle. Les Bell Labs, composante du Bell System, en sont peut‐être le meilleur exemple. La période qui s'étend de leur création en 1925 à leur démantèlement en 1984—suite à des poursuites judiciaires relatives aux lois anti‐trust—est d'un intérêt tout particulier. Le système, pendant cette époque, a en effet été géréà la manière d'un monopole, les abonnés soutenant réellement, au travers de leurs factures téléphoniques mensuelles, les activités de recherche et développement, procurant ainsi aux Bell Labs une grande stabilité financière. La statistique figure parmi les domaines de recherche les plus florissants des Labs, pour le plus grand bénéfice du Bell System, de ses clients, et de la science statistique. Le but de cet article est de passer en revue, d'expliquer et d'illustrer les circonstances de ce succès, tout en s'interrogeant sur la façon dont les entreprises modernes, dans le contexte concurrentiel qui est le leur, pourraient s'assurer le même type d'avantages.
Article
The measurement of power spectra is a problem of steadily increasing importance which appears to some to be primarily a problem in statistical estimation. Others may see it as a problem of instrumentation, recording and analysis which vitally involves the ideas of transmission theory. Actually, ideas and techniques from both fields are needed. When they are combined, they provide a basis for developing the insight necessary (i) to plan both the acquisition of adequate data and sound procedures for its reduction to meaningful estimates and (ii) to interpret these estimates correctly and usefully. This account attempts to provide and relate the necessary ideas and techniques in reasonable detail — Part I of this article appeared in the January, 1958 issue of THE BELL SYSTEM TECHNICAL JOURNAL.
Article
The measurement of power spectra is a problem of steadily increasing importance which appears to some to be primarily a problem in statistical estimation. Others may see it as a problem of instrumentation, recording and analysis which vitally involves the ideas of transmission theory. Actually, ideas and techniques from both fields are needed. When they are combined, they provide a basis for developing the insight necessary (i) to plan both the acquisition of adequate data and sound procedures for its reduction to meaningful estimates and (ii) to interpret these estimates correctly and usefully. This account attempts to provide and relate the necessary ideas and techniques in reasonable detail. Part II of this article wilt appear in the March issue of THE JOURNAL.
Article
Techniques for reliably estimating the power spectral density function for both small and large samples of a stationary stochastic process are described. These techniques have been particularly successful in cases where the range of the spectrum is large. The methods are resistant to a moderate amount of contaminated or erroneous data and are well suited for use with auxiliary tests for stationarity and normality. Part I is concerned with background and theoretical considerations while examples from the development and analysis of the WT4 waveguide medium will be discussed in Part II, next issue.
Article
The "Fast Fourier Transform" has now been widely known for about a year. During that time it has had a major effect on several areas of computing, the most striking example being techniques of numerical convolution, which have been completely revolutionized. What exactly is the "Fast Fourier Transform"?
An overview of techniques of data analysis, emphasizing its exploratory aspects Spectrum estimation techniques for characterization and development of the WT4 waveguide. I. Bell System Tech Spectrum estimation techniques for characterization and development of the WT4 waveguide
  • C L Mallows
  • J W Tukey
MALLOWS, C. L. and TUKEY, J. W. (1982j). An overview of techniques of data analysis, emphasizing its exploratory aspects. In Some Recent Advances in Statistics (J. Tiago de Oliviera and B. Epstein, eds.) 111–172. Academic Press, London. THOMSON, D. J. (1977a). Spectrum estimation techniques for characterization and development of the WT4 waveguide. I. Bell System Tech. J. 56 1769–1815. THOMSON, D. J. (1977b). Spectrum estimation techniques for characterization and development of the WT4 waveguide. II. Bell System Tech. J. 56 1983–2005.
Spectrum estimation techniques for characterization and development of the WT4 waveguide. II. Bell System Tech
THOMSON, D. J. (1977b). Spectrum estimation techniques for characterization and development of the WT4 waveguide. II. Bell System Tech. J. 56 1983-2005.
The Vocabulary of Science
  • L Hogben
  • M Cartwright
The Collected Works of John W. Tukey VI
  • C L Mallows