Journal of the Royal Statistical Society Series A (General)

Online ISSN: 0035-9238
There has been a dramatic increase in mortality over the last 100 years, particularly in the first few years of life. The rate of improvement, standardized for age, was practically constant from 1890 to 1950, slowed down from 1950 to 1975, and then speeded up again. Changes in society, medical knowledge, and biological organisms have contributed to the trend, some introducing new hazards as well as helping to prevent or cure disease. Those new hazards that have caused major epidemics (defined as epidemics that have caused or threaten to cause more than 10,000 people to die or be seriously disabled) are reviewed. Most have been due to changes in personal behaviour made possible by increased affluence; others have been due to industrial pollution or to independent changes in biological organisms. The most important have been the epidemics of lung cancer and coronary thrombosis. The first has caused nearly a million premature deaths in the last 50 years, amounting to 31/2 per cent of all deaths in adults. It has been due principally to the increase in cigarette consumption and is now waning as cigarette consumption falls and the tar content of cigarette smoke is reduced. The size of the second cannot be estimated at all precisely as the previous incidence of coronary thrombosis is too uncertain; but it must be large. Factors that have contributed to it include increased cigarette smoking, reduced physical exercise, and an increased prevalence of diabetes (due to increased consumption of refined carbohydrates). The most important are those that determine the level of blood cholesterol and the recent reduction in the total consumption of fat and the relative increase in the consumption of polyunsaturated fat explain most of the recent decrease in mortality. In comparison, epidemics attributable to industrial pollution have been small, the most important being the 25,000 deaths from lung cancer, mesothelioma, and asbestosis due to the widespread use of asbestos. The epidemic of road traffic deaths may be most remarkable for the way it has been controlled. Despite the number of motor vehicles increasing more than 200-fold, the number of deaths due to motor vehicles has increased less than 20-fold and the death rate, which increased from 1909 to 1934, was lower in 1985 than at any time in the last 60 years apart from 1948 when the number of cars in use was restricted by the shortage of petrol. Two epidemics have been due to changes in biological organisms. A change in the influenza virus enabled it to escape immunological control in 1918 and led to the greatest world-wide epidemic in modern history, with 140,000 excess deaths in Great Britain in two years and 10 million throughout the world. A virus similar to that which caused the epidemic now causes epidemics in swine. A change in a monkey virus is the likely origin of the organism that causes AIDS. It has not yet caused major mortality in Britain, but it is certain to do so unless an effective treatment is discovered as some 35,000 to 55,000 people are already thought to be infected, most of whom are likely to become ill. Until now the great majority of cases has occurred in male homosexuals and addicts who inject drugs intravenously, but experience in Africa and elsewhere shows that the disease can also spread by heterosexual intercourse. Knowledge of the transmission rate and of the pattern of sexual behaviour in the community is too incomplete to enable any prediction to be made of the likely future spread of the disease. This review has made extensive use of the vital statistics collected by the Registrars-General, the quality of which is due largely to the pioneering of an early Fellow of the Society (Dr William Farr). These have provided clues to the causation of the two big epidemics that held up the decline of the death rate and despite the rapid advance of laboratory medicine they will continue to be an essential component of society's armamentarium against disease.
We analyse response patterns to an important survey of school children, exploiting rich auxiliary information on respondents’ and non-respondents’ cognitive ability that is correlated both with response and the learning achievement that the survey aims to measure. The survey is the Programme for International Student Assessment (PISA), which sets response thresholds in an attempt to control data quality. We analyse the case of England for 2000 when response rates were deemed high enough by the PISA organisers to publish the results, and 2003, when response rates were a little lower and deemed of sufficient concern for the results not to be published. We construct weights that account for the pattern of non-response using two methods, propensity scores and the GREG estimator. There is clear evidence of biases, but there is no indication that the slightly higher response rates in 2000 were associated with higher quality data. This underlines the danger of using response rate thresholds as a guide to data quality.
Concentration data - which expresses the relative importance of the largest firms in an industry - covering almost the entire range of British industry, were made available for this study, first published in 1960, by the Board of Trade. The authors combined with each industry’s concentration-ratio with the average and relative sizes of its constituent firms and plants, and so sought to determine its structural type. Then, by comparing these results with those of an earlier study, they established in which trades significant changes in concentration occurred since 1935. Two chapters describe how the leading firms in such highly concentrated trades as sugar refining, wallpaper, matches, explosives, tinplate and oil-refining grew over the years and how they maintained their position. There is also a discussion of the relevance of such factors as mergers, nationalisation, technological changes, illustrated by reference to brief case-studies of twenty trades.
Rational management depends upon the provision and proper use of information about the determinants and probable consequences of decisions. The provision of this information is a task best left to the services of a statistician who is professionally competent and himself uncommitted to policy and therefore in a position to offer objective judgment. What is required is an effective dialogue between the manager who asks pertinent questions and the statistician who provides relevant answers. How can this dialogue be made effective? Can the manager trust the statistician? Will the statistician be listened to? This paper examines the conditions necessary to ensure mutual trust between manager and information services. It offers guidelines for the behaviour of both parties to the dialogue.
Book on stationary and related stochastic processes covering sample function properties and applications, Hilbert space geometry, etc
The author presents his discussion of experimental design with emphasis on the understanding of basic principles in the text and in study exercises following each chapter the student can apply the principle to specific experimental situations selected from the literature of psychology and education. The first chapter presents fundamental concepts such as measures of precision, testing hypotheses, and randomization. Chi square, t, and F distributions are discussed in chapter 2. Chapters 3 to 16 deal with simple and complex designs, and methods of analysis of data. The mathematical level does not assume formal training beyond high school algebra. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
This is the first volume of a two-volume work on Probability and Induction. Because the writer holds that probability logic is identical with inductive logic, this work is devoted to philosophical problems concerning the nature of probability and inductive reasoning. The author iejects a statistical frequency basis for probability in favor of a logical relation between two statements or propositions. Probability "is the degree of confirmation of a hypothesis (or conclusion) on the basis of some given evidence (or premises)." Furthermore, all principles and theorems of inductive logic are analytic, and the entire system is to be constructed by means of symbolic logic and semantic methods. This means that the author confines himself to the formalistic procedures of word and symbol systems. The resulting sentence or language structures are presumed to separate off logic from all subjectivist or psychological elements. Despite the abstractionism, the claim is made that if an inductive probability system of logic can be constructed it will have its practical application in mathematical statistics, and in various sciences. 16-page bibliography. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
The univariate generalized Waring distribution was shown by Irwin (1968, 1975) to provide a useful accident model which enables one to split the variance into three additive components due to randomness, proneness and liability. The two non-random variance components, however, cannot be separately estimated. In this paper a way of tackling this problem is suggested by defining a bivariate extension of the generalized Waring distribution. Using this it is possible to obtain distinguishable estimates for the variance components and hence inferences can be made about the role of the underlying accident factors. The technique is illustrated by two examples.
Seasonal-adjustment procedures based on regression methods applied to a mixed additive-multiplicative model are described. The procedures are based on the traditional model of trend, seasonal and irregular components but instead of assuming that the seasonal is purely additive or purely multiplicative as in the usual methods of seasonal adjustment they permit the use of a mixture of additive and multiplicative components. Tests are developed which are intended to investigate whether a purely additive, a purely multiplicative or a mixed additive-multiplicative model is required by the data. For the general case where both additive and multiplicative components are required, the seasonal component is assumed to consist of a seasonal pattern and a seasonal amplitude which are estimated separately. The trend is estimated using moving-average filters and the seasonal component is fitted by means of a stepwise regression method applied to additive and multiplicative Fourier components. Two main computer programs have been developed, the first of which tests whether a purely additive, a purely multiplicative or a mixed additive-multiplicative seasonal model is required; the second estimates the seasonal component and produces a seasonally adjusted series. The methods are applied to a number of unemployment series for several countries.
"Companion study to the twe-volume work world urbanization, 1950-1970" Incluye bibliografía
Top-cited authors
F. James Rohlf
  • State University of New York
G. E. P. Box
Paul W. Holland
  • Educational Testing Service
Ralph L. Keeney
  • Duke University
Frederick S. Hillier
  • Stanford University