
Terrence L Fine- PhD
- Professor Emeritus at Cornell University
Terrence L Fine
- PhD
- Professor Emeritus at Cornell University
About
110
Publications
23,209
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
3,103
Citations
Introduction
Skills and Expertise
Current institution
Additional affiliations
September 1966 - June 2010
September 1964 - August 1966
July 1963 - August 1964
Publications
Publications (110)
We present a mathematical theory of objective, frequentist chance phenomena that uses as a model a set of probability measures. In this work, sets of measures are not viewed as a statistical compound hypothesis or as a tool for modeling imprecise subjective behavior. Instead we use sets of measures to model stable (although not stationary in the tr...
By recourse to the fundamentals of preference orderings and their numerical representations through linear utility, we address
certain questions raised in Nover and Hájek 2004, Hájek and Nover 2006, and Colyvan 2006. In brief, the Pasadena and Altadena
games are well-defined and can be assigned any finite utility values while remaining consistent w...
We propose a Bayesian test for independence among signals where only a small dataset is available. Traditional frequentist approaches often fail in this case due to inaccurate estimation of either the source statistical models or the threshold used by the test statistics. In addition, these frequentist methods cannot incorporate prior information i...
textbook on mathematical probability intended for electrical engineering undergraduates, although the formal development is mathematical
Information-theoretic capacity notions for a slotted Aloha random-access system are considered in this paper, as well as joint power and retransmission controls for this protocol. The effect of the bursty nature of the arrival and transmission process on the information-carrying capability and spectral efficiency of the system is studied. The natur...
We propose a Bayesian test to assess the statistical dependence when only a small number of samples are available. Our procedure converts the problem of independence test to a parametric one through quantization and computes the likelihood of the observed cell counts under the independence hypothesis where the marginal cell probabilities are modele...
In this paper, we address the issue of testing for stochastic independence and its application as a guide to selecting the standard independent component analysis (ICA) algorithms when solving blind source separation (BSS) problems. Our investigation focuses on the problem of establishing tests for the quality of separation among recovered sources...
A Chaotic Probability model is a usual set of proba- bility measures, M, the totality of which is endowed with an objective, frequentist interpretation as op- posed to being viewed as a statistical compound hy- pothesis or an imprecise behavioral subjective one. In the prior work of Fierens and Fine, given finite time series data, the estimation of...
We propose a new definition for conditioning in the Chaotic Probability framework that in-cludes as a special case the conditional ap-proach of Fierens 2003 [2] and can be given the interpretation of a generalized Markov chain. Chaotic Probabilities were introduced by Fine et al. as an attempt to model chance phenomena with a usual set of measures...
We consider in this study dynamic control policies for slotted Aloha random access systems. New performance bounds are derived when random access is combined with power control for system optimization, and we establish the existence of optimal control approaches for such systems. We analyze throughput and delay when the number of backlogged users i...
We adopt the same mathematical model of a set M of probability measures as is central to the theory of coherent imprecise probability. However, we endow this model with an objective, frequentist interpretation in place of a behavioral subjective one. We seek to use M to model stable physical sources of time series data that have highly irregular be...
We propose a nonparametric independent component analysis (ICA) algorithm for the case of instantaneous and linear mixtures. Our algorithm combines minimization of correlation among nonlinear expansions of the output signals with good initialization derived from a search guided by statistical tests for independence. Simulation results obtained from...
We adopt the same mathematical model of a set M of probability measures as is central to the theory of coherent imprecise probability. However, we endow this model with an objective, frequentist interpretation in place of a behavioral subjective one. We seek to use M to model stable physical sources of time series data that have highly irregular be...
We consider a slotted Aloha random access system with combined coding and power/retransmission control. Under suitable conditions, if M or fewer users transmit simultaneously, they are successfully decoded with high probability, with M depending on coding rate and SINR. We show then that, among any realizable random access protocol, slotted Aloha's...
We consider collision resolution protocols for a random access collision channel with multiplicity feedback. By using Markov decision processes, we provide performance bounds for such systems in terms of the mean number of slots required to resolve a collision of a given multiplicity. We show that recursive binary splitting is strictly suboptimal f...
A methodology to build interval-valued probability models is
presented. It is shown that this alternative produces temporally stable
models of Internet-generated communications variables
We revisit the oft-studied asymptotic (in sample size) behavior of the parameter or weight estimate returned by any member of a large family of neural network training algorithms. By properly accounting for the characteristic property of neural networks that their empirical and generalization errors possess multiple minima, we rigorously establish...
This paper describes a neural network-based system for stabilizing
Aloha random access networks. A grid of sensors is used to gather energy
measurements for the neural network. The neural network has been trained
to estimate the number of colliding users in a given slot. This
information is used to set the parameters of the backoff algorithm so as...
We introduce a statistically based methodology for the design of
neural networks when the dimension d of the network input is comparable
to the size n of the training set. If one proceeds straightforwardly,
then one is committed to a network of complexity exceeding n. The result
will be good performance on the training set but poor generalization
p...
A neural network-based system for stabilizing ALOHA random access networks is described. A grid of sensors is used to gather energy measurements for a neural network. The neural network has been trained to estimate the number of colliding users in a given slot, supporting an algorithm that tracks the number of backlogged users in the system
The error surface on which minimization is done in any feedforward
neural network training algorithm is highly irregular, with multiple
local minima having been observed empirically. In training schemes, this
implies that several random initial points must be chosen, and the
performance of the resulting trained neural network evaluated for each
suc...
We attempt to use a neural network to solve the channel blind
equalization problem. An equalizer is a device which by observing the
channel outputs recovers the channel inputs. A blind equalizer does not
require any known training sequence for the startup period. We have
implemented a blind equalizer using a neural network for channel inputs
of (-1...
Neural networks have been used to tackle what might be termed
`empirical regression' problems. Given independent samples of
input/output pairs (x<sub>i</sub>,y<sub>i</sub>), we wish to estimate
f(x)=E[Y|X=x]. The approach taken is to choose an approximating class of
networks N={η(x;w)}w∈W and within that class, by an often
complex procedure, choose...
We estimate the number of training samples required to ensure that the performance of a neural network on its training data matches that obtained when fresh data is applied to the network. Existing estimates are higher by orders of magnitude than practice indicates. We narrow the gap between theory and practice by transforming the problem into dete...
We consider the relationship between the finite-dimensional distributions of a stationary time series model and its asymptotic behavior in the framework of interval-valued probability (IVP), a simple generalization of additive probability measures. By Caratheodory's theorem, the specification of a countably additive probability measure on the algeb...
We estimate the number of training samples required to ensure that the performance of a neural network on its training data matches that obtained when fresh data is applied to the network. Existing estimates are higher by orders of magnitude than practice indicates. This work seeks to narrow the gap between theory and practice by transforming the p...
Patrick Suppes’s vigorous defense of the centrality of probability and of the irreducibility of randomness or chance phenomena,
especially in the province of quantum mechanical phenomena, his interests in axiomatization, and the viewpoints he developed
through his fundamental work on measurement theory have encouraged and enabled him to study the i...
First Page of the Article
Much of our effort was devoted to establishing statistical performance bounds for neural networks, acting as pattern classifiers, through improvements to Vapnik-Chervonenkis their (VCT). This theory addresses the interrelationships between the complexity of a network, the amount of training data, and the statistical reliability/performance of the t...
Our work is motivated by the study of empirical processes (such as flicker noise) that occur in stable systems yet give rise to observations with seemingly divergent time averages. Stationary models for such processes do not exist in the domain of numerical probability, as the ergodic theorems dictate the convergence of time averages of stationary...
We discuss issues of existence and stochastic modeling in regard to sequences that exhibit combined features of independence and instability of relative frequencies of marginal events. The concept of independence used here is borrowed from the frequentist account of numerical probability advanced by von Mises: A sequence is independent if certain s...
We indicate an approach to pattern classification using neural networks that clarifies and simplifies the roles of the various layers in the networks. The first layer can always be considered as quantizing, and thereby smoothing, the d-dimensional feature vector that is its input. We indicate how to determine the width w
Q
of this layer in terms of...
An extension to the class of conventional numerical probability models for nondeterministic phenomena has been identified by Dempster and Shafer in the class of belief functions. We were originally stimulated by this work, but have since come to believe that the bewildering diversity of uncertainty and chance phenomena cannot be encompassed within...
We have undertaken to develop a new type of stochastic model for nondeterministic empirical processes that exhibit paradoxical characteristics of stationarity, bounded variables, and unstable time averages. By the well-known ergodic theorems of probability theory there is no measure that can model such processes. Hence we are motivated to broaden t...
Interest in lower probability has largely focussed on lower envelopes and, more particularly, on belief functions. We consider those lower probabilities that do not admit of a dominating probability measure and hence are not lower envelopes. A simple and useful family of such undominated lower probabilities is constructed. We briefly explore the ge...
This paper explores the possibilities for probability-like models of stationary nondeterministic phenomena that possess divergent but bounded time averages. A random sequence described by a stationary probability measure must have almost surely convergent time averages whenever it has almost surely bounded time averages. Hence, no measure can provi...
We present elements of a frequentist theory of statistics for concepts of upper and lower (interval-valued) probability (IVP), defined on finite event algebras. We consider IID models for unlinked repetitions of experiments described by IVP and suggest several generalizations of standard notions of independence, asymptotic certainty and estimabilit...
We consider the use of interval-valued probabilities to represent the support lent to the hypothesis that the parameter value θ lies in a subset A of the parameter set Θ when we observe x, know the likelihoods {fΘ: θεΘ}, and have some prior information concerning the parameter. Our model for prior information is that of a salient prior distribution...
We consider a parameterized family {S_{alpha}, alpha in a}, a subset R^{infty} , of systems or sources having stochastic outputs {x_{n}} that are partially described by a statistic (e.g, correlation function) sigma_{alpha}(tau) . If we represent alpha=(alpha_{1},alpha_{2}...,alpha_{n}...) , then by the system order M_{alpha} we mean the index n of...
The introduction is divided into five parts including this preface. The second part outlines the contents of the paper and indicates some areas of omission. The third part speculates on some of the reasons why the subject of this paper has been so conspicuously neglected and attempts to suggest why the time may be propitious for formal study and ra...
Comparative probability (CP) is a theory of probability in which uncertainty is measured by a CP ordering of events, rather than by a probability measure. A CP order is additive iff it has an agreeing probability measure. This paper deals with the formation of joint CP orders from given marginals, both with and without a certain independence condit...
We undertake to argue in favor of generalizing the formal concept of probability by replacing the usual quantitative formulation by the hitherto largely ignored comparative formulation. The theory of comparative probability (CP) is a theory of statements of the forms ‘event A is more probable than event B’, ‘events A, B are equally probable’, ‘even...
An attempt is made to enumerate the distinct antisymmetric comparative probability relations on sample spaces of $n$ atoms. The results include an upper bound to the total number of such relations and upper and lower bounds to the size of the subset of the comparative probability relations admitting an agreeing probability measure as representation...
We undertake to discuss the question of the apparent stability of relative frequency (RF) and the concept of stochastic independence (SI) from the vantage point of the relatively new notion of computational complexity (CC) pioneered by Chaitin [1], Kolmogorov [2], Loveland [3], Martin- Löf [4] and Solomonoff [5] and as discussed in Fine [6]. This d...
The decoding of efficiently encoded messages, from either probabilistic, nonprobabilistic, or unknown message sources, is shown to be often practically impossible. If tau(S) is a running-time bound on the computational effort of a decoder Psi accepting a codeword P for message S , and gamma[K_{Psi}(S)] is an upper bound to acceptable codeword lengt...
Quantum mechanics (QM) supplies quantitative probabilities for the occurrences of physically significant events. Historically the probabilistic interpretation of the Schrödinger wave function arose almost as an afterthought. When Schrödinger proposed his equation for the wave function Ψ he had an electromagnetic analogy in mind (Jammer, 1966). When...
The classical approach to probability attempts to assess unique probabilities for random events even in the absence of extensive prior knowledge or information concerning a random experiment. In its early formulation by Laplace, through the principle of nonsufficient reason, equiprobable events were identified by the absence of reasons to expect th...
This chapter focuses on the concepts of events, probability, independence, and conditional probability. The Kolmogorov setup for probability consists of a probability space (Ω, F, P) having as components a sample space, Ω; a σ-field F of selected subsets of Ω; and a probability measure or assignment, P. The sample space Ω has elements ω called the...
This chapter discusses the rationale behind some of the formal structure of the concepts of random events, independence, and probability. The view of probability is that it is a physical characteristic or description of the occurrence of events in the performance of an experiment. As a physical or empirical property of random events, probability sh...
The relative-frequency, complexity, classical, and logical interpretations of probability are primarily concerned with knowledge and inference. None of these interpretations lend themselves to a ready justification for the use of probability to guide behavior or to facilitate decision making. Nor are these interpretations sufficient to direct the u...
This chapter discusses axiomatic comparative probability. The concept of comparative probability (CP) has thus far received very little attention from engineers, scientists, probabilists, or philosophers. CP provides a more realistic model of random phenomena data to estimate quantitative probability reasonably. CP presents a wider class of models...
The probability of an event is assessed from a balance of evidence, or lack of evidence, in favor of the occurrences of each of an exhaustive set of mutually exclusive alternatives, some subset of which comprises the event. The formal domain of a theory of logical probability is generally a set of inferences between statements or propositions in a...
This chapter discusses the various theories of quantitative probability with respect to their ability to measure probability and to yield probability conclusions of value either for the characterization of chance and uncertainty or for the determination of inferences and decisions. The formal theories of comparative and quantitative probability are...
Exposition of a variety of meanings for probability and a variety of mathematical characterizations of probability that need not be restricted to numerical probability
The problem of pattern classification when very little is known about the pattern source is structured through a set of axioms describing desirable properties for a classifier that involve no probabilistic assumptions or hypotheses. The derived classifier is then studied under the hypothesis that the pattern source is probabilistic and its long-run...
The problem of pattern classification when very little is known about the pattern source is structured through a set of axioms describing desirable properties for a classifier that involve no probabilistic assumptions or hypotheses. The derived classifier is then studied under the hypothesis that the pattern source is probabilistic and its long-run...
Prompted by the inadequacies of the now traditional characterization of chance and uncertainty through the Kolmogorov axioms for probability and the relative frequency interpretation of probability, we propose and examine a nonstatistical approach to extrapolation. The basic problem is the association of a real number y to a sequence of real number...
An explanation is provided for the prevalence of apparently convergent relative frequencies in random sequences. The explanation is based upon the computational-complexity characterization of a random sequence. Apparent convergence is shown to be attributable to a surprising consequence of the selectivity with which relative frequency arguments are...
We analyze the performance of a dispersion instrument in which light is multiplexed both in the entrance and exit slit positions. This double multiplexing scheme allows one to recover both Fellgett’s advantage and the high throughput advantage normally attributed only to interferometric spectrometers. The spectrometer’s performance is evaluated for...
We propose a nonparametric independent component analy-sis (ICA) algorithm for the problem of blind source separa-tion with instantaneous, time-invariant and linear mixtures. Our Init-NLE algorithm combines minimization of correla-tion among nonlinear expansions of the output signals with a good initialization derived from search guided by statis-t...
Research focussed on the relatively new and little explored concept of interval-valued or upper and lower probability (U/LP). The axiomatic structure of U/LP is presented in Section 3 below, but roughly we may think of it as a pair of nonnegative numbers in the unit interval that simultaneously represent the tendency for an event in question (inter...
A brief survey of developments at the foundations of communications theory is presented. New themes are being heard and they may lead to as great a change in communications theory as that ushered in 25 years ago. The new themes concern alternative mathematical and interpretive conceptions of probability, broadened insights into the nature of perfor...
Following is a continuation of the list of titles and authors: Approach to Pattern Classification When Little Is Known. By Terrence L. Fine. Fast method for probabilistic and Fuzzy Cluster Analysis Using Association Measures. By Enrique H. Ruspini. Estimation of the Information in a Two Class Sample. By G. A. Butler and H. B. Ritea. Non-Parametric...
A number of binary cyclic coding schemes for multiplex spectrometry are discussed and evaluated in terms of a linear, least mean square, unbiased estimate. The optical realization of such codes in dispersion instruments is briefly discussed. We show that there are many advantages both in the construction of the instrument and in its operation which...
A number of binary cyclic coding schemes for multiplex spectrometry are discussed and evaluated in terms of a linear, least mean square, unbiased estimate. The optical realization of such codes in dispersion instruments is briefly discussed. We show that there are many advantages both in the construction of the instrument and in its operation which...
A discrete time, nonlinear system composed of an integrator preceded by a binary quantizer with integrated negative feedback, which can model a tracking loop or a single integrating delta modulation communication system, is discussed with regard to the input-output statistics for two types of input processes: independent inputs and independent incr...
This paper exhibits an optimum strategy for the sequential estimation of, or search for, the location of the maximum M of a unimodal function, when M is initially uniformly distributed over some interval. The explicit search strategy which is found is valid for a variety of expected cost functions that add the expected cost of observation to the ex...