Article

Density estimation on a finite regular lattice

Authors:
  • MLTechniques.com
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... In order words, P(fl,... ,fm) is a Gibbs density defined with respect to some neighbourhood family 2,~ = {H~,...,Hm}, where//, is the set of sites which are neighbours to site i. Mathematically, our problem is stated as follows (Granville and Rasson, 1992): given N~,... ,N,n, the frequency distribution at sites 1 ..... m, we are searching for the discrete density (considered here as a random variable) f --(fl,... ,fro) r, fl + @BULLET -. + f,, = 1, fi > 0 (i = 1,...,m), such that the conditional likelihood L(f) = P(fJN1,. ...
Article
A new theoretical point of view is discussed here in the framework of density estimation. The discrete multivariate true density is viewed as a finite dimensional continuous random vector governed by a Markov random field structure. Estimating the density is then a problem of maximizing a conditional likelihood under a Bayesian framework. This maximization problem is expressed as a constrained optimization problem and is solved by an iterative fixed point algorithm. However, for time efficiency reasons, we have been interested in an approximate estimate of the true density f, where B is a stochastic matrix and π is the raw histogram. This estimate is obtained by developing as a function of π around the uniform histogram π0, using multivariate Taylor expansions for implicit functions ( is actually an implicit function of π). The discrete setting of the problem allows us to get a simple analytical form for B. Although the approach is original, our density estimator is actually nothing else than a penalized maximum likelihood estimator. However, it appears to be more general than those proposed in the literature (Scott et al., 1980; Simonoff, 1983; Thompson and Tapia, 1990).In a second step, we investigate the discrimination problem on the same space, using the theory previously developed for density estimation. We also introduce an adaptive bandwidth depending on the k-nearest neighbours and we have chosen to optimize the leaving-one-out criterion. We have always kept in mind the practical implementation on a computer. Our final classification algorithm compares favourably in terms of error rate and time efficiency with other algorithms tested, including multinormal IMSL, nearest-neighbour, and convex hull classifiers. Comparisons were performed on satellite images.
... He did not examine the properties of this estimator under sparse asymptotics. Other Bayesian justification for penalized likelihood estimation (and related estimators) for discrete data can be found in Thorburn (1986), Lenk (1990) and Granville and Rasson (1992). Simonoff (1983) used a penalized likelihood approach to develop the first example of an estimator demonstrated to be consistent under sparse asymptotics. ...
Article
Statistical analysis of categorical data (contingency tables) has a long history, and a good deal of work has been done formulating parametric models for such data. Unfortunately, such analyses are often not appropriate, due to sparseness of the table. An alternative to these parametric models is smoothing the table, by ‘borrowing’ information from neighboring cells. In this paper, various strategies that have been proposed for such smoothing are discussed. It is shown that these strategies have close ties to other areas of statistical methodology, including shrinkage estimation, Bayes methods, penalized likelihood, spline estimation, and kernel density and regression estimation. Probability estimates based on smoothing methods can outperform the unsmoothed frequency estimates when the table is sparse (often, dramatically so). Methods for one-dimensional tables are discussed, as well as generalizations to higher-dimensional tables. Attempts to use smoothed probability estimates in statistical functionals are identified. Finally, potential future work in categorical data smoothing is also mentioned.
Article
A new theoretical point of view is discussed in the framework of density estimation. The multivariate true density, viewed as a prior or penalizing factor in a Bayesian framework, is modelled by a Gibbs potential. Estimating the density consists in maximizing the posterior. For efficiency of time, we are interested in an approximate estimator f̂ = Bπ of the true density f, where B is a stochastic operator and π is the raw histogram. Then, we investigate the discrimination problem, introducing an adaptive bandwidth depending on the k nearest neighbours and chosen to optimize the cross‐validation criterion. Our final classification algorithm referred to as APML for approximate penalized maximum likelihood compares favourably in terms of error rate and time efficiency with other algorithms tested, including multinormal, nearest neighbour and convex hull classifiers.
Article
In the framework of image remote sensing, Markov random fields are used to model the distribution of points both in the 2-dimensional geometrical layout of the image and in the spectral grid. The problems of image filtering and supervised classification are investigated. The mixture model of noise developed here and appropriate Gibbs densities yield a same approach and a same efficient ICM algorithm both for filtering and classifying.
ResearchGate has not been able to resolve any references for this publication.