Page 1

Institute of Mathematical Statistics is collaborating with JSTOR to digitize, preserve, and extend access to

The Annals of Mathematical Statistics.

www.jstor.org

®

Page 2

Page 3

Page 4

Page 5

Article

The Annals of Mathematical Statistics 12/1960; DOI: 10.1214/aoms/1177705673

Source: OAI

- [Show abstract] [Hide abstract]

**ABSTRACT:**In this paper, we propose a maximum margin classifier that deals with uncertainty in data input. Specifically, we reformulate the SVM framework such that each input training entity is not solely a feature vector representation, but a multi-dimensional Gaussian distribution with given probability density, i.e., with a given mean and covariance matrix. The latter expresses the uncertainty. We arrive at a convex optimization problem, which is solved in the primal form using a gradient descent approach. The resulting classifier, which we name SVM with Gaussian Sample Uncertainty (SVM-GSU), is tested on synthetic data, as well as on the problem of event detection in video using the large-scale TRECVID MED 2014 dataset, and the problem of image classification using the MNIST dataset of handwritten digits. Experimental results verify the effectiveness of the proposed classifier. - [Show abstract] [Hide abstract]

**ABSTRACT:**In this paper, a robust support vector regression (RSVR) method with uncertain input and output data is studied. First, the data uncertainties are investigated under a stochastic framework and two linear robust formulations are derived. Linear formulations robust to ellipsoidal uncertainties are also considered from a geometric perspective. Second, kernelized RSVR formulations are established for nonlinear regression problems. Both linear and nonlinear formulations are converted to second-order cone programming problems, which can be solved efficiently by the interior point method. Simulation demonstrates that the proposed method outperforms existing RSVRs in the presence of both input and output data uncertainties.IEEE transactions on neural networks and learning systems 11/2012; 23(11):1690-1700. DOI:10.1109/TNNLS.2012.2212456 · 4.37 Impact Factor -
##### Conference Paper: Clustering based large margin classification: a scalable approach using SOCP formulation.

[Show abstract] [Hide abstract]

**ABSTRACT:**This paper presents a novel Second Order Cone Program- ming (SOCP) formulation for large scale binary classifica- tion tasks. Assuming that the class conditional densities are mixture distributions, where each component of the mixture has a spherical covariance, the second order statistics of the components can be estimated efficiently using clustering al- gorithms like BIRCH. For each cluster, the second order moments are used to derive a second order cone constraint via a Chebyshev-Cantelli inequality. This constraint en- sures that any data point in the cluster is classified correctly with a high probability. This leads to a large margin SOCP formulation whose size depends on the number of clusters rather than the number of training data points. Hence, the proposed formulation scales well for large datasets when compared to the sate-of-the-art classifiers, Support Vector Machines (SVMs). Experiments on real world and syn- thetic datasets show that the proposed algorithm outper- forms SVM solvers in terms of training time and achieves similar accuracies.Proceedings of the Twelfth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Philadelphia, PA, USA, August 20-23, 2006; 01/2006

Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.