Article

# Multivariate Chebyshev Inequalities

The Annals of Mathematical Statistics 12/1960; DOI: 10.1214/aoms/1177705673

Source: OAI

- [Show abstract] [Hide abstract]

**ABSTRACT:**In this paper we propose an output-feedback Model Predictive Control (MPC) algorithm for linear discrete-time systems affected by a possibly unbounded additive noise and subject to probabilistic constraints. In case the noise distribution is unknown, the chance constraints on the input and state variables are reformulated by means of the Chebyshev - Cantelli inequality. The recursive feasibility of the proposed algorithm is guaranteed and the convergence of the state to a suitable neighbor of the origin is proved under mild assumptions. The implementation issues are thoroughly addressed showing that, with a proper choice of the design parameters, its computational load can be made similar to the one of a standard stabilizing MPC algorithm. Two examples are discussed in details, with the aim of providing an insight on the performance achievable by the proposed control scheme.08/2014; - [Show abstract] [Hide abstract]

**ABSTRACT:**This paper presents a new feature selection framework based on the L0-norm, in which data are summarized by their moments of the class conditional densities. However, discontinuity of the L0-norm makes it difficult to find the optimal solution. We apply a proper approximation of the L0-norm and a bound on the misclassification probability involving the mean and covariance of the dataset, to derive a robust difference of convex functions (DC) program formulation, while the DC optimization algorithm is used to solve the problem effectively. Furthermore, a kernelized version of this problem is also presented in this work. Experimental results on both real and synthetic datasets show that the proposed formulations can select fewer features than the traditional Minimax Probability Machine and the L1-norm state.International Journal of Computational Intelligence Systems 03/2014; 7(1):12-24. · 0.45 Impact Factor - [Show abstract] [Hide abstract]

**ABSTRACT:**In this paper, a robust support vector regression (RSVR) method with uncertain input and output data is studied. First, the data uncertainties are investigated under a stochastic framework and two linear robust formulations are derived. Linear formulations robust to ellipsoidal uncertainties are also considered from a geometric perspective. Second, kernelized RSVR formulations are established for nonlinear regression problems. Both linear and nonlinear formulations are converted to second-order cone programming problems, which can be solved efficiently by the interior point method. Simulation demonstrates that the proposed method outperforms existing RSVRs in the presence of both input and output data uncertainties.IEEE transactions on neural networks and learning systems 11/2012; 23(11):1690-1700. · 4.37 Impact Factor

Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.