A.-K. Seghouane

Australian National University, Canberra, Australian Capital Territory, Australia

Are you A.-K. Seghouane?

Claim your profile

Publications (28)24.03 Total impact

  • A.-K. Seghouane
    [Show abstract] [Hide abstract]
    ABSTRACT: A fundamental question in functional MRI (fMRI) data analysis is to declare pixels either activated or non-activated with respect to the experimental design. A new statistical test for detecting activated pixels in fMRI data is proposed. The test is based on comparing the dimension of the parametric models fitted to the voxels fMRI time series data with and without controlled activation-baseline pattern. A corrected variant of the Akaike information criterion, is used for this comparison. This test has the advantage of not requiring any user-specified threshold to be estimated. The effectiveness of the proposed fMRI activation detection method is illustrated on real experimental data.
    Statistical Signal Processing Workshop (SSP), 2012 IEEE; 01/2012
  • A.-K. Seghouane
    [Show abstract] [Hide abstract]
    ABSTRACT: Estimation of the expected Kullback-Leibler information is the basis for deriving the Akaike information criterion (AIC) and its corrected version AIC c . Both criteria were designed for selecting multivariate regression models with an appropriateness of AIC c for small sample cases. In the work presented here, two new small sample AIC corrections are derived for multivariate regression model selection. The proposed AIC corrections are based on asymptotic approximation of bootstrap-type estimates of Kullback-Leibler information. These new corrections are of particular interest when the use of bootstrap is not really justified in terms of the required calculations. As it is the case for AIC c , the new proposed criteria are asymptotically equivalent to AIC. Simulation results demonstrate that in small sample size settings, one of the proposed criterion provides better model choices than other available model selection criteria. As a result, this proposed criterion serves as an effective tool for selecting a model of appropriate order. Asymptotic justifications for the proposed criteria are provided in the Appendix.
    IEEE Transactions on Aerospace and Electronic Systems 05/2011; · 1.30 Impact Factor
  • A.-K. Seghouane, Ju Lynn Ong
    [Show abstract] [Hide abstract]
    ABSTRACT: Computed tomographic colonography (CTC) is a promising alternative to traditional invasive colonoscopic methods used in the detection and removal of cancerous growths, or polyps in the colon. Existing algorithms for CTC typically use a classifier to discriminate between true and false positives generated by a polyp candidate detection system. However, these classifiers often suffer from a phenomenon termed the curse of dimensionality, whereby there is a marked degradation in the performance of a classifier as the number of features used in the classifier is increased. In addition an increase in the number of features used also contributes to an increase in computational complexity and demands on storage space. This paper demonstrates the benefits of feature selection with the aim at increasing specificity while preserving sensitivity in a polyp detection system. It also compares the performances of an individual (F-score) and mutual information (MI) method for feature selection on a polyp candidate database, in order to select a subset of features for optimum CAD performance. Experimental results show that the performance of SVM+MI seems to be better for a small number of features used, but the SVM+Fscore method seems to dominate when using the 30-50 best ranked features. On the whole, the AUC measures are able to reach 0.8-0.85 for the top ranked 20-40 features using MI or F-score methods compared with 0.65-0.7 when using all 100 features in the worst-case scenario.
    Image Processing (ICIP), 2010 17th IEEE International Conference on; 10/2010
  • A.-K. Seghouane
    [Show abstract] [Hide abstract]
    ABSTRACT: A new algorithm for Maximum likelihood blind image restoration is presented in this paper. It is obtained by modeling the original image and the additive noise as multivariate Gaussian processes with unknown covariance matrices. The blurring process is specified by its point spread function, which is also unknown. Estimations of the original image and the blur are derived by alternating minimization of the Kullback-Leibler divergence. The algorithm presents the advantage to provide closed form expressions for the parameters to be updated and to converge only after few iterations. A simulation example that illustrates the effectiveness of the proposed algorithm is presented.
    Image Processing (ICIP), 2010 17th IEEE International Conference on; 10/2010
  • A.-K. Seghouane, Ju Lynn Ong
    [Show abstract] [Hide abstract]
    ABSTRACT: Existing polyp detection methods rely heavily on curvature-based characteristics to differentiate between lesions. However, as curvature is a local feature and a second order differential quantity, simply inspecting the curvature at a point is not sufficient. In this paper, we propose to inspect a local neighbourhood around a candidate point using curvature maps. This candidate point is pre-identified using the geodesic centroid of a surface patch containing vertices with positive point curvature values corresponding to convex shaped protrusions. Geodesic rings are then constructed around this candidate point and point curvatures around these rings are accumulated to produce curvature maps. From this, a cumulative shape property, S for a given neighbourhood radius can be computed and used for identifying bulbous polyps which typically have a high S value, and its corresponding 'neck' region. We show that a threshold value of S > 0.48 is sufficient to discriminate between polyps and non polyps with 100% sensitivity and specificity for bulbous polyps > 10mm.
    Image Processing (ICIP), 2010 17th IEEE International Conference on; 10/2010
  • Ju Lynn Ong, A.-K. Seghouane
    [Show abstract] [Hide abstract]
    ABSTRACT: Existing polyp detection methods rely heavily on curvature-based characteristics to differentiate between lesions. However, as curvature is a local feature and a second order differential quantity, noise caused by small bumpy structures and incoherent curvature fields of a discretized volume or surface can greatly increase the number of false positives (FPs) detected. This paper investigates a spectral compression and curvature tensor smoothing algorithm with the aim to reduce the number of FPs detected while preserving true positives. Simulation results give 96% sensitivity for polyps >10 mm while reducing FPs by 92%.
    Biomedical Imaging: From Nano to Macro, 2009. ISBI '09. IEEE International Symposium on; 08/2009
  • A.-K. Seghouane, J.L. Ong
    [Show abstract] [Hide abstract]
    ABSTRACT: The description of lesion shapes with curvature-based features is a widespread approach in polyp detection methods. These methods are motivated by the need to compactly and accurately encode the different existing shape forms on the colon wall. However, the colon wall presents a number of small convex shape that resemble polyps which increases drastically the number of false positive (FP) detected. In this paper a method based on multiresolution shape processing using the spherical wavelet transform is proposed. The method aims to reduce the number of FPs detected while preserving true positives. Simulation results illustrating the effectiveness of the method are presented.
    Statistical Signal Processing, 2009. SSP '09. IEEE/SP 15th Workshop on; 01/2009
  • A.-K. Seghouane
    [Show abstract] [Hide abstract]
    ABSTRACT: Alternating minimization of the information divergence is used to derive an effective algorithm for maximum likelihood (ML) factor analysis. The proposed algorithm is derived as an iterative alternating projections procedure on a model family of probability distributions defined on the factor analysis model and a desired family of probability distributions constrained to be concentrated on the observed data. The algorithm presents the advantage of being simple to implement and stable to converge. A simulation example that illustrates the effectiveness of the proposed algorithm for ML factor analysis is presented.
    Machine Learning for Signal Processing, 2008. MLSP 2008. IEEE Workshop on; 11/2008
  • A.-K. Seghouane
    [Show abstract] [Hide abstract]
    ABSTRACT: The Akaike information criterion, AIC, and its corrected version, AIC<sub>c</sub> are two methods for selecting normal linear regression models. Both criteria were designed as estimators of the expected Kullback-Leibler information between the model generating the data and the approximating candidate model. In this paper, a new corrected variants of AIC is derived for the purpose of small sample linear regression model selection. The new proposed variant of AIC is based on asymptotic approximation of bootstrap type estimates of Kullback-Leibler information. Simulation results which illustrate better performance of the proposed AIC correction when applied to polynomial regression in comparison to AIC, AIC<sub>c</sub> and other criteria are presented. Asymptotic justifications for the proposed criterion are provided in the Appendix.
    Machine Learning for Signal Processing, 2008. MLSP 2008. IEEE Workshop on; 11/2008
  • Source
    Ju Lynn Ong, A.-K. Seghouane, K. Osborn
    [Show abstract] [Hide abstract]
    ABSTRACT: As an alternative procedure to the current methods which consider only the mean values of shape features to globally characterize a candidate shape polyps, probability density functions (PDFs) of some feature variables constructed based on Gaussian and mean curvatures are used to characterize the global shape of a candidate lesion. The decision on whether or not this candidate lesion is a polyp is made by comparing the density functions of the considered shape feature variables to reference PDFs of the same variables obtained from a pre- constructed polyp/non polyp data base. The Kullback-Leibler divergence is used as a dissimilarity measure to compare these PDFs and make a decision based on closeness. Experiments carried out on real data are used to illustrate the effectiveness of the proposed method in comparison to existing ones.
    Biomedical Imaging: From Nano to Macro, 2008. ISBI 2008. 5th IEEE International Symposium on; 06/2008
  • Source
    M. Kleinsteuber, A.-K. Seghouane
    [Show abstract] [Hide abstract]
    ABSTRACT: The Cramer-Rao bound (CRB) plays an important role in direction of arrival (DOA) estimation because it is always used as a benchmark for comparison of the different proposed estimation algorithms. In this correspondence, using well-known techniques of global analysis and differential geometry, four necessary conditions for the maximum of the log-likelihood function are derived, two of which seem to be new. The CRB is derived for the general class of sensor arrays composed of multiple arbitrary widely separated subarrays in a concise way via a coordinate free form of the Fisher Information. The result derived in [1] is confirmed.
    IEEE Transactions on Signal Processing 03/2008; · 2.81 Impact Factor
  • A.-K. Seghouane
    [Show abstract] [Hide abstract]
    ABSTRACT: Image restoration necessitates the choice of a regularization parameter that controls the trade-off between fidelity to the blurred noisy observed image and the smoothness of the restored image. The choice of this parameter for which several estimators have been proposed is crucial for the quality of the restored image. In this letter, two estimators for choosing the regularization parameter are proposed. One is a simple closed-form approximation to the minimum of the selection criterion, and the other is an approximation to the minimum of a mean squared error (MSE)-based criterion.
    IEEE Signal Processing Letters 02/2008; · 1.67 Impact Factor
  • A.-K. Seghouane
    [Show abstract] [Hide abstract]
    ABSTRACT: A small sample version of KIC for the selection of least absolute deviations regression models is proposed. In contrast to KIC, the proposed criterion named KIC<sub>L1</sub>, where the notation L<sub>1</sub> stands for absolute deviation, provides an exactly unbiased estimator for the expected Kullback symmetric divergence, assuming that the errors have a double exponential distribution and that the true model is correctly specified or overfitted. Simulation results showing that KIC<sub>L1</sub> performs slightly better than KIC are presented.
    Signal Processing and Its Applications, 2007. ISSPA 2007. 9th International Symposium on; 03/2007
  • A.-K. Seghouane, S.-I. Amari
    [Show abstract] [Hide abstract]
    ABSTRACT: The Akaike information criterion (AIC) is a widely used tool for model selection. AIC is derived as an asymptotically unbiased estimator of a function used for ranking candidate models which is a variant of the Kullback-Leibler divergence between the true model and the approximating candidate model. Despite the Kullback-Leibler's computational and theoretical advantages, what can become inconvenient in model selection applications is their lack of symmetry. Simple examples can show that reversing the role of the arguments in the Kullback-Leibler divergence can yield substantially different results. In this paper, three new functions for ranking candidate models are proposed. These functions are constructed by symmetrizing the Kullback-Leibler divergence between the true model and the approximating candidate model. The operations used for symmetrizing are the average, geometric, and harmonic means. It is found that the original AIC criterion is an asymptotically unbiased estimator of these three different functions. Using one of these proposed ranking functions, an example of new bias correction to AIC is derived for univariate linear regression models. A simulation study based on polynomial regression is provided to compare the different proposed ranking functions with AIC and the new derived correction with AIC<sub>c</sub>
    IEEE Transactions on Neural Networks 02/2007; · 2.95 Impact Factor
  • A.-K. Seghouane
    [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, a new small-sample model selection criterion for vector autoregressive (VAR) models is developed. The proposed criterion is named Kullback information criterion (KICvc), where the notation vc stands for vector correction, and it can be considered as an extension of the KIC, for VAR models. KIC<sub>vc</sub> adjusts KIC to be an unbiased estimator for the variant of the Kullback symmetric divergence, assuming that the true model is correctly specified or overfitted. Furthermore, KIC<sub>vc</sub> provides better VAR model-order choices than KIC in small samples. Simulation results show that the proposed criterion selects the model order more accurately than other asymptotically efficient methods when applied to VAR model selection in small samples. As a result, KIC<sub>vc</sub> serves as an effective tool for selecting a VAR model of appropriate order. A theoretical justification of the proposed criterion is presented
    Circuits and Systems I: Regular Papers, IEEE Transactions on 11/2006; · 2.24 Impact Factor
  • A.-K. Seghouane
    [Show abstract] [Hide abstract]
    ABSTRACT: An instance crucial to most problems in signal processing is the selection of the order of a candidate model. Among the different exciting criteria, the two most popular model selection criteria in the signal processing literature have been the Akaike's criterion AIC and the Bayesian information criterion BIC. These criteria are similar in form in that they consist of data and penalty terms. Different approaches have been used to derive these criteria. However, none of them take into account the prior information concerning the parameters of the model. In this paper, an new approach for model selection, that takes into account the prior information on the model parameters, is proposed. Using the proposed approach and depending on the nature of the prior on the model parameters, two new information criteria are proposed for univariate linear regression model selection. We use the term "information criteria" because their derivation is based on the Kullback-Leibler divergence
    Sensor Array and Multichannel Processing, 2006. Fourth IEEE Workshop on; 08/2006
  • H. Belkacemi, A.-K. Seghouane
    [Show abstract] [Hide abstract]
    ABSTRACT: The Kullback information criterion (KIC) is a recently developed tool for statistical model selection. KIC serves as an asymptotically unbiased estimator of the Kullback symmetric divergence, known as J-divergence. A corrected version for KIC denoted by KIC<sub>C</sub> have been also proposed to correct the bias of KIC. This version tends to overfit when the sample size increases. In this paper we propose an alternative to KIC<sub>C</sub>, the KIC<sub>U</sub> criterion which is unbiased estimator of the Kullback's symmetric divergence. It provides better model choice than KIC<sub>C</sub> for moderate to large sample size
    Statistical Signal Processing, 2005 IEEE/SP 13th Workshop on; 08/2005
  • Source
    A.K. Seghouane
    [Show abstract] [Hide abstract]
    ABSTRACT: The Kullback information criterion, KIC, and its univariate bias-corrected version, KIC<sub>c</sub>, are two recently developed criteria for model selection. A small sample model selection criterion for vector autoregressive models is developed. The proposed criterion is named KIC<sub>vc</sub>, where the notation "vc" stands for vector correction, and it can be considered as an extension of KIC for vector autoregressive models. KIC<sub>vc</sub> is an unbiased estimator of a variant of the Kullback symmetric divergence, assuming that the true model is correctly specified or overfitted. Simulation results shows that the proposed criterion estimates the model order more accurately than any other asymptotically efficient method when applied to vector autoregressive model selection in small samples.
    Acoustics, Speech, and Signal Processing, 2005. Proceedings. (ICASSP '05). IEEE International Conference on; 04/2005 · 4.63 Impact Factor
  • A.-K. Seghouane
    [Show abstract] [Hide abstract]
    ABSTRACT: The Kullback information criterion, KIC and its multivariate bias-corrected version, KIC<sub>VC</sub> are two alternatively developed criteria for model selection. The two criteria can be viewed as estimators of the expected Kullback symmetric divergence. In this paper, a new criterion is proposed in order to select a well fitted model for an extrapolation case. The proposed criterion is named, PKIC, where "P" stands for prediction, and is derived as an exact unbiased estimator of an adapted cost function that is based on the Kullback symmetric divergence and the future design matrix. PKIC is an unbiased estimator of its cost function assuming that the true model is correctly specified or overfitted. A simulation study illustrating that model selection with PKIC performs well for some extrapolation cases is presented.
    Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on; 01/2005
  • A.-K. Seghouane, M. Bekara
    [Show abstract] [Hide abstract]
    ABSTRACT: The Kullback information criterion (KIC) is a recently developed tool for statistical model selection. KIC serves as an asymptotically unbiased estimator of a variant (within a constant) of the Kullback symmetric divergence, known also as J-divergence between the generating model and the fitted candidate model. In this paper, a bias correction to KIC is derived for linear regression models. The correction is of particular use when the sample size is small or when the number of fitted parameters is a moderate to large fraction of the sample size. For linear regression models, the corrected criterion, called KICc is an exactly unbiased estimator of the variant of the Kullback symmetric divergence, assuming that the true model is correctly specified or overfitted. Furthermore, when applied to polynomial regression and autoregressive time-series modeling, KICc is found to estimate the model order more accurately than any other asymptotically efficient method. Finally, KICc is tested on real data to forecast foreign currency exchange rate; the result is very interesting in comparison to classical techniques.
    IEEE Transactions on Signal Processing 01/2005; · 2.81 Impact Factor

Publication Stats

143 Citations
24.03 Total Impact Points

Institutions

  • 2006–2012
    • Australian National University
      • College of Engineering & Computer Science
      Canberra, Australian Capital Territory, Australia
  • 2005–2008
    • National ICT Australia Ltd
      Sydney, New South Wales, Australia
  • 2004–2005
    • National Institute for Research in Computer Science and Control
      Le Chesney, Île-de-France, France
  • 2003
    • French National Centre for Scientific Research
      Lutetia Parisorum, Île-de-France, France