Temporal Synchronization of MIMO Wireless Communication in the Presence of Interference

MIT Lincoln Lab., Lexington, MA, USA
IEEE Transactions on Signal Processing (Impact Factor: 2.79). 04/2010; 58(3):1794 - 1806. DOI: 10.1109/TSP.2009.2037067
Source: IEEE Xplore


In wireless communications systems that potentially operate in interference, acquisition and temporal alignment of a transmitted signal by a receiver can be the most fragile component of the link. In this paper, synchronization detection in the presence of interference for multiple-input multiple-output (MIMO) communication is discussed. Here, synchronization indicates signal acquisition and timing estimation at the receiver, and is formulated as a binary statistical hypothesis test. Transmit sequences from multiple antennas are received by multiple antennas in noisy environments with spatially correlated noise (interference). Flat-fading and frequency-selective channel models for both the interference and signal of interest are considered. By applying well-known multiple antenna approaches to the MIMO synchronization problem, a number of new synchronization test statistics are introduced. These test statistics are motivated by minimum-mean-square-error (MMSE) beamformers, generalized-likelihood ratio test (GLRT), least-squared (LS) channel estimation, and spatial invariance. Test statistics appropriate for orthogonal-frequency-division-multiplexing (OFDM) systems are considered, including test statistics that take advantage of cyclic prefixes and of pilot sequences within an OFDM symbol. Performances of various test statistics in terms of probability of missing detection for some probability of a false detection are shown to vary by multiple orders of magnitude in the presence of interference.

4 Reads
  • Source
    • "In this context, it is well established (see e.g. [4]) that the generalized maximum likelihood test consists in comparing the following statistics η N to a threshold: "
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper addresses the behaviour of a classical multi-antenna GLRT test that allows to detect the presence of a known signal corrupted by a multi-path propagation channel and by an additive white Gaussian noise with unknown spatial covariance matrix. The paper is focused on the case where the number of sensors M is large, and of the same order of magnitude as the sample size N, a context which is modeled by the large system asymptotic regime M goes to infinity, N goes to infinity in such a way that M/N goes to c for c in (0, infinity). The purpose of this paper is to study the behaviour of a GLRT test statistics in this regime, and to show that the corresponding theoretical analysis allows to accurately predict the performance of the test when M and N are of the same order of magnitude.
    Preview · Article · Dec 2014
  • Source
    • "In the case of a flat fading channel, the sampling period is much greater than the multipath delay spread, and a single channel filter tap is sufficient to represent the channel. Therefore, the MIMO channel matrix at a relative delay , H , will be assumed of the form [1] H = { H, = 0 0, otherwise where the correct delay in terms of receiver's clock is 0 , and 0 denotes the null matrix. This represents the key assumption under which the results of this paper will be built upon. "
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper studies the performance of the generalized likelihood ratio test (GLRT) proposed in [1] in the context of data-aided timing synchronization for multiple-input multiple-output (MIMO) communications systems and flat-fading channels. Herein paper, the asymptotic performance of the GLRT test is derived, and an upper bound for the detection probability is provided and shown to behave well as a benchmark for sufficiently large number of observations. Computer simulations are presented to corroborate the developed analytical performance benchmarks. In addition, choice of some system design parameters to improve the synchronization performance is discussed via computer simulations.
    Full-text · Conference Paper · Jan 2012
  • [Show abstract] [Hide abstract]
    ABSTRACT: Cyclic-delay diversity (CDD) schemes require only one inverse discrete Fourier transform (IDFT) operation at the transmitter. As a result, they provide a low-complexity means of increasing the transmission diversity in multiple-input multiple- output (MIMO) orthogonal frequency division multiplexing (OFDM) systems. In this paper, we develop a novel timing synchronization method for CDD-OFDM systems. In our proposed method, the channel correlation between transmit antennas is exploited to enhance timing synchronization accuracy. The simulation results are performed to verify the superiority of the proposed timing synchronization method.
    No preview · Conference Paper · Sep 2011
Show more