Matthew Fickus

Air Force Institute of Technology, Patterson, California, United States

Are you Matthew Fickus?

Claim your profile

Publications (69)74.36 Total impact

  • [Show abstract] [Hide abstract]
    ABSTRACT: Hyperspectral data is commonly used by astronomers to discern the chemical composition of stars. Unfortunately, conventional hyperspectral platforms require long exposure times, which can hamper their use in applications like celestial navigation. We propose a compressed sensing platform that exploits the spatial sparsity of stars to quickly sample the hyperspectral data. We leverage certain combinatorial designs to devise coded apertures, and then we apply block orthogonal matching pursuit to quickly reconstruct the desired imagery.
    No preview · Article · Nov 2015 · IEEE Signal Processing Letters
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: An equiangular tight frame (ETF) is a set of unit vectors whose coherence achieves the Welch bound, and so is as incoherent as possible. Though they arise in many applications, only a few methods for constructing them are known. Motivated by the connection between real ETFs and graph theory, we introduce the notion of ETFs that are symmetric about their centroid. We then discuss how well-known constructions, such as harmonic ETFs and Steiner ETFs, can have centroidal symmetry. Finally, we establish a new equivalence between centroid-symmetric real ETFs and certain types of strongly regular graphs (SRGs). Together, these results give the first proof of the existence of certain SRGs, as well as the disproofs of the existence of others.
    Preview · Article · Sep 2015
  • Matthew Fickus · Cody E. Watson
    [Show abstract] [Hide abstract]
    ABSTRACT: An equiangular tight frame (ETF) is a set of unit vectors whose coherence achieves the Welch bound, and so is as incoherent as possible. They arise in numerous applications. It is well known that real ETFs are equivalent to a certain subclass of strongly regular graphs. In this note, we give some alternative techniques for understanding this equivalence. In a later document, we will use these techniques to further generalize this theory.
    No preview · Article · Aug 2015
  • M. Fickus · D.G. Mixon
    [Show abstract] [Hide abstract]
    ABSTRACT: We consider the fundamental problem of determining a low-rank orthogonal projection operator P from measurements of the form Px. First, we leverage a nonembedding result for the complex Grassmannian to establish and analyze a lower bound on the number of measurements necessary to uniquely determine every possible P. Next, we provide a collection of particularly few measurement vectors that uniquely determine almost every P. Finally, we propose manifold-constrained least-squares optimization as a general technique for projection retrieval.
    No preview · Article · Jul 2015
  • [Show abstract] [Hide abstract]
    ABSTRACT: An equiangular tight frame (ETF) is a set of unit vectors whose coherence achieves the Welch bound, and so is as incoherent as possible. ETFs arise in numerous applications, including compressed sensing. They also seem to be rare: despite over a decade of active research by the community, only a few construction methods have been discovered. One known method constructs ETFs from combinatorial designs known as balanced incomplete block designs. In this short paper, we provide an updated, more explicit perspective of that construction, laying the groundwork for upcoming results about such frames.
    No preview · Article · Jul 2015
  • Source
    Matthew Fickus · Dustin G. Mixon
    [Show abstract] [Hide abstract]
    ABSTRACT: A Grassmannian frame is a collection of unit vectors which are optimally incoherent. The most accessible (and perhaps most beautiful) of Grassmannian frames are equiangular tight frames (ETFs); indeed, there are infinite families of known ETFs, whereas only finitely many non-ETF Grassmannian frames are known to date. This paper surveys every known construction of ETFs and tabulates existence for sufficiently small dimensions.
    Preview · Article · Apr 2015
  • Source
    Matthew Fickus · Justin D. Marks · Miriam J. Poteet
    [Show abstract] [Hide abstract]
    ABSTRACT: The Schur-Horn theorem is a classical result in matrix analysis which characterizes the existence of positive semidefinite matrices with a given diagonal and spectrum. In recent years, this theorem has been used to characterize the existence of finite frames whose elements have given lengths and whose frame operator has a given spectrum. We provide a new generalization of the Schur-Horn theorem which characterizes the spectra of all possible finite frame completions. That is, we characterize the spectra of the frame operators of the finite frames obtained by adding new vectors of given lengths to an existing frame. We then exploit this characterization to give a new and simple algorithm for computing the optimal such completion.
    Full-text · Article · Aug 2014 · Applied and Computational Harmonic Analysis
  • [Show abstract] [Hide abstract]
    ABSTRACT: We propose a methodology for the design of features mimicking the visual cues used by pathologists when identifying tissues in hematoxylin and eosin (H&E)-stained samples. Background: H&E staining is the gold standard in clinical histology; it is cheap and universally used, producing a vast number of histopathological samples. While pathologists accurately and consistently identify tissues and their pathologies, it is a time-consuming and expensive task, establishing the need for automated algorithms for improved throughput and robustness. Methods: We use an iterative feedback process to design a histopathology vocabulary (HV), a concise set of features that mimic the visual cues used by pathologists, e.g. “cytoplasm color” or “nucleus density”. These features are based in histology and understood by both pathologists and engineers. We compare our HV to several generic texture-feature sets in a pixel-level classification algorithm. Results: Results on delineating and identifying tissues in teratoma tumor samples validate our expert knowledge-based approach. Conclusions: The HV can be an effective tool for identifying and delineating teratoma components from images of H&E-stained tissue samples.
    No preview · Article · Jun 2014
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The restricted isometry property (RIP) is an important matrix condition in compressed sensing, but the best matrix constructions to date use randomness. This paper leverages pseudorandom properties of the Legendre symbol to reduce the number of random bits in an RIP matrix with Bernoulli entries. In this regard, the Legendre symbol is not special---our main result naturally generalizes to any small-bias sample space. We also conjecture that no random bits are necessary for our Legendre symbol--based construction.
    Full-text · Article · Jun 2014 · Constructive Approximation
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We propose a new mathematical and algorithmic framework for unsupervised image segmentation, which is a critical step in a wide variety of image processing applications. We have found that most existing segmentation methods are not successful on histopathology images, which prompted us to investigate segmentation of a broader class of images, namely those without clear edges between the regions to be segmented. We model these images as occlusions of random images, which we call textures, and show that local histograms are a useful tool for segmenting them. Based on our theoretical results, we describe a flexible segmentation framework that draws on existing work on nonnegative matrix factorization and image deconvolution. Results on synthetic texture mosaics and real histology images show the promise of the method.
    Full-text · Article · May 2014 · IEEE Transactions on Image Processing
  • Source
    Lindsay N. Smith · Matthew Fickus
    [Show abstract] [Hide abstract]
    ABSTRACT: In certain real-world applications, one needs to estimate the angular frequency of a spinning object. We consider the image processing problem of estimating this rate of rotation from a video of the object taken by a camera aligned with the axis of rotation. For many types of spinning objects, this problem can be addressed with existing techniques: simply register two consecutive video frames. We focus, however, on objects whose shape and intensity changes greatly from frame to frame, such as spinning plumes of plasma that emerge from a certain type of spacecraft thruster. To estimate the angular frequency of such objects, we introduce the Geometric Sum Transform (GST), a new rotation-based generalization of the discrete Fourier transform (DFT). Taking the GST of a given video produces a new sequence of images, the most coherent of which corresponds to the object’s true rate of rotation. After formally demonstrating this fact, we provide a fast algorithm for computing the GST which generalizes the decimation-in-frequency approach for performing a Fast Fourier Transform (FFT). We further show that computing a GST is, in fact, mathematically equivalent to computing a system of DFTs, provided one can decompose each video frame in terms of an eigenbasis of a rotation operator. We conclude with numerical experimentation.
    Preview · Article · Feb 2014 · Advances in Computational Mathematics
  • Source
    Matthew Fickus · Dustin G. Mixon · Aaron A. Nelson · Yang Wang
    [Show abstract] [Hide abstract]
    ABSTRACT: In many applications, signals are measured according to a linear process, but the phases of these measurements are often unreliable or not available. To reconstruct the signal, one must perform a process known as phase retrieval. This paper focuses on completely determining signals with as few intensity measurements as possible, and on efficient phase retrieval algorithms from such measurements. For the case of complex M-dimensional signals, we construct a measurement ensemble of size 4M-4 which yields injective intensity measurements; this is conjectured to be the smallest such ensemble. For the case of real signals, we devise a theory of "almost" injective intensity measurements, and we characterize such ensembles. Later, we show that phase retrieval from M+1 almost injective intensity measurements is NP-hard, indicating that computationally efficient phase retrieval must come at the price of measurement redundancy.
    Preview · Article · Jul 2013 · Linear Algebra and its Applications
  • Source
    John Jasper · Dustin G. Mixon · Matthew Fickus
    [Show abstract] [Hide abstract]
    ABSTRACT: An equiangular tight frame (ETF) is a set of unit vectors in a Euclidean space whose coherence is as small as possible, equaling the Welch bound. Also known as Welch-bound-equality sequences, such frames arise in various applications, such as waveform design and compressed sensing. At the moment, there are only two known flexible methods for constructing ETFs: harmonic ETFs are formed by carefully extracting rows from a discrete Fourier transform; Steiner ETFs arise from a tensor-like combination of a combinatorial design and a regular simplex. These two classes seem very different: the vectors in harmonic ETFs have constant amplitude, whereas Steiner ETFs are extremely sparse. We show that they are actually intimately connected: a large class of Steiner ETFs can be unitarily transformed into constant-amplitude frames, dubbed Kirkman ETFs. Moreover, we show that an important class of harmonic ETFs is a subset of an important class of Kirkman ETFs. This connection informs the discussion of both types of frames: some Steiner ETFs can be transformed into constant-amplitude waveforms making them more useful in waveform design; some harmonic ETFs have low spark, making them less desirable for compressed sensing. We conclude by showing that real-valued constant-amplitude ETFs are equivalent to binary codes that achieve the Grey-Rankin bound, and then construct such codes using Kirkman ETFs.
    Full-text · Article · Jun 2013 · IEEE Transactions on Information Theory
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The local histogram transform of an image is a data cube that consists of the histograms of the pixel values that lie within a fixed neighborhood of any given pixel location. Such transforms are useful in image processing applications such as classification and segmentation, especially when dealing with textures that can be distinguished by the distributions of their pixel intensities and colors. We, in particular, use them to identify and delineate biological tissues found in histology images obtained via digital microscopy. In this paper, we introduce a mathematical formalism that rigorously justifies the use of local histograms for such purposes. We begin by discussing how local histograms can be computed as systems of convolutions. We then introduce probabilistic image models that can emulate textures one routinely encounters in histology images. These models are rooted in the concept of image occlusion. A simple model may, for example, generate textures by randomly speckling opaque blobs of one color on top of blobs of another. Under certain conditions, we show that, on average, the local histograms of such model-generated-textures are convex combinations of more basic distributions. We further provide several methods for creating models that meet these conditions; the textures generated by some of these models resemble those found in histology images. Taken together, these results suggest that histology textures can be analyzed by decomposing their local histograms into more basic components. We conclude with a proof-of-concept segmentation-and-classification algorithm based on these ideas, supported by numerical experimentation.
    Preview · Article · May 2013 · Applied and Computational Harmonic Analysis
  • [Show abstract] [Hide abstract]
    ABSTRACT: Digital fingerprinting is a framework for marking media files, such as images, music, or movies, with user-specific signatures to deter illegal distribution. Multiple users can collude to produce a forgery that can potentially overcome a fingerprinting system. This paper proposes an equiangular tight frame fingerprint design which is robust to such collusion attacks. We motivate this design by considering digital fingerprinting in terms of compressed sensing. The attack is modeled as linear averaging of multiple marked copies before adding a Gaussian noise vector. The content owner can then determine guilt by exploiting correlation between each user's fingerprint and the forged copy. The worst case error probability of this detection scheme is analyzed and bounded. Simulation results demonstrate that the average-case performance is similar to the performance of orthogonal and simplex fingerprint designs, while accommodating several times as many users.
    No preview · Article · Mar 2013 · IEEE Transactions on Information Theory
  • Matthew Fickus · Melody L. Massar · Dustin G. Mixon
    [Show abstract] [Hide abstract]
    ABSTRACT: Filter banks are fundamental tools of signal and image processing. A filter is a linear operator which computes the inner products of an input signal with all translates of a fixed function. In a filter bank, several filters are applied to the input, and each of the resulting signals is then downsampled. Such operators are closely related to frames, which consist of equally spaced translates of a fixed set of functions. In this chapter, we highlight the rich connections between frame theory and filter banks. We begin with the algebraic properties of related operations, such as translation, convolution, downsampling, the discrete Fourier transform, and the discrete Z-transform. We then discuss how basic frame concepts, such as frame analysis and synthesis operators, carry over to the filter bank setting. The basic theory culminates with the representation of a filter bank’s synthesis operator in terms of its polyphase matrix. This polyphase representation greatly simplifies the process of constructing a filter bank frame with a given set of properties. Indeed, we use this representation to better understand the special case in which the filters are modulations of each other, namely Gabor frames.
    No preview · Article · Jan 2013
  • Matthew Fickus · Dustin G. Mixon · Miriam J. Poteet
    [Show abstract] [Hide abstract]
    ABSTRACT: Broadly speaking, frame theory is the study of how to produce well-conditioned frame operators, often subject to nonlinear application-motivated restrictions on the frame vectors themselves. In this chapter, we focus on one particularly well-studied type of restriction, i.e., having frame vectors of prescribed lengths. We discuss two methods for iteratively constructing such frames. The first method, called spectral tetris, produces special examples of such frames, and only works in certain cases. The second method combines the idea behind spectral tetris with the classical theory of majorization; this method can build any such frame in terms of a sequence of interlacing spectra, called eigensteps.
    No preview · Article · Jan 2013
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In many areas of imaging science, it is difficult to measure the phase of linear measurements. As such, one often wishes to reconstruct a signal from intensity measurements, that is, perform phase retrieval. In this paper, we provide a novel measurement design which is inspired by interferometry and exploits certain properties of expander graphs. We also give an efficient phase retrieval procedure, and use recent results in spectral graph theory to produce a stable performance guarantee which rivals the guarantee for PhaseLift in [Candes et al. 2011]. We use numerical simulations to illustrate the performance of our phase retrieval procedure, and we compare reconstruction error and runtime with a common alternating-projections-type procedure.
    Preview · Article · Oct 2012 · SIAM Journal on Imaging Sciences
  • Source
    Matthew Fickus · John Jasper · Dustin G. Mixon · Jesse Peterson
    [Show abstract] [Hide abstract]
    ABSTRACT: In the field of compressed sensing, a key problem remains open: to explicitly construct matrices with the restricted isometry property (RIP) whose performance rivals those generated using random matrix theory. In short, RIP involves estimating the singular values of a combinatorially large number of submatrices, seemingly requiring an enormous amount of computation in even low-dimensional examples. In this paper, we consider a similar problem involving submatrix singular value estimation, namely the problem of explicitly constructing numerically erasure robust frames (NERFs). Such frames are the latest invention in a long line of research concerning the design of linear encoders that are robust against data loss. We begin by focusing on a subtle difference between the definition of a NERF and that of an RIP matrix, one that allows us to introduce a new computational trick for quickly estimating NERF bounds. In short, we estimate these bounds by evaluating the frame analysis operator at every point of an epsilon-net for the unit sphere. We then borrow ideas from the theory of group frames to construct explicit frames and epsilon-nets with such high degrees of symmetry that the requisite number of operator evaluations is greatly reduced. We conclude with numerical results, using these new ideas to quickly produce decent estimates of NERF bounds which would otherwise take an eternity. Though the more important RIP problem remains open, this work nevertheless demonstrates the feasibility of exploiting symmetry to greatly reduce the computational burden of similar combinatorial linear algebra problems.
    Full-text · Article · Sep 2012 · Linear Algebra and its Applications
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In this paper we provide rigorous proof for the convergence of an iterative voting-based image segmentation algorithm called Active Masks. Active Masks (AM) was proposed to solve the challenging task of delineating punctate patterns of cells from fluorescence microscope images. Each iteration of AM consists of a linear convolution composed with a nonlinear thresholding; what makes this process special in our case is the presence of additive terms whose role is to "skew" the voting when prior information is available. In real-world implementation, the AM algorithm always converges to a fixed point. We study the behavior of AM rigorously and present a proof of this convergence. The key idea is to formulate AM as a generalized (parallel) majority cellular automaton, adapting proof techniques from discrete dynamical systems.
    Full-text · Article · Sep 2012 · Applied and Computational Harmonic Analysis