Figure - available from: Journal of Mathematical Imaging and Vision
This content is subject to copyright. Terms and conditions apply.
Examples of video frames in visual speech recognition (first two rows) and hand gesture classification (last two rows)
Source publication
Statistical analysis of dynamic systems, such as videos and dynamic functional connectivity, is often translated into a problem of analyzing trajectories of relevant features, particularly covariance matrices. As an example, in video-based action recognition, a natural mathematical representation of activity videos is as parameterized trajectories...
Citations
... More recently, the Fisher-Rao Riemannian metric has been used to separate the phase and amplitude parts of 1D functional data on [0, 1] in [28]. Alignment of functional data on the manifold, i.e., g : [0, 1] → M where M is a nonlinear manifold, has been investigated in [29]- [31]. However, the ConCon function f is defined on a product manifold domain Ω × Ω, which is significantly different from the previous works. ...
Brain networks are typically represented by adjacency matrices, where each node corresponds to a brain region. In traditional brain network analysis, nodes are assumed to be matched across individuals, but the methods used for node matching often overlook the underlying connectivity information. This oversight can result in inaccurate node alignment, leading to inflated edge variability and reduced statistical power in downstream connectivity analyses. To overcome this challenge, we propose a novel framework for registering high resolution continuous connectivity (ConCon), defined as a continuous function on a product manifold space specifically, the cortical surface capturing structural connectivity between all pairs of cortical points. Leveraging ConCon, we formulate an optimal diffeomorphism problem to align both connectivity profiles and cortical surfaces simultaneously. We introduce an efficient algorithm to solve this problem and validate our approach using data from the Human Connectome Project (HCP). Results demonstrate that our method substantially improves the accuracy and robustness of connectome-based analyses compared to existing techniques.
... For example, neuroscientists study dynamic functional connectivity, where one records longitudinal samples of time-varying covariances at increasingly high resolutions [10], whereas collections of multiple functional time series will generate multiple spectral density operators [33]. Furthermore, data in the form of time-varying covariance matrices is increasingly common, and has recently attracted significant interest from the statistical community [10,18,48,35] mostly due to a growing interest in studying the brain functional connectome generated from functional MRI [45]. ...
We develop a statistical framework for conducting inference on collections of time-varying covariance operators (covariance flows) over a general, possibly infinite dimensional, Hilbert space. We model the intrinsically non-linear structure of covariances by means of the Bures- Wasserstein metric geometry. We make use of the Riemmanian-like structure induced by this metric to define a notion of mean and covariance of a random flow, and develop an associ- ated Karhunen-Lo`eve expansion. We then treat the problem of estimation and construction of functional principal components from a finite collection of covariance flows, observed fully or irregularly. Our theoretical results are motivated by modern problems in functional data analysis, where one observes operator-valued random processes – for instance when analysing dynamic functional connectivity and fMRI data, or when analysing multiple functional time se- ries in the frequency domain. Nevertheless, our framework is also novel in the finite-dimensions (matrix case), and we demonstrate what simplifications can be afforded then. We illustrate our methodology by means of simulations and data analyses.
... Another metric that also provides a Riemannian symmetric structure on Sym + (n) was used in [29,36]. It was introduced directly by the quotient structure detailed in Proposition 3.1 but with the submersion √ π : A ∈ GL + (n) −→ √ AA ∈ Sym + (n) based on the polar decomposition of A (and without the coefficient 4). ...
Symmetric Positive Definite (SPD) matrices are ubiquitous in data analysis under the form of covariance matrices or correlation matrices. Several O(n)-invariant Riemannian metrics were defined on the SPD cone, in particular the kernel metrics introduced by Hiai and Petz. The class of kernel metrics interpolates between many classical O(n)-invariant metrics and it satisfies key results of stability and completeness. However, it does not contain all the classical O(n)-invariant metrics. Therefore in this work, we investigate super-classes of kernel metrics and we study which key results remain true. We also introduce an additional key result called cometric-stability, a crucial property to implement geodesics with a Hamiltonian formulation. Our method to build intermediate embedded classes between O(n)-invariant metrics and kernel metrics is to give a characterization of the whole class of O(n)-invariant metrics on SPD matrices and to specify requirements on metrics one by one until we reach kernel metrics. As a secondary contribution, we synthesize the literature on the main O(n)-invariant metrics, we provide the complete formula of the sectional curvature of the affine-invariant metric and the formula of the geodesic parallel transport between commuting matrices for the Bures-Wasserstein metric.
... For example, in Fig. 2, we show how the reference point can affect the computed distance between given functional data or trajectories on S 2 using the method in [32]. To overcome these limitations, more intrinsic methods were proposed in [6,16,37,38]. Compared with [32], these new methods represent each functional data as a pair, its starting point and its speed vector field renormalized by the square root of its norm, to 1) reduce the distortion brought by parallel translating to a far away tangent space (of the reference point), and 2) avoid the need of choosing an arbitrary reference point. ...
... We represent p using a pair (x, q), illustrated in Fig. 3 (a). The TSRVC representation is bijective: any p ∈ F can be uniquely represented by a pair (x, q) and we can reconstruct p from (x, q) using covariant integral [38]. When it is convenient, we use p = (x, q) for notation simplicity and we will explicitly point out the notation change. ...
Manifold-valued functional data analysis (FDA) has become an active area of research motivated by the rising availability of trajectories or longitudinal data observed on nonlinear manifolds. The challenges of analyzing such data come from many aspects, including infinite dimensionality and nonlinearity, as well as time domain or phase variability. In this paper, we study the amplitude part of manifold-valued functions on S2, which is invariant to random time warping or re-parameterization. We represent a smooth function on S2 using a pair of components: a starting point and a transported square-root velocity curve (TSRVC). Under this representation, the space of all smooth functions on S2 forms a vector bundle, and the simple L2 norm becomes a time-warping invariant metric on this vector bundle. Utilizing the nice geometry of S2, we develop a set of efficient and accurate tools for temporal alignment of functions, geodesic computing, and sample mean and covariance calculation. At the heart of these tools, they rely on gradient descent algorithms with carefully derived gradients. We show the advantages of these newly developed tools over its competitors with extensive simulations and real data and demonstrate the importance of considering the amplitude part of functions instead of mixing it with phase variability in manifold-valued FDA.
... Another metric that also provides a Riemannian symmetric structure on SPD(n) was used in [16,31]. It was introduced directly by the quotient structure detailed in Proposition 3.1 but with the submersion √ π : A ∈ GL + (n) −→ √ AA ∈ SPD(n) based on the polar decomposition of A (and without the coefficient 4). ...
Symmetric Positive Definite (SPD) matrices are ubiquitous in data analysis under the form of covariance matrices or correlation matrices. Several O(n)-invariant Riemannian metrics were defined on the SPD cone, in particular the kernel metrics introduced by Hiai and Petz. The class of kernel metrics interpolates between many classical O(n)-invariant metrics and it satisfies key results of stability and completeness. However, it does not contain all the classical O(n)-invariant metrics. Therefore in this work, we investigate super-classes of kernel metrics and we study which key results remain true. We also introduce an additional key result called cometric-stability, a crucial property to implement geodesics with a Hamiltonian formulation. Our method to build intermediate embedded classes between O(n)-invariant metrics and kernel metrics is to give a characterization of the whole class of O(n)-invariant metrics on SPD matrices and to specify requirements on metrics one by one until we reach kernel metrics. As a secondary contribution, we synthesize the literature on the main O(n)-invariant metrics, we provide the complete formula of the sectional curvature of the affine-invariant metric and the formula of the geodesic parallel transport between commuting matrices for the Bures-Wasserstein metric.
... Leveraging advancements in shape analysis, [26] extended the square-root velocity curve in the Euclidean space [25] to the manifold by parallel transporting all square-root velocity vectors to some reference point c ∈ M along geodesic paths Although this generalization makes the distance function invariant under Γ, the random choice of the reference point c ∈ M can introduce distortions and uncertainty into the analysis. To overcome the need of choosing an arbitrary reference point and transforming the functions on the manifold to a Euclidean space, more intrinsic methods were proposed in [30,31,12,4], where they treat the manifold-valued functions as an (infinite-dimensional) manifold itself, and equip it with a Riemannian metric that can be invariant with respect to re-parameterization of functions. However, due to the complexity of the proposed metrics and the manifold itself, these methods all have significant computational challenges to calculate geodesics and amplitude mean after alignment. ...
... We represent p using a pair (x, q), illustrated in Figure 2 (a). The TSRVC representation is bijective: any p ∈ F can be uniquely represented by a pair (x, q) and we can reconstruct p from (x, q) using covariant integral [31]. When it is convenient, we use p = (x, q) for notation simplicity and we will explicitly point out the notation change. ...
Mainfold-valued functional data analysis (FDA) recently becomes an active area of research motivated by the raising availability of trajectories or longitudinal data observed on non-linear manifolds. The challenges of analyzing such data comes from many aspects, including infinite dimensionality and nonlinearity, as well as time domain or phase variability. In this paper, we study the amplitude part of manifold-valued functions on , which is invariant to random time warping or re-parameterization of the function. Utilizing the nice geometry of , we develop a set of efficient and accurate tools for temporal alignment of functions, geodesic and sample mean calculation. At the heart of these tools, they rely on gradient descent algorithms with carefully derived gradients. We show the advantages of these newly developed tools over its competitors with extensive simulations and real data, and demonstrate the importance of considering the amplitude part of functions instead of mixing it with phase variability in mainfold-valued FDA.
... Definition 1: (Transported Square-Root Velocity Field or TSRVF) [30]: For a shape sequence α : I → S, we define its transported square-root velocity field (TSRVF) according to: Fig. 4: Examples of computing shape velocitiesα(τ ) in the shape space. As earlier, the computations are performed in S space but displayed for convenience in the curve space. ...
This paper develops a generative statistical model for representing, modeling, and comparing the morphological evolution of biological cells undergoing motility. It uses the elastic shape analysis to separate cell kinematics (overall location, rotation, speed, etc.) from its morphology and represents morphological changes using transported square-root vector fields (TSRVFs). This TSRVF representation, followed by a PCA-based dimension reduction, provides a convenient mathematical representation of a shape sequence in the form of a Euclidean time series. Fitting a vector auto-regressive (VAR) model to this TSRVF-PCA time series leads to statistical modeling of the overall shape dynamics. We use the parameters of the fitted VAR model to characterize morphological evolution. We validate VAR models through model comparisons, synthesis, and sequence classifications. For classification, we use the VAR parameters in conjunction with different classifiers: SVM, Random Forest, and CNN, and obtain high classification rates. Extensive experiments presented here demonstrate the success of the proposed pipeline. These results are the first of the kind in classifying cell migration videos using shape dynamics.
... This is accomplished using parallel transport. Definition 1 Transported Square-Root Velocity Field (TSRVF) [12]: For a shape sequence α : I → S, define its transported square-root velocity field (TSRVF) according to: F α (τ ) = (α(τ )) α(τ )→α(0) ∈ T α(0) (S), where α(τ ) → α(0) denotes the parallel transport ofα(τ ) from α(τ ) to α(0) along the path α. Similarly, define the integrated TSRVF (I-TSRVF) to be: H α (τ ) = τ 0 F α (τ ) dτ . ...
... A. Future work • We have treated the autocovariance matrices of timeseries (pixels) as stationary, though more information may be obtained by capturing dynamics via covariance trajectories in the space of SPD matrices. See, e.g., [30]. • Due to the one-to-one mapping of any SPD matrix to a real vector, via a differentiable transformation, one can endow the space of SPD matrices with any distribution on multivariate real numbers, such as the Gaussian distribution, and obtain a distribution over the SPD vector via a transformation of variables construction. ...
We state theoretical properties for k-means clustering of Symmetric Positive Definite (SPD) matrices, in a non-Euclidean space, that provides a natural and favourable representation of these data. We then provide a novel application for this method, to time-series clustering of pixels in a sequence of Synthetic Aperture Radar images, via their finite-lag autocovariance matrices.
... The difficulty of implementing this method depends on the particular manifold M . See [62] for applications of this method to curves in the sphere S 2 , and [63] for applications to curves in the space of positive definite symmetric matrices. ...
This chapter reviews some past and recent developments in shape comparison and analysis of curves based on the computation of intrinsic Riemannian metrics on the space of curves modulo shape-preserving transformations. We summarize the general construction and theoretical properties of quotient elastic metrics for Euclidean as well as non-Euclidean curves before considering the special case of the square root velocity metric for which the expression of the resulting distance simplifies through a particular transformation. We then examine different numerical approaches that have been proposed to estimate such distances in practice and in particular to quotient out curve reparametrization in the resulting minimization problems.