# Javier Perez AlvaroUniversity of Montana | UMT · Department of Mathematical Sciences

Javier Perez Alvaro

## About

22

Publications

2,939

Reads

**How we measure 'reads'**

A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more

115

Citations

Introduction

Additional affiliations

September 2015 - March 2016

September 2010 - July 2015

## Publications

Publications (22)

Zeros of rational transfer functions matrices $R(\lambda)$ are the eigenvalues of associated polynomial system matrices $P(\lambda)$, under minimality conditions. In this paper we define a structured condition number for a simple eigenvalue $\lambda_0$ of a (locally) minimal polynomial system matrix $P(\lambda)$, which in turn is a simple zero $\la...

We construct a new family of linearizations of rational matrices R(λ) written in the general form R(λ)=D(λ)+C(λ)A(λ)−1B(λ), where D(λ), C(λ), B(λ) and A(λ) are polynomial matrices. Such representation always exists and is not unique. The new linearizations are constructed from linearizations of the polynomial matrices D(λ) and A(λ), where each of t...

In the literature it is common to use the first and last pencils $D_1(\lambda, P)$ and $D_k(\lambda, P)$ in the ``standard basis'' for the vector space $\mathbb{DL}(P)$ of block-symmetric pencils to solve the symmetric/Hermitian polynomial eigenvalue problem $P(\lambda)x=0$. When the polynomial $P(\lambda)$ has odd degree, it was proven in recent y...

In the framework of Polynomial Eigenvalue Problems (PEPs), most of the matrix polynomials arising in applications are structured polynomials (namely, (skew-)symmetric, (skew-)Hermitian, (anti-)palindromic, or alternating). The standard way to solve PEPs is by means of linearizations. The most frequently used linearizations belong to general constru...

One strategy to solve a nonlinear eigenvalue problem $T(\lambda)x=0$ is to solve a polynomial eigenvalue problem (PEP) $P(\lambda)x=0$ that approximates the original problem through interpolation. Then, this PEP is usually solved by linearization. Because of the polynomial approximation techniques, in this context, $P(\lambda)$ is expressed in a no...

In the framework of Polynomial Eigenvalue Problems, most of the matrix polynomials arising in applications are structured polynomials (namely (skew-)symmetric, (skew-)Hermitian, (anti-)palindromic, or alternating). The standard way to solve Polynomial Eigenvalue Problems is by means of linearizations. The most frequently used linearizations belong...

We construct a new family of linearizations of rational matrices $R(\lambda)$ written in the general form $R(\lambda)= D(\lambda)+C(\lambda)A(\lambda)^{-1}B(\lambda)$, where $D(\lambda)$, $C(\lambda)$, $B(\lambda)$ and $A(\lambda)$ are polynomial matrices. Such representation always exists and are not unique. The new linearizations are constructed...

One strategy to solve a nonlinear eigenvalue problem $T(\lambda)x=0$ is to solve a polynomial eigenvalue problem (PEP) $P(\lambda)x=0$ that approximates the original problem through interpolation. Then, this PEP is usually solved by linearization. Most of the literature about linearizations assumes that $P(\lambda)$ is expressed in the monomial bas...

In the last decade, there has been a continued effort to produce families of strong linearizations of a matrix polynomial $P(\lambda)$, regular and singular, with good properties. As a consequence of this research, families such as the family of Fiedler pencils, the family of generalized Fiedler pencils (GFP), the family of Fiedler pencils with rep...

In the last decade, there has been a continued effort to produce families of strong linearizations of a matrix polynomial $P(\lambda)$, regular and singular, with good properties, such as, being companion forms, allowing the recovery of eigenvectors of a regular $P(\lambda)$ in an easy way, allowing the computation of the minimal indices of a singu...

The standard approach for finding eigenvalues and eigenvectors of matrix polynomials starts by embedding the coefficients of the polynomial into a matrix pencil, known as linearization. Building on the pioneering work of Nakatsukasa and Tisseur, we present error bounds for the computed eigenvectors of matrix polynomials. Our error bounds are applic...

A strong $\ell$-ification of a matrix polynomial $P(\lambda)=\sum A_i\lambda^i$ of degree $d$ is a matrix polynomial $\mathcal{L}(\lambda)$ of degree $\ell$ having the same finite and infinite elementary divisors, and the same numbers of left and right minimal indices as $P(\lambda)$. Strong $\ell$-ifications can be used to transform the polynomial...

We present a method for solving nonlinear eigenvalue problems using rational approximation. The method uses the AAA method by Nakatsukasa, S\`{e}te, and Trefethen to approximate the nonlinear eigenvalue problem by a rational eigenvalue problem and is embedded in the state space representation of a rational polynomial by Su and Bai. The advantage of...

We revisit the numerical stability of the two-level orthogonal Arnoldi (TOAR) method for computing an orthonormal basis of a second--order Krylov subspace associated with two given matrices. We show that the computed basis is close (on certain subspace metric sense) to a basis for a second-order Krylov subspace associated with nearby coefficient ma...

Polynomials eigenvalue problems with structured matrix polynomials arise in many applications. The standard way to solve polynomial eigenvalue problems is through the classical Frobenius companion linearizations, which may not retain the structure of the matrix polynomial. Particularly, the structure of the symmetric matrix polynomials can be lost,...

Computing the roots of a scalar polynomial, or the eigenvalues of a matrix polynomial, expressed in the Chebyshev basis {Tk(x)} is a fundamental problem that arises in many applications. In this work, we analyze the backward stability of the polynomial rootfinding problem solved with colleague matrices. In other words, given a scalar polynomial p(x...

The standard way of solving the polynomial eigenvalue problem associated with a matrix polynomial is to embed the matrix polynomial into a matrix pencil, transforming the problem into an equivalent generalized eigenvalue problem. Such pencils are known as linearizations. Many of the families of linearizations for matrix polynomials available in the...

Many applications give rise to structured matrix polynomials. The problem of constructing structure-preserving strong linearizations of structured matrix polynomials is revisited in this work and in the forthcoming ones \cite{PartII,PartIII}. With the purpose of providing a much simpler framework for structure-preserving linearizations for symmetri...

Fiedler pencils are a family of strong linearizations for polynomials expressed in the monomial basis, that include the classical Frobenius companion pencils as special cases. We generalize the definition of a Fiedler pencil from monomials to a larger class of orthogonal polynomial bases. In particular, we derive Fiedler-comrade pencils for two bas...

The polynomial eigenvalue problem for Hermite interpolation matrix polynomials is discussed. The standard approach to solve a polynomial eigenvalue problem is via linearization. In this work we introduce a new linearization for Hermite interpolation matrix polynomials expressed in the first barycentric form that is more sparse than the ones known s...

Computing roots of scalar polynomials as the eigenvalues of Frobenius companion matrices using backward stable eigenvalue algorithms is a classical approach. The introduction of new families of companion matrices allows for the use of other matrices in the root-finding problem. In this paper, we analyse the backward stability of polynomial root-fin...

## Projects

Project (1)