In linear algebra, the Frobenius companion matrix of the monic polynomial

is the square matrix defined as

Some authors use the transpose of this matrix, , which is more convenient for some purposes such as linear recurrence relations (see below).

is defined from the coefficients of , while the characteristic polynomial as well as the minimal polynomial of are equal to .[1] In this sense, the matrix and the polynomial are "companions".

Similarity to companion matrix edit

Any matrix A with entries in a field F has characteristic polynomial  , which in turn has companion matrix  . These matrices are related as follows.

The following statements are equivalent:

  • A is similar over F to  , i.e. A can be conjugated to its companion matrix by matrices in GLn(F);
  • the characteristic polynomial   coincides with the minimal polynomial of A , i.e. the minimal polynomial has degree n;
  • the linear mapping   makes   a cyclic  -module, having a basis of the form  ; or equivalently   as  -modules.

If the above hold, one says that A is non-derogatory.

Not every square matrix is similar to a companion matrix, but every square matrix is similar to a block diagonal matrix made of companion matrices. If we also demand that the polynomial of each diagonal block divides the next one, they are uniquely determined by A, and this gives the rational canonical form of A.

Diagonalizability edit

The roots of the characteristic polynomial   are the eigenvalues of  . If there are n distinct eigenvalues  , then   is diagonalizable as  , where D is the diagonal matrix and V is the Vandermonde matrix corresponding to the λ's:

 
Indeed, an easy computation shows that the transpose   has eigenvectors   with  , which follows from  . Thus, its diagonalizing change of basis matrix is  , meaning  , and taking the transpose of both sides gives  .

We can read the eigenvectors of   with   from the equation  : they are the column vectors of the inverse Vandermonde matrix  . This matrix is known explicitly, giving the eignevectors  , with coordinates equal to the coefficients of the Lagrange polynomials

 
Alternatively, the scaled eigenvectors   have simpler coefficients.

If   has multiple roots, then   is not diagonalizable. Rather, the Jordan canonical form of   contains one diagonal block for each distinct root, an m × m block with   on the diagonal if the root   has multiplicity m.

Linear recursive sequences edit

A linear recursive sequence defined by   for   has the characteristic polynomial  , whose transpose companion matrix   generates the sequence:

 
The vector   is an eigenvector of this matrix, where the eigenvalue   is a root of  . Setting the initial values of the sequence equal to this vector produces a geometric sequence   which satisfies the recurrence. In the case of n distinct eigenvalues, an arbitrary solution   can be written as a linear combination of such geometric solutions, and the eigenvalues of largest complex norm give an asymptotic approximation.

From linear ODE to first-order linear ODE system edit

Similarly to the above case of linear recursions, consider a homogeneous linear ODE of order n for the scalar function  :

 
This can be equivalently described as a coupled system of homogeneous linear ODE of order 1 for the vector function  :
 
where   is the transpose companion matrix for the characteristic polynomial
 
Here the coefficients   may be also functions, not just constants.

If   is diagonalizable, then a diagonalizing change of basis will transform this into a decoupled system equivalent to one scalar homogeneous first-order linear ODE in each coordinate.

An inhomogeneous equation

 
is equivalent to the system:
 
with the inhomogeneity term  .

Again, a diagonalizing change of basis will transform this into a decoupled system of scalar inhomogeneous first-order linear ODEs.

Cyclic shift matrix edit

In the case of  , when the eigenvalues are the complex roots of unity, the companion matrix and its transpose both reduce to Sylvester's cyclic shift matrix, a circulant matrix.

Multiplication map on a simple field extension edit

Consider a polynomial   with coefficients in a field  , and suppose   is irreducible in the polynomial ring  . Then adjoining a root   of   produces a field extension  , which is also a vector space over   with standard basis  . Then the  -linear multiplication mapping

  defined by  

has an n × n matrix   with respect to the standard basis. Since   and  , this is the companion matrix of  :

 
Assuming this extension is separable (for example if   has characteristic zero or is a finite field),   has distinct roots   with  , so that
 
and it has splitting field  . Now   is not diagonalizable over  ; rather, we must extend it to an  -linear map on  , a vector space over   with standard basis  , containing vectors  . The extended mapping is defined by  .

The matrix   is unchanged, but as above, it can be diagonalized by matrices with entries in  :

 
for the diagonal matrix   and the Vandermonde matrix V corresponding to  . The explicit formula for the eigenvectors (the scaled column vectors of the inverse Vandermonde matrix  ) can be written as:
 
where   are the coefficients of the scaled Lagrange polynomial
 

See also edit

Notes edit

  1. ^ Horn, Roger A.; Charles R. Johnson (1985). Matrix Analysis. Cambridge, UK: Cambridge University Press. pp. 146–147. ISBN 0-521-30586-1. Retrieved 2010-02-10.