•  
  •  
 

Keywords

self-interlacing polynomials, totally nonnegative matrices, tridiagonal matrices, anti-bidiagonal matrices, oscillatory matrices

Abstract

An $n\times n$ matrix is said to have a self-interlacing spectrum if its eigenvalues $\lambda_k$, $k=1,\ldots,n$, are distributed as follows: $$ \lambda_1>-\lambda_2>\lambda_3>\cdots>(-1)^{n-1}\lambda_n>0. $$ A method for constructing sign definite matrices with self-interlacing spectrum from totally nonnegative ones is presented. This method is applied to bidiagonal and tridiagonal matrices. In particular, a result by O. Holtz on the spectrum of real symmetric anti-bidiagonal matrices with positive nonzero entries is generalized.

abs_vol32_pp51-57.pdf (85 kB)
Abstract

Share

COinS
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.