Efficient Evaluation of Matrix Polynomials beyond the Paterson–Stockmeyer Method

Efficient Evaluation of Matrix Polynomials beyond the Paterson–Stockmeyer Method (License CC BY 4.0), J. Sastre, J. Ibáñez, Mathematics 20219(14), 1600; https://doi.org/10.3390/math9141600Researchgate link

Two general methods for evaluating matrix polynomials requiring one matrix product less than the Paterson–Stockmeyer method were proposed in J. Sastre, Efficient evaluation of Matrix Polynomials, Linear Algebra Applications, where the cost of evaluating a matrix polynomial is given asymptotically by the total number of matrix product evaluations. An analysis of the stability of those methods was given and the methods have been applied to Taylor-based implementations for computing the exponential, the cosine and the hyperbolic tangent matrix functions. Moreover, a particular example for the evaluation of the matrix exponential Taylor approximation of degree 15 requiring four matrix products was given, whereas the maximum polynomial degree available using Paterson–Stockmeyer method with four matrix products is 9. Based on this example, a new family of methods for evaluating matrix polynomials more efficiently than the Paterson–Stockmeyer method was proposed, having the potential to achieve a much higher efficiency, i.e., requiring less matrix products for evaluating a matrix polynomial of certain degree, or increasing the available degree for the same cost. However, the difficulty of these family of methods lies in the calculation of the coefficients involved for the evaluation of general matrix polynomials and approximations. In this paper, we provide a general matrix polynomial evaluation method for evaluating matrix polynomials requiring two matrix products less than the Paterson-Stockmeyer method for degrees higher than 30. Moreover, we provide general methods for evaluating matrix polynomial approximations of degrees 15 and 21 with four and five matrix product evaluations, respectively, whereas the maximum available degrees for the same cost with the Paterson–Stockmeyer method are 9 and 12, respectively. Finally, practical examples for evaluating Taylor approximations of the matrix cosine and the matrix logarithm accurately and efficiently with these new methods are given.

Efficient evaluation of matrix polynomials

Efficient evaluation of matrix polynomials, Jorge Sastre, Linear Algebra Applications, Vol. 539, pp. 229-250, Feb. 2018 (early version submitted on Feb. 18, 2016  in AMC-S-16-00951.pdf), submitted Oct. 2016, available online 2017, Preprint.

Abstract: This paper presents a new family of methods for evaluating matrix polynomials more efficiently than the state-of-the-art Paterson–Stockmeyer method. Examples of the application of the methods to the Taylor polynomial approximation of matrix functions like the matrix exponential and matrix cosine are given. Their efficiency is compared with that of the best existing evaluation schemes for general polynomial and rational approximations, and also with a recent method based on mixed rational and polynomial approximants. For many years, the Paterson–Stockmeyer method has been considered the most efficient general method for the evaluation of matrix polynomials. In this paper we show that this statement is no longer true. Moreover, for many years rational approximations have been considered more efficient than polynomial approximations, although recently it has been shown that often this is not the case in the computation of the matrix exponential and matrix cosine. In this paper we show that in fact polynomial approximations provide a higher order of approximation than the state-of-the-art computational methods for rational approximations for the same cost in terms of matrix products. For an early unpublished version of this work submitted on Feb. 18, 2016 to Appl. Math. Comput. see AMC-S-16-00951.pdf.