• SYMSACC
• 1976
Uspensky's 1948 book on the theory of equations presents an algorithm, based on Descartes' rule of signs, for isolating the real roots of a squarefree polynomial with real coefficients. Programmed in SAC-1 and applied to several classes of polynomials with integer coefficients, Uspensky's method proves to be a strong competitor of the recently discovered(More)
• Computing
• 2006
Finding an upper bound for the positive roots of univariate polynomials is an important step of the continued fractions real root isolation algorithm. The revived interest in this algorithm has highlighted the need for better estimations of upper bounds of positive roots. In this paper we present a new theorem, based on a generalization of a theorem by D.(More)
• 5
• Given an m×n matrix A, with m ≥ n, the four subspaces associated with it are shown in Fig. 1 (see [1]). Fig. 1. The row spaces and the nullspaces of A and A T ; a 1 through a n and h 1 through h m are abbreviations of the alignerframe and hangerframe vectors respectively (see [2]). The Fundamental Theorem of Linear Algebra tells us that N (A) is the(More)
In this paper an attempt is made to correct the misconception of several authors [1] that there exists a method by Upensky (based on Vincent's theorem) for the isolation of the real roots of a polynomial equation with rational coefficients. Despite Uspensky's claim, in the preface of his book [2], that he invented this method, we show that what Upensky(More)
• Mathematics and Computers in Simulation
• 2004
Let A be an m × n matrix with m ≥ n. Then one form of the singular-value decomposition of A is A = UΣV, where U and V are orthogonal and Σ is square diagonal. That is, UUT = Irank(A), V V T = Irank(A), U is rank(A)×m, V is rank(A)× n and Σ =   σ1 0 · · · 0 0 0 σ2 · · · 0 0 .. .. . . . .. .. 0 0 · · · σrank(A)−1 0 0 0 · · · 0 σrank(A)   is a(More)
In this paper we compare two real root isolation methods using Descartes’ Rule of Signs: the Interval Bisection method, and the Continued Fractions method. We present some time-saving improvements to both methods. Comparing computation times we conclude that the Continued Fractions method works much faster save for the case of very many very large roots.
It is well known that Euclid's algorithm for computing the greatest common divisor (gcd) of two integer numbers is more than two thousand years old and, as it turns out, it is the oldest known algorithm. Interest in computing a gcd of two polynomials first appeared only in the sixteenth century and the problem was solved by Simon Stevin [13] simply by(More)