Jesse L. Barlow

Learn More
n fi When computing eigenvalues of sym metric matrices and singular values of general matrices i nite precision arithmetic we in general only expect to compute them with an error bound pron portional to the product of machine precision and the norm of the matrix. In particular, we do ot expect to compute tiny eigenvalues and singular values to high relative(More)
The problem of nonlinear dimensionality reduction is considered. We focus on problems where prior information is available, namely, semi-supervised dimensionality reduction. It is shown that basic nonlinear dimensionality reduction algorithms, such as Locally Linear Embedding (LLE), Isometric feature mapping (ISOMAP), and Local Tangent Space Alignment(More)
This paper studies the solution of the linear least squares problem for a large and sparse m by n matrix A with m n by QR factorization of A and transformation of the right-hand side vector b to Q T b. A multifrontal-based method for computing Q T b using Householder factorization is presented. A theoretical operation count for the K by K unbordered grid(More)
Probabilistic models of floating point and logarithmic arithmetic are constructed using assumptions with both theoretical and empirical justification. The justification of these assumptions resolves open questions in Hamming (1970) and Bustoz et al. (1979). These models are applied to errors from sums and inner products. A comparison is made between the(More)
Two new algorithms for one-sided bidiagonalization are presented. The first is a block version which improves execution time by improving cache utilization from the use of BLAS 2.5 operations and more BLAS 3 operations. The second is adapted to parallel computation. When incorporated into singular value decomposition software, the second algorithm is faster(More)
Bidiagonal reduction is the preliminary stage for the fastest stable algorithms for computing the singular value decomposition. However, the best error bounds on bidiagonal reduction methods are of the form A + A = UBV T ; kAk 2 " M f(n)kAk 2 where B is bidiagonal, U and V are orthogonal, " M is machine precision, and f(n) is a modestly growing function of(More)
The ULV decomposition (ULVD) is an important member of a class of rank-revealing two-sided orthogonal decompositions used to approximate the singular value decomposition (SVD). It is useful in applications of the SVD such as principal components where we are interested in approximating a matrix by one of lower rank. It can be updated and downdated much more(More)