Learn More
The problems of computing least squares approximations for various types of real and symmetric matrices subject to spectral constraints share a common structure. This paper describes a general procedure in using the projected gradient method. It is shown that the projected gradient of the objective function on the manifold of constraints usually can be(More)
Let A 2 R mn denote an arbitrary matrix. If x 2 R n and y 2 R m are vectors such that ! = y T Ax 6 = 0, then the matrix B := A ? ! ?1 Axy T A has rank exactly one less than the rank of A. This Wedderburn rank-one reduction formula is easy to prove, yet the idea is so powerful that perhaps all matrix factorizations can be derived from it. The formula also(More)
Any logical procedure that is used to reason or infer either deductively or inductively so as to draw conclusions or make decisions can be called, in a broad sense, a realization process. A realization process usually assumes the recursive form that one state gets developed into another state by following a certain specific rule. Such an action is qualified(More)
A collection of inverse eigenvalue problems are identiied and classiied according to their characteristics. Current developments in both the theoretic and the algorithmic aspects are summarized and reviewed in this paper. This exposition also reveals many open questions that deserves further study. An extensive bibliography of pertinent literature is(More)
An inverse eigenvalue problem concerns the reconstruction of a structured matrix from prescribed spectral data. Such an inverse problem arises in many applications where parameters of a certain physical system are to be determined from the knowledge or expectation of its dynamical behavior. Spectral information is entailed because the dynamical behavior(More)
The inverse eigenvalue problem of constructing real and symmetric square matrices M, C and K of size n × n for the quadratic pencil Q(λ) = λ 2 M + λC + K so that Q(λ) has a prescribed subset of eigenvalues and eigenvectors is considered. This paper consists of two parts addressing two related but different problems. The first part deals with the inverse(More)
Spectral decomposition provides a canonical representation of an operator over a vector space in terms of its eigenvalues and eigenfunctions. The canonical form often facilitates discussions which, otherwise, would be complicated and involved. Spectral decomposition is of fundamental importance in many applications. The well-known GLR theory generalizes the(More)