Large, sparse, Hermitian eigenvalue problems are still some of the most computationally challenging tasks. Despite the need for a robust, nearly optimal preconditioned iterative method that can operate under severe memory limitations, no such method has surfaced as a clear winner. In this research we approach the eigenproblem from the nonlinear perspective that helps us develop two nearly optimal methods. The first extends the recent Jacobi-Davidson-Conjugate-Gradient (JDCG) method to JDQMR, improving robustness and efficiency. The second method, Generalized-Davidson+1 (GD+1), utilizes the locally optimal Conjugate Gradient recurrence as a restarting technique to achieve almost optimal convergence. We describe both methods within a unifying framework, and provide theoretical justification for their near optimality. A choice between the most efficient of the two can be made at runtime. Our extensive experiments confirm the robustness, the near optimality, and the efficiency of our multimethod over other state-of-the-art methods.