Panagiotis B. Pintelas

Learn More
—We present a new curvilinear algorithmic model for training neural networks which is based on a modifications of the memoryless BFGS method that incorporates a curvilinear search. The proposed model exploits the nonconvexity of the error surface based on information provided by the eigensystem of memoryless BFGS matrices using a pair of directions; a(More)
Artificial intelligence has enabled the development of more sophisticated and more efficient student models which represent and detect a broader range of student behavior than was previously possible. In this work, we describe the implementation of a user-friendly software tool for predicting the students' performance in the course of " Mathematics " which(More)
i Abstract Concept, Resource, Order, Product (CROP) is a reference architecture for adaptive Learning Objects owned by Semantic Learning Services developed by the second author. According to CROP, composite Objects are essentially recursive, and adaptively is an emergent property of Learning Service communication and collaboration. CROP is formally(More)
Traditional educational and training practices and methods cannot deal effectively with complex training domains nor do they focus on the trainees' individual requirements. To overcome these problems computer programs have been used as the training medium; Virtual Training Environments (VTEs) constitute the latest generation of such programs and embody(More)
In this paper, we evaluate the performance of a new class of conjugate gradient methods for training recurrent neural networks which ensure the sufficient descent property. The presented methods preserve the advantages of classical conjugate gradient methods and simultaneously avoid the usually inefficient restarts. Simulation results are also presented(More)
Conjugate gradient methods constitute an excellent choice for efficiently training large neural networks since they don't require the evaluation of the Hessian matrix neither the impractical storage of an approximation of it. Despite the theoretical and practical advantages of these methods their main drawback is the use of restarting procedures in order to(More)
—We present a matrix-free method for the large scale trust region subproblem (TRS), assuming that the approximate Hessian is updated using a minimal-memory BFGS method, where the initial matrix is a scaled identity matrix. We propose a variant of the Moré-Sorensen method that exploits the eigenstructure of the approximate Hessian, and incorporates both the(More)
We present a nearly-exact method for the large scale trust region sub-problem (TRS) based on the properties of the minimal-memory BFGS method. Our study in concentrated in the case where the initial BFGS matrix can be any scaled identity matrix. The proposed method is a variant of the Moré-Sorensen method that exploits the eigenstructure of the approximate(More)
We present a new matrix-free method for the trust region subproblem, assuming that the approximate Hessian is updated by the limited memory BFGS formula with m = 2. The resulting updating scheme, called 2-BFGS, give us the ability to determine via simple formulas the eigenvalues of the resulting approximation. The resulting updating scheme gives the ability(More)
  • 1