Selin Damla Ahipasaoglu

Learn More
Given a multivariate data set, sparse principal component analysis (SPCA) aims to extract several linear combinations of the variables that together explain the variance in the data as much as possible, while controlling the number of nonzero loadings in these combinations. In this paper we consider 8 different optimization formulations for computing a(More)
Covariance selection seeks to estimate a covariance matrix by maximum likelihood while restricting the number of nonzero inverse covariance matrix coefficients. A single penalty parameter usually controls the tradeoff between log likelihood and sparsity in the inverse matrix. We describe an efficient algorithm for computing a full regularization path of(More)
A popular discrete choice model that incorporates correlation information is the Multinomial Probit (MNP) model where the random utilities of the alternatives are chosen from a multivariate normal distribution. Computing the choice probabilities is challenging in the MNP model when the number of alternatives is large and simulation is a popular technique(More)
We use convex relaxation techniques to produce lower bounds on the optimal value of subset selection problems and generate good approximate solutions. We then explicitly bound the quality of these relaxations by studying the approximation ratio of sparse eigenvalue relaxations. Our results are used to improve the performance of branch-and-bound algorithms(More)
Crashing is a method for optimally shortening the project makespan by reducing the time of one or more activities in a project network by allocating resources to it. Activity durations are however uncertain and techniques in stochastic optimization, robust optimization and distributionally robust optimization have been developed to tackle this problem. In(More)
An emerging trend in feature selection is the development of two-objective algorithms that analyze the tradeoff between the number of features and the classification performance of the model built with these features. Since these two objectives are conflicting, a typical result stands in a set of Pareto-efficient subsets, each having a different cardinality(More)
The MVE estimator is an important tool in robust regression and outlier detection in statistics. We develop fast and efficient algorithms for the MVE estimator problem and discuss how they can be implemented efficiently. The novelty of our approach stems from the recent developments in the first-order algorithms for solving the related Minimum Volume(More)
We develop and analyse a first-order algorithm for the A-optimal experimental design problem. The problem is first presented as a special case of a parametric family of optimal design problems for which duality results and optimality conditions are given. Then, two first-order (Frank-Wolfe type) algorithms are presented, accompanied by a detailed(More)