Learn More
Multitask learning can be effective when features useful in one task are also useful for other tasks, and the group lasso is a standard method for selecting a common subset of features. In this paper, we are interested in a less restrictive form of mul-titask learning, wherein (1) the available features can be organized into subsets according to a notion of(More)
Classification with a sparsity constraint on the solution plays a central role in many high dimensional signal processing applications. In some cases, the features can be grouped together, so that entire subsets of features can be selected or discarded. In many applications, however, this can be too restrictive. In this paper, we are interested in a less(More)
Taylor & Francis makes every effort to ensure the accuracy of all the information (the " Content ") contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views(More)
Binary logistic regression with a sparsity constraint on the solution plays a vital role in many high dimensional machine learning applications. In some cases, the features can be grouped together, so that entire subsets of features can be selected or zeroed out. In many applications, however, this can be very restrictive. In this paper, we are interested(More)
Representational Similarity Learning (RSL) aims to discover features that are important in representing (human-judged) similarities among objects. RSL can be posed as a sparsity-regularized multi-task regression problem. Standard methods, like group lasso, may not select important features if they are strongly correlated with others. To address this(More)
While learning is often highly specific to the exact stimuli and tasks used during training, there are cases where training results in learning that generalizes more broadly. It has been previously argued that the degree of specificity can be predicted based upon the learning solution(s) dictated by the particular demands of the training task. Here we(More)
Binary logistic regression with a sparsity constraint on the solution plays a vital role in many high dimensional machine learning applications. In some cases, the features can be grouped together, so that entire subsets of features can be selected or zeroed out. In many applications, however, this can be very restrictive. In this paper, we are interested(More)
With practice, humans tend to improve their performance on most tasks. But do such improvements then generalize to new tasks? Although early work documented primarily task-specific learning outcomes in the domain of perceptual learning [1-3], an emerging body of research has shown that significant learning generalization is possible under some training(More)
  • 1