Learn More
When faced with learning a set of interrelated tasks from a limited amount of usable data, learning each task independently may lead to poor generalization performance. Multi-Task Learning (MTL) exploits the latent relations between tasks and overcomes data scarcity limitations by co-learning all these tasks simultaneously to offer improved performance. We(More)
Credit scoring has become very important issue due to the recent growth of the credit industry, so the credit department of the bank faces a large amount of credit data. Clearly it is impossible analyzing this huge amount of data both in economic and manpower terms, so data mining techniques were employed for this purpose. So far many data mining methods(More)
— one of the most important issues in dairy farm is feed management which aims to manage available and required feed. There are several problems associated with managing cattle feed such as how to best satisfy feed deficits or how to utilize extra feed available. Since early 70s, feed planning decision support systems have been developed to help farmers(More)
Credit scoring has become a very important issue due to the recent growth of the credit industry. As the first objective, this chapter provides an academic database of literature between and proposes a classification scheme to classify the articles. The second objective of this chapter is to suggest the employing of the Optimally Weighted Fuzzy K-Nearest(More)
Data mining is an emerging area of research that aims at extracting meaningful patterns from available data. This paper highlights the significance of classification in predicting new trends from voluminous data. Performance analysis of various data mining algorithm viz. BayesNet, Meta-Stacking, Naïve Bayes, Random Forest, SMO and ZeroR in predicting(More)
This paper introduces a new and effective algorithm for learning kernels in a Multi-Task Learning (MTL) setting. Although, we consider a MTL scenario here, our approach can be easily applied to standard single task learning, as well. As shown by our empirical results, our algorithm consistently outperforms the traditional kernel learning algorithms such as(More)
We show a Talagrand-type concentration inequality for Multi-Task Learning (MTL), using which we establish sharp excess risk bounds for MTL in terms of distribution-and data-dependent versions of the Local Rademacher Complexity (LRC). We also give a new bound on the LRC for norm regularized as well as strongly convex hypothesis classes, which applies not(More)
  • 1