Corpus ID: 220265633

Efficient Continuous Pareto Exploration in Multi-Task Learning

@inproceedings{Ma2020EfficientCP,
  title={Efficient Continuous Pareto Exploration in Multi-Task Learning},
  author={Pingchuan Ma and Tao Du and W. Matusik},
  booktitle={ICML},
  year={2020}
}
Tasks in multi-task learning often correlate, conflict, or even compete with each other. As a result, a single solution that is optimal for all tasks rarely exists. Recent papers introduced the concept of Pareto optimality to this field and directly cast multi-task learning as multi-objective optimization problems, but solutions returned by existing methods are typically finite, sparse, and discrete. We present a novel, efficient method that generates locally continuous Pareto sets and Pareto… Expand
Safety Aware Reinforcement Learning (SARL)
Pareto Self-Supervised Training for Few-Shot Learning
Pareto Efficient Fairness in Supervised Learning: From Extraction to Tracing
Controllable Pareto Multi-Task Learning
Learning the Pareto Front with Hypernetworks

References

SHOWING 1-10 OF 15 REFERENCES
Pareto Multi-Task Learning
MINRES-QLP: A Krylov Subspace Method for Indefinite or Singular Symmetric Systems
Deep Residual Learning for Image Recognition
Long-Term Occupancy Analysis Using Graph-Based Optimisation in Thermal Imagery
Age Progression/Regression by Conditional Adversarial Autoencoder
  • Zhifei Zhang, Yang Song, H. Qi
  • Computer Science, Mathematics
  • 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
  • 2017
Scaling Up the Accuracy of Naive-Bayes Classifiers: A Decision-Tree Hybrid
GradientBased Learning Applied to Document Recognition
...
1
2
...