Corpus ID: 209515718

SoftAdapt: Techniques for Adaptive Loss Weighting of Neural Networks with Multi-Part Loss Functions

@article{Heydari2019SoftAdaptTF,
  title={SoftAdapt: Techniques for Adaptive Loss Weighting of Neural Networks with Multi-Part Loss Functions},
  author={A. Ali Heydari and C. Thompson and Asif Mehmood},
  journal={ArXiv},
  year={2019},
  volume={abs/1912.12355}
}
  • A. Ali Heydari, C. Thompson, Asif Mehmood
  • Published 2019
  • Computer Science, Mathematics
  • ArXiv
  • Adaptive loss function formulation is an active area of research and has gained a great deal of popularity in recent years, following the success of deep learning. However, existing frameworks of adaptive loss functions often suffer from slow convergence and poor choice of weights for the loss components. Traditionally, the elements of a multi-part loss function are weighted equally or their weights are determined through heuristic approaches that yield near-optimal (or sub-optimal) results. To… CONTINUE READING
    6 Citations

    Figures, Tables, and Topics from this paper

    References

    SHOWING 1-10 OF 37 REFERENCES
    Multi-Objective Optimization for Self-Adjusting Weighted Gradient in Machine Learning Tasks
    • 2
    • PDF
    Addressing the Loss-Metric Mismatch with Adaptive Loss Alignment
    • 18
    • PDF
    Barzilai-Borwein-based adaptive learning rate for deep learning
    • 5
    • PDF
    Why Regularized Auto-Encoders learn Sparse Representation?
    • 42
    • PDF
    GradNorm: Gradient Normalization for Adaptive Loss Balancing in Deep Multitask Networks
    • 217
    • Highly Influential
    • PDF
    Group sparse autoencoder
    • 42
    • PDF
    k-Sparse Autoencoders
    • 235
    • Highly Influential
    • PDF
    Adaptive Subgradient Methods for Online Learning and Stochastic Optimization
    • 6,541
    • PDF
    Adam: A Method for Stochastic Optimization
    • 58,777
    • PDF