Controlling Model Complexity in Probabilistic Model-Based Dynamic Optimization of Neural Network Structures

@article{Saito2019ControllingMC,
  title={Controlling Model Complexity in Probabilistic Model-Based Dynamic Optimization of Neural Network Structures},
  author={S. Saito and S. Shirakawa},
  journal={ArXiv},
  year={2019},
  volume={abs/1907.06341}
}
  • S. Saito, S. Shirakawa
  • Published 2019
  • Computer Science, Mathematics
  • ArXiv
  • A method of simultaneously optimizing both the structure of neural networks and the connection weights in a single training loop can reduce the enormous computational cost of neural architecture search. [...] Key Method We formulate a penalty term using the number of weights or units and derive its analytical natural gradient. The proposed method minimizes the objective function injected the penalty term based on the stochastic gradient descent.Expand Abstract
    Neural architecture search for sparse DenseNets with dynamic compression

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 26 REFERENCES
    Dynamic Optimization of Neural Network Structures Using Probabilistic Modeling
    13
    Embedded feature selection using probabilistic model-based optimization
    3
    Adaptive Stochastic Natural Gradient Method for One-Shot Neural Architecture Search
    20
    Neural Architecture Search with Bayesian Optimisation and Optimal Transport
    165
    A genetic programming approach to designing convolutional neural network architectures
    164
    Efficient Neural Architecture Search via Parameter Sharing
    845
    Neural Architecture Search with Reinforcement Learning
    1855
    Learning both Weights and Connections for Efficient Neural Network
    2410
    Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
    18336
    Efficient Multi-Objective Neural Architecture Search via Lamarckian Evolution
    154