• Corpus ID: 235765584

Bag of Tricks for Neural Architecture Search

  title={Bag of Tricks for Neural Architecture Search},
  author={Thomas Elsken and Benedikt Sebastian Staffler and Arber Zela and Jan Hendrik Metzen and Frank Hutter},
While neural architecture search methods have been successful in previous years and led to new state-of-the-art performance on various problems, they have also been criticized for being unstable, being highly sensitive with respect to their hyperparameters, and often not performing better than random search. To shed some light on this issue, we discuss some practical considerations that help improve the stability, efficiency and overall performance. 

NeuralArTS: Structuring Neural Architecture Search with Type Theory

A new framework called Neural Architecture Type System (NeuralArTS) is presented that categorizes the infinite set of network operations in a structured type system that can be applied to convolutional layers and proposed several future directions.

Training BatchNorm Only in Neural Architecture Search and Beyond

This work proposes a novel composite performance indicator to evaluate networks from three perspectives: expressivity, trainability, and uncertainty, derived from the theoretical property of BatchNorm, and empirically disclose that train-BN-only supernet provides an advantage on convolutions over other operators, cause unfair competition between architectures.



Best Practices for Scientific Research on Neural Architecture Search

A set of possible issues and ways to avoid them are described, leading to the NAS best practices checklist available at this http URL.

Neural Architecture Search: A Survey

An overview of existing work in this field of research is provided and neural architecture search methods are categorized according to three dimensions: search space, search strategy, and performance estimation strategy.

Neural Architecture Generator Optimization

This work is the first to investigate casting NAS as a problem of finding the optimal network generator and proposes a new, hierarchical and graph-based search space capable of representing an extremely large variety of network types, yet only requiring few continuous hyper-parameters.

Understanding and Simplifying One-Shot Architecture Search

With careful experimental analysis, it is shown that it is possible to efficiently identify promising architectures from a complex search space without either hypernetworks or reinforcement learning controllers.

A Survey on Neural Architecture Search

This survey provides a formalism which unifies and categorizes the landscape of existing methods along with a detailed analysis that compares and contrasts the different approaches.

DARTS: Differentiable Architecture Search

The proposed algorithm excels in discovering high-performance convolutional architectures for image classification and recurrent architectures for language modeling, while being orders of magnitude faster than state-of-the-art non-differentiable techniques.

NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search

This work proposes an extension to NAS-bench-101: NAS-Bench-201 with a different search space, results on multiple datasets, and more diagnostic information, which provides additional diagnostic information such as fine-grained loss and accuracy, which can give inspirations to new designs of NAS algorithms.

Evaluating the Search Phase of Neural Architecture Search

This paper finds that on average, the state-of-the-art NAS algorithms perform similarly to the random policy; the widely-used weight sharing strategy degrades the ranking of the NAS candidates to the point of not reflecting their true performance, thus reducing the effectiveness of the search process.

Shake-Shake regularization

The method introduced in this paper aims at helping deep learning practitioners faced with an overfit problem. The idea is to replace, in a multi-branch network, the standard summation of parallel

NAS-Bench-1Shot1: Benchmarking and Dissecting One-shot Neural Architecture Search

A general framework for one-shot NAS that can be instantiated to many recently-introduced variants and a general benchmarking framework that draws on the recent large-scale tabular benchmark NAS-Bench-101 for cheap anytime evaluations of one- shot NAS methods are introduced.