• Corpus ID: 237264116

NAS-Bench-360: Benchmarking Diverse Tasks for Neural Architecture Search

@article{Tu2021NASBench360BD,
  title={NAS-Bench-360: Benchmarking Diverse Tasks for Neural Architecture Search},
  author={Renbo Tu and Mikhail Khodak and Nicholas Roberts and Ameet S. Talwalkar},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.05668}
}
Most existing neural architecture search (NAS) benchmarks and algorithms priori1 tize performance on well-studied tasks, focusing on computer vision datasets such 2 as CIFAR and ImageNet. However, the applicability of NAS approaches in other 3 areas is not adequately understood. In this paper, we present NAS-Bench-360, 4 a benchmark suite for evaluating state-of-the-art NAS methods on less-explored 5 datasets. To do this, we organize a diverse array of tasks, from classification of 6 simple… 

Figures and Tables from this paper

EmProx: Neural Network Performance Estimation For Neural Architecture Search

Performance estimations of this method are comparable to the MLP performance predictor used in NAO in terms of accuracy, while being nearly nine times faster to train compared to NAO.

On Redundancy and Diversity in Cell-based Neural Architecture Search

An empirical post-hoc analysis of architectures from the popular cellbased search spaces finds that the existing search spaces contain a high degree of redundancy: the architecture performance is minimally sensitive to changes at large parts of the cells, and universally adopted designs significantly increase the complexities but have very limited impact on the performance.

Arch-Graph: Acyclic Architecture Relation Predictor for Task-Transferable Neural Architecture Search

This work proposes Arch-Graph, a transferable NAS method that predicts task-specific optimal architectures with respect to given task embeddings, and shows its transferability and high sample efficiency across numerous tasks, beating many NAS methods designed for both single-task and multi-task search.

References

SHOWING 1-10 OF 64 REFERENCES

NAS-Bench-101: Towards Reproducible Neural Architecture Search

This work introduces NAS-Bench-101, the first public architecture dataset for NAS research, which allows researchers to evaluate the quality of a diverse range of models in milliseconds by querying the pre-computed dataset.

NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search

This work proposes an extension to NAS-bench-101: NAS-Bench-201 with a different search space, results on multiple datasets, and more diagnostic information, which provides additional diagnostic information such as fine-grained loss and accuracy, which can give inspirations to new designs of NAS algorithms.

NAS-Bench-1Shot1: Benchmarking and Dissecting One-shot Neural Architecture Search

A general framework for one-shot NAS that can be instantiated to many recently-introduced variants and a general benchmarking framework that draws on the recent large-scale tabular benchmark NAS-Bench-101 for cheap anytime evaluations of one- shot NAS methods are introduced.

TransNAS-Bench-101: Improving transferability and Generalizability of Cross-Task Neural Architecture Search

This work proposes TransNAS-Bench-101, a benchmark dataset containing network performance across seven tasks, covering classification, regression, pixel-level prediction, and self-supervised tasks, and explores two fundamentally different types of search space: cell-level search space and macro- level search space.

NAS-Bench-NLP: Neural Architecture Search Benchmark for Natural Language Processing

This work step outside the computer vision domain by leveraging the language modeling task, which is the core of natural language processing (NLP), and considers that the benchmark will provide more reliable empirical findings in the community and stimulate progress in developing new NAS methods well suited for recurrent architectures.

Random Search and Reproducibility for Neural Architecture Search

This work proposes new NAS baselines that build off the following observations: (i) NAS is a specialized hyperparameter optimization problem; and (ii) random search is a competitive baseline for hyperparameters optimization.

Geometry-Aware Gradient Algorithms for Neural Architecture Search

A geometry-aware framework is presented that exploits the underlying structure of this optimization to return sparse architectural parameters, leading to simple yet novel algorithms that enjoy fast convergence guarantees and achieve state-of-the-art accuracy on the latest NAS benchmarks in computer vision.

ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware

ProxylessNAS is presented, which can directly learn the architectures for large-scale target tasks and target hardware platforms and apply ProxylessNAS to specialize neural architectures for hardware with direct hardware metrics (e.g. latency) and provide insights for efficient CNN architecture design.

DARTS: Differentiable Architecture Search

The proposed algorithm excels in discovering high-performance convolutional architectures for image classification and recurrent architectures for language modeling, while being orders of magnitude faster than state-of-the-art non-differentiable techniques.

Learning Transferable Architectures for Scalable Image Recognition

This paper proposes to search for an architectural building block on a small dataset and then transfer the block to a larger dataset and introduces a new regularization technique called ScheduledDropPath that significantly improves generalization in the NASNet models.
...