Neural Architecture Search Benchmarks: Insights and Survey
Date
2023-03-08
Authors
Major Professor
Advisor
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
IEEE
Abstract
Neural Architecture Search (NAS), a promising and fast-moving research field, aims to automate the architectural design of Deep Neural Networks (DNNs) to achieve better performance on the given task and dataset. NAS methods have been very successful in discovering efficient models for various Computer Vision, Natural Language Processing, etc. The major obstacles to the advancement of NAS techniques are the demand for large computation resources and fair evaluation of various search methods. The differences in training pipeline and setting make it challenging to compare the efficiency of two NAS algorithms. A large number of NAS Benchmarks to simulate the architecture evaluation in seconds have been released over the last few years to ease the computation burden of training neural networks and can aid in the unbiased assessment of different search methods. This paper provides an extensive review of several publicly available NAS Benchmarks in the literature. We provide technical details and a deeper understanding of each benchmark and point out future directions.
Series Number
Journal Issue
Is Version Of
Versions
Series
Academic or Administrative Unit
Type
article
Comments
This article is published as Chitty-Venkata, Krishna Teja, Murali Emani, Venkatram Vishwanath, and Arun K. Somani. "Neural Architecture Search Benchmarks: Insights and Survey." IEEE Access 11 (2023): 25217 - 25236.
DOI: 10.1109/ACCESS.2023.3253818.
Copyright 2023 The Author(s).
Attribution 4.0 International (CC BY 4.0).
Posted with permission.