Loading [MathJax]/extensions/MathZoom.js
DNA Family: Boosting Weight-Sharing NAS With Block-Wise Supervisions | IEEE Journals & Magazine | IEEE Xplore

DNA Family: Boosting Weight-Sharing NAS With Block-Wise Supervisions


Abstract:

Neural Architecture Search (NAS), aiming at automatically designing neural architectures by machines, has been considered a key step toward automatic machine learning. On...Show More

Abstract:

Neural Architecture Search (NAS), aiming at automatically designing neural architectures by machines, has been considered a key step toward automatic machine learning. One notable NAS branch is the weight-sharing NAS, which significantly improves search efficiency and allows NAS algorithms to run on ordinary computers. Despite receiving high expectations, this category of methods suffers from low search effectiveness. By employing a generalization boundedness tool, we demonstrate that the devil behind this drawback is the untrustworthy architecture rating with the oversized search space of the possible architectures. Addressing this problem, we modularize a large search space into blocks with small search spaces and develop a family of models with the distilling neural architecture (DNA) techniques. These proposed models, namely a DNA family, are capable of resolving multiple dilemmas of the weight-sharing NAS, such as scalability, efficiency, and multi-modal compatibility. Our proposed DNA models can rate all architecture candidates, as opposed to previous works that can only access a sub- search space using heuristic algorithms. Moreover, under a certain computational complexity constraint, our method can seek architectures with different depths and widths. Extensive experimental evaluations show that our models achieve state-of-the-art top-1 accuracy of 78.9% and 83.6% on ImageNet for a mobile convolutional network and a small vision transformer, respectively. Additionally, we provide in-depth empirical analysis and insights into neural architecture ratings.
Page(s): 2722 - 2740
Date of Publication: 21 November 2023

ISSN Information:

PubMed ID: 37988208

Funding Agency:

No metrics found for this document.

I. Introduction

Neural architecture search (NAS) [1], aiming to replace human experts with machines in designing neural architectures, is widely anticipated. Typical works included reinforcement learning approaches [2], [3], evolutionary algorithms [4], [5], and Bayesian methods [6], [7]. These methods require multiple trials (i.e., training many architectures separately to assess their quality), which is computationally unaffordable for many researchers. Recent weight-sharing NAS solutions encoded a search space into a weight-sharing supernet and trained all architectures in the supernet at once, significantly improving search efficiency.

Usage
Select a Year
2025

View as

Total usage sinceNov 2023:699
01020304050JanFebMarAprMayJunJulAugSepOctNovDec242442000000000
Year Total:90
Data is updated monthly. Usage includes PDF downloads and HTML views.

Contact IEEE to Subscribe

References

References is not available for this document.