TY - JOUR
T1 - Zero-Shot Proxy with Incorporated-Score for Lightweight Deep Neural Architecture Search
AU - Nguyen, Thi Trang
AU - Han, Ji Hyeong
N1 - Publisher Copyright:
© 2024 by the authors.
PY - 2024/8
Y1 - 2024/8
N2 - Designing a high-performance neural network is a difficult task. Neural architecture search (NAS) methods aim to solve this process. However, the construction of a high-quality accuracy predictor, which is a key component of NAS, usually requires significant computation. Therefore, zero-shot proxy-based NAS methods have been actively and extensively investigated. In this work, we propose a new efficient zero-shot proxy, Incorporated-Score, to rank deep neural network architectures instead of using an accuracy predictor. The proposed Incorporated-Score proxy is generated by incorporating the zen-score and entropy information of the network, and it does not need to train any network. We then introduce an optimal NAS algorithm called Incorporated-NAS that targets the maximization of the Incorporated-Score of the neural network within the specified inference budgets. The experiments show that the network designed by Incorporated-NAS with Incorporated-Score outperforms the previously proposed Zen-NAS and achieves a new SOTAaccuracy on the CIFAR-10, CIFAR-100, and ImageNet datasets with a lightweight scale.
AB - Designing a high-performance neural network is a difficult task. Neural architecture search (NAS) methods aim to solve this process. However, the construction of a high-quality accuracy predictor, which is a key component of NAS, usually requires significant computation. Therefore, zero-shot proxy-based NAS methods have been actively and extensively investigated. In this work, we propose a new efficient zero-shot proxy, Incorporated-Score, to rank deep neural network architectures instead of using an accuracy predictor. The proposed Incorporated-Score proxy is generated by incorporating the zen-score and entropy information of the network, and it does not need to train any network. We then introduce an optimal NAS algorithm called Incorporated-NAS that targets the maximization of the Incorporated-Score of the neural network within the specified inference budgets. The experiments show that the network designed by Incorporated-NAS with Incorporated-Score outperforms the previously proposed Zen-NAS and achieves a new SOTAaccuracy on the CIFAR-10, CIFAR-100, and ImageNet datasets with a lightweight scale.
KW - efficient model
KW - neural architecture search (NAS)
KW - zero-shot NAS
UR - https://www.scopus.com/pages/publications/85202681658
U2 - 10.3390/electronics13163325
DO - 10.3390/electronics13163325
M3 - Article
AN - SCOPUS:85202681658
SN - 2079-9292
VL - 13
JO - Electronics (Switzerland)
JF - Electronics (Switzerland)
IS - 16
M1 - 3325
ER -