| Literature DB >> 35230956 |
Chen Wei, Chuang Niu, Yiping Tang, Yue Wang, Haihong Hu, Jimin Liang.
Abstract
Neural architecture search (NAS) adopts a search strategy to explore the predefined search space to find superior architecture with the minimum searching costs. Bayesian optimization (BO) and evolutionary algorithms (EA) are two commonly used search strategies, but they suffer from being computationally expensive, challenging to implement, and exhibiting inefficient exploration ability. In this article, we propose a neural predictor guided EA to enhance the exploration ability of EA for NAS (NPENAS) and design two kinds of neural predictors. The first predictor is a BO acquisition function for which we design a graph-based uncertainty estimation network as the surrogate model. The second predictor is a graph-based neural network that directly predicts the performance of the input neural architecture. The NPENAS using the two neural predictors are denoted as NPENAS-BO and NPENAS-NP, respectively. In addition, we introduce a new random architecture sampling method to overcome the drawbacks of the existing sampling method. Experimental results on five NAS search spaces indicate that NPENAS-BO and NPENAS-NP outperform most existing NAS algorithms, with NPENAS-NP achieving state-of-the-art performance on four of the five search spaces.Entities:
Year: 2022 PMID: 35230956 DOI: 10.1109/TNNLS.2022.3151160
Source DB: PubMed Journal: IEEE Trans Neural Netw Learn Syst ISSN: 2162-237X Impact factor: 10.451