A Generic Graph-Based Neural Architecture Encoding Scheme With Multifaceted Information.
IEEE Trans Pattern Anal Mach Intell
; 45(7): 7955-7969, 2023 Jul.
Article
en En
| MEDLINE
| ID: mdl-37015374
Neural architecture search (NAS) can automatically discover well-performing architectures in a large search space and has been shown to bring improvements to various applications. However, the computational burden of NAS is huge, since exploring a large search space can need evaluating more than thousands of architecture samples. To improve the sample efficiency of search space exploration, predictor-based NAS methods learn a performance predictor of architectures, and utilize the predictor to sample worth-evaluating architectures. The encoding scheme of NN architectures is crucial to the predictor's generalization ability, and thus crucial to the efficacy of the NAS process. To this end, we have designed a generic Graph-based neural ArchiTecture Encoding Scheme (GATES), a more reasonable modeling of NN architectures that mimics their data processing. Nevertheless, GATES is unaware of the concrete computing semantic of NN operations or architectures. Thus, the learning of operation embeddings and weights in GATES can only exploit the information in architectures-performance pairs. We propose GATES++, which incorporates multifaceted information about NN's operation-level and architecture-level computing semantics into its construction and training, respectively. Experiments on benchmark search spaces show that both the operation-level and architecture-level information can bring improvements alone, and GATES++ can discover better architectures after evaluating the same number of architectures.
Texto completo:
1
Colección:
01-internacional
Base de datos:
MEDLINE
Tipo de estudio:
Prognostic_studies
Idioma:
En
Revista:
IEEE Trans Pattern Anal Mach Intell
Asunto de la revista:
INFORMATICA MEDICA
Año:
2023
Tipo del documento:
Article
Pais de publicación:
Estados Unidos