Number of the records: 1  

Graph Embedding for Neural Architecture Search with Input-Output Information

  1. 1.
    0560713 - ÚI 2023 RIV US eng C - Conference Paper (international conference)
    Suchopárová, Gabriela - Neruda, Roman
    Graph Embedding for Neural Architecture Search with Input-Output Information.
    Auto-ML Conf 2022: Accepted Papers: Late-Breaking Workshop. Baltimore: AutoML Conference, 2022.
    [Auto-ML 2022: International Conference on Automated Machine Learning /1./. Baltimore (US), 25.07.2022-27.07.2022]
    Grant - others:Ministerstvo školství, mládeže a tělovýchovy - GA MŠk(CZ) LM2018140
    Institutional support: RVO:67985807
    Keywords : machine learning * neural architecture search * meta-learning * graph neural networks * representation learning
    OECD category: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)

    ZÁKLADNÍ ÚDAJE: Auto-ML Conf 2022: Accepted Papers: Late-Breaking Workshop. Baltimore: AutoML Conference, 2022. KONFERENCE: Auto-ML 2022: International Conference on Automated Machine Learning /1./. Baltimore (US), 25.07.2022-27.07.2022. ABSTRAKT: Graph representation learning has been widely used in neural architecture search as a part of performance prediction models. Existing works focused mostly on neural graph similarity without considering functionally similar networks with different architectures. In this work, we address this issue by using meta-information of input images and output features of a particular neural network. We extended the arch2vec model, a graph variational autoencoder for neural architecture search, to learn from this novel kind of data in a semi-supervised manner. We demonstrate our approach on the NAS-Bench-101 search space and the CIFAR10 dataset, and compare our model with the original arch2vec on a REINFORCE search task and a performance prediction task. We also present a semi-supervised accuracy predictor, and we discuss the advantages of both variants. The results are competitive with the original model and show improved performance.
    Permanent Link: https://hdl.handle.net/11104/0333566

     
    FileDownloadSizeCommentaryVersionAccess
    0560713-aoa.pdf4673.3 KBOA CC BY 4.0 (v clanku)Publisher’s postprintopen-access
     
Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.