Počet záznamů: 1  

Efficient Text Classification with Echo State Networks

  1. 1.
    0542211 - ÚI 2022 RIV US eng C - Konferenční příspěvek (zahraniční konf.)
    Cabessa, Jérémie - Hernault, H. - Kim, H. - Lamonato, Y. - Levy, Y. Z.
    Efficient Text Classification with Echo State Networks.
    IJCNN 2021. The International Joint Conference on Neural Networks Proceedings. Piscataway: IEEE, 2021, s. 1-8. ISBN 978-0-7381-3366-9.
    [IJCNN 2021: The International Joint Conference on Neural Networks /34./. Virtual (US), 18.07.2021-22.07.2021]
    Grant CEP: GA ČR(CZ) GA19-05704S
    Institucionální podpora: RVO:67985807
    Klíčová slova: reservoir computing * echo state networks * natural language processing * text classification
    Obor OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)

    We consider echo state networks (ESNs) for text classification. More specifically, we investigate the learning capabilities of ESNs with pre-trained word embedding as input features, trained on the IMDb and TREC sentiment and question classification datasets, respectively. First, we introduce a customized training paradigm for the processing of multiple input time series (the inputs texts) associated with categorical targets (their corresponding classes). For sentiment tasks, we use an additional frozen attention mechanism which is based on an external lexicon, and hence requires only negligible computational cost. Within this paradigm, ESNs can be trained in tens of seconds on a GPU. We show that ESNs significantly outperform their Ridge regression baselines provided with the same embedded features. ESNs also compete with classical Bi-LSTM networks while keeping a training time of up to 23 times faster. These results show that ESNs can be considered as robust, efficient and fast candidates for text classification tasks. Overall, this study falls within the context of light and fast-to-train models for NLP.
    Trvalý link: http://hdl.handle.net/11104/0319682

     
    Název souboruStaženoVelikostKomentářVerzePřístup
    0542211-apre.pdf11.2 MBAutorský preprintvyžádat
     
Počet záznamů: 1  

  Tyto stránky využívají soubory cookies, které usnadňují jejich prohlížení. Další informace o tom jak používáme cookies.