Number of the records: 1  

The EsnTorch Library: Effcient Implementation of Transformer-Based Echo State Networks

  1. 1.
    0562588 - ÚI 2024 RIV SG eng C - Conference Paper (international conference)
    Cabessa, Jérémie - Hernault, H. - Lamonato, Y. - Rochat, M. - Levy, Y. Z.
    The EsnTorch Library: Effcient Implementation of Transformer-Based Echo State Networks.
    Neural Information Processing. 29th International Conference, ICONIP 2022, Proceedings, Part VII. Singapore: Springer, 2023 - (Tanveer, M.; Agarwal, A.; Ozawa, S.; Ekbal, A.; Jatowt, A.), s. 235-246. Communications in Computer and Information Science, 1794. ISBN 978-981991647-4. ISSN 1865-0929.
    [ICONIP 2022: The International Conference on Neural Information Processing /29./. Indore / Virtual (IN), 22.11.2022-26.11.2022]
    R&D Projects: GA ČR(CZ) GA22-02067S
    Institutional support: RVO:67985807
    Keywords : reservoir computing * echo state networks * natural language processing (NLP) * text classification * transformers * BERT * python library * Hugging Face
    OECD category: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
    https://dx.doi.org/10.1007/978-981-99-1648-1_20

    Transformer-based models have revolutionized NLP. But in general, these models are highly resource consuming. Based on this consideration, several reservoir computing approaches to NLP have shown promising results. In this context, we propose EsnTorch, a library that implements echo state networks (ESNs) with transformer-based embeddings for text classification. EsnTorch is developed in PyTorch, optimized to work on GPU, and compatible with the transformers and datasets libraries from Hugging Face: the major data science platform for NLP. Accordingly, our library can make use of all the models and datasets available from Hugging Face. A transformer-based ESN implemented in EsnTorch consists of four building blocks: (1) An embedding layer, which uses a transformer-based model to embed the input texts, (2) A reservoir layer, which can implements three kinds of reservoirs: recurrent, linear or null, (3) A pooling layer, which offers three kinds of pooling strategies: mean, last, or None, (4) And a learning algorithm block, which provides six different supervised learning algorithms. Overall, this work falls within the context of sustainable models for NLP.
    Permanent Link: https://hdl.handle.net/11104/0334891

     
     
Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.