Počet záznamů: 1  

Argument Classification with BERT plus Contextual, Structural and Syntactic Features as Text

  1. 1.
    0562585 - ÚI 2024 RIV SG eng C - Konferenční příspěvek (zahraniční konf.)
    Mushtaq, U. - Cabessa, Jérémie
    Argument Classification with BERT plus Contextual, Structural and Syntactic Features as Text.
    Neural Information Processing. 29th International Conference, ICONIP 2022, Proceedings, Part VII. Singapore: Springer, 2023 - (Tanveer, M.; Agarwal, A.; Ozawa, S.; Ekbal, A.; Jatowt, A.), s. 622-633. Communications in Computer and Information Science, 1794. ISBN 978-981991647-4. ISSN 1865-0929.
    [ICONIP 2022: The International Conference on Neural Information Processing /29./. Indore / Virtual (IN), 22.11.2022-26.11.2022]
    Grant CEP: GA ČR(CZ) GA22-02067S
    Institucionální podpora: RVO:67985807
    Klíčová slova: Argument Mining * BERT * Features as Text * NLP * Prompting * Text Classification
    Obor OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
    https://dx.doi.org/10.1007/978-981-99-1639-9_52

    In Argument Mining (AM), the integral sub-task of argument component classification refers to the classification of argument components as claims or premises. In this context, the content of the component alone does not actually suffice to accurately predict its corresponding class. In fact, additional lexical, contextual, and structural features are needed. Here, we propose a unified model for argument component classification based on BERT and inspired by the new prompting NLP paradigm. Our model incorporates the component itself together with contextual, structural and syntactic features – given as text – instead of the usual numerical form. This new technique enables BERT to build a customized and enriched representation of the component. We evaluate our model on three datasets that reflect a diversity of written and spoken discourses. We achieve state-of-art results on two datasets and 95% of the best results on the third. Our approach shows that BERT is capable of exploiting non-textual information given in a textual form.
    Trvalý link: https://hdl.handle.net/11104/0334888

     
     
Počet záznamů: 1  

  Tyto stránky využívají soubory cookies, které usnadňují jejich prohlížení. Další informace o tom jak používáme cookies.