Number of the records: 1  

Exploring the impact of post-training rounding in regression models

  1. 1.
    0585135 - ÚI 2025 RIV DE eng J - Journal Article
    Kalina, Jan
    Exploring the impact of post-training rounding in regression models.
    Applications of Mathematics. Roč. 69, č. 2 (2024), s. 257-271. ISSN 0862-7940. E-ISSN 1572-9109
    R&D Projects: GA ČR(CZ) GA22-02067S
    Institutional support: RVO:67985807
    Keywords : supervised learning * trained model * perturbations * effect of rounding * low-precision arithmetic
    OECD category: Statistics and probability
    Impact factor: 0.7, year: 2022
    Method of publishing: Open access
    https://doi.org/10.21136/AM.2024.0090-23

    Post-training rounding, also known as quantization, of estimated parameters stands as a widely adopted technique for mitigating energy consumption and latency in machine learning models. This theoretical endeavor delves into the examination of the impact of rounding estimated parameters in key regression methods within the realms of statistics and machine learning. The proposed approach allows for the perturbation of parameters through an additive error with values within a specified interval. This method is elucidated through its application to linear regression and is subsequently extended to encompass radial basis function networks, multilayer perceptrons, regularization networks, and logistic regression, maintaining a consistent approach throughout.
    Permanent Link: https://hdl.handle.net/11104/0352867

     
    FileDownloadSizeCommentaryVersionAccess
    0585135-afinoa.pdf0184.7 KBOA CC BY 4.0Publisher’s postprintopen-access
     
Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.