- NeRD: Neural field-based Demosaicking
Number of the records: 1  

NeRD: Neural field-based Demosaicking

  1. 1.
    SYSNO ASEP0575759
    Document TypeC - Proceedings Paper (int. conf.)
    R&D Document TypeConference Paper
    TitleNeRD: Neural field-based Demosaicking
    Author(s) Kerepecký, Tomáš (UTIA-B) ORCID
    Šroubek, Filip (UTIA-B) RID, ORCID
    Novozámský, Adam (UTIA-B) RID, ORCID
    Flusser, Jan (UTIA-B) RID, ORCID
    Number of authors4
    Source TitleProceedings of the 2023 IEEE International Conference on Image Processing (ICIP). - Piscataway : IEEE, 2023 - ISBN 978-1-7281-9835-4
    Pagess. 1735-1739
    Number of pages5 s.
    Publication formOnline - E
    ActionIEEE International Conference on Image Processing 2023 (ICIP 2023)
    Event date08.10.2023 - 11.10.2023
    VEvent locationKuala Lumpur
    CountryMY - Malaysia
    Event typeWRD
    Languageeng - English
    CountryUS - United States
    KeywordsDemosaicking ; neural field ; implicit neural representation
    Subject RIVJC - Computer Hardware ; Software
    OECD categoryComputer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
    R&D ProjectsGA21-03921S GA ČR - Czech Science Foundation (CSF)
    Institutional supportUTIA-B - RVO:67985556
    DOI https://doi.org/10.1109/ICIP49359.2023.10221948
    AnnotationWe introduce NeRD, a new demosaicking method for generating full-color images from Bayer patterns. Our approach leverages advancements in neural fields to perform demosaicking by representing an image as a coordinate-based neural network with sine activation functions. The inputs to the network are spatial coordinates and a low-resolution Bayer pattern, while the outputs are the corresponding RGB values. An encoder network, which is a blend of ResNet and U-net, enhances the implicit neural representation of the image to improve its quality and ensure spatial consistency through prior learning. Our experimental results demonstrate that NeRD outperforms traditional and state-of-the-art CNN-based methods and significantly closes the gap to transformer-based methods.
    WorkplaceInstitute of Information Theory and Automation
    ContactMarkéta Votavová, votavova@utia.cas.cz, Tel.: 266 052 201.
    Year of Publishing2024
Number of the records: 1  

Metadata are licenced under CC0

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.