Number of the records: 1
NeRD: Neural field-based Demosaicking
- 1.
SYSNO ASEP 0575759 Document Type C - Proceedings Paper (int. conf.) R&D Document Type Conference Paper Title NeRD: Neural field-based Demosaicking Author(s) Kerepecký, Tomáš (UTIA-B) ORCID
Šroubek, Filip (UTIA-B) RID, ORCID
Novozámský, Adam (UTIA-B) RID, ORCID
Flusser, Jan (UTIA-B) RID, ORCIDNumber of authors 4 Source Title Proceedings of the 2023 IEEE International Conference on Image Processing (ICIP). - Piscataway : IEEE, 2023 - ISBN 978-1-7281-9835-4 Pages s. 1735-1739 Number of pages 5 s. Publication form Online - E Action IEEE International Conference on Image Processing 2023 (ICIP 2023) Event date 08.10.2023 - 11.10.2023 VEvent location Kuala Lumpur Country MY - Malaysia Event type WRD Language eng - English Country US - United States Keywords Demosaicking ; neural field ; implicit neural representation Subject RIV JC - Computer Hardware ; Software OECD category Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) R&D Projects GA21-03921S GA ČR - Czech Science Foundation (CSF) Institutional support UTIA-B - RVO:67985556 DOI https://doi.org/10.1109/ICIP49359.2023.10221948 Annotation We introduce NeRD, a new demosaicking method for generating full-color images from Bayer patterns. Our approach leverages advancements in neural fields to perform demosaicking by representing an image as a coordinate-based neural network with sine activation functions. The inputs to the network are spatial coordinates and a low-resolution Bayer pattern, while the outputs are the corresponding RGB values. An encoder network, which is a blend of ResNet and U-net, enhances the implicit neural representation of the image to improve its quality and ensure spatial consistency through prior learning. Our experimental results demonstrate that NeRD outperforms traditional and state-of-the-art CNN-based methods and significantly closes the gap to transformer-based methods. Workplace Institute of Information Theory and Automation Contact Markéta Votavová, votavova@utia.cas.cz, Tel.: 266 052 201. Year of Publishing 2024
Number of the records: 1
