Number of the records: 1  

Texture Segmentation Benchmark

  1. 1.
    0545221 - ÚTIA 2023 RIV US eng J - Journal Article
    Mikeš, Stanislav - Haindl, Michal
    Texture Segmentation Benchmark.
    IEEE Transactions on Pattern Analysis and Machine Intelligence. Roč. 44, č. 9 (2022), s. 5647-5663. ISSN 0162-8828. E-ISSN 1939-3539
    R&D Projects: GA ČR(CZ) GA19-12340S
    Institutional support: RVO:67985556
    Keywords : Benchmark * Image segmentation * Texture segmentation * (Un)supervised segmentation * Segmentation criteria * Scale, rotation and illumination invariants
    OECD category: Robotics and automatic control
    Impact factor: 23.6, year: 2022
    Method of publishing: Limited access
    http://library.utia.cas.cz/separaty/2021/RO/haindl-0545221.pdf https://ieeexplore.ieee.org/document/9416785

    The Prague texture segmentation data-generator and benchmark (\href{https://mosaic.utia.cas.cz}{mosaic.utia.cas.cz}) is a web-based service designed to mutually compare and rank (recently nearly 200) different static and dynamic texture and image segmenters, to find optimal parametrization of a segmenter and support the development of new segmentation and classification methods. The benchmark verifies segmenter performance characteristics on potentially unlimited monospectral, multispectral, satellite, and bidirectional texture function (BTF) data using an extensive set of over forty prevalent criteria. It also enables us to test for noise robustness and scale, rotation, or illumination invariance. It can be used in other applications, such as feature selection, image compression, query by pictorial example, etc. The benchmark's functionalities are demonstrated in evaluating several examples of leading previously published unsupervised and supervised image segmentation algorithms. However, they are used to illustrate the benchmark functionality and not review the recent image segmentation state-of-the-art.
    Permanent Link: http://hdl.handle.net/11104/0322145

     
     
Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.