Number of the records: 1  

Updates on usage of the Czech national HPC center

  1. 1.
    SYSNO ASEP0548761
    Document TypeC - Proceedings Paper (int. conf.)
    R&D Document TypeConference Paper
    TitleUpdates on usage of the Czech national HPC center
    Author(s) Svatoš, Michal (FZU-D) RID, ORCID
    Chudoba, Jiří (FZU-D) RID, ORCID
    Vokáč, P. (CZ)
    Number of authors3
    Article number02008
    Source TitleEPJ Web of Conferences, 251. - Les Ulis : EDP Sciences, 2021 / Biscarat C. ; Campana S. ; Hegner B. ; Roiser S. ; Rovelli C.I. ; Stewart G.A.
    Number of pages7 s.
    Publication formOnline - E
    ActionInternational Conference on Computing in High Energy and Nuclear Physics (CHEP 2021) /25./
    Event date17.05.2021 - 21.05.2021
    VEvent locationVirtual Event
    CountryCH - Switzerland
    Event typeWRD
    Languageeng - English
    CountryFR - France
    KeywordsHPC ; computing
    Subject RIVJD - Computer Applications, Robotics
    OECD categoryAutomation and control systems
    R&D ProjectsEF18_046/0016013 GA MŠMT - Ministry of Education, Youth and Sports (MEYS)
    LTT17018 GA MŠMT - Ministry of Education, Youth and Sports (MEYS)
    LM2018104 GA MŠMT - Ministry of Education, Youth and Sports (MEYS)
    EF16_013/0001404 GA MŠMT - Ministry of Education, Youth and Sports (MEYS)
    Institutional supportFZU-D - RVO:68378271
    DOI10.1051/epjconf/202125102008
    AnnotationThe distributed computing of the ATLAS experiment at LHC has used computing resources of the Czech national HPC center IT4Innovations for several years. The submission system is based on ARC-CEs installed at the Czech Tier2 site (praguelcg2). Recent improvements of this system will be discussed here. First, there was a migration of the ARC-CE from version 5 to 6 which improves the reliability and scalability. A shared filesystem built on top of sshfs 3.7 no longer represents performance bottleneck. It provided an order of magnitude better transfer performance. New Singularity containers with full software stack can easily fit default resource limits on the IT4I cluster filesystem. A new submission system, allowing sequential running of payloads in one job, was set and adapted to HPC’s environment, improving usage on worker nodes with very high number of cores. Overall, the whole infrastructure provides significant contribution to resources provided by praguelcg2.
    WorkplaceInstitute of Physics
    ContactKristina Potocká, potocka@fzu.cz, Tel.: 220 318 579
    Year of Publishing2022
    Electronic addresshttps://www.epj-conferences.org/articles/epjconf/pdf/2021/05/epjconf_chep2021_02008.pdf
Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.