Open Access
Issue
EPJ Web of Conf.
Volume 295, 2024
26th International Conference on Computing in High Energy and Nuclear Physics (CHEP 2023)
Article Number 04028
Number of page(s) 8
Section Distributed Computing
DOI https://doi.org/10.1051/epjconf/202429504028
Published online 06 May 2024
  1. WLCG Project website: https://home.cern/science/computing/grid [Google Scholar]
  2. Bird I et al. “Computing for the Large Hadron Collider”, 2011 Annu. Rev. Nucl. Part. S. 61: 99 doi: 10.1146/annurev-nucl-102010-130059 [CrossRef] [Google Scholar]
  3. CMS Collaboration, The CMS experiment at the CERN LHC, JINST 3 (2008) S08004, doi:10.1088/1748-0221/3/08/S08004 [Google Scholar]
  4. ATLAS Collaboration, The ATLAS Experiment at the CERN LHC, JINST 3 (2008) S08003, doi: 10.1088/1748-0221/3/08/S08003 [Google Scholar]
  5. LHCb Collaboration, The LHCb Detector at the LHC, JINST 3 (2008) S08005 doi: 10.1088/1748-0221/3/08/S08005 [Google Scholar]
  6. ALICE Collaboration, The ALICE experiment at the CERN LHC, JINST 3 (2008) S08002 doi: 10.1088/1748-0221/3/08/S08002 [Google Scholar]
  7. A A Ayllon et al. “FTS3: New Data Movement Service For WLCG”, 2014 J. Phys.: Conf. Ser. 513 032081 doi: 10.1088/1742-6596/513/3/032081 [Google Scholar]
  8. XRootD project page: http://www.xrootd.org/ [Google Scholar]
  9. K. Bloom et al. “Any Data, Any Time, Anywhere: Global Data Access for Science”, 2015 arXiv:1508.01443 [Google Scholar]
  10. CMS Software framework, http://cms-sw.github.io/ [Google Scholar]
  11. P Agostini et al. “The Large Hadron–Electron Collider at the HL-LHC”, 2021 J. Phys. G: Nucl. Part. Phys. 48 110501 DOI: 10.1088/1361-6471/abf3ba [CrossRef] [Google Scholar]
  12. A P Millar et al. “dCache, agile adoption of storage technology”, 2012 J. Phys.: Conf. Ser. 396 032077 doi: 10.1088/1742-6596/396/3/032077 [CrossRef] [Google Scholar]
  13. Apache Hadoop Project page: https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/HdfsDesign.html [Google Scholar]
  14. Lustre Project page: https://www.lustre.org [Google Scholar]
  15. DPM Project page: https://lcgdm.web.cern.ch/dpm [Google Scholar]
  16. Peters A J and Janyst L. “Exabyte Scale Storage at CERN”, 2011 J. Phys.: Conf. Series 331 052015 doi: 10.1088/1742-6596/331/5/052015 [Google Scholar]
  17. Ceph Project page: https://ceph.io/ [Google Scholar]
  18. Caltech Tier2 page: https://tier2.hep.caltech.edu [Google Scholar]
  19. UCSD page: https://ucsd.edu [Google Scholar]
  20. E. Fajardo et al, “Moving the California distributed CMS XCache from bare metal into containers using Kubernetes” EPJ Web of Conferences 245, 04042 (2020). DOI: 10.1051/epjconf/202024504042 [CrossRef] [EDP Sciences] [PubMed] [Google Scholar]
  21. Susmit Shannigrahi et al. “Named Data Networking in Climate Research and HEP Applications” 2015 J. Phys.: Conf. Ser. 664 052033 doi 10.1088/1742-6596/664/5/052033 [CrossRef] [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.