Open Access
Issue
EPJ Web Conf.
Volume 214, 2019
23rd International Conference on Computing in High Energy and Nuclear Physics (CHEP 2018)
Article Number 03033
Number of page(s) 7
Section T3 - Distributed computing
DOI https://doi.org/10.1051/epjconf/201921403033
Published online 17 September 2019
  1. WLCG. http://wlcg.web.cern.ch/ Accessed 25th October 2018. [Google Scholar]
  2. ATLAS Collaboration. The ATLAS Experiment at the CERN Large Hadron Collider. JINST, 3:S08003 (2008) [Google Scholar]
  3. CMS Collaboration. The CMS Experiment at the CERN Large Hadron Collider. JINST 3:S08004 (2008) [Google Scholar]
  4. F. H. Barreiro Megino et al. PanDA for ATLAS Distributed Computing in the next decade. Journal of Physics: Conference Series, 898(5):052002 (2017) [CrossRef] [Google Scholar]
  5. M. Barisits, T. Beermann, V. Garonne, T. Javurek, M. Lassnig, C. Serfon, The ATLAS Data Management System Rucio: Supporting LHC Run-2 and beyond,ACAT, Seattle, 2017 [Google Scholar]
  6. P. Love, T. G. Hartland, B. Douglas, J. Schovancová, A. Dewhurst. Object store characterisation for ATLAS distributed computing, The 23rd International Conference on Computing in High Energy and Nuclear Physics, Sofia, 2018, these proceedings (2019) [Google Scholar]
  7. A. Anisenkov, A. Di Girolamo, M. Alandes Pradillo. AGIS: Integration of new technologies used in ATLAS Distributed Computing.Journal of Physics: Conference Series, 898(9):092023 (2017) [CrossRef] [Google Scholar]
  8. A. Anisenkov et al. Computing Resource Information Catalog: the ATLAS Grid Information system evolution for other communities. The Nuclear Electronics and Computing, Montenegro, 2017 [Google Scholar]
  9. D. Smith et al. Sharing server nodes for storage and compute, The 23rd International Conference on Computing in High Energy and Nuclear Physics, Sofia, 2018, these proceedings (2019) [Google Scholar]
  10. I. Bird, S. Campana, M. Girone, X. Espinal Curull, G. McCance, J. Schovancová. Architecture and prototype of a WLCG data lake for HL-LHC, The 23rd International Conference on Computing in High Energy and Nuclear Physics, Sofia, 2018, these proceedings (2019) [Google Scholar]
  11. HEP Software Foundation. A Roadmap for HEP Software and Computing R…D for the 2020s. arXiv:1712.06982 (2017) [Google Scholar]
  12. Django web framework. https://www.djangoproject.com/ Accessed 25th October 2018. [Google Scholar]
  13. MySQL database. https://www.mysql.com/ Accessed 25th October 2018. [Google Scholar]
  14. P. Andrade, T. Bell, J. van Eldik, G. McCance, B. Panzer-Steindel, M. Coelho dos Santos, S. Traylen, U. Schwickerath. Review of CERN Data Centre Infrastructure. Journal of Physics: Conference Series, 396(4):042002 (2012) [CrossRef] [Google Scholar]
  15. Celery: Distributed Task Queue. http://www.celeryproject.org/ Accessed 25th October 2018. [Google Scholar]
  16. Redis. https://redis.io/ Accessed 25th October 2018. [Google Scholar]
  17. M. Cinquilli et al. CRAB3: Establishing a new generation of services for distributed analysis at CMS. Technical Report CMS-CR-2012-139, CERN (2012) [Google Scholar]
  18. OpenStack. https://www.openstack.org/ Accessed 25th October 2018. [Google Scholar]
  19. D. Thain, T. Tannenbaum, M. Livny. Distributed computing in practice: the Condor experience. Concurrency - Practice and Experience, 17(2–4):323–356 (2005) [CrossRef] [Google Scholar]
  20. Puppet. https://puppet.com/ Accessed 25th, October 2018. [Google Scholar]
  21. Packer by Hashicorp. https://www.packer.io/ Accessed 25th October 2018. [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.