Open Access
Issue
EPJ Web Conf.
Volume 214, 2019
23rd International Conference on Computing in High Energy and Nuclear Physics (CHEP 2018)
Article Number 06027
Number of page(s) 8
Section T6 - Machine learning & analysis
DOI https://doi.org/10.1051/epjconf/201921406027
Published online 17 September 2019
  1. Fisher, R. A., The Use Of Multiple Measurements In Taxonomic Problems, Annals of Eugenics, 7, 179–188 (1936) doi:10.1111/j.1469–1809.1936.tb02137.x [Google Scholar]
  2. Rosenblatt, F., The Perceptron: A Probabilistic Model For Information Stor-age And Organization In The Brain, Psychological Review 65(6), 386–406 (1958) doi:10.1037/h0042519 [Google Scholar]
  3. Albertsson, K. and others, Machine Learning in High Energy Physics Community White Paper, (2018) doi:10.1088/1742–6596/1085/2/022008 [Google Scholar]
  4. Abadi, M. and others, TensorFlow: Large-scale machine learning on heterogeneous systems, (2015), Software available from tensorflow.org arXiv:1603.04467 [Google Scholar]
  5. Travis E., O., A guide to NumPy (Trelgol Publishing, USA, 2006) ISBN:9781517300074 [Google Scholar]
  6. Pinfold, J., The MoEDAL experiment at the LHC, EPJ Web of Conferences 145, 12002 (2017) [CrossRef] [EDP Sciences] [Google Scholar]
  7. Evans, L. and Bryant, P., LHC Machine, Journal of Instrumentation 3, (2008) doi:10.1088/1748–0221/3/08/S08001 [CrossRef] [Google Scholar]
  8. Adam-Bourdarios, C. and others, The Higgs boson machine learning challenge, Journal of Machine Learning Research Workshop and Conference Proceedings 42, 19–55 (2015) doi:10.1088/1742–6596/664/7/072015 [Google Scholar]
  9. Brun, R. and Rademakers, F., ROOT — An Object Oriented Data Analysis Framework, Nucl. Inst. & Meth. in Phys. Res. A 389 81–86 (1997), See also http://root.cern.ch/doi:10.1016/S0168–9002(97)00048-X [Google Scholar]
  10. Chollet, F. and others, Keras, (2015), See also https://keras.io [Google Scholar]
  11. Hopfield, J. J., Neural networks and physical systems with emergent collective com putational abilities, Proceedings of the National Academy of Sciences 79 (8), 25542558 (1982) doi:10.1073/pnas.79.8.2554 [CrossRef] [PubMed] [Google Scholar]
  12. Jordan, M. I., Serial order: A parallel distributed processing approach. Technical Report 8604, (1986) doi:10.1016/S0166–4115(97)80111–2 [Google Scholar]
  13. Elman, J. L., Finding structure in time. Cognitive science, 14(2), 179211, (1990) doi:10.1016/0364–0213(90)90002-E [Google Scholar]
  14. Hochreiter, S. and Schmidhuber, J., Long Short-Term Memory, Neural Computation, 9 (8), 1735–1780, (1997) doi:10.1162/neco.1997.9.8.1735 [CrossRef] [Google Scholar]
  15. Srivastava, N. and others, Dropout: A Simple Way To Prevent Neural Net-works from Overfitting Journal of Machine Learning Research 15, 1929–1958 (2014) doi:10.1007/978–3–642–46466–9_18 [Google Scholar]
  16. Fukushima, K., Neocognitron: A Self-organizing Neural Network Model for a Mechanism of Pattern Recognition Unaffected by Shift in Position, Biological Cybernetics 36, 193–202 (1980) doi:10.1007/BF00344251 [CrossRef] [PubMed] [Google Scholar]
  17. LeCun, Y. and others, Object Recognition with Gradient-Based Learning, (1999) doi:10.1007/3–540–46805–6_19 [Google Scholar]
  18. Sutskever, I. V., O. Le. Q. V., Sequence to Sequence Learning with Neural Networks, Proc. Advances in Neural Information Processing Systems 27, 3104–3112 (2014) arXiv:1409.3215v3 [Google Scholar]
  19. Clark, A. and others, Pillow — The friendly PIL fork, (2016) doi:10.5281/zenodo.44297 [Google Scholar]
  20. Caswell, T. and others, matplotlib: plotting with Python, (2018) doi:10.5281/zen-odo.1482098 [Google Scholar]
  21. Reeve, A. and others, npTDMS, https://github.com/adamreeve/npTDMS [Google Scholar]
  22. Hoecker, A. and others, TMVA — Toolkit for Multivariate Data Analysis, PoS ACAT 040, (2007) arXiv:physics/0703039v5 [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.