The Run-by-Run Monte Carlo simulation for the ANTARES experiment

The ANTARES neutrino telescope is the largest and longest-operated underwater neutrino telescope. Data acquisition conditions in a marine environment are not stable in time: biological and physical phenomena follow a seasonal evolution producing a periodical change of the rates registered at the neutrino telescope. Variations in the sea current velocity also affect the measured baseline value and the burst fraction on short time scales. Monte Carlo simulations of the detector response to charged particles in the proximity of the telescope should reproduce the conditions of the medium and of the acquisition setup as much as possible. An efficient way to account for their variability is to extract related information directly from the data runs. A Run-by-Run simulation procedure has been developed to follow the time evolution of data acquisition in ANTARES.


Introduction
The ANTARES detector is at present the largest neutrino telescope in the Northern Hemisphere [1].It has been operated continuously since 2007, and in its full configuration since 2008.It is located at a depth of 2475 m in the Mediterranean Sea at (42 • 48 N, 6 • 10 E), 40 km off-shore from Toulon in the Gulf of Lion, Southern France.
The neutrino detector consists of 12 slender detection lines, kept taut by a top buoy, with 25 storeys each.A storey hosts three optical modules (OMs), glass spheres housing 10-inch photomultiplier tubes (PMTs) and a local control module (LCM) containing the off-shore electronics.The distance between storeys on each line is 14.5 m and the first storey is located ∼100 m from the sea-bed.PMTs in the storey are oriented 45 • downwards in order to optimise their acceptance to Cherenkov light coming from upgoing particles.The length of a line is 450 m and the horizontal distance between neighbouring lines is 60-75 m.
A Junction Box (JB), on the sea bed, connects the lines to the shore station with an electro-optical cable providing power, transferring data to shore and distributing a clock signal responsible for the synchronisation of the different detector elements.At the shore station, filtering and triggering of data a e-mail: lfusco@bo.infn.itThis is an Open Access article distributed under the terms of the Creative Commons Attribution License 4.0, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.take place.Data stream is subdivided into "runs" lasting from 2 to 12 hours depending on the data taking setup.
Monte Carlo simulations are necessary to understand the response of the detector to physics events, to check the correct functioning of the apparatus, to optimise selection cuts for the rejection of background events and to evaluate the level of purity of the selected sample of events.The simulation chain can be subdivided into two main steps: 1. Physics simulation: neutrino interactions and muon bundles are generated in the proximity of the detector.Particles are propagated through the active volume and Cherenkov light is produced along their paths.2. Data acquisition simulation: the PMT behaviour and the behaviour of the data acquisition electronics are simulated.Filtering and triggering algorithms are applied.The optical background due to environmental light (mainly bioluminescent bacteria and 40 K decay) is also added.

Physics simulation 2.1 Event generation
The instrumented volume of the apparatus is a cylinder which contains all the PMTs.A larger cylinder, called the can, surrounding the first one, extending it by three light attenuation lengths.The can defines the active volume within which Cherenkov photon emission can lead to a signal in a PMT.Outside this volume, only particle energy losses during propagation are considered.The dedicated GENHEN [2] package is suitable for the full range of neutrino studies in ANTARES from neutrino oscillations to high energy astrophysics, being the majority of detectable neutrinos in a range of energies from tens GeV, limited by the energy threshold of muon detection, to hundreds PeV, where the absorption of neutrinos in the Earth, which strongly attenuates the upward neutrino flux, must be taken into account.
Atmospheric muon bundles are generated in ANTARES with the MUPAGE software [3].MUPAGE is based on the parametrisation of the angular and energy distributions of muons under-water or in ice as a function of the muon bundle multiplicity [4].These parametrisations are taken from complete simulations and data collected with the MACRO experiment at Gran Sasso, extrapolated either under the sea or under an ice layer.The usage of parametric formulas allows the production of an extremely large number of events with a reasonable CPU time consumption, the main limitation being the primary cosmic ray flux that cannot be modified.

Particles and light propagation
Long-lived particles produced by the physics generators are tracked through the water in the can volume using a GEANT-based [5] package, denoted in ANTARES as KM3 [2].The composition and density of the sea water and its optical properties as measured at the experimental site are considered.All relevant physics processes are activated (energy loss, multiple scattering, radiative processes and photo-nuclear interactions).
Thanks to the homogeneity of the sea-water, a photon-by-photon simulation for the Cherenkov light is not required, differently from what is done to describe the photon propagation through the Antarctic ice where optical properties show a dependence on the ice stratification.This means that a set of "scattering tables" containing the probability of each photon to give a hit on a PMT as a function of 5 parameters: the distance from the muon; 3 angles defining the direction of the photons with respect to the muon and to the PMT; the photon arrival time; and considering the diffusion and absorption phenomena, can be created in advance.Muons are propagated in the can volume by MUSIC [6].Steps of 1 m for muon tracks are considered, the ionisation energy loss and the Cherenkov light Very Large Volume Neutrino Telescope (VLVnT-2015) emission are calculated.The probability for each Cherenkov photon to reach PMTs is extracted from the scattering tables.Also the probability of a catastrophic energy loss is evaluated.Similar scattering tables are created for electromagnetic showers.
Hadronic showers are treated differently.In this case a large number of charged particles is produced at the interaction vertex.The computation of scattering tables for each particle would require an eventby-event simulation, and a huge amount of CPU time, because of the high variability in the hadronic shower composition.To overcome this problem, a Multi Particle Approximation approach is followed.Each hadron is considered as equivalent to an electron.The electron scattering tables are used in association with opportune weights, evaluated for each hadron after many complete photon tracking simulations.The light yield from hadronic showers obtained with this technique is compatible with what is achieved with a detailed simulation.

Data acquisition simulation
The final step of the simulation is performed by the TriggerEfficiency program.This software adds the optical background to the hits generated by physics events, reproducing a sort of data stream, simulates the behaviour of the front-end electronics and applies the trigger algorithms for candidate event selection.The optical background can be generated and added to MC events according to a Poisson distribution using a fixed background rate specified by the user or, in alternative, extracting information on rates directly from real data.
Two front-end ARS (Analogue Ring Sampler) chips are connected to each PMT, integrating the analogue signal from the PMT over a typical time window of 25 ns.This behaviour is simulated by summing up the number of detected photons in that window.The two ARS operate in a token ring scheme.After the integration time of the first ARS chip, the second takes over.Only after the integration time of the second ARS chip, the dead time of the first plays a role.The time resolution for single photo-electron signals is 1.3 ns and decreases for higher amplitudes.To simulate this effect, the hit times are smeared using a Gaussian function with a width = 1.3 ns/ N , where N is the number of simultaneously detected photons.
Different level of triggers are considered in the ANTARES DAQ: • "Level zero" (L0) trigger selects hits that have a charge greater than a low threshold -typically 0.3 photo-electrons (p.e.).The amplitude measurement is simulated by smearing the integrated number of photons with an empirical function that produces a (roughly Gaussian) smearing of about 30%.The dynamic range of the charge integration has a saturation level corresponding to about 20 photo-electrons.MC events are triggered with the same trigger algorithm used for real data.
• The "first level trigger" (L1) is built up of coincidence hits in the same storey within a 20 ns time window or single hits with a large charge amplitude, greater than a "high threshold" tuneable from 2.5 p.e. to 10 p.e. • A trigger logic algorithm, "level 2" trigger (L2), is then applied to data and operates on L1 hits.All trigger definitions are reproduced inside TriggerEfficiency.

Run-by-run approach
Environmental conditions in a marine environment are not stable and constant in time.Variations affect data acquisition in an under-sea neutrino telescope like ANTARES.Biological [7] and physics phenomena [8] show evolving trends on seasonal timescales producing a periodical change of the background optical rates registered at the detector.Short term variations are also present as the optical rates are modified by the sea current velocity.In addition to environmental effects, the detector elements 02002-p.3EPJ Web of Conferences might not collect data continuously, because of temporary or permanent malfunctioning of the optical modules or lack of connection to some part of the apparatus.Finally, it is possible to modify the trigger algorithms applied to the data taking along the life of the detector.
A reliable Monte Carlo simulation of the detector should reproduce all these effects.An efficient way to account for the variations of the optical background is to extract related information directly from the data.The TriggerEfficiency program takes this information from raw data files.It considers the counting rate in short segments of the data stream, about 100 ms long, and simulates the corresponding optical background according to the measured charge distribution of hits.DAQ conditions (the status of each detector element, the active triggers, the data filtering parameters) for each run are stored in the ANTARES database, which is accessed by TriggerEfficiency, and used during the simulation.
The resulting output has the same format of raw data and is processed using the available reconstruction algorithm to extract physics information from data.This approach has significantly improved the data/MC agreement, and allowed for a better monitoring of the time evolution of the data acquisition.