Radioactivity Registered With a Small Number of Events

The synthesis of superheavy elements asks for the analysis of low statistics experimental data presumably obeying an unknown exponential distribution and to take the decision whether they originate from one source or have admixtures. Here we analyze predictions following from non-parametrical methods, employing only such fundamental sample properties as the sample mean, the median and the mode. The experiments on the synthesis of the superheavy elements yield chains of the sequential decays of the elements Ei, following a scheme: Ei → Ei+1 → Ei+2 → Ei+3 etc. Each examined chain consists of registered decay times ti of the corresponding element although some chains can be incomplete. These experiments can be performed by different groups using somewhat different methods of producing the nuclei under study, registration, and analysis of the obtained data. The goal of the present analysis is to estimate the statistical behavior of the detected elements and to test the results of different groups for compatibility. First, we have to answer the following question: should we analyze each individual component in the chains or analyze the chain as a whole? To make it more comprehensive, let us recall the basic properties of the radioactive decays [1]. • Any k decays belong to the same type, are independent, occur in non-intersecting time points, and their probability within any time interval [t0, t0 + ∆t] does not depend on the choice of t0; • The decays are labeled as ‘rare events’, i.e., the probability of 2 and more events for any small ∆t is infinitesimal compared with the probability of 0 or 1 events; • Markov property of decays – the probability of observing any number of decays after any moment t does not depend on the origin of the element (it means on its prehistory). Thus, our problem is: being given the experimental events tiEc, where i is the number of event, E is the name of the isotope, c is the number of team performing the experiment, • estimate the parameters of each group of data tiEc, i = 1, . . . , nEc for each couple Ec • test these estimates t̂Ec for each E and c for compatibility. However, do all the methods follow the above rules? For instance, if the third condition (Markov property) is not satisfied, we should not analyze separate events, but consider the chain as a whole (for each E and c), which may complicate the analysis. Therefore, we must first test the definitely uncorrelated data tiEc of E j and the corresponding tiEc of E j+1. e-mail: zlokazov@jinr.ru e-mail: utyonkov@jinr.ru © The Authors, published by EDP Sciences. This is an open access article distributed under the terms of the Creative Commons Attribution License 4.0 (http://creativecommons.org/licenses/by/4.0/). EPJ Web of Conferences 173, 04014 (2018) https://doi.org/10.1051/epjconf/201817304014 Mathematical Modeling and Computational Physics 2017

The experiments on the synthesis of the superheavy elements yield chains of the sequential decays of the elements E i , following a scheme: E i → E i+1 → E i+2 → E i+3 etc.Each examined chain consists of registered decay times t i of the corresponding element although some chains can be incomplete.These experiments can be performed by different groups using somewhat different methods of producing the nuclei under study, registration, and analysis of the obtained data.The goal of the present analysis is to estimate the statistical behavior of the detected elements and to test the results of different groups for compatibility.
First, we have to answer the following question: should we analyze each individual component in the chains or analyze the chain as a whole?To make it more comprehensive, let us recall the basic properties of the radioactive decays [1].
• Any k decays belong to the same type, are independent, occur in non-intersecting time points, and their probability within any time interval [t 0 , t 0 + ∆t] does not depend on the choice of t 0 ; • The decays are labeled as 'rare events', i.e., the probability of 2 and more events for any small ∆t is infinitesimal compared with the probability of 0 or 1 events; • Markov property of decays -the probability of observing any number of decays after any moment t does not depend on the origin of the element (it means on its prehistory).
Thus, our problem is: being given the experimental events t iEc , where i is the number of event, E is the name of the isotope, c is the number of team performing the experiment, • estimate the parameters of each group of data t iEc , i = 1, . . ., n Ec for each couple Ec • test these estimates tEc for each E and c for compatibility.
However, do all the methods follow the above rules?For instance, if the third condition (Markov property) is not satisfied, we should not analyze separate events, but consider the chain as a whole (for each E and c), which may complicate the analysis.Therefore, we must first test the definitely uncorrelated data t iEc of E j and the corresponding t iEc of E j+1 .
In an ideal case (infinite n, absolutely exact m 1 , m 2 , σ 1 , σ 2 ) we should have r = 0.However, in our case of finite data, the calculations of r will result in a nonvanishing value.The Monte-Carlo tests showed that the main factor defining the spread of the of sample values of r is the size n.A rigorous mathematical study of the distribution of the sample correlation coefficients can be found in [2], but it covers only asymptotics of normal event distributions and is described by a very cumbersome formula, including infinite series, that can be hardly used in practice.
However, based on results of Table 1, we can specify an interval (e.g.[−0.4,0.4]) and say that if the obtained sample correlation coefficient does not exceed this range, the tested sample does not contradict the hypothesis that the data is uncorrelated.
To verify if the following radioactive decays are correlated or not, we can use the data with rather good statistics, e.g. 288Mc → 284 Nh → 280 Rg; 63 events for each decay step were selected from [3,4].The performed analysis gave the estimate r = −0.18.So the real practice shows examples that do not reject the Markov property of the successive decays.
As for the data discussed in [5] (11 events in each sample) for the transition 'element 115 → element 113', we obtained r = −0.11and for the samples 'element 113 → element 111', r = +0.30.Therefore, we can accept the hypothesis that the successive decays are uncorrelated and we can use for our analysis separate events in each data group rather than the fixed chains of events.
Furthermore, we can estimate the parameters T of all the exponential probability distributions of decay times t i : where the expectation of t i is T and its variance is T 2 .The sum of the times of m decays S = m i=1 t i has the real (m, T )-gamma distribution [2].Its density function is where m is a positive integer, and a T is positive real.Let's consider a quantity S m = S /m.The density of the S m distribution is g m (t) = m • g(mt, m, T ), and its mean and the variance are equal to T and to T 2 /m, respectively.The maximum of g m (t) (let it be t x ) is reached at the root of the equation the difference of the two gamma-distributed random quantities has no simple closed form, so that it is difficult to estimate the trustworthiness of this test.Now, we describe a method to solve a very important problem: is the analyzed data really a sample from a single exponential distribution or from a mixture?
This method is based on checking the sample relations between such fundamental characteristics of any event distribution as the mode, the median and the mean.In the case of exponentials these relations are uniquely defined and are different for a single component and a mixture of components.
The relation between the mean E and the median M for a single exponential is M = E • ln 2. For the ratio K of 'median / mean' in finite samples (e.g.n = 15) the statistical test gives K = 0.7 ± 0.17 for a confidence interval of 67 %.However, if the data consist of the decays of two (or more) sources, K can differ from ln 2 by several orders.Therefore, K is both a simple and a rather reliable indicator of whether the data originate from a single source or from a mixture of components.

Discussion and Conclusions
The literature covering the methods for the problem solution is very large (e.g. one can note the excellent paper [8]), but usually they are oriented to the analysis of large data volume size about which there is a certain amount of a priori information available.Such are all the methods based on the use of the regression techniques or likelihood relations.To advance in the data analysis with low statistics the accent was made on the non-parametric methods, employing only such fundamental sample properties as the sample mean, the median and the mode.
The report confirms that a radioactive process really possesses at least the weak form of the Markov property -the uncorrelatedness of the decays that is the necessary condition for a correct use of the mathematical techniques.It provides an analysis of the various methods of constructing of optimal and easily interpretable confidence intervals, using the order statistics and suggests a simple non-parametric criterion for testing the exponential data for admixtures by comparing the sample median with the sample mean.As example, this criterion applied to the data [9] reveals that some of them are most probably not pure.
Table 1 collects the depending on n minimum, maximum, mean and unspecified values of r, respectively.
The table demonstrates occurrence of significant deviations from zero of the correlation coefficients for finite samples (especially at values n = 10 and 20).