Single photon detection and signal analysis for high sensitivity dosimetry based on optically stimulated luminescence with beryllium oxide

Single photon detection applied to optically stimulated luminescence (OSL) dosimetry is a promising approach due to the low level of luminescence light and the known statistical behavior of single photon events. Time resolved detection allows to apply a variety of different and independent data analysis methods. Furthermore, using amplitude modulated stimulation impresses time- and frequency information into the OSL light and therefore allows for additional means of analysis. Considering the impressed frequency information, data analysis by using Fourier transform algorithms or other digital filters can be used for separating the OSL signal from unwanted light or events generated by other phenomena. This potentially lowers the detection limits of low dose measurements and might improve the reproducibility and stability of obtained data. In this work, an OSL system based on a single photon detector, a fast and accurate stimulation unit and an FPGA is presented. Different analysis algorithms which are applied to the single photon data are discussed.


I. INTRODUCTION
Optically stimulated luminescence (OSL) is gaining greater importance in the field of personal dosimetry in the last few years. Its principle is based on the release of small amounts of light induced by the prior absorption of ionizing radiation. In comparison to the method of thermoluminescence, which is also a common method in personal dosimetry, the luminophore is stimulated by optical photons and not through heat. This makes the OSL easy to use, more flexible and less sensitive to changing environmental influences. J. Radtke, J. Sponner, C. Jakobi, J. Schneider, M. Sommer, T. Teichmann, W. Ullrich, J. Henniger and T. Kormoll are with the IKTP (Institute of nuclear and particle physics), Dresden 01069 Germany (e-mail: radtke@asp.tudresden.de, sponner@asp.tu-dresden.de, jakobi@asp.tu-dresden.de, schneider@asp.tu-dresden.de, sommer@asp.tu-dresden.de, teichmann@asp.tu-dresden.de, ullrich@asp.tu-dresden.de, henniger@asp.tudresden.de, kormoll@asp.tu-dresden. de) For OSL, there are several possible luminophors, like CaF2 [1], BeO [2] or Al2O3 [3]. One suitable luminophore for OSL in personal dosimetry is beryllium oxide (BeO) because of its near tissue equivalent effective atomic number of 7,1 [4] [5]. The luminescence of BeO in the UV region below 370 nm, can be effectively stimulated by broadband blue light. The optical properties of BeO are well-known and were investigated by [6], [7] and [8]. Requirements placed on such systems are high rate of accuracy, a very robust and stable measuring procedure and a very good reproducibility as well as an easy and fast reading process. Therefore a new generation of measurement system based on the OSL of BeO was developed by the radiation physics group at the TU Dresden. The emission of luminescence photons is a Poisson process. Thus, a time resolved detection of single photons allows access to this statistical character. Moreover, the time resolved detection in a combination with a modulated stimulation should be robust against noise.

II. MATERIAL AND METHODS
BeO is most efficiently stimulated by blue light around 435 nm [2]. In this work, a blue LED CREE XT-E Royal blue is used as stimulation source. The peak wavelength of this LED is around 450 nm while the wavelength of the luminescence light is below 370 nm [5]. A combination of different filters is used to allow mainly luminescence light to pass on to the sensor. In this work, a photon counting head by Hamamatsu, H10682-210, was used for the detector. This detector has an internal trigger circuit which is able to produce a voltage signal every 20 ns. The schematics of the system are outlined in figure 1. The signal from the photon detector is sampled by an FPGA and a timestamp with 10 ns resolution is assigned to each event.
Single photon detection and signal analysis for high sensitivity dosimetry based on optically stimulated luminescence with beryllium oxide Because of their origin, the signals are independent of each other and obey the Poisson statistics. Consequently the measurement process becomes a statistical experiment. The character of this statistic and the associated uncertainties are well known and allows an accurate determination of the uncertainties. These are only limited by nature. The laws of the Poisson statistics show that multiplication or convolution of the Poisson-distributed events will have no effect to the statistical character of the measurement. Therefore the accuracy of the statistical nature of the measurement remains despite the use of weighting for analysis purposes or dead time correction. An advantage over current measurement is the discrete and character of the single events. These individual events can be separated and resolved, which allows a clear identification. They can then be digitized, which means that inaccuracies of the measurement device have no great influence. The events are stored in a listmode manner in memory for later analysis. This gives the opportunity to apply a variety of different and independent data analysis methods. The luminescence process of BeO has a decay constant of around 26 µs at room temperature [9]. The stimulation light is modulated synchronously with the sampling clock. The modulation is characterized by a period of 1 ms and a duty cycle of 0.5 (cf. figure 2). These parameters of the modulation ensure that the luminescence signal is completely decayed in the stimulation free phase to prevent a superposition of different decay processes of successive periods. With that regime, time and frequency information were imprinted on the signal. This allows the analysis in the time-and frequency domain, which delivers addition options for evaluation versus the continuous wave stimulation. [3] Different kinds of weighting functions are used as digital filters. The rectangular weighting is adapted to the pulsed stimulation with square wave modulation and the known decay process of BeO. So every counts appearing during the stimulation and the first 180 µs of the decay process is weighted by a factor 1, while signals during the rest of the period are being ignored. Another weighting, that is used is the gate-time weighting. In this case, the signals are collected into two different gates. The first gate is taking all signals during the stimulation process into account while the second gate only considers the signals of the decay process. Thereby it is possible to recognize irregularities in one of these gates and correct them, or use the relation of both gates for further research. Considering the impressed frequency information, data analysis by using Fast Fourier Transform (FFT) algorithms or other digital filters can be used for separating the OSL signal from unwanted light or events generated by other phenomena. This can have significant effect to the lower detection limit. While using this system for dose determination over several orders of magnitude, some properties of the detector must be corrected for. For example there exist no obvious techniques for dead time correction for timestamp data convolved with a filter function. This is a challenge because of the timestamp character of the signal and the modulated stimulation which leads to an inhomogeneous Poisson process and time-dependent event rate.

III. RESULTS AND DISCUSSION
In figure 3, the count rate stabilized to the stimulation frequency is shown. The measurement without dosimeter shows a steep rise and decay edges which indicates the rise and fall of the stimulation signal in the first 500 µs. Signals with dosimeters, whether irradiated or not, reveal the typical decay curves. For the signal different kinds of data analysis methods were applied. Therefore especially three different methods are examined are more in detail. As a comparative method it is the simple counting method, which is integration over all signals for the whole measuring time. Using the time impressed information the rectangular weighting will be discussed in the following section. Because of the periodic modulated stimulation there is also frequency information impressed to the measuring signal. This will be investigated by using the FOURIER transformation. This data analysis method is implemented by using FFT-algorithm. These three representative methods will be used to discuss the strength, advantages but also disadvantages of every method. The total measurement time was 30 s for each curve.
As a comparison for the different methods, the measurement deviation for a dose range between 2 µGy and 5 mGy was investigated. In this area the dead time correction has not such a big impact on the results. For the analysis a batch of 20 different BeO-detectors was evaluated, to compare the reproducibility of each method. Therefore, the same measurements were analyzed using the various data analysis methods. Figure 4 presents the results of the dose measurements with the different processes. The rectangular weighting has the least uncertainty over all dose ranges, in particular for the low dose. The difference between the counting method and the rectangular weighting is significant. Using the time impressed information and rectangular weighting can improve the stability of dose measurements by around 20% in the area of the lower detection limit. For the higher dose range, the used method does not have a major impact on uncertainty.
For the measuring process it is of paramount importance to separate the dose induced signal from the background signal. The measurement process can be described by unwanted light detection due to static discharge, scintillation or other uncorrelated events which manifest in the occurrence of a spike in count rate. Empirical studies have revealed the character of such disturbances. It could be observed, that the occurring mean length of such a spike is around 10 µs. Every of them consist of independently occurring events, which can be described by a Poisson process. The number of events per spike has a big variation, depending on the origin and type. On average there are 10-20 detected events for such a disturbance. Using this gained information it was possible to simulate some disturbances. Since the occurrences of such spikes were uncorrelated to the measurement process or event rate, the entrance of this spikes were simulated irrespective of the stimulation function. This has been used for investigate the reproducibility and stability of the system, by combining measuring signals with simulated different kinds of possible occurring disorders. 10 spikes were simulated, each with 10-20 events, spread over a measurement time of 10 seconds with an external simulation software. Subsequently the disturbance signal was combined with the measurement signal, considering the detector reasoned dead time. The impacts of the disorders were investigated for different irradiated doses for the representative data analysis methods (figure 5). differences in comparison to the rectangular function. In the range of higher dose, from around 100 µGy such disorders have not much impact to all measuring methods, because of the very good statistic of the measuring signal and high event rate.

IV. CONCLUSION
Through the measurement of single events instead of current measurement, this measurement process is based on the statistical nature of counting experiment. This makes it possible on the basis of the well-known character of the Poisson distributed independent events to express the uncertainty by the laws of statistics. Every single event will get an unique time stamp, whereby the maximum amount of accessible information of this measurement process can be retained. This information can be used for any kind of data analysis. Using amplitude-modulated stimulation impresses time-and frequency information into the OSL signal and therefore allows additional means of analysis. Compared to conventional analytical methods, such has the advantage of simple identification and discrimination of any kind of disturbances, without affecting the actual signal. Thereby the signal-to-noise ratio can be optimized efficiently. This has also some impact to the stability of the whole measurement process. The stability of the measurement especially for small measuring times or small count rates can be improved significantly. Investigation of the background signal shows, that this disorders can have many different sources. Disturbances, which are independent of the modulation or measurement process, and caused by external reasons like cosmic rays, or spontaneous conversions inside the measurement device can be almost completely suppressed, as it was shown in figure 5. The effect of counting rates dependent parasitic, which were identified as detector depended after pulsing is subject of further investigation. The focus for future development is set on a calibration process, which is independent by the measurement process or used device. Also the time stamp based dead time correction needs further investigation. This is then to implement on an FPGA, in order to allow a parallel evaluation of the measurement process.