Industrial activities involving Naturally Occurring Radioactive Materials (NORM) are found among the most important industrial sectors worldwide as oil/gas facilities, metal production, phosphate Industry, zircon treatment, etc. being really significant the radioactive characterization of the materials involved in their production processes in order to assess the potential radiological risk for workers or natural environment. High resolution gamma spectrometry is a versatile non-destructive radiometric technique that makes simultaneous determination of several radionuclides possible with little sample preparation.
However NORM samples cover a wide variety of densities and composition, as opposed to the standards used in gamma efficiency calibration, which are either water-based solutions or standard/reference sources of similar composition. For that reason self-absorption correction effects (especially in the low energy range) must be considered individually in every sample. In this work an experimental and a semi-empirical methodology of self-absorption correction were applied to NORM samples, and the obtained results compared critically, in order to establish the best practice in relation to the circumstances of an individual laboratory.
This methodology was applied in samples coming from a TiO2 factory (NORM industry) located in the south-west of Spain where activity concentration of several radionuclides from the Uranium and Thorium series through the production process was measured. These results will be shown in this work.