Year
2016
Abstract
The Enrichment Meter Principle (EMP) is the simple yet powerful physical idea that underpins a common kind of uranium enrichment determination using straightforward nondestructive gamma-ray spectroscopy. EMP was developed decades ago and has been shown to be capable of high accuracy, but its correct and reliable implementation requires attention to certain details. Enrichment results are often given out erroneously and interchangeably in mass percent or atom percent. The subtle but important differences due to the molar masses of the uranium isotopes are neglected. When the instrument calibration is performed and is applied over a narrow range of isotopic compositions, the differences in atom percent and mass percent may not be very evident, but over a wide range of isotopic fractions, the differences can lead to a significant bias. In the present work we emphasize that a wide-range linear and proportionate response requires the instrument calibration to be performed in terms of atom fraction. We performed careful measurements using certified reference material standards of uranium, spanning the range from 0.3206 to 93.2330 at%, supplied by New Brunswick Laboratories. The observed trends in the measurement results are discussed. We derived the definitive equations for atom percent and mass percent, taking into consideration the naturally occurring isotopes of uranium, namely, 234U, 235U, and 238U. We tabulated the material correction factors that are suitable for applying the calibration performed using one chemical form to the assay of a different chemical form, and, for the first time, we provide guidance on the uncertainty associated with the transfer factors.