There are two main types of errors that may be made when reporting levels of contaminants:
It is the goal of the ESER program to minimize the error of saying something is not present when it actually is, to the extent that is reasonable practicable. To do this, a two standard deviation (2s) reporting level is used. The standard deviation is a measurement of the variation about the mean. In a distribution of results for one sample, the average result, plus or minus (±) two standard deviations (2s) of that average, approximates the 97.5% confidence interval for that average. When a net sample result is more than 2s above zero, more than 97.5% of the time the value will have come from a distribution with an average greater than zero (Figure 5). The uncertainty of measurements in this report are denoted by following the result with a “±” 2s uncertainty term and all results that are greater than 2s from zero are reported in the text (all data are reported in Appendix C).
Samples with true values at or barely above the 2s limit have a high probability that the level of radioactivity will be reported as less than 2s (false negative results; Figure 5). Results at or barely about the 2s-reporting limit have a relatively low probability of reporting radioactivity above zero when it actually is not (false positive results; Figure 5). The probability that a sample will be reported as not being radioactive when it actually is radioactive falls as the level of radioactivity increases. The level of radioactivity at which the sample will have a less than 5% possibility of being reported as not being detectably radioactive is the “minimum detectable activity” (Figure 6). The MDA per sample weight or volume is called the MDC. All results with measured levels greater than 2s and the MDC will be specifically highlighted in this report.
When radiological measurements are made, it is often of interest to determine whether concentrations are different between locations or periods of time. For example, if the INEEL were a significant source of offsite contamination, concentrations of contaminants would be higher at INEEL locations compared to Boundary locations which, in turn, would be higher than Distant locations due to dispersal. To investigate this, statistical tests are used. Specifically, an independent samples t-test is used to determine if there are significant differences between the average gross alpha and gross beta concentrations at INEEL, Boundary, and Distant locations. Groups are considered significantly different if the 95% confidence interval for their averages overlap (t-test with a = 0.05).
In Our World
has always been a part of the natural environment in the form of cosmic
radiation, cosmogenic radionuclides [carbon-14 (14C),
Beryllium-7 (7Be), and tritium (3H)], and
naturally occurring radionuclides, such as potassium-40 (40K),
and the thorium, uranium, and actinium series radionuclides which have
very long half lives. Additionally,
human-made radionuclides were distributed throughout the world beginning
in the early 1940s. Atmospheric
testing of nuclear weapons from 1945 through 1980 and nuclear power
plant accidents, such as the Chernobyl accident in the former Soviet
Union during 1986, have resulted in fallout of detectable radionuclides
around the world. This
natural and manmade global fallout radioactivity is referred to as
radionuclides present in our environment can give both internal and
external doses (Table 1). Internal
dose is received as a result of the intake of radionuclides. The major routes of intake of radionuclides for members of
the public are ingestion and inhalation.
Ingestion includes the intake of the radionuclides from drinking
milk and water, and consumption of food products.
Inhalation includes the intake of radionuclides through breathing
dust particles containing radioactive materials.
Natural radiation doses vary based on local geology and
During the last 100 years, research has been conducted in an attempt to understand the effects of radiation on humans and the environment. Much of this research was done using standard epidemiological and toxicological approaches to characterize the response of populations and individuals to high radiation doses. A good understanding of risks associated with high radiation doses was achieved. At low exposures to radiation, however, cells heal, so the risks from these levels are less known. This problem is compounded because scientists are searching for effects from exposure to low levels of radiation in the midst of exposure to much larger amounts of background radiation. The only measurable increased cancer incidence has occurred following high radiation doses. Mathematical models have been used to predict risks from low radiation doses.
Regulatory dose limits are set well below levels where measurable health effects have been observed. The total radiation dose limit for individual members of the public as defined by the Code of Federal Regulations (10 CFR 20.1301) is 1 mSv/y (100 mrem/y), not including the dose contribution from background radiation. Limits on emissions of radionuclides to the air from DOE facilities are set such that they will not result in a dose greater than 0.1 mSv/y (10 mrem/y) to any member of the public (40 CFR 61.92). DOE drinking water criterion have set limits of 0.04 mSv/y (4 mrem/y) for the ingestion of drinking water (DOE Order 5400.5, ), and EPA limits on drinking water supplies specify low allowable limits for radioactive constituents (40 CFR Parts 9, 141, and 142). DOE Order 5400.5 lists DCG values which are the concentrations in air and water that if a person is exposed to continuously (ingested and inhaled given certain assumptions) will result in the dose limit. DCG values are used as a reference to ensure observed concentrations are lower than concentrations that would result in a dose near the limit. ESER Program laboratories analyze for radionuclides at levels ranging from 10 to over one million times lower than those that would result in a dose near the limits (Table B-1, Appendix B).