Return to Index
R. Mitchell - S. M. Stoller Corporation
R. Wilhelmsen - CH2M-WG Idaho
M. Verdoorn - Battelle Energy Alliance
Quality assurance and quality control programs are maintained by contractors conducting environmental monitoring and by laboratories performing environmental analyses.
The purpose of a quality assurance and quality control program is to ensure precise, accurate, representative, and reliable results, and to maximize data completeness. Another key issue of a quality program is to ensure that data collected at different times are comparable to previously collected data. Elements of typical quality assurance programs include, but are not limited to the following (ASME 2001, ASME 1989, EPA 1998):
Back to top
Data reported in this document were obtained from several commercial, university, government, and government contractor laboratories. In 2006, the Idaho Cleanup Project (ICP) contractor used General Engineering Laboratories (GEL) and Sanford Cohen and Associates for radiological and inorganic analyses. The Idaho National Laboratory (INL) Site Drinking Water Program used GEL for radiological analyses, Microwise Laboratories (now Energy Laboratories) of Idaho Falls for inorganic and bacteriological analyses, and Environmental Health Laboratories (now Underwriters Laboratory) for inorganic and organic analyses. The air monitoring program also used Severn-Trent St. Louis and the Liquid Effluent Program also used Southwest Research Institute for some analyses.
The Environmental Surveillance, Education and Research Program (ESER) contractor used the Environmental Assessments Laboratory located at Idaho State University for gross radionuclide analyses (gross alpha, gross beta, and gamma spectrometry). Teledyne Brown Engineering of Knoxville, TN was used for specific radionuclide analyses (e.g., strontium-90 [90Sr], americium 241 [241Am], plutonium-238 [238Pu], and plutonium 239/240 [239/240Pu]). The U.S. Department of Energy’s (DOE’s) Radiological and Environmental Sciences Laboratory (RESL) performed radiological analyses for the U.S. Geological Survey (USGS). The USGS National Water Quality Laboratory conducted nonradiological analyses. All these laboratories participated in a variety of programs to ensure the quality of their analytical data. Some of these programs are described below.
The Mixed Analyte Performance Evaluation Program (MAPEP) is administered by DOE’s RESL. The DOE has mandated since 1994 that all laboratories performing analyses in support of the Office of Environmental Management shall participate in MAPEP. The program generally distributes samples of air, water, vegetation, and soil for analysis during the first and third quarters. Both radiological and nonradiological constituents are included in the program. Results can be found at http://www.inl.gov/resl/mapep/reports.html (DOE 2006).
Comparisons of the air and water MAPEP results for the laboratories used by INL Site environmental monitoring organizations in 2006 are presented in Figure 10-1 and Figure 10-2 for gross alpha/beta and actinides. Results for all laboratories were qualified as acceptable for these analyses.
The DOE RESL participates in a traceability program administered through the NIST. RESL prepares requested samples for analysis by NIST to confirm their ability to adequately prepare sample material to be classified as NIST traceable. NIST also prepares several alpha-, beta , and gamma-emitting standards, generally in liquid media, for analysis by RESL to confirm their analytical capabilities. RESL maintained NIST certifications in both preparation and analysis in 2006.
To verify the quality of the environmental dosimetry program conducted by the INL contractor and the ESER contractor, the Operational Dosimetry Unit participates in International Environmental Dosimeter Intercomparison Studies. The Operational Dosimetry Unit’s past results have been within ± 30 percent of the test exposure values on all intercomparisons. This is an acceptable value that is consistent with other analysis that range from ± 20 percent to ± 35 percent.
The Operational Dosimetry Unit of the INL Contractor also conducts in-house quality assurance testing during monthly and quarterly environmental thermoluminescent dosimeter (TLD) processing periods. The quality assurance (QA) test dosimeters were prepared by a QA program administrator. The delivered irradiation levels were blind to the TLD processing technician. The results for each of the QA tests have remained within the 20 percent acceptance criteria during each of the testing periods.
INL Site contractors participate in additional performance evaluation programs, including those administered by the International Atomic Energy Agency, the U.S. Environmental Protection Agency (EPA), and the American Society for Testing and Materials. Contractors are required by law to use laboratories certified by the state of Idaho or certified by another state whose certification is recognized by the state of Idaho for drinking water analyses. The Idaho State Department of Environmental Quality oversees the certification program and maintains a listing of approved laboratories. Where possible (i.e., the laboratory can perform the requested analysis) the contractors use such state-approved laboratories for all environmental monitoring analyses.
Back to top
As a measure of the quality of data collected, the ESER contractor, the INL contractor, the ICP contractor, the USGS, and other contractors performing monitoring use a variety of quality control samples of different media. Quality control samples include blind spike samples, duplicate samples, and split samples.
Groups performing environmental sampling use blind spikes to assess the accuracy of the laboratories selected for analysis. Contractors purchase samples spiked with known amounts of radionuclides or nonradioactive substances from suppliers whose spiking materials are traceable to the NIST. These samples are then submitted to the laboratories with regular field samples, with the same labeling and sample numbering system. The analytical results are expected to compare to the known value within a set of performance limits.
Monitoring organizations also collect a variety of quality control samples as a measure of the precision of sampling and analysis activities. One type is a duplicate sample, where two samples are taken from a single location at the same time. A second type is a split sample, where a single sample is taken and later divided into two portions that are analyzed separately. Contractors specify in quality assurance plans the relative differences expected to be achieved in reported results for both types of quality assurance samples.
Both the ESER contractor and the INL contractor maintained duplicate air samplers at two locations during 2006. The ESER contractor operated duplicate samplers at the locations in Mud Lake and at the Experimental Field Station. The INL contractor duplicate samplers were located at the Materials and Fuels Complex and at the Van Buren Boulevard Gate. Filters from these samplers were collected and analyzed in the same manner as filters from regular air samplers. Graphs of gross beta activity for the duplicate samplers are shown in Figure 10-3 and Figure 10-4. The figures show that duplicate sample results tracked each other well.
Another measure of data quality can be made by comparing data collected simultaneously by different organizations. The ESER contractor, the INL contractor, and the state of Idaho’s INL Oversight Program collected air monitoring data throughout 2006 at four common sampling locations: the distant locations of Craters of the Moon National Monument and Idaho Falls, and on the INL Site at the Experimental Field Station and Van Buren Boulevard Gate. Data from these sampling locations for gross beta compared favorably and are shown in Figure 10-5a and Figure 10-5b.
The ESER contractor collects semiannual samples of drinking and surface water jointly with the INL Oversight Program at five locations in the Magic Valley area and two shared locations near the INL Site. Table 10-1 contains intercomparison results of the gross alpha, gross beta, and tritium analyses for the 2006 samples taken from these locations. The paired results were statistically the same for 95 percent (40 of 42) of the comparisons made.
The USGS routinely collects groundwater samples simultaneously with the INL Oversight Program. Comparison results from this sampling are regularly documented in reports prepared by the two organizations.
Back to top
The ICP contractor’s Liquid Effluent Monitoring Program has specific quality assurance/quality control objectives for monitoring data. All effluent sample results were usable. Goals are established for accuracy, precision, and completeness, and all analytical results are validated following standard EPA protocols. The Liquid Effluent Monitoring Program submits three types of quality control samples:
During 2006, four sets of PE samples were submitted to the laboratory along with routine monitoring samples. Most results were within performance acceptance limits. Table 10-2 shows the number of results outside the performance acceptance limits. The laboratory was notified of the results so they could evaluate whether corrective action was required.
The relative percent difference (RPD) between the duplicate samples is used to assess data precision. Table 10-3 shows the results for 2006. Variations in the reported concentrations in the field duplicates are most likely the result of sample heterogeneity caused by variations in the amount of solids in the sample.
The analytical results for the equipment blank sample indicated that decontamination procedures are adequate.
The goal for completeness is to collect 100 percent of all required compliance samples. During 2006, this goal was met.
The groundwater sampling activities associated with Wastewater Land Application Permit (WLAP) compliance sampling follow established procedures and analytical methodologies.
During 2006, groundwater samples were collected from all of the Idaho Nuclear Technology and Engineering Center (INTEC) and Test Area North (TAN) WLAP monitoring wells that had sufficient water. Samples were not collected from aquifer well ICPP-MON-A-167, which was dry during April and October 2006, perched well ICPP MON-V-191, which was dry in October 2006, and perched well TSFAG-05, which was dry during both April 2006 and October 2006. All of the samples required for permit compliance were collected.
All groundwater sample results were usable, except for some April 2006 sample results that were rejected as unusable during data validation because of quality control issues. Table 10-4 shows the April 2006 groundwater sample results that were rejected. The analytical laboratory was notified of the missed holding times, and the laboratory implemented corrective action to prevent recurrence. The Liquid Effluent Monitoring QA Program did not require notifying the analytical laboratory of the other rejected results.
Field quality control samples were collected or prepared during the sampling activity in addition to regular groundwater samples. Laboratories qualified by the ICP Sample and Analysis Management Organization performed all ICP groundwater analyses during 2006. Because TAN and INTEC are regarded as separate sites, quality control samples (duplicate samples, field blanks, and equipment blanks) were prepared for each site.
Duplicate samples are collected to assess natural variability and precision of analyses. One duplicate groundwater sample was collected for every 20 samples collected or, at a minimum, five percent of the total number of samples collected. Duplicates were collected using the same sampling techniques and preservation as regular groundwater samples. Duplicates have precision goals within 35 percent as determined by the relative percent difference measured between the paired samples. In 2006, for the 84 duplicate pairs with detectable results, 94 percent had RPDs less than 35 percent. This high percentage of acceptable duplicate results indicates little problem with laboratory operations and good overall precision.
Field blanks are collected to assess the potential introduction of contaminants during sampling activities. They were collected at the same frequency as the duplicate samples. Results from the field blanks did not indicate field contamination.
Equipment blanks (rinsates) were collected to assess the potential introduction of contaminants from incomplete decontamination activities. They were collected by pouring analyte-free water through the sample port manifold after decontamination and before subsequent use. Again, results from the equipment blanks indicate proper decontamination procedures.
Results from the duplicate, field blank, and equipment blank (rinsate) samples indicate that laboratory procedures, field sampling procedures, and decontamination procedures were used effectively to produce high quality data.
During the April 2006 groundwater sampling event, two PE samples were analyzed for total coliform and fecal coliform. These samples were within the quality control (QC) Performance Acceptance Limits.
During the April 2006 sampling event, one PE sample was analyzed for metals. The results were as follows:
During the October 2006 groundwater sampling event, one PE sample was analyzed for metals. The metals PE sample results were within the QC Performance Acceptance Limits.
The Drinking Water Program’s completeness goal is to collect, analyze, and verify 100 percent of all compliance samples. This goal was met during 2006.
The Drinking Water Program requires that 10 percent of the samples (excluding bacteria) collected be quality assurance/quality control samples to include duplicates, field blanks, trip blanks, blind spikes, and splits. This goal was met in 2006 for all parameters.
The RPD between the duplicate samples is used to assess data precision. The INL and ICP contractor met the precision results for the Drinking Water Program in 2006, and results are shown in Table 10-5. Variations in the reported concentrations in the field duplicates are most likely the result of sample heterogeneity caused by variations in the amount of solids in the sample. Relative percent difference was not calculated if either the sample or its duplicate were reported as nondetects.
The ESER program met its completeness goals for 2006, which requires that 98 percent of scheduled samples are collected and analyzed. For air sampling, less than 0.5 percent of scheduled samples did not meet the required volume to be considered a valid sample, due to equipment malfunctions and power outages. For most sample types, 100 percent of samples were collected as scheduled.
Spike samples were used to test the accuracy of the laboratories performing analyses for the program. During 2006, samples of air, water, and milk were submitted to each of the analytical laboratories and analyzed for gross alpha/beta, tritium, gamma-emitting radionuclides, actinides, and 90Sr. Each laboratory also conducted an internal spike sample program using standards traceable to NIST.
Precision was measured using duplicate and split samples and laboratory recounts. In 2006, 98.6 percent of the results were within the criteria specified for these types of comparisons.
Both field blanks and laboratory blanks were used by the ESER contractor and analytical laboratories to detect the presence of contamination through the sampling and analysis process. No major problems were reported in 2006.
The INL contractor analytical laboratories analyzed all Surveillance Monitoring Program samples as specified in the statements of work. These laboratories participate in a variety of intercomparison quality assurance programs, which verify all the methods used to analyze environmental samples. The programs include the DOE MAPEP and the EPA National Center for Environmental Research (NCER) Quality Assurance Program. The laboratories met the performance objectives specified by the MAPEP and NCER.
The Surveillance Monitoring Program met its completeness and precision goals. Samples were collected and analyzed as planned from all available media. The Environmental Surveillance Program submitted duplicate, blank, and control samples as required with routine samples for analyses.
The ICP contractor analytical laboratories analyzed all Waste Management Surveillance Program samples as specified in the statements of work. These laboratories participate in a variety of intercomparison quality assurance programs, which verify all the methods used to analyze environmental samples. The programs include the DOE MAPEP and the EPA NCER Quality Assurance Program. The laboratories met the performance objectives specified by the MAPEP and NCER.
PE samples for soils, vegetation, and run-off water were submitted to the
contract laboratory for analysis in March 2006 for Waste Management Surveillance
Programs. PE sample results showed satisfactory agreement.
The Waste Management Surveillance Program met its completeness and precision goals. Samples were collected and analyzed as planned from all available media. The Waste Management Surveillance Program submitted duplicate and blank samples to the contract laboratory as required with routine samples for analyses. In 2006, the results for these samples were within the acceptable range.
Back to top
American Society of Mechanical Engineers (ASME), 1989, “NQA-3-1989: Quality Assurance Requirements for the Collection of Scientific and Technical Information for Site Characterization of High-Level Nuclear Repositories, Supplement SW-1,” American National Standard; New York.
American Society of Mechanical Engineers (ASME), 2001, “NQA-1-2000: Quality Assurance Requirements for Nuclear Facility Applications, Part I,” American National Standard; New York.
U.S. Department of Energy (DOE), 2005, “Mixed Analyte Performance Evaluation Program,” http://www.inl.gov/resl/mapep/reports.html
U.S. Environmental Protection Agency (EPA), 1998, EPA
QA/G-5, “EPA Guidance for Quality Assurance Project Plans,” Appendix B,
Return to Index