Return to Index
Quality assurance and quality control programs are maintained by contractors conducting environmental monitoring, and by laboratories performing environmental analyses to ensure precise, accurate, representative, and reliable results, and to maximize data completeness. Data reported in this document were obtained from several commercial, university, government, and government contractor laboratories. To assure quality results, the laboratories participate in a number of laboratory quality check programs.
All contractors conducting environmental monitoring programs maintain
specific quality assurance/quality control objectives for data. These programs
use a number of quality control samples, including duplicate samples, split
samples, spike samples, and field blanks to demonstrate that data are meeting
the established objectives.
Return to top
Quality assurance and quality control programs are maintained by contractors conducting environmental monitoring and by laboratories performing environmental analyses.
The purpose of a quality assurance and quality control program is to ensure precise, accurate, representative, and reliable results, and maximize data completeness. Another key issue of a quality program is to ensure that data collected at different times are comparable to previously collected data. Elements of typical quality assurance programs include, but are not limited to the following (ASME 2001, ASME 1989, EPA 1998):
Data reported in this document were obtained from several commercial, university, government, and government contractor laboratories. In 2004, the Management and Operating (M&O) contractor used the Idaho National Engineering and Environmental Laboratory (INEEL) Radiological Measurements Laboratory (RML) and General Engineering Laboratories (GEL) for radiological and inorganic analyses. The M&O Drinking Water Program used GEL for radiological analysis, Microwise Laboratories (now Energy Laboratories) of Idaho Falls for inorganic and bacteriological analyses, and Environmental Health Laboratories (now Underwriters Laboratories) for inorganic and organic analyses.
The Environmental Surveillance, Education and Research Program (ESER) contractor used the Environmental Assessments Laboratory (EAL) located at Idaho State University (ISU) for gross radionuclide analyses (gross alpha, gross beta, and gamma spectrometry) and Severn-Trent Laboratories (STL) of Richland, Washington, for specific radionuclide analyses (e.g., strontium-90 [90Sr], americium 241 [241Am], plutonium-238 [238Pu], and plutonium 239/240 [239/240Pu]). The U.S. Department of Energy's (DOE's) Radiological and Environmental Sciences Laboratory (RESL) performed radiological analyses for the U.S. Geological Survey (USGS). The USGS National Water Quality Laboratory (NWQL) conducted nonradiological analyses. All these laboratories participated in a variety of programs to ensure the quality of their analytical data. Some of these programs are described below.
Return to top
The Quality Assessment Program (QAP), administered by the DOE Environmental Measurements Laboratory (EML) in Brookhaven, New York, was a performance evaluation program that tested the quality of DOE contractor and subcontractor laboratories in performing environmental radiological analyses. The EML prepared samples containing known amounts of up to 15 radionuclides in four media: simulated air filters, soil, vegetation, and water. These were distributed to participating laboratories in March and September. Participants could use any method for each analysis, and they were required to report their results within 90 days. The EML issued quality assessment reports twice per year in which the identities of participating laboratories, their results, and comparison to EML results were presented. These reports are available, along with a searchable database of results, on the Internet at http://www.eml.doe.gov/qap/reports/ (DOE 2004a). The QAP was discontinued following the March 2004 distribution.
The Mixed Analyte Performance Evaluation Program (MAPEP) (DOE 2004b) is administered by DOE's RESL. The DOE has mandated since 1994 that all laboratories performing analyses in support of the Office of Environmental Management shall participate in MAPEP. The program generally distributes samples of air, water and soil for analysis during the first and third quarters. Both radiological and nonradiological constituents are included in the program. Results can be found at http://www.inel.gov/resl/mapep/reports.html (DOE 2004b).
Comparisons of the air and water results for the laboratories used by INEEL environmental monitoring organizations in 2004 are presented in Figure 10-1 and Figure 10-2. QAP results are for June 2004 and MAPEP results were reported for November 2004. All results for all laboratories were qualified as acceptable with the following exceptions. For the June air analysis, the DOE EML qualified the 238U and gross beta results from GEL as "acceptable with warning." For November, STL received a "not acceptable" rating for its actinides in air analyses.
For water results in the June QAP report, GEL and STL each received an "acceptable with warning" rating for 241Am. Severn-Trent also received a "not acceptable" rating for gross alpha and gross beta in water. For the November MAPEP report, Severn -Trent received a "not acceptable" rating for 241Am and 90Sr.
The DOE RESL participates in a traceability program administered through the NIST. RESL prepares requested samples for analysis by NIST to confirm their ability to adequately prepare sample material to be classified as NIST traceable. NIST also prepares several alpha-, beta-, and gamma-emitting standards, generally in liquid media, for analysis by RESL to confirm their analytical capabilities. RESL maintained NIST certifications in both preparation and analysis in 2004.
To verify the quality of the environmental dosimetry program
conducted by the M&O contractor and the ESER contractor, the Operational
Dosimetry Unit participates in International Environmental Dosimeter
Intercomparison Studies. The Operational Dosimetry Unit's past results have been
within ± 30 percent of the test exposure values on all intercomparisons. This is
an acceptable value that is consistent with other analysis that range from ± 20
percent to ± 35 percent. During 2004, the International Environmental Dosimeter
Intercomparison Study was not offered for participation.
The Operational Dosimetry Unit of the INEEL M&O Contractor also conducts in-house quality assurance testing during monthly and quarterly environmental thermoluminescent dosimeter (TLD) processing periods. The quality assurance (QA) test dosimeters were prepared by a QA program administrator. The delivered irradiation levels were blind to the TLD processing technician. The results for each of the QA tests have remained within the 20 percent acceptance criteria during each of the testing periods during calendar year 2004. At no time during QA testing did any test exceed ± 10 percent.
INEEL contractors participate in additional performance evaluation programs, including those administered by the International Atomic Energy Agency, the U.S. Environmental Protection Agency (EPA), and the American Society for Testing and Materials. Contractors are required by law to use laboratories certified by the State of Idaho or certified by another state whose certification is recognized by the State of Idaho for drinking water analyses. The Idaho State Department of Environmental Quality oversees the certification program and maintains a listing of approved laboratories. Where possible (i.e., the laboratory can perform the requested analysis) the contractors use such state-approved laboratories for all environmental monitoring analyses.
Return to top
As a measure of the quality of data collected, the ESER contractor, the M&O contractor, the USGS, and other contractors performing monitoring use a variety of quality control samples of different media. Quality control samples include blind spike samples, duplicate samples, and split samples.
Groups performing environmental sampling use blind spikes to assess the accuracy of the laboratories selected for analysis. Contractors purchase samples spiked with known amounts of radionuclides or nonradioactive substances from suppliers whose spiking materials are traceable to the NIST. These samples are then submitted to the laboratories with regular field samples, with the same labeling and sample numbering system. The analytical results are expected to compare to the known value within a set of performance limits.
Monitoring organizations also collect a variety of quality control samples as a measure of the precision of sampling and analysis activities. One type is a duplicate sample, where two samples are taken from a single location at the same time. A second type is a split sample, where a single sample is taken and later divided into two portions that are analyzed separately. Contractors specify in quality assurance plans the relative differences expected to be achieved in reported results for both types of quality assurance samples.
Both the ESER contractor and the M&O contractor maintained duplicate air samplers at two locations during 2004. The ESER contractor operated duplicate samplers at the locations in Blackfoot and Mud Lake. The M&O contractor duplicate samplers were located at Argonne National Laboratory-West and at the Van Buren Boulevard Gate. Filters from these samplers were collected and analyzed in the same manner as filters from regular air samplers. Graphs of gross beta activity for the duplicate samplers are shown in Figure 10-3 and Figure 10-4. The figures show that duplicate sample results tracked each other well.
Another measure of data quality can be made by comparing data
collected simultaneously by different organizations. The ESER contractor, the
M&O contractor, and the State of Idaho's INEEL Oversight Program collected air
monitoring data throughout 2004 at four common sampling locations: the distant
locations of Craters of the Moon National Monument and Idaho Falls, and on the
INEEL at the Experimental Field Station and Van Buren Boulevard Gate. Data from
these sampling locations for gross beta compared favorably and are shown in
The ESER contractor collects semiannual samples of drinking and surface water jointly with the INEEL Oversight Program at five locations in the Magic Valley area and two shared locations near the INEEL. Table 10-1 contains intercomparison results of the gross alpha, gross beta, and tritium analyses for the 2004 samples taken from these locations. The paired results were statistically the same for 98 percent (40 of 41) of the comparisons made.
The USGS routinely collects groundwater samples simultaneously with the INEEL Oversight Program. Comparison results from this sampling are regularly documented in reports prepared by the two organizations.
Return to top
The M&O contractor's Liquid Effluent Monitoring Program has specific quality assurance/ quality control objectives for monitoring data. Goals are established for accuracy, precision, and completeness, and all analytical results are validated following standard EPA protocols. This section applies to all surveillance groundwater and effluent monitoring.
Performance evaluation samples (submitted as field blind spikes) are required to assess analytical data accuracy. At a minimum, performance evaluation samples are required quarterly.
During 2004, five sets of performance evaluation samples were submitted to the laboratory along with routine monitoring samples. With the exception of total Kjeldahl nitrogen (TKN), no blind spike parameters routinely missed the performance acceptance limits. Out of five field blind spikes submitted for TKN, four were less than the lower performance acceptance limit (the laboratory value was less than the true value). For blind spike results below the lower performance acceptance limit, the concern is that all the reported concentrations associated with that blind spike result could be biased in the same direction and could result in an unreported permit limit exceedance. For blind spike results that are above the performance acceptance limit, the concern is that all the associated reported concentrations could again be biased in the same direction as the blind spike results and could result in the appearance of a permit limit exceedance when in fact none has occurred. The contract laboratory was contacted in November 2004 regarding the TKN results that fell outside of the performance acceptance limits. The laboratory apparently has resolved the issue because the subsequent (and most recent results from December 2004) were well within the performance acceptance limits. Blind spikes will continue to be submitted regularly to ensure laboratory performance.
Relative percent difference (RPD) between the duplicate samples is used to assess data precision. Table 10-2 shows the results for 2004.
The goal for completeness is to collect 100 percent of all required compliance samples. During 2004, this goal was met.
Validation performed on analytical results from the 2004 sampling efforts resulted in one rejected sample. The January total suspended solids result for CPP-773 (Idaho Nuclear Technology and Engineering Center [INTEC] effluent) was rejected for exceeding the hold time
In addition two biochemical oxygen demand (BOD) results were not reported. Although October BOD samples for Central Facilities Area (CFA) influent and effluent were collected and delivered to the analytical laboratory, the laboratory did not report the results due to an analyst overlooking the end of the 5-day incubation period, which resulted in erratic sample results.
No other sampling or validation issues were identified during Calendar Year 2004.
The groundwater sampling activities associated with Wastewater Land Application Permit compliance sampling follow established procedures and analytical methodologies.
During 2004, groundwater samples were collected from all of the INTEC and Test Area North (TAN) Wastewater Land Application Permit monitoring wells (with the exception of perched well ICPP-MON-V-191, which was dry during both April 2004 and October 2004). All of the samples required for permit compliance were collected. Some of the 2004 analytical results were rejected as unusable during data validation because of quality control issues. The quality control issues were with the April coliform results from all wells, the April TKN result from one well, and some of the October metals results from several wells. Because all of the April coliform results were rejected, the impacted wells were resampled for coliform in July, and none of the July sample results were rejected. All other rejected results were attributed to either matrix spike or matrix spike duplicate recovery problems, both of which could be an issue with the analytical laboratory or could be the result of interference in the sample matrix, which is outside the laboratory's control. If this continues to be a problem, blind samples could be submitted to the laboratory for matrix spike analyses.
Field quality control samples were collected or prepared during the sampling activity in addition to regular groundwater samples. Laboratories qualified by the INEEL Sample and Analysis Management Organization performed all M&O groundwater analyses during 2004. Because TAN and INTEC are regarded as separate sites, quality control samples (duplicate samples, field blanks, and equipment blanks) were prepared for each site.
Duplicate samples are collected to assess the potential for any
bias introduced by analytical laboratories. One duplicate groundwater sample was
collected for every 20 samples collected or, at a minimum, five percent of the
total number of samples collected. Duplicates were collected using the same
sampling techniques and preservation requirements as regular groundwater
samples. Duplicates have precision goals within 35 percent as determined by the
relative percent difference measured between the paired samples. In 2004, for
the 68 duplicate pairs with detectable results, 92 percent had RPDs less than 35
percent. This high percentage of acceptable duplicate results indicates little
problem with laboratory contamination and good overall precision.
Field blanks are collected to assess the potential introduction of contaminants during sampling activities. They were collected at the same frequency as the duplicate samples. Results from the field blanks did not indicate field contamination.
Equipment blanks (rinsates) were collected to assess the potential introduction of contaminants from decontamination activities. They were collected by pouring analyte-free water through the sample port manifold after decontamination and before subsequent use. Again, results from the equipment blanks did not indicate improper decontamination procedures.
Performance evaluation (PE) samples for fecal and total coliform were submitted to the contract laboratory in July and October of 2004. The PE coliform samples either had a certified value of <1 colony/100 ml or had a set QC Performance Acceptance Limits range associated with a certified result (reported in colonies/100 ml). For four toal coliform and two fecal coliform PE samples submitted in July of 2004, the laboratory reported the results as too numberous to count (TNTC). While the methods for coliform analyses allow the laboratory to report the results as TNTC when the count is greater than 200 colonies/100 mL, this made it impossible to compare the results of these six samples to the QC Performance Acceptance Limits. The laboratory was contacted and a request made to report the remainder of the July 2004 PE sample results as colonies/100 mL. The remaining four PE samples were analyzed in July 2004 for fecal coliform. These four samples were within the QC Performance Acceptance Limits.
During the October 2004 groundwater sampling event, four PE samples were analyzed for total coliform, and four PE samples were analyzed for fecal coliform. Only one sample (total coliform) did not meet the QC Performance Acceptance Limit or <1 colony/100 mL criteria. Ther result reported by the laboratory fell below the acceptable range. Additional PE samples for coliform analyses will be submitted in Calendar Year 2005 to ensure the laboratory meets the performance standards.
Results from the duplicate, field blank, and equipment blank (rinsate) samples indicate that laboratory procedures, field sampling procedures, and decontamination procedures were used effectively to produce high quality data.
The two samples collected at the Radioactive Waste Management Complex and the two samples collected at the T-28 north gravel pit were collected as unfiltered grab samples. No trip blanks or duplicate samples were collected. Sample containers and preservation methods were used according to internal procedures. The data were reviewed according to internal procedures.
Visual examination reports were checked for accuracy against logbook entries before submittal to the industrial storm water coordinator.
The Drinking Water Program's completeness goal is to collect, analyze, and verify 100 percent of all compliance samples. This goal was met during 2004.
The Drinking Water Program requires that 10 percent of the samples (excluding bacteria) collected be quality assurance/quality control samples to include duplicates, field blanks, trip blanks, blind spikes, and splits. This goal was met in 2004 for all parameters.
The Drinking Water Program's precision goal states that the relative percent difference determined from duplicates must be 35 percent or less for 90 percent of all duplicates. That goal was met for 2004. The relative percent difference was less than the required 35 percent for 97 percent of all duplicates (for those with both results detected). Relative percent difference was not calculated if either the sample or its duplicate were reported as nondetects.
The ESER program met its completeness goals for 2004, which requires that 98 percent of scheduled samples are collected and analyzed. For air sampling, less than 1.2 percent of scheduled samples did not meet the required volume to be considered a valid sample, due to equipment malfunctions and power outages. For most sample types, 100 percent of samples were collected as scheduled.
Spike samples were used to test the accuracy of the laboratories performing analyses for the program. During 2004, samples of air, water, milk, and soil were submitted to each of the analytical laboratories and analyzed for gross alpha/beta, tritium, gamma-emitting radionuclides, actinides, and 90Sr. Each laboratory also conducted an internal spike sample program using standards traceable to NIST.
Precision was measured using duplicate and split samples and
laboratory recounts. In 2004, over 97 percent of the results were within the
criteria specified for these types of comparisons.
Both field blanks and laboratory blanks were used by the ESER contractor and analytical laboratories to detect the presence of contamination through the sampling and analysis process. No major problems were reported in 2004.
Return to top
The M&O contractor analytical laboratories analyzed all Environmental Surveillance Program samples as specified in the statements of work. These laboratories participate in a variety of intercomparison quality assurance programs, which verify all the methods used to analyze environmental samples. The programs include the DOE MAPEP and the EPA National Center for Environmental Research (NCER) Quality Assurance Program. The laboratories met the performance objectives specified by the MAPEP and NCER.
The Environmental Surveillance Program met its completeness and precision goals. Samples were collected and analyzed as planned from all available media. The Waste Management Surveillance Program submitted duplicate, blank, and control samples as required with routine samples for analyses.
PE samples were submitted for soils and vegetation and results received met all of the agreement criteria.
PE samples were also submitted for both 2-in. and 4-in. air samples. The trip blank from the 2-in. third quarter composites indicated 90Sr at greater than 3 sigma error with similar levels on most the samples in that batch. The batch was reanalyzed, and the results did not confirm the original data. The second set of analytical results is being reported; however, the contract laboratory has continued to show poor performance on 90Sr on air and other media.
PE samples were submitted to the contract laboratory for analysis in June 2004 (second quarter) and February 2005 (fourth quarter) for both waste management and site surveillance programs. For the PE samples submitted for analysis with the second quarter composites, the laboratory met the required agreement criteria on all nuclides using gamma spectrometry. For the fourth quarter PE samples, the laboratory met the required agreement criteria on all nuclides using gamma spectrometry with the exception of cobalt-60 (60Co) on one 4-in. sample. The laboratory result was biased high (141 percent). Two other PE samples showed agreement with the known activity for 60Co. Radiochemical analytical results showed warnings for 90Sr on all three PE samples submitted. Two of the 90Sr results were biased high, and one was biased low. All other radiochemical results on the PE samples showed satisfactory agreement.
Based on the results of these PE samples, 90Sr and 60Co results may be biased high, all three sigma results are being reported. The M&O contractor will submit additional PE samples in Calendar Year 2005 to monitor the contract laboratory's performance.
Return to top
American Society of Mechanical Engineers (ASME), 1989, "NQA-3-1989: Quality Assurance Requirements for the Collection of Scientific and Technical Information for Site Characterization of High-Level Nuclear Repositories, Supplement SW-1," American National Standard; New York.
American Society of Mechanical Engineers (ASME), 2001, "NQA-1-2000: Quality Assurance Requirements for Nuclear Facility Applications, Part I," American National Standard; New York.
U.S. Department of Energy (DOE), 2004a, Environmental Measurements Laboratory, Quality Assurance Program, http://www.eml.doe.gov/qap/reports/ .
U.S. Department of Energy (DOE), 2004b, Mixed Analyte Performance Evaluation Program, http://www.inel.gov/resl/mapep/reports.html .
U.S. Environmental Protection Agency (EPA), 1998, EPA
QA/G-5, EPA Guidance for Quality Assurance Project Plans, Appendix B,
Return to Index