Section 10

Data Quality Assurance and Quality Control
Interim Final – December 31, 2008

Click to jump to your area of interest or scroll down to read about this topic.
Section 10.0 Introduction

Section 10.1 Project Specific Quality Assurance Project Plan (QAPP) Requirements

Section 10.2 Quality Assurance Objectives
10.2.1 Data Quality Objectives
10.2.2 QA Objectives
10.2.3 Quantitation Limits

Section 10.3 Data Quality Assurance Procedures
10.3.1 Precision and Accuracy
10.3.2 Representativeness
10.3.3 Completeness
10.3.4 Comparability

Section 10.4 Quality Control

Section 10.5 Field Equipment and Laboratory Instrument Calibration
10.5.1 Field Equipment Calibration
10.5.2 Laboratory Equipment Calibration

Section 10.6 Field QA/QC
10.6.1 Field Replicates
10.6.1.1 Discrete Sampling Replicates
10.6.1.2 Multi-Increment Soil Sampling Replicates
10.6.2 Blanks
10.6.2.1 Trip Blanks
10.6.2.2 Field Equipment Rinsate Blanks
10.6.2.3 Field Source Blank
10.6.3 Documentation
10.6.4 Chain of Custody

Section 10.7 LABORATORY QA/QC
10.7.1 Method Blanks
10.7.2 Laboratory Control Samples
10.7.3 Matrix Spikes
10.7.4 Matrix Cleanup
10.7.5 Surrogates
10.7.6 Laboratory Sub-sampling Replicates
10.7.7 QA/QC Reports

Section 10.8 Corrective Action

Tables
10-1 Recommended QC Sample Frequency

Return to the Top of the Page

10.0 DATA QUALITY ASSURANCE AND QUALITY CONTROL

The State of Hawaiʻi Department of Health (HDOH) Hazard Evaluation and Emergency Response Office (HEER Office) regards the quality of data as crucial to proper site characterization and evaluation of potential environmental hazards at a site. Data must be of sufficient quality to ensure that the overall site assessment objectives are met using the systematic planning approach (see Section 3). Ensuring that data is of sufficient quality begins during the initial planning and development of site investigation objectives and continues throughout the investigation to the final assessment of data quality. The HEER Office also regards evaluation of the suitability and usability of data as essential during both the site investigation and data quality documentation process.

The quality assurance and quality control (QA/QC) process should also be viewed as an approach to:

  1. Ensure that site characterization data are adequate to accurately define site impacts and evaluate potential environmental hazards
  2. Maximize the potential that any remedial actions at a site will be correctly selected
  3. Ensure that site management decisions are arrived at with the correct information

Not devoting proper time and resources to QA/QC at any stage of investigation may result in uncertainly as to whether conclusions or actions are sufficiently protective of human health and the environment; this deficiency may be compounded by the need to make up for wasted time and resources.

Return to the Top of the Page

10.1 PROJECT SPECIFIC QUALITY ASSURANCE PROJECT PLAN (QAPP) REQUIREMENTS

The Quality Assurance Project Plan (QAPP) is the formal project document that specifies the operational procedures and QA/QC requirements for obtaining environmental data of sufficient quantity and quality to satisfy site investigation objectives (see also Subsections 3.6 and 3.7). The QAPP is required for all data collection activities that generate data for use in decision-making. It contains information on project management, measurement and data acquisition, assessment and oversight, and data validation and usability. The QAPP integrates the Data Quality Objectives (DQO), the data collection design, and QA/QC procedures into a coherent plan to be used for collecting data of known quality and adequate for their intended use. The QAPP is typically presented as part of the Sampling and Analysis Plan (SAP) (Step 6 of systematic planning – See Subsection 3.2) and should include the following elements:

  1. Quality assurance (QA) objectives for measurement
  2. Sample chain of custody
  3. Calibration procedures
  4. Analytical methods
  5. Data reduction, validation, and reporting
  6. Internal quality control (field and laboratory checks)
  7. Performance and system audits
  8. Preventative maintenance
  9. Data measurement assessment procedures (precision, accuracy, representativeness, completeness, and comparability)
  10. Corrective actions

In addition, the QAPP should provide the contaminants of concern known or suspected to be present at the sampling location, the HEER Office Tier 1 Environmental Action Levels (EALs) (or other pertinent screening criteria) for those contaminants, and the quantitation limits needed to assess the EALs. The project-specific QAPP will also provide the required quantitation limits for these analytes in various matrices based upon their concentrations of concern. See Subsection 10.2.3 for additional information on quantitation limits.

More detailed information regarding the outline, format, and required content of the QAPP, is presented in Section 18.

Return to the Top of the Page

10.2 QUALITY ASSURANCE OBJECTIVES

QA objectives and procedures are included in the sampling strategy to assess and evaluate a wide variety of concerns from sample collection through laboratory analysis. Defining QA/QC requirements is integral to the DQO process and is detailed in Section 3.0. The DQO must be developed well before any sampling or analysis and must be clearly defined in the SAP and QAPP for each project. QA objectives and procedures must also be defined in the SAP and the QAPP and will depend on the results of the project specific DQO processes.

Return to the Top of the Page

10.2.1 DATA QUALITY OBJECTIVES

Data Quality Objectives are discussed in detail in Section 3.0 and summarized in this subsection. Environmental data must be of the appropriate type, quantity and quality to manage uncertainty and reach a defensible decision on appropriate response actions. To ensure that data obtained during a site investigation are adequate to identify or negate the presence of potential environmental hazards, the HEER Office recommends that the site investigation be developed using a systematic planning approach. This approach emphasizes using straightforward, clear questions to design and guide the site investigation. Consultation with the laboratory being utilized for sample analysis is also important to ensure the DQO can be met within their capabilities.

Systematic planning involves a series of well-thought-out steps that help ensure the investigation results are adequate to characterize potential environmental hazards posed by contamination and ultimately provide sufficient information to develop appropriate response actions. The recommended steps of the systematic planning approach, presented as Figure 3-1 in Section 3.0, function to establish the DQO for a site investigation.

DQO are established based on the expected end use of the data. For example, the data needed to perform a preliminary site screening assessment will differ significantly from the data needed to fully characterize a site and select an appropriate response action. DQO and the systematic planning approach in general are essential to developing a cost effective site investigation because they assure that resources devoted to sampling and analysis are not wasted on unnecessary or unreliable data.

Return to the Top of the Page

10.2.2 QA OBJECTIVES

QA objectives must be specified in the project specific QAPP. For each sample matrix and environmental measurement type, define QA objectives in terms of the following information:

  • Types of quality control (QC) samples and measurements involved
  • Frequency of collection and analysis of QC samples and measurements
  • How the QA objective is measured
  • Acceptance criteria or QC limits for that measurement

For example, for soil samples analyzed for semivolatile organic compounds (SVOC), a project-specific QAPP might specify that the precision will be measured as the relative percent difference (RPD) between the results of matrix spike (MS) and matrix spike duplicate (MSD) samples. The QAPP might further specify that MS/MSD samples will be collected at a frequency of 1 MS/MSD sample for every 20 environmental samples, and that the QC limit for RPD is 20 percent for all spiking compounds.

Analytical data must be evaluated for compliance with QC limits. Typically, when analytical data do not meet the QC limits, corrective action must be initiated or the data will be qualified or rejected. Corrective action includes stopping the analysis; examining instrument performance, sample preparation, and analysis information; recalibrating instruments; preparing and reanalyzing samples. Examples of QC results indicating that corrective action may be necessary are provided in Subsection 10.8.

Return to the Top of the Page

10.2.3 QUANTITATION LIMITS

A crucial QA objective is for sensitivity, which is generally expressed in the form of the method quantitation limit(s) (also commonly referred to as ‘reporting limits’) for the analytical method(s) selected. The concentrations of concern will be based on risk-based criteria, regulatory limits, and other similar guidelines. In Hawaiʻi, the default screening criteria on which to base quantitation limits are the HDOH HEER Office Tier 1 Environmental Action Levels (EALs) (HDOH, 2016).

Quantitation limits reflect the influences of the sample matrix on method sensitivity and are typically higher than detection limits. Quantitation limits provide a reliable indication of the amount of material needed to produce an instrument response that can be routinely identified and reliably quantified when applying a particular analytical method to real environmental samples.

The HEER Office requires analytical methods with sensitivities appropriate to the intended data use. Whenever possible, analytical methods should be specified such that matrix-specific reporting limits are lower than any contaminant concentrations of concern. In the event that the laboratory would not be able to achieve reporting limits below the screening criteria, the investigation team should first contact the HEER Office and present the proposed alternative laboratory reporting limit for the Chemical of Potential Concern (COPC) in a SAP or QAPP. Advance concurrence from the HEER Office for use of reporting limits above a relevant EAL must be obtained prior to initiation of field sampling. As part of the process for obtaining concurrence, the HEER Office will require that the investigation team document the proposed levels and provide well-documented evidence, rationale, and justification for using higher reporting limits.

Return to the Top of the Page

10.3 DATA QUALITY ASSURANCE PROCEDURES

Table 10-1 Recommended QC Sample Frequency

QC Type QC Sample Default Frequency 1
Field QC Soil replicates/ triplicates Depends on numbers of Decision Units (DU), COPCs, site characteristics. See Subsection 4.2.3 regarding field replicates (triplicates for MIS).
Groundwater duplicates 1 per day for every 10 samples
Equipment rinsate blank Not required routinely when effective decontamination protocols are documented in the SAP. When required (e.g., investigations for trace levels), 1 per day per type of non-disposable sampling equipment
Trip blanks 1 per shipping container containing volatile samples
Source blanks 1 per water source per investigation, if used to decontaminate equipment for re-use.
Laboratory QC Method blanks 1 per every 20 samples
Sub-sampling replicates 1 per every 20 samples for soil analyses of non-volatile contaminants (triplicates preferred)
MS/MSD percent recovery 1 per every 20 samples
LCS/LCSD or blank spikes percent recovery 1 per every 20 samples
Surrogate standard percent recovery Every sample for organic analysis by gas chromatography
Notes:
LCS/LCSD Laboratory Control Sample/Laboratory Control Sample Duplicate
MS/MSD Matrix Spike/Matrix Spike Duplicate
MIS Multi-Increment sample
      1 Based on HEER Office guidance and SW-846 Method 8000C
Guidance (USEPA, 2003a) pertaining to laboratory QC.

Implementing QA/QC procedures from start to finish in an investigation helps assure data that are usable and will meet and support the DQO. Procedures for Data Quality Assurance are presented within this subsection. Specifically, QA/QC parameters for precision, accuracy, representativeness, completeness, and comparability (commonly referred to as the “PARCC parameters”) must be evaluated. The parameters of precision, accuracy, and completeness are quantitative measures, while representativeness and comparability are largely qualitative.

Return to the Top of the Page

10.3.1 PRECISION AND ACCURACY

Precision and accuracy are evaluated quantitatively by collecting the types of QC samples listed in Table 10-1. While these QC samples are primarily intended for evaluation of precision and accuracy, the results are also used as necessary information for evaluating the other quality parameters.

The default, or preferred frequency, for these parameters is listed; however, different project-specific frequencies may be proposed to best meet project DQO. If proposing different QC sampling frequencies for a specific investigation, the proposed QC sampling program and the rationale should be presented in detail in the project-specific SAP or QAPP and should receive approval from the HEER Office prior to field investigation. More detailed descriptions of the individual types of QC samples and the modes of collection and handling are presented in Subsections 10.6 and 10.7.

Precision

Precision is the degree of mutual agreement between individual measurements of the same property under similar conditions. For soil samples, combined field and laboratory precision is typically evaluated by collecting and analyzing field triplicates and then calculating the variance between the samples as a Relative Standard Deviation (RSD) percent:

Figure 10.4

Groundwater field duplicates are evaluated by determining a RPD for the replicates, using RPD formula as noted below for laboratory MS/MSD precision determinations.

Laboratory analytical precision is evaluated by analyzing laboratory duplicates or MS and MSD, typically utilizing the following formula:

Figure 10.1
where:
A = First duplicate concentration
B = Second duplicate concentration

The results of the analysis of each MS/MSD and sample duplicate pairs will be used to calculate an RPD for evaluating precision (USEPA, 2003a). These are default values that laboratories may use until they develop in-house QC limits for each method, in accordance with the guidelines established in SW-846 (USEPA, 2008a).

Laboratory sub-sampling poses the greatest potential for error in soil sample analyses for non-volatile contaminants; therefore, the HEER Office recommends laboratories perform triplicate sub-sampling analyses from at least one in every 20 of these soil samples (original sub-sample plus two additional sub-sample replicates collected independently from the entire mass of soil in the sample). Laboratory sub-sampling precision is typically calculated as RSD percent (for triplicates or more). The lab sub-sampling precision measure is also helpful to compare the degree of lab sub-sampling and analysis error to the total error (i.e. the field replicate precision data representing total error from field sampling plus lab sub-sampling and analysis). Soil sub-sample replicates (as well as sub-samples for any other soil analyses for non-volatiles) are collected by the laboratory from the entire mass of available sample using a sectorial splitter or by hand Multi-Increment sampling, as described in Subsection 4.2.2. This laboratory sub-sampling QC guidance applies to soil samples collected by Multi-Increment or discrete sampling approaches.

Accuracy

Sample spiking will be conducted to evaluate laboratory accuracy. This includes analysis of the MS and MSD samples, laboratory control samples (LCS) and laboratory control sample duplicates (LCSD), or blank spikes, surrogate standards, and method blanks. MS and MSD samples will be prepared and analyzed at a frequency of 5 percent. LCS or blank spikes are also analyzed at a frequency of 5 percent. Surrogate standards, where available, are added to every sample analyzed for organic constituents. The results of the spiked samples are used to calculate the percent recovery for evaluating accuracy (USEPA, 2003a).

Figure 10.2
where:
S = Measured spike sample concentration
C = Sample concentration
T = True or actual concentration of the spike

Results that fall outside the project-specific accuracy goals will be further evaluated on the basis on the results of other QC samples. Table 10-1 summarizes recommended default frequencies for QC sample types. Example default precision and accuracy goals for laboratory analyses are described in Subsection 10.7.

Return to the Top of the Page

10.3.2 REPRESENTATIVENESS

Representativeness is a qualitative measure that expresses the degree to which field data accurately and precisely represents a characteristic of a population, parameter variations at a sampling point, process condition, or environmental condition. For purposes of environmental investigation, representativeness is how well the media (e.g., soil) sampled represents impact (i.e., contamination) at the site. In the initial planning stages of an investigation, representativeness of data collected is first ensured by proper sampling design. Project planners account for the difficulty in knowing when, where, and how to collect representative samples by developing a statistical or random sampling approach; collecting adequate numbers of increments or samples to determine a representative average COPC concentration in each decision unit; collecting samples at several different phases of natural or anthropogenic cycles; sampling at different locations within the project area; collecting Multi-Increment samples as opposed to grab samples; and verifying and validating the sampling techniques. The general strategies for ensuring representativeness are described in Section 3. The specific strategy used by the investigation team for each site is to be documented in detail in the project-specific QAPP or SAP.

One measurement of representativeness is the degree to which implementation of the sampling program has ensured that results reflect the site contaminant conditions and not outside impacts related to analytical preparation, field sampling, field decontamination, sample handling, sample shipping and other aspects of field investigation. The degree to which the sampling strategy has achieved representativeness can be measured as a qualitative parameter based on the proper implementation of the sampling program and laboratory analytical program (i.e., the QA/QC program set out in the QAPP). The results of field QC samples (i.e., replicates, trip blanks, field source blanks, or equipment blanks) may indicate that compounds have been introduced into the samples, possibly to an extent that would affect representativeness of the overall investigation.

Representativeness may also be measured by how well samples were delivered to the analytical laboratory within the described holding times and holding temperatures prescribed for individual analyses. Potential impacts to data quality measured by the QA/QC methods include (but are not limited to) the following:

  • Insufficiency or lack of cleanliness of sample collection containers, materials or preservatives provided by the analytical laboratory prior to field work, to ensure that outside contaminants are not introduced into the analytical process
  • Impurities detected in final decontamination rinse water that may not have originated from the site
  • Contaminants originating from exposure during transport of samples from the field to the analytical laboratory
  • Sample transport where delivery time to the laboratory exceeds holding time or sample temperature exceeds allowable temperature limits. Occurrence of either may indicate loss of contaminants during transport prior to extraction and analysis

Representativeness should be assessed for each matrix (media) and for each COPC. In addition to trip blanks for sites with volatile organics sampling (see Subsection 10.6.2.1) or equipment rinsate blanks and field source blanks (as described in Subsections 10.6.2.2 and 10.6.2.3), the following field QC procedures are used in evaluating representativeness:

  • Temperature measurement, usually of the samples themselves and sometimes via separate temperature blanks. These blanks are containers of analyte-free water included with field samples, handled and transported in the same manner and measure for temperature upon delivery to the analytical laboratory. Trip blanks sometimes double as temperature blanks
  • Chain-of-custody forms that document date and time of sampling and sample preservation for each sample

If analyses of field QC blank samples result in detected contaminants, the field procedures for decontamination, sample handling, and sample transport should be evaluated for how well procedures were followed, for any potential introduction of contaminants from outside sources, or for potential losses in the course of sample handling or transport.

Return to the Top of the Page

10.3.3 COMPLETENESS

Completeness is a measure of the percentage of data that are valid. Data validation is performed by evaluating field and laboratory QC analyses combined with field QC logs, and chain-of-custody form information to determine how well field samples were collected and analyzed in accordance with QC procedures outlined in the QAPP. Field analytical data are acceptable if log and Chain-of-Custody (COC) information show that field QC procedures were properly followed, no significant level of analytes are detected in QC blank analyses, and when none of the QC objectives that affect data usability are exceeded. Data validation is also performed to determine when data should be rejected or declared unusable due to improper field QC, detection of analytes in blanks or laboratory QC limit exceedances. Completeness will also be evaluated as part of the data quality assessment process. This evaluation will help determine whether any limitations are associated with the decisions to be made based on the data collected.

Completeness is a percentage value, calculated to determine if an acceptable amount of usable data was obtained so that a valid scientific site assessment may be completed. The QAPP should present completeness goals (e.g., commonly 95%) to evaluate the degree of completeness. Percent completeness is calculated using the following equation:

Figure 10.3
where:
%C = percent completeness
T = total number of sample results
R = total number of rejected sample results

Completeness at a minimum should be determined for all field analytical results by method, but should also be determined by comparing the planned number of samples per method and specific matrix.

Return to the Top of the Page

10.3.4 COMPARABILITY

Comparability is a qualitative parameter that expresses the confidence with which one data set can be compared with another. It is important that data sets be comparable if they are used in conjunction with other data sets. This type of comparison manifests itself most commonly (but not limited to) the following scenarios:

  • Data from the same site but collected during different investigations.
  • Data from the same site but collected during widely separated time-frames.
  • Comparison of data from the same site and investigation, but analyzed by different laboratories.

Comparability of data can be achieved by consistently following standard field and laboratory procedures and by using standard measurement units in reporting analytical data. The factors affecting comparability include sample collection and handling techniques, matrix type, and analytical method. If these aspects of sampling and analysis are carried out according to standard analytical procedures and the procedures implemented properly, the data may be considered comparable. Comparability is also dependent upon other quality criteria, because only when precision, accuracy, and representativeness are known may data sets be compared with confidence. In some cases, additional care must be taken to evaluate comparability. For instance, groundwater samples handled in the exact same fashion, collected within the same sampling event, and analyzed by the same analytical method may not be directly comparable if one sample was filtered and the other was not.

Return to the Top of the Page

10.4 QUALITY CONTROL

Field and laboratory QC samples and measurements must be used to verify that analytical data meet project-specific QA objectives. Field QC samples and measurements are used to assess how the sampling activities and measurements influence data quality. Similarly, laboratory QC samples are used to assess how a laboratory’s analytical program influences data quality. How well a laboratory’s QC program is set up, its past performance in implementing that program, and how well QC goals have been met also play a critical role in laboratory selection. The project-specific QAPP will provide a description of QC samples to be analyzed during the investigation for (1) each field and laboratory environmental measurement method and (2) each sample matrix type.

All laboratories that perform analytical work for investigations performed by or reviewed by the HEER Office must adhere to a QA program that is used to monitor and control all laboratory QC activities. Each laboratory must have a written QA manual that describes the QA program in detail. The laboratory QA manager is responsible for ensuring that all laboratory internal QC checks are conducted according to the laboratory’s QA manual, and the requirements included within a project-specific QAPP or SAP. The most common (and default for projects conducted and reviewed by the HEER Office) QA/QC procedures are those outlined in the United States Environmental Protection Agency’s (USEPA’s) publication entitled: “Test Methods for Evaluating Solid Waste, SW-846” (USEPA, 2008a), and laboratory standard operating procedures (SOPs). Investigators should consult the USEPA SW-846 website under the following circumstances:

  • During project planning to determine the most recent edition of any analytical method or SOP to cite in the work plan or QAPP.
  • During selection of the analytical laboratory to ensure that they are employing the most recent method or SOP cited in the SAP or QAPP.
  • If the investigation is utilizing multiple analytical laboratories.
  • When significant time has elapsed between project planning and field investigation or between stages of field investigation to ensure that a previously cited or utilized method or SOP has not been modified or superseded.

For particulate (e.g., soil or sediment) samples, laboratories should follow the USEPA lab sub-sampling guidance (USEPA, 2003b) to ensure that representative lab sub-samples are obtained for subsequent analysis.

Many of the laboratory QC procedures and requirements are described in USEPA-approved analytical methods, laboratory method SOPs, and method guidance documents. If, however, laboratory QC requirements are not specified in an analytical method, or if additional requirements beyond those included in an analytical method are necessary to ensure that project QA objectives and DQO are met, the project-specific QAPP should identify the additional laboratory QC checks to be performed. The following types of information should be included:

  • Laboratory analytical method(s) to which the internal QC check applies
  • Complete procedures for conducting the internal QC check
  • QC samples and QC measurements involved in the internal QC check
  • Complete collection and preparation procedures for the QC samples
  • Spiking analytes and concentrations
  • Control limits for the internal QC check
  • Corrective action procedures to be followed if the internal QC check is not done properly or results are outside control limits. Description of example instances that may require corrective action is presented in Subsection 10.8.

Laboratory QC procedures and requirements may include the preparation and analysis of sub-sampling replicates, method blanks, LCS, surrogate spikes, matrix duplicates, MS and MSD samples, and standard reference materials or independent check standards. Subsections 10.6 and 10.7 describe field and laboratory QC procedures respectfully. Subsection 10.7.7 includes information on data that the analytical laboratory should include in project analytical reports.

Return to the Top of the Page

10.5 FIELD EQUIPMENT AND LABORATORY INSTRUMENT CALIBRATION

Return to the Top of the Page

10.5.1 FIELD EQUIPMENT CALIBRATION

Investigations of soil, water, or gas phase matrices utilize variety of field equipment, such as a photo-ionization detector (PID) or flame-ionization detector (FID) to measure volatile constituents in soil sample, water quality measurement instruments, or a flow controller to limit or regulate the flow of gas.

In general, calibrate field equipment at least daily prior to its first use. Re-calibrate or check field equipment throughout the field day to verify that it is operating properly. Record field equipment calibration and equipment field checks in a field logbook and/or on a calibration log sheet accompanying the instrument. At a minimum, record:

  1. Date and time of calibration
  2. Type and identification number of equipment being calibrated
  3. Reference standard(s) used for calibration
  4. Name or initials of person performing the calibration.

Return to the Top of the Page

10.5.2 LABORATORY EQUIPMENT CALIBRATION

For Method 8000 analyses, laboratory instruments are typically calibrated with a linear 5-point calibration curve prior to use. A calibration is considered valid if the 5-point linear curve meets a less than or equal to 20 percent RSD. Continual calibration verifications throughout the analytical day assess whether the calibration curve has drifted as a result of instrument use (USEPA, 2008a).

Return to the Top of the Page

10.6 FIELD QA/QC

More than one type of field QA/QC sample may be collected simultaneously to provide a broad assessment of sample data quality. Field QA/QC sampling is typically used to evaluate the following, as well as other considerations:

  • Accuracy of sample collection, processing and analysis procedures through the use of field replicates.
  • Accuracy of sample analysis procedure through the use of field replicates sent to separate laboratories.
  • Effectiveness of sample collection equipment decontamination procedures through the use of equipment blanks.
  • Sample handling and transportation procedures through the use of trip blanks– for samples in aqueous media being analyzed for volatile chemicals.

The frequency of QA/QC sample collection is strongly dependent upon a variety of factors including the sample matrix (i.e., soil, water, or gas phase), COPCs, and QA/QC questions to be answered. The number of field QA/QC samples is site and project specific and needs to be addressed in detail in the SAP or QAPP for each project. A QA/QC sample should be analyzed for the same constituents and by the same method as the primary sample.

QA/QC samples should be labeled in a manner that does not allow the analytical laboratory to identify or correlate the QA/QC sample to the primary sample. This is often referred to as the submission of “blind” samples to the analytical laboratory. For example, if only one primary sample is collected, a replicate QA/QC sample label should not identify it as a duplicate (or other) QA/QC sample. If more than one primary sample is collected, the QA/QC sample name may indicate that it is a duplicate (or other) QA/QC sample as long as correlation to the primary sample is not possible (e.g., by indicating a sample collection date or time different from the primary sample).

Return to the Top of the Page

10.6.1 FIELD REPLICATES

Field replicate samples are duplicate or triplicate samples collected from within the same decision unit or from the same groundwater well to evaluate the precision of the sampling effort. Replicates are to be collected, preserved, stored, transported, and analyzed in the same way as primary field samples. Duplicate or triplicate samples are collectively referred to as “replicate” samples unless specifically indicated. Replicates are intended to represent the same population and are taken to provide information on precision, accuracy, and representativeness for the data collection activity (e.g., replicates provide a measure of contaminant heterogeneity for a specific decision unit). If the degree of contaminant heterogeneity exceeds established DQO in the SAP, then additional sampling and/or steps to limit errors during sample processing and analysis are typically required to provide representative sample data. A field replicate precision of approximately 10-35% is generally established as a DQO, depending on the media and contaminant.

The method for collecting replicate QA/QC samples is strongly dependent upon the sample matrix, the COPCs, and the QA/QC questions to be answered, particularly when considering Multi-Increment sample (MIS) techniques as compared to discrete sampling.

Return to the Top of the Page

10.6.1.1 DISCRETE SAMPLING REPLICATES

In general, for discrete sampling of groundwater or soil the HEER Office recommends collecting one replicate QA/QC sample per field day per sample matrix, or 10% of all field samples, whichever is greater. At least 10% replicate QA/QC samples should be collected in each decision unit or each area of known or suspected contamination. Consider both the horizontal and vertical dimensions when planning replicate QA/QC sample locations. If small scale heterogeneity is expected at the site, additional replicate QA/QC samples may be required to assess the scale of heterogeneity. Different project-specific frequencies may be proposed to best meet project DQO. If proposing different QC sampling frequencies for a specific investigation, the proposed QC sampling program and the rationale should be presented in detail in the project-specific SAP or QAPP and discussed with the HEER Office prior to field investigation.

Co-located duplicate samples

Co-located duplicates are samples collected at the same time from a location in proximity to the primary sample. Co-located duplicate soil samples are commonly collected due to sample volume factors (i.e., the volume of sample material retrieved in the sampler is less than the volume of sample required for laboratory analysis). Minimize the distance between the primary and duplicate sample collection points; small scale heterogeneity in the contaminant distribution is more likely as the distance increases.

The co-located samples would be expected to have similar contaminant concentrations. Data quality objectives to evaluate the precision of co-located samples should be included in the SAP or QAPP, and co-located sample data compared to ensure these DQO are met.

Duplicate groundwater samples

For non-volatile groundwater contaminants collected in vials, generally two sample containers are “alternately” filled. For example, if a low-flow pump is used, the two containers would be filled by going back and forth with the discharge tubing.

For volatile groundwater contaminants, where multiple 40 ml vials are commonly used for each sample and loss of volatiles is an important concern, the primary and duplicate sets of samples are collected alternately. One vial is completely filled with the primary sample then a duplicate sample vial is filled until all vials (primary and duplicate sets) for that one sample are collected. It is also important to follow vial filling protocols appropriate to ensure minimal agitation and zero-headspace for the volatile samples.

Relative Percent Difference

In certain cases, particularly for discrete sampling, only duplicates rather than triplicates may be available to evaluate precision of sampling data, though triplicates are recommended wherever feasible. In those situations where only duplicates are able to be collected, the precision of the data would be evaluated by determining the RPD.

The RPD is calculated as described in Subsection 10.3.1.

Return to the Top of the Page

10.6.1.2 MULTI-INCREMENT SOIL SAMPLING REPLICATES

The Multi-Increment soil sampling approach relies on collection of field replicate (triplicate) samples to estimate the sampling precision, as discussed in Subsection 4.2.3. Collecting and analyzing triplicate samples allows for statistical calculation of several important quantities including the standard deviation, RSD, and 95 percent (%) upper confidence level (UCL) of the mean. These statistical evaluations are utilized to determine the degree that the measured levels of contaminants vary from the (estimated) mean, and is taken into consideration when comparing site data to applicable HEER Office EALs (See Subsection 4.2.5).

The number of decision units where Multi-Increment sample replicates are collected will vary with each project, total number of decision units, and site characteristics. Consequently the number of DUs with replicates is site-specific and determined as part of the overall sampling strategy in the SAP. A batch-type replicate approach (similar to that used in the lab) may be applied in the field, if multiple decision units are similar (e.g., similar soil type, contaminants of concern, history of chemical use, topography, etc.). If multiple similar DUs are evaluated on a site, replicates in one DU may be used to evaluate that DU and up to 9 similar DUs. In this case, the precision data determined for contaminant(s) in one DU (e.g., RSD) would also be applied to the other DUs in the similar batch.

Standard Deviation, Relative Standard Deviation, and 95% Upper Confidence Limit of the Mean

The standard deviation is a statistical measure of the scatter, or variability, of several sample values around their mean (or average). The lower the standard deviation, the lower the variability of the sample values observed in the data. The standard deviation may be informally interpreted as the size of a “typical” deviation from the mean (or average) and may be calculated using standard equations presented in an introductory statistics book or included as software functions in programs such as Microsoft Excel.

The RSD, expressed as a percent, is a measure of precision among several sample values (the normal, duplicate, and triplicate samples in the case of Multi-Increment sampling). The RSD differs from the RPD in that it measures the precision among several sample values versus between just two sample values. The RSD can be calculated as the standard deviation of the sample replicates divided by the mean (or average) of the sample replicates, times 100%.

An RSD of 35% or less is typically a goal during environmental investigations. However, an RSD greater than 35% does not necessarily mean the data is not usable for the intended purpose. For example, an RSD somewhat greater than 35% may be acceptable if the estimated average level of contaminant(s) in the DU is much greater or much less than the relevant HDOH Tier 1 EAL.

The 95% UCL is another statistical measure of the precision for a series of sampling measurements. In this case, the normal, duplicate and triplicate samples are used to calculate a mean (or average) value and a standard deviation. The mean and standard deviation are used to calculate, with 95% confidence, the mean value for the individual decision unit. Formulas and spreadsheets for calculating the 95% UCL are available through websites providing statistical analysis support.

Return to the Top of the Page

10.6.2 BLANKS

Blank QA/QC samples are aliquots of the sample matrix that is known to be free of contaminants. The analytical data for blanks provides a measure of the cross-contamination that may have occurred during sample collection, sample storage and transport, or during laboratory preparation, extraction, and analysis. Compare the analytical results of the various types of blanks to each other to assess the degree to which contamination may have been introduced into the samples.

Return to the Top of the Page

10.6.2.1 TRIP BLANKS

The purpose of a trip blank is to assess the possibility of cross contamination during sample collection, storage, and transport to the analytical laboratory. The trip blank is typically analyzed for volatile organic compounds in aqueous samples due to the high vapor pressure and potential for vapor migration. Non-aqueous samples collected using methanol preservation techniques may also require a trip blank.

Prepare trip blanks by filling sample containers with reagent grade water, then assuring that the trip blank sample containers accompany the main sample containers along every step to the analytical laboratory. Trip blanks are not opened in the field. Trip blank water should be from the same source as the method blank water used in the laboratory.

Return to the Top of the Page

10.6.2.2 FIELD EQUIPMENT RINSATE BLANKS

The purpose of an equipment blank (also commonly referred to as a field equipment rinsate sample) is to place a mechanism of control on sample collection equipment (i.e., soil core samplers or sample tubing) that is decontaminated and reused in the field. Specifically, an equipment blank assesses sample collection equipment and/or related ambient conditions that may affect sample quality. Because the equipment blank is stored and transported with the primary samples, it is also representative of sample bottle preparation, storage, and transport conditions.

An equipment blank is collected by pouring reagent grade water over/through decontaminated equipment used in sample collection. The water is then collected in a sample container and analyzed for the contaminants of interest. Equipment blank water should be from the same source as the method blank water used in the laboratory.

The use of field equipment rinsate blanks is important for ultraclean and very low level (trace) contaminant investigations; however, in many general contaminant investigations it is not necessary as long as a specific and effective protocol (i.e., SOPs) for field decontamination of any re-used sampling tools is documented in the SAP and utilized. Collection of large Multi-Increment soil samples (rather than discrete samples) further decreases the potential for cross contamination with trace amounts of soil left on a sampling tool.

The protocol for decontamination should ensure that new sampling equipment is decontaminated (or certified clean and in original container until used) and any previously used equipment is decontaminated before reuse. The SAP or QAPP should clearly identify if the site investigation will or will not include equipment rinsate blanks, and discuss the rationale for this decision. Field (water) source blanks are required to be analyzed whenever equipment is decontaminated in the field.

Where equipment rinsate blanks are included for trace level investigations or for other reasons, the HEER Office recommends collecting one equipment blank per matrix per sampling team per day.

Return to the Top of the Page

10.6.2.3 FIELD SOURCE BLANK

Field source blanks are collected from the water source used for decontamination rinse of equipment, and are used to assess potential for contamination in the water used for decontamination. One source blank is collected from each source of water used for decontamination.

10.6.3 DOCUMENTATION

Document the following sampling information, as applicable, for primary and QA/QC samples in the field log:

  • Time and date of sample collection
  • Name of person(s) collecting the sample
  • Location of sample
  • Sampling procedure
  • Sample identification
  • Source of blank matrix
  • Table that provides a cross reference of primary and replicate samples
  • Equipment decontamination procedure

10.6.4 CHAIN OF CUSTODY

Attach a label to the sample jars and log each sample on a chain-of-custody (COC) form. Provide at a minimum the following information on the COC:

  • Project identification
  • Samplers name
  • Sender – company name and address
  • Destination – laboratory name and address
  • Sample identification
  • Number of sample containers per sample
  • Preservation, if any
  • Date and time of sample collection for each sample
  • Requested analytes
  • Special handling requirements, if any
  • Shipping company
  • Name and signatures of persons relinquishing custody
  • Date and time when custody is relinquished
  • Signatures of persons receiving custody
  • Date and time when custody is received

The chain-of-custody must not be broken between the sampler and the laboratory sample receiving personnel. Enter the name of the shipping company into the received custody section, if the samples need to be shipped.

Return to the Top of the Page

10.7 LABORATORY QA/QC

An accurate estimate of the precision or accuracy of analytical results is only possible if sample results are derived within laboratory reporting limits (RL) required by the DQO described in the SAP or QAPP. The RL represents the concentration of a specific analyte the laboratory can detect to a high degree of confidence for a particular sample.

The QAPP identifies DQO for the project; the laboratory report indicates RLs for each result. Variables that affect the laboratory’s ability to achieve the RL conforming to the QAPP include: the sample matrix, naturally occurring background concentrations, and laboratory instrumentation. QA/QC requirements include following the referenced analytical method for each chemical of concern.

Most analytical data from laboratories are documented in computer records or on printouts generated by the instrument data-handling computer and transferred to a centralized acquisition server. Standard logs are maintained to document the preparation of standards. The identity and number of the parent material is recorded and each prepared standard is assigned a number that is traceable to the parent material. All data from analytical laboratories should be collected and documented in such a manner that allows the generation of data packages that can be used by an external data auditor to reconstruct the analytical process.

Return to the Top of the Page

10.7.1 METHOD BLANKS

The laboratory analyzes method blanks for each analytical batch and uses results to assess laboratory background or reagent contamination. An aliquot (extraction blank) equal in mass to the sample and known to be free of the COPCs is used for method blank analysis. The matrix of the method blank is selected to represent the sample matrix as closely as possible. The method blank is taken through the whole analytical process and is analyzed exactly like the calibration standards, field samples, and field replicate samples. Method blank analytical results are included in the analytical report. Method blanks should be prepared and analyzed at a frequency of at least 1 per every 20 field samples (5%) of the same matrix (USEPA, 2003a).

Return to the Top of the Page

10.7.2 LABORATORY CONTROL SAMPLES (LCS)

The laboratory analyzes an LCS to assess overall method performance; it is the primary indicator of laboratory performance. The LCS is commonly accompanied by an LCSD. The LCS and LCSD pairs should be prepared and analyzed at a frequency of at least 1 per every 20 field samples (i.e., 5%) of the same matrix (USEPA, 2003a). The LCS and LCSD are typically similar in composition to the primary samples, contain known concentrations of all analytes of interest, and undergo the same preparatory and determinative procedures as the primary samples. LCS and LCSD pairs are used to assess laboratory specific precision and accuracy or to assess the performance of an analytical method. Laboratories should have established internal QC RPD and Percent Recovery limits as defined in Subsection 10.3.1 for each method. The parameters should be developed in accordance with guidelines established in USEPA SW-846 (USEPA, 2008a). In the absence of established guidelines, RPD goals of 20% and Percent Recovery goal ranges of 70 to 130% should be used as default objectives (USEPA, 2003a).

When both an LCS and an LCSD are processed for a batch of samples, there is no significant physical distinction between the LCS and LCSD. Both the LCS and LCSD must satisfy the same recovery acceptance criteria, which is usually based on the laboratory specific control limits.

The LCS and LCSD are prepared by spiking an uncontaminated sample matrix with known amounts of analytes from a source independent from the calibration standards. Should the LCS and LCSD fail the acceptance criteria, the entire analytical batch must be re-analyzed with another LCS and LCSD pair.

Return to the Top of the Page

10.7.3 MATRIX SPIKES (MS)

An MS sample is evaluated to assess the accuracy and precision of an analytical method with respect to the sample matrix. The MS is commonly accompanied by an MSD sample. The MS and MSD samples are prepared by adding known concentrations of analytes to the sample matrix prior to sample preparation. The MS/MSD pairs should be prepared and analyzed at a frequency of at least 1 per every 20 field samples (i.e., 5%) of the same matrix (USEPA, 2003a). The concentrations of analytes in the sample matrix are known prior to the addition of matrix spike analytes.

The MS and MSD are used to identify matrix interference peaks that may co-elute with target analytes. The MS and MSD are taken through the whole analytical process. Following the analytical process, the recoveries of the spike analytes are calculated and reported for assessment of accuracy. When an MSD is analyzed, the relative percent differences between the MS and the MSD results will also be calculated and reported. The percent recoveries and the relative percent difference are used to evaluate the effect of the sample matrix on the accuracy and precision of the analysis. Matrix interference effects may result in the MS and MSD failing the acceptance criteria. However, the MS and MSD pair must satisfy their acceptance criteria for the analytical batch to be considered in control and acceptable.

Return to the Top of the Page

10.7.4 MATRIX CLEANUP

Matrix cleanup methods are applied to the extracts prepared by one of the extraction methods to eliminate sample matrix interferences. Several cleanup methods may be employed depending upon the target analytes of interest. USEPA Method 3600 from SW-846 provides general guidance on selecting cleanup methods (USEPA, 2008a).

As indicated in USEPA Method 3600, the purpose of applying cleanup methods to extracts is to remove interferences and high boiling point material that may result in the following:

  • Errors in quantitation [data may be biased low because of analyte adsorption in the injection port or front of the gas chromatograph (GC) column, or biased high because of overlap with an interference peak]
  • False positives because of interference peaks falling within the analyte retention time window
  • False negatives caused by shifting the analyte outside the retention time window

Most extracts of soil require some degree of cleanup. Highly contaminated extracts (e.g., soil containing oily residue) often require a combination of cleanup methods. Following extraction and cleanup, the extract is analyzed by one of the determinative methods. If interferences still preclude analysis for the analytes of interest, additional cleanup may be required.

Return to the Top of the Page

10.7.5 SURROGATES

Surrogate spikes involve the addition of a known concentration of a non-target analyte prior to sample preparation and analysis. The surrogate is chemically similar to the target analyte(s) and behaves similarly during extraction and analysis. The surrogate spike recovery must meet the established acceptance criteria, and measures the efficiency of the steps of the analytical method in recovering the non-target analytes.

Return to the Top of the Page

10.7.6 LABORATORY SUB-SAMPLING REPLICATES

Laboratory sub-sampling replicate QA/QC samples are generally employed for all soil, sediment, or other particulate samples analyzed for non-volatile contaminants (from Multi-Increment or discrete samples). The HEER Office recommends triplicate sub-sampling and determination of the RSD. Due to the typically smaller mass of discrete soil samples, there may be situations where only duplicate lab sub-samples may be feasible. This issue should be considered during the systematic planning phase of the investigation when determining DQO and coordinating with the laboratory.

The HEER Office recommends collecting laboratory sub-sampling replicate QA/QC samples at a frequency of at least one per 20 samples, or at least one if there are less than 20 samples. Replicate sub-samples are collected from the entire mass of sample available (e.g. the entire mass of sample available after drying and sieving to project-specific particle size, typically < 2mm). Sub-sampling should be performed using a sectorial splitter or by hand Multi-Increment sampling. The USEPA lab sub-sampling guidance (USEPA, 2003b) provides detailed information on sub-sampling procedures.

Return to the Top of the Page

10.7.7 QA/QC REPORTS

The investigation team generating the data should include an experienced data reviewer or a third party data validator to review the analytical data to determine its validity and therefore usability.

The data reviewer or validator should review all QC-related information provided in the data package and project-specific laboratory report provided by the analytical laboratory. As part of the process of selecting the project analytical laboratory, the investigation team will ensure that the laboratory assigns a data analyst. The analyst should review the data to assess that:

  • Sample preparation information is correct and complete.
  • Analysis information is correct and complete.
  • The appropriate SOPs were followed.
  • Analytical results are correct and complete.
  • Quality control samples were within established control limits.
  • Documentation, including the case narrative is complete.

The analyst will then review the analytical data package to verify that:

  • Calibration data are scientifically sound and method compliant.
  • QC samples were within established guidelines.
  • Qualitative and quantitative results are correct.
  • Documentation and the case narrative are complete.
  • The data package is complete and ready for document archiving.

The laboratory report must provide the following QA/QC information:

  • Sample temperature at time of receipt
  • Whether sample hold times were within method limits.
  • Whether samples were received in good condition.
  • Whether bubbles were present in volatile organic analysis (VOA) vials at time of receipt and size of bubbles if any.
  • Description of corrective measures taken, if any QA/QC sample results were out of laboratory control limits.

Return to the Top of the Page

10.8 CORRECTIVE ACTION

Whenever any QC parameters are outside of the control limits or DQO specified in the SAP or QAPP, the investigation team must identify the potential origin(s) of the problem(s), and initiate any appropriate corrective action. In some cases, the corrective action may involve evaluating potential impacts that these exceedances have on data quality and therefore usability of the data.

Any investigation should include a checklist of parameters or questions related to potential data quality issues potentially needing corrective action. Example issues include (but are not limited to) the following:

  • Were any analytes, not on the initial SAP analyte suite, detected in laboratory blanks that could be attributed to laboratory contamination rather than field contamination? (e.g., solvents commonly used in analytical laboratories such as methylene chloride and acetone that were likely not used, handled, or stored at the site under investigation).
  • Were any analytes of concern detected in the Method Blank? This may indicate contamination that is unrelated to the field sample.
  • Were contaminants found in both the environmental sample and a blank sample? Such detections may be regarded as laboratory artifacts and not a result of contamination at the investigation site if the contaminant is detected in both and the concentration in the environmental sample is:
    • less than 10 times the blank value for common laboratory contaminants (e.g. methylene chloride, acetone, 2-butanone and phthalate esters)
    • greater than five times the blank value for other potential laboratory contaminants
  • Did the RPD and Percent Recoveries for any of the QC analyses (e.g., LCS, MS/MSD) exceed the control limits initially specified in the SAP or QAPP? This may indicate sample preparation problems such as differences in spike solution preparation methods. If the control limits for a certain batch of samples being analyzed are exceeded and underlying issues not identified or resolved, the affected samples may need to be qualified or rejected.
  • Was there any matrix interference suspected or determined that required dilution of the sample for reanalysis (e.g., did the dilution cause any reanalysis reporting limit to exceed the corresponding screening or regulatory criteria)? This may result in a degree of uncertainly for contaminants that may potentially mask each other on a chromatogram, such as pesticides and polychlorinated biphenyls (PCBs), or it may cause the reporting limit to exceed the HDOH Tier 1 EAL screening criteria or cleanup criteria.
  • Were all calibration verification sample results within control limits? If any fail, recalibration of the instrument is necessary.

These parameters should be evaluated before accepting the data for use in the overall site investigation. Investigation reports should also include a data quality evaluation section that addresses these issues and provides documentation and justification for accepting the data. The HEER Office may reject data that does not meet the agreed upon level of data quality in the initially reviewed work plan or planning documents for the investigation. In more extreme cases, not evaluating data quality issues or initiating appropriate corrective action after an issue is identified may result in rejection of subsequent data sets.