In this guide, we look at the most common sources of contamination and error in an analytical process.
Photo credit: bidaala_studio/shutterstock.com
Table I: ASTM designations for reagent laboratory water (1)
Table II: Elemental leaching from laboratory tubing (µg/L) (2)
Figure 1: Phthalate concentrations in laboratory water sources (pbb) (3).
Table III: Grades of laboratory solvents and their uses
Table IV: Properties of common chromatography and spectroscopy solvents
Table V: Dispensing volume error associated with various size syringes (4)
Table VI: Compound carryover found in syringe washes (ppm of carryover) (4)
Table VII: Major elemental impurities found in laboratory container materials (2)
Table VIII: Elemental impurities found in nitric acid distilled in clean laboratories versus regular laboratories (2)
Figure 2: Triboelectric charge potential of common materials and particles in the laboratory.
Figure 3: Common sources of contamination found in the laboratory.
Operator, environmental, and method errors often include sources of contamination. In an increasing, more exacting analytical landscape in pursuit of parts-per-billion (ppb)-level analytes, it is very important not only to understand the sources of error and contamination but how to reduce them. During the dawn of analytical instruments, the laboratories tested for a select number of compounds or elements at parts-per-thousand levels. Modern instrumentation now has increased the number of compounds and elements to be quantitated and lowered the analytical threshold to sub-part-per-billion levels where 1 ppb is equivalent to 1 s in 32 years! In this type of testing environment even low parts-per-billion levels of contamination can cause large errors in quantitation. In this guide, we look at all of the most common sources of contamination and error in an analytical process from the water used in the laboratory to the inherent mistakes and error caused by laboratory equipment and operators.
Chemical reference standards have been an important component in accurate analysis for decades. Over the years, the challenges facing chemical laboratories have changed. As a manufacturer of certified reference materials (CRMs), many questions have arisen during our daily interaction with customers on how to best use CRMs. Sometimes a customer is not even aware that the issue they are questioning is really an problem of contamination or error. A common scenario is that a scientist will express a concern that their standard is too high in particular elements. Usually during the conversation, our scientists discover that the customer is inadvertently allowing contamination into their analysis.
Contamination and error can occur at almost any point of the process and then can be magnified as the method and analysis runs its course. Modern instrumentation has raised the bar or in this case lowered the limits of detection to the parts-per-billion (ppb) or even parts-per-trillion (ppt) levels. The concept of such small measurements would have been almost inconceivable during the emergence of modern laboratory analysis.
To put this into a different context, 1 ppb expressed as a unit of time would be 1 s in 32 years and 1 ppt would be 1 s in 320 centuries! In the past, issues of laboratory contamination were problematic but now contaminants, even in trace amounts, can severely alter results. It is hard to imagine that such small amounts of contamination can dramatically change laboratory values.
Most questions about contamination come in the form of an inquiry about a particularly high result for some common contaminant. In most cases, the root of that contamination can be traced to a common source. The most common sources of standard and sample contamination are found in the laboratory: reagents, labware, laboratory environment, storage, and personnel.
Water is one of the most basic yet most essential laboratory components. Most scientists are aware that the common perception that all water is the same is untrue. There are many types, grades, and intended uses for water. Water is most often used in two ways in the laboratory: as a cleaning solution and as a transfer solution for volumetric or gravimetric calibrations or dilutions. In both of these uses, the water must be clean to reduce contamination and introduce error into the process. Poor quality water can cause a host of problems from creating deposits in labware or inadvertently increasing a target element or analyte concentration in solution.
The confusion starts when laboratories are unsure about which type of water they get from their water filtration system. ASTM has guidelines that designate different grades of water. Table I shows parameters for the four ASTM types of water (1). (See upper right for Table I, click to enlarge. Table I: ASTM designations for reagent laboratory water (1).)
The actual type of water produced by a commercial laboratory water filtration system can vary in pH, solutes, and soluble silica. Critical analytical processes should always require a minimum quality of ASTM Type I water. All trace analysis standards, dilutions, dissolutions, extractions, and digestions should be conducted with the highest purity of water. Analysts who use CRMs and perform quantitative analysis need to use quality water so as not to contaminate their CRMs, standards, and samples with poor quality water.
High-purity water is often achieved in several stages in multiple processes that remove physically and chemically potential contaminating substances. Municipal water supplies often test their own water sources on an annual basis, but that does not mean it is applicable for use in laboratory applications. Municipal water can become contaminated from its distribution point, especially when left static sitting in pipes, tubing, and hoses. Water left stationary in a laboratory water system can be exposed to leaching of elements and compounds from the piping and hoses. In one experiment conducted at SPEX CertiPrep, leaching solutions of either ASTM Grade I water or 5% nitric acid in ASTM Grade I water were run through lengths of silicon and neoprene laboratory tubing. The solutions were collected and tested by inductively coupled plasma–mass spectrometry (ICP-MS). Many common elements were found to have leached into the solutions from exposure to the laboratory tubing adding potential contamination to critical analytical processes, which can be seen in Table II. (See upper right for Table II, click to enlarge. Table II: Elemental leaching from laboratory tubing (µg/L) (2).) If the tubing was used to transfer acidified solutions between vessels the acid further increased the contamination, elements such as lead increased from 0.1 µg/L in the ASTM I deionized (DI) water to 3 µg/L in the silicon tubing with 5% nitric acid and 2 µg/L in the neoprene tubing with 5% nitric acid.
Another potential source of contamination of laboratory water is phthalates. Phthalates are ubiquitous in the environment and in the laboratory. Plastics are found everywhere from bottles, tubing, caps, and containers. In a laboratory water study performed at SPEX CertiPrep, we examined bisphenol A and phthalate content present in a variety of laboratory water sources including bottled high performance liquid chromatography (HPLC)-grade, liquid chromatography–mass spectrometry (LC–MS)-grade water, municipal tap water, and water obtained from our in-house DI (ASTM I) water source (3). The in-house DI water was tested from a carboy filled within the laboratory, and directly from the faucet outlet. The faucet samples were collected after sitting overnight and a second sample was collected after the system had been flushed with multiple gallons of fresh water. Phthalates were found in all the water sources with the highest total phthalates found in the HPLC bottled water with about 91 ppb of phthalates as seen in Figure 1. (See upper right for Figure 1, click to enlarge. Figure 1: Phthalate concentrations in laboratory water sources (pbb) (3).)
To reduce contamination from laboratory water sources, the first line of attack is to choose the correct water source for the given application. It is also important to realize that water can change quality over time depending on the storage conditions. Bottled-water sources can leach organic and inorganic contaminants from the bottle, cap, and liners. Bottled water can also have an expiration date and a shelf life that should be checked before use. Water that has been decanted into other vessels can be exposed to many types of contaminants including dust, microbial growth, and oxidation effects. Water left in liquid systems such as HPLC, LC–MS, or ion chromatography (IC) systems should be changed frequently to prevent contamination and microbial growth.
An analytical laboratory often uses a large amount of various reagents, solvents, and acids of varying quality and contamination levels. These chemical components can be a large economic investment for a laboratory, but also a large source of potential contamination. Just as in the case of water, there are different types or grades of chemicals, reagents, acids, and solvents. Some designations are set forth by standards set by the U.S. Pharmacopeia (USP) or the American Chemical Society (ACS). Other types or grades of material are designated by individual manufacturers based on intended use. Some are general laboratory grades with intended use for noncritical applications while other grades usually high in purity and low in contaminants are designated for more-critical analyses. Table III shows some general descriptions of types and grades of solvents and their intended uses. (See upper right for Table III, click to enlarge. Table III: Grades of laboratory solvents and their uses.)
Many analytical laboratories use a variety of chromatography and spectrometry instruments such as gas chromatography (GC), GC–MS, HPLC, and LC–MS in their analyses. Solvents play a large role in these analytical techniques as either mobile phases or matrices for analysis. During sample preparation, many laboratories expose samples to a variety of solvents ahead of introduction to an analytical instrument. In some cases, the kind of solvent is chosen to best fit the technique. LC, GC, and MS systems each have different modes of analysis that benefit from specific chemical or physical properties found in solvents. For example, in GC and GC–MS, the most widely used solvents have low boiling points, are eluted quickly, and don’t interfere with the target analytes. Solvents with boiling points within the analytical range of the target analytes have the potential to be coeluted with the targets and hinder quantitation. (See Table IV.) (See upper right for Table IV, click to enlarge. Table IV: Properties of common chromatography and spectroscopy solvents.)
LC and MS techniques depend on the type of analysis being performed. Normal-phase LC uses polar columns and nonpolar solvents such as hexane and cyclohexane. Reversed-phase LC uses nonpolar columns such as octadecyl (C18) columns with polar solvents including methanol, water, and acetonitrile. If ultraviolet–visible light diode-array detection (UV-vis DAD) or equivalent is used then the wavelength of the solvent also becomes important. The wavelength of the mobile phase and solvents used should be outside of the wavelengths of the target analytes in the analysis. The most common range for a typical reverse phase is above 190 nm UV cutoff for acetonitrile and below 300 nm, which is the UV cutoff for acetone, both popular solvents for LC applications seen in Table IV.
Solvents are both a material that can become contaminated and a source of contamination. Solvents can be contaminated by particles such as dust, rust, and mold. The solvents can also be contaminated by gases, oxidation, or compounds like the phthalates found in seals and bottle closures. Some solvents have added preservatives that could add contaminants to analysis or leach additional elements from the storage containers.
There are many persistent solvents that can be found in the laboratory which can cross-contaminate samples by their presence. Some persistent solvents include dichloromethane which can cause chlorine contamination as well as dimethyl sulfoxide (DMSO) and carbon disulfide, which can add sulfur residues. There are solvents that react with air to form peroxides, which can cause contamination and safety issues in the laboratory.
Acids are another laboratory reagent that can become both a potential danger and a potential contaminant. Acids by their nature are oxidizers and many of the strongest acids are used in the processing of samples for inorganic analysis. Common acids in digestion and dissolution include perchloric acid, hydrofluoric acid, sulfuric acid, hydrochloric acid, and nitric acid. Many of these acids are commercially available in several grades from general laboratory or reagent grade to high-purity trace metal–grade acids. Acid grades often reflect the number of sub-boiling distillations the acid undergoes for purification before bottling. The more an acid is distilled, the higher the purity. These high-purity acids have the lowest amount of elemental contamination but can become very costly at up to 10 times the cost of the reagent grade of acids.
Often the question is asked if high purity acids are necessary in sample preparation if the laboratory is using a high-quality ICP-MS-grade CRM. Clean acids used in sample preparation, digestion, and preservation can be very costly. But, the difference between the amounts of contamination in a low-purity acid and a high-purity acid can be dramatic. High-quality standards for use in parts-per-billion and parts-per-trillion analyses use the highest purity acids available to reduce all possible contamination from the acid source. An example of potential contamination is an aliquot of 5 mL of acid containing 100 ppb of Ni as contaminant, used for diluting a sample to 100 mL, can introduce 5 ppb of Ni into the sample.
To reduce contamination it is recommended that high-purity acids be used to dilute and prepare standards and samples when possible. In addition to using pure acid, it is important that the chemist check the acid’s certificate of analysis to identify the elemental contamination levels present in the acid. Some laboratories prefer to use blank subtractions to negate the background contamination, but blank subtraction for acids can only work in a range well over the instrumental level of detection. If blank subtraction causes an analytical result to fall below the instrument’s level of detection, it should not be used.
Volumetric measurement is a common repeated daily activity in most analytical laboratories. Many processes in the laboratory from sample preparation to standards calculation depend on accurate and contamination-free volumetric measurements. Unfortunately, laboratory volumetric labware, syringes, and pipettes are among the most common sources of contamination, carryover, and error in the laboratory.
The root of these errors is based on the four “I” errors of volumetrics:
These four I’s can lead to error and contamination, which negate all intent of careful measurement processes.
The first two I’s stand for improper use, meaning that the volumetric is not used correctly or the incorrect choice is made. Many errors can be avoided by understanding the markings displayed on the volumetrics and choosing the proper tool for the job. There is a lot of information displayed on volumetric labware. Most labware, especially glassware, is designated as either Class A or Class B labware. Class A glassware is a higher quality analytical class of glassware whereas Class B glassware is a lower quality glassware with a larger uncertainty and tolerance. If a critical measurement process is needed, then only Class A glassware should be used for measurement.
Other information that can be found on labware is the name of the manufacturer, country of origin, tolerance or uncertainty of the measurement of the labware, and a series of descriptors that indicate how the glassware should be used. Labware can be marked with letters that designate the purpose of the container. If a volumetric is designed to contain liquid it will be marked by either the letters TC or IN. Labware that is designated to deliver liquid will be marked by either the letters TD or EX. Sometimes there are additional designations such as wait time or delivery time inscribed on the labware. The delivery time refers to a period of time required for the meniscus to flow from the upper volume mark to reach the lower volume mark. The wait time refers to the time needed for the meniscus to come to rest after the residual liquid has finished flowing down from the wall of the pipette or vessel.
A second type of improper use and incorrect choice can be seen in the selection of pipettes and syringes for analytical measurements. Many syringe manufacturers recommend a minimum dispensing volume of approximately 10% of the total volume of the syringe or pipette. A study by SPEX CertiPrep showed that dispensing such a small percentage of the syringe’s total volume created a large amount of error. In this study, four syringes, 10 µL, 25 µL, 100 µL, and 1000 µL were used to dispense between 8–100% of the syringe’s total volume of water. Each volume was weighed and replicated 10 times by several analysts and the results were averaged together to calculate average error.
The largest rates of error were seen in the smaller syringes of 10 and 25 µL. Dispensing 20% of the 10 µL syringe created 23% error. Error only dropped down to below 5% as the volume dispensed approached 100%. In the larger syringes, measurements over 25% were able to see error in and around 1%. The larger syringes were able to get closer to the 10% manufacturer’s dispensing minimum without a large amount of error, but the error did drop as the dispensed volume approached 100%, which is seen in Table V. (See upper right for Table V, click to enlarge. Table V: Dispensing volume error associated with various size syringes (4).)
The third “I” of volumetric error is inadequate cleaning. Many volumetrics can be subject to memory effects and carryover. In critical laboratory experiments, labware sometimes needs to be separated by purpose and use. Labware subject to high levels of organic compounds or persistent inorganic compounds can develop chemical interactions and memory effects. It is also sometimes difficult to eliminate carryover from labware and syringes even when using a manufacturer’s stated instructions. For example, many syringes are cleaned by several repeated solvent rinses before use. A study of syringe carryover by SPEX CertiPrep showed that some syringes are subject to high levels of chemical carryover despite repeated rinses.
In this study, several syringes ranging in volume from 10 µL to 1000 µL were used to dispense a 2000 µg/mL internal standard mix of deuterated compounds. The subsequent washes were collected and tested by GC–MS to determine the amount of carryover in each wash (see Table VI). (See upper right for Table VI, click to enlarge. Table VI: Compound carryover found in syringe washes (ppm of carryover) (4).)
The larger syringes needed less rinses to reduce carryover than the smaller 10-µL syringes. The smaller syringes had more than 1 ppm carryover through over 15 rinses. The typical number of rinses usually employed to rinse syringes is between three and five, which in the case of the smaller syringe would not be adequate to clear all the carryover from the syringe.
The final source of error is infrequent calibration. Many laboratories have schedules of maintenance for equipment such as balances and automatic pipettes, but often overlook calibration of reusable burettes, pipettes, syringes, and labware. Under most normal use, labware often does not need frequent calibration but there are some instances where a schedule of recalibration should be used. Any glassware or labware in continuous use for years should be checked for calibration. Glass manufactures suggest that any glassware used or cleaned at high temperatures, used for corrosive chemicals, or autoclaved should be recalibrated more frequently.
It is also suggested that under normal conditions that soda-lime glass be checked or recalibrated every five years and borosilicate glass after it has been in use for 10 years. The error associated with the use of volumetrics can be greatly reduced by choosing the correct volumetric for the task, using the tool properly, and by making sure the volumetrics are properly cleaned and calibrated before use.
Inorganic analysts know that glassware is a source of contamination. Even clean glassware can contaminate samples with elements such as boron, silicon, and sodium. If glassware, such as pipettes and beakers, is reused, the potential for contamination escalates. A study was made of residual contamination at SPEX CertiPrep of our pipettes after they were manually and automatically cleaned using a pipette washer (2).
An aliquot of 5% nitric acid was drawn through a 5-mL pipette after the pipette was manually cleaned according to standard procedures. The aliquots were analyzed by ICP-MS. The results showed that significant residual contamination still persisted in the pipettes despite a thorough manual cleaning procedure.
The experiment was repeated using a pipette washer especially made for use in parts-per-trillion analysis. The pipette washer repeatedly forced deionized water through the pipettes for a set time period. The pipettes were cleaned in the pipette washer, and then the same aliquot of 5% nitric acid was drawn through the 5-mL pipettes. The aliquot was analyzed by ICP-MS. The automated washer reduced the contamination significantly over manual cleaning of the pipettes. The reduction of contamination by moving from manual cleaning to an automated cleaning process was clear. High levels of contamination of sodium and calcium (almost 20 ppb) dropped to <0.01 ppb. Other common contaminants including lead and iron dropped from 5.4 and 1.6 ppb, respectively, to less than 0.01 ppb.
The reduction of contamination in labware can depend on the material of the labware and its use. Different materials contain many types of elemental and organic potential contamination as seen in Table VII (5). (See upper right for Table VII, click to enlarge. Table VII: Major elemental impurities found in laboratory container materials (2).) Trace inorganic analyses are best performed in polymer or high purity quartz vessels, such as fluorinated ethylene propylene (FEP), and minimize contact with borosilicate glass. Metals such as Pb and Cr are highly absorbed by glass but not by plastics. On the other hand, samples containing low levels of Hg (parts-per-billion levels) must be stored in glass or fluoropolymer because Hg vapors diffuse through polyethylene bottles.
All laboratories believe they observe a level of laboratory cleanliness. Most chemists recognize that there are inherent levels of contamination present in all laboratories. A common belief is that the small amounts of environmental and laboratory contamination cannot truly change the analytical results. To test the background level of contamination in a typical laboratory, samples of nitric acid were distilled in both a regular laboratory and in a clean-room laboratory with special air handling systems (HEPA filters). The nitric acid distilled in the regular laboratory had high amounts of aluminum, calcium, iron, sodium, and magnesium contamination. Table VIII shows that the acid distilled in the clean room had significantly lower amounts of most contaminants (2). (See upper right for Table VIII, click to enlarge. Table VIII: Elemental impurities found in nitric acid distilled in clean laboratories versus regular laboratories (2).)
Laboratory air also can contribute to contamination of samples and standards. Common sources of air and particulate matter contamination are from surfaces and building materials such as ceiling tiles, paints, cements, and dry wall. Surface contaminants can be found in dust and rust on shelves, equipment, and furniture. Dust contains many different Earth elements such as sodium, calcium, magnesium, manganese, silicon, aluminum, and titanium. Dust can also contain elements of human activities (Ni, Pb, Zn, Cu, As) and organic compounds like pesticides, persistent organic pollutants (POP), and phthalates. The dust and rust particles can contaminate open containers in the laboratory or enter containers by charge transfer from friction by the triboelectric effect. The triboelectric effect or triboelectric charging is when materials become charged after coming into contact with a second material creating friction. The most common example of this effect is seen when hair sticks to a plastic comb after a static charge is created.
The polarity and the strength of the electrical charge is dependent upon the type of material and other physical characteristics. Many materials in the laboratory have strong positive or negative triboelectric charges as shown in Figure 2. (See upper right for Figure 2, click to enlarge. Figure 2: Triboelectric charge potential of common materials and particles in the laboratory.) In the laboratory, materials like dust, air, skin, and lead have extreme positive charges and can be attracted to the strongly negative charge of PTFE or other plastic bottles when the bottle is opened and friction is created, inducing a charge.
Laboratory personnel can add their own contamination from laboratory coats, makeup, perfume, and jewelry. Aluminum contamination can come from laboratory glassware, cosmetics, and jewelry. Many other common elements can be brought in as contamination from lotions, dyes, and cosmetics. Even sweat and hair can cause elevated levels of sodium, calcium, potassium, lead, magnesium, and many ions. If a laboratory is seeing an usually high level of cadmium in the samples it could be from cigarettes, pigments, or batteries. If the levels of lead are out of range, contamination can be from paint, cosmetics, and hair dyes. Figure 3 shows potential sources of common elemental contamination from outside products. (See upper right for Figure 3, click to enlarge. Figure 3: Common sources of contamination found in the laboratory.)
Laboratory environment and personnel contamination can be reduced by limiting use of personal care products, jewelry, and cosmetics that could contain contamination and interfere with critical analyses. Laboratory coats can collect all types of contamination and should only be worn in the laboratory to avoid cross contamination from other laboratories and the outside world. The laboratory surfaces should be kept clean. Deionized water can be used to wipe down work surfaces. Laboratory humidity can be kept above 50% to reduce static charge. An ethanol- or methanol-soaked laboratory wipe can be used to reduce static electricity as it evaporates.
Even with clean laboratory practices in place, erroneous results can often find their way into sample analysis. To eliminate some of these spurious results, replication of blanks and sample dilutions can be used. The blank results should be averaged and the sample run values can either be minimally selected or averaged. The difference between the two values can then be plotted against a curve established against two more standards. A minimum of two standard points can be used if the chance of contamination is minimal, such as in the case of rare or uncommon elements. Additional standard points should be considered if the potential for contamination is high with common elements such as aluminum, sodium, and magnesium. Multiple aliquots of blanks and dilutions can also be used to further minimize analytical uncertainty.
Laboratories should follow a general regime of three runs each of wash–rinse runs, blank runs, and sample runs, as well as single runs of sample plus spike, and standard or spike runs without sample to use as a control solution to evaluate recovery.
Analysts must realize that the cleanliness and accuracy of their procedures, equipment, and dilutions affect the quality of the standards and samples. Many laboratories will dilute CRMs to use across an array of procedures and techniques. This in-house dilution of CRMs can be a savings to the laboratory but in the final analysis can be a source of error and contamination.
CRM manufacturers design standards for particular instruments to obtain the highest level of accuracy and performance for that technique. They also use calibrated balances, glassware, and instruments to ensure the most accurate standards are delivered to customers. Certifications such as ISO 9000, ISO 17025, and 17034 assure customers that procedures are being followed to ensure quality and accuracy in those standards. After those CRMs are in chemists’ hands, it is then their responsibility to use all possible practices to keep their analysis process free from contamination and error.
Patricia Atkins is a Senior Applications Scientist with SPEX CertiPrep in Metuchen, New Jersey. Direct correspondence to: patkins@spex.com
P. Atkins, Cannabis Science and Technology1(4), 40-49 (2018).
Insights on Cannabis Testing Challenges and Industry Standards: An Interview with Douglas Duncan
August 9th 2024In light of recent headlines concerning cannabis laboratories throughout the country, Cannabis Science and Technology reached out to Douglas Duncan, Laboratory Director of Kairos Labs in Detroit, MI and member of our Editorial Advisory Board for more information. In this interview, Duncan shares his perspectives on lab shopping, major challenges in the industry today, and innovations in cannabis testing laboratories for the future. He also shares insights into consumer practices and the potential effects of a federal rescheduling of cannabis.