The word “calibration” often conjures up negative images, especially in IVD development and manufacturing, where regulations mandate that laboratories and production plants have regular calibration procedures for their equipment.
The expense and downtime needed to accommodate maintenance and calibration service schedules are two recurring burdens often associated with calibration, most usually performed with manual methods or outsourced to the original equipment manufacturer or to third-party service providers. However, failing to perform calibration is not an option. Both regulatory compliance and device quality control depend on regular maintenance and calibration of all measuring equipment, and when liquid handling is involved, few measurements are more critical than accurate and reliable volumetric dispensing.
This is because the consequences of liquid delivery failure are high, including non-compliance and poor quality devices. Therefore, it is critical that Liquid Delivery Quality Assurance (LDQA) be an integral component of laboratory and production operations. This is difficult with manual or outsourced calibration due to disruption, lack of control, inconvenience, expense and inefficiency. However, because automated calibration technologies can eliminate these issues and reduce risk and control costs, it is now possible to make liquid delivery quality assurance more than a random act of compliance without decreasing productivity. With the proper integration of real-time measurement verifications and appropriate software, LDQA can become a routine state of being that ensures continual and optimal quality in liquid handling processes.
Due to new, standardized technologies, automated liquid delivery quality assurance methods are now easy to integrate and maintain. Available automated methods include systems that manage equipment inventories, document and control calibration procedures, compute accuracy and precision statistics, generate pass/fail reports and facilitate record keeping and compliance. This article will explore the various options available for automating liquid delivery calibration methods and the resulting improvements to IVD operations.
Liquid handling is present in a number of critical phases of IVD development and manufacturing. For example, liquid delivery accuracy is critical during the manufacture of reagent strips used in diagnostic test kits. During strip production, dispensers fire tiny droplets of antibodies onto paper continually fed from a reel. This paper is then cut into strips and placed into test kits. If the liquid dispensing equipment malfunctions, the reagent strips might receive inaccurate antibody volumes or no antibodies at all. This can lead to defective diagnostic tests that provide inaccurate results, such as false negatives. This is highly alarming given the recent FDA report stating that laboratory tests guide up to 80% of medical decision-making.
Clinical analyzers also rely on liquid handling instrumentation that is built into these systems to deliver both samples and reagents. Volumetric accuracy and precision are critical to achieve reliable and repeatable results. Therefore, analyzer manufacturers must calibrate and verify liquid delivery processes, and these verifications should occur during design, development and qualification. Volumetric calibration should also become part of ongoing maintenance, calibration and re-qualification of analyzer systems at periodic intervals after installation in clinical laboratories.
Due to the prevalence of liquid handling, the severe consequences of instrumentation failure and the tight regulations governing operations, there is no question that IVD companies must regularly calibrate liquid handling equipment. For example, the Code of Federal Regulations (21 CFR 820.72) states that procedures must be established and maintained “to ensure that equipment is routinely calibrated, inspected, checked and maintained.” This section of the CFR further stipulates that calibration procedures “shall include specific directions and limits for accuracy and precision,” and that “there shall be provisions for remedial action to reestablish the limits and to evaluate whether there was any adverse effect on the device’s quality.” As a result, all device manufacturers must adopt and enforce strict calibration protocols.
To meet calibration requirements, IVD companies have for years relied onmanual or outsourced methods for liquid delivery performance verification. However, with the advent of automated calibration methods, the inefficiencies of these alternatives are more apparent.
For example, the most common manual calibration method is based on gravimetry, which weighs liquid quantities on analytical balances. With this method, laboratory personnel manually obtain weighing data and calculate volume using appropriate corrections such as Z-Factor tables and corrections for evaporative loss. Then accuracy and precision are calculated and the results are recorded. However, manual methods are prone to human error and are time-consuming. This is illustrated by the fact that automated performance verification of a 384 well plate can be accomplished in less than 10 minutes, while manual gravimetric calibration can require hours to complete. With manual methods, management also places a heavy reliance on operator skill, training and attention to detail in order to ensure procedural compliance.
While automation can solve many of these drawbacks, manual calibration may be preferred when calibration is required infrequently or when laboratories are testing new methods. However, once a laboratory commits to a given calibration protocol, automation can often result in productivity gains.
Outsourcing liquid delivery calibration is another alternative to in-house automation. This strategy can be useful for organizations lacking sufficient human resources to implement and maintain an automated calibration process. However, outsourcing can lead to scheduling nightmares, downtime and loss of productivity due to instruments taken out of the laboratory. Outsourcing can also be costly, both in terms of the actual fee paid to the vendor as well as the hidden costs associated with qualifying and managing service partners. Another concern is that the quality of the calibration process can vary depending on the technician charged with performing the calibration. On the other hand, outsourcing calibration can be successful for a company that has a reliable, well-known partner that can be trusted to perform well. Regardless of service partner, the responsibility for properly investigating any irregularities or failed calibrations rests squarely on the IVD manufacturer. For many manufacturers, this is a compelling reason to take charge of their calibrations and thereby control their operations.
Whether the liquid delivery process itself is automated or manual, automatic performance verification or calibration can have several advantages over manual and outsourced service methods. First, software enables electronic calculation and documentation and reduces the risk of error, which in turn enhances efficiency by eliminating the need for repeat tests. Automatic instrument “trackability” (the ability to know precisely when a piece of equipment was last verified and which manufactured devices were produced since the last successful verification) is another key benefit. Automatic calibration software systems can also improve efficiency in scheduling and compliance through features such as automatic notification of upcoming scheduled calibration.
Coupled together, software and automated volume measurement technologies can simplify calibration to the extent that frequent verifications can be integrated directly into the IVD manufacturing process. This gives IVD companies better control over their instrumentation and processes and greatly reduces or even eliminates the need for out of tolerance investigations, corrective actions or root cause investigations that would otherwise arise from liquid delivery failures.
Reducing the amount of labor required for calibration allows for reallocation of thinly-stretched resources to more complex and profitable projects, and can also reduce costs over the long-run. However, the more important end result of an automated calibration system is a process that is continually and painlessly controlled, instilling confidence in the quality of operations and manufactured products.
One solution for speeding calibration and enhancing productivity is an equipment management software system. These systems are used to electronically manage liquid handling instrumentation, which is especially useful for organizations with numerous pipettes, for example. Through bar coding or RFID technologies, these systems track instrumentation, schedule calibration and, based on management input, control and enforce testing protocols, such as frequency of verification and number of data points.
Other systems incorporate automated data processing and recording, providing documented calibration results and automatic pass/fail determinations. These technologies can be integrated with equipment management systems for full calibration support. To automatically process information, these systems rely on software components to enforce testing protocols, compute liquid volume measurement results and calculate summary statistics such as accuracy and precision. These software programs are available for implementations of the three most common liquid delivery calibration methods – gravimetry, single-dye photometry or Ratiometric PhotometryTM.
Gravimetric calibration can be automated by integrating balances with computational software or building balances into automated liquid handlers. This removes human transcription and calculation error and provides automatic documentation. Traceability to national standards can be achieved as well (see Automation and Traceability section). Although this technique automates some functions, human involvement is usually necessary to monitor and periodically empty the receiving vessels on the balances to avoid overflow. Additionally, this method can be time consuming. For example, one commonly used gravimetric calibration protocol requires about 90 minutes of downtime to calibrate an eight-probe liquid handler. Gravimetry also requires a controlled environment for accurate results, which can be an impediment.
Photometric calibration, which measures the absorbance of light by a dye solution at a given wavelength to verify liquid volume, may be more suitable for automation than gravimetry because this technology is less affected by the environment and evaporation errors. This also gives photometry a distinct advantage at lower volumes. To date, single-dye photometry has only been incorporated into some internally developed automated systems and in a few manual commercial systems. Traceability is therefore limited and dependent on user expertise (see Automation and Traceability section).
Ratiometric Photometry is another approach to calibration, and this patented technology, which relies on two dyes for more accurate and precise volume measurement, has been incorporated into commercially available automated systems. Based on robust dual-dye technology, these systems not only provide speed and traceability but can also be used bench-top. Because ratiometric photometry is more technically complex, the calculations are incorporated into software to facilitate automatic volume calculation and measure error percentages versus set tolerances. These systems can calibrate single channel pipettes in five minutes and automated liquid handlers with up to 384 well plates in less than 10 minutes, and are preferred for low-volume applications.
The fact that automated calibration facilitates traceability is a key benefit that warrants special attention. According to previously discussed 21 CFR 820.72, measurements made as part of a calibration program must be traceable to national or international standards to ensure that results are consistent across different locations and over time. Full traceability requires some estimate of the uncertainty of the measurement. This can be thought of as a statistical margin of error.
For measurements of liquid volumes in the milliliter or high microliter range, there are two common traceability approaches. The original approach, gravimetry, weighs the liquid to determine the mass of the sample, converts mass to volume using known traceable values of the liquid density, and makes other appropriate corrections. As volumes decrease in size, traceability via mass of the liquid becomes increasingly problematic. This is because solvents, even water, evaporate while being weighed and this evaporative error becomes more significant as volumes descend into the lower part of the microliter range.
The second approach, chemical traceability, is useful for smaller volumes and can be realized using photometry. Chemical traceability is based on accurate knowledge of a chemical concentration in the parent sample, followed by a measurement of the amount of chemical present in the dispensed droplet. A proper choice of chemical species is one that does not evaporate or degrade during the measurement, and also produces a sufficiently strong signal that can be measured with adequate accuracy and precision. In the range of microliters and nanoliters, it is important to use dyes that have a strong absorbance response and excellent stability. Several commercially available automated systems include standardized dyes to facilitate traceability, which is not possible with home brewed systems.
To meet current requirements, traceability requires calculation and analysis of uncertainty. To do so manually requires a complicated and time consuming process performed by users with specialized knowledge that is typically not available to many laboratories and manufacturers. Alternatively, automated systems build this intelligence into the software, calculating uncertainty automatically and ensuring that traceability will meet all requirements.
In regulated environments such as IVD manufacturing, traceability and accuracy requirements usually bring the decision down to two options – gravimetry or ratiometric photometry. Each method has its place, and the four factors that most significantly influence the choice are environmental conditions, measurement rate, volume range and required uncertainty as shown in Table 2.
Gravimetry requires an environment that is relatively free of vibration, and accuracy is improved when temperature is stable and humidity is elevated. Ratiometric photometry is more forgiving of vibration and lower humidity, but does require an environment that is free of excess dust that could contaminate solutions and alter photometric readings. In the clean environment typical of IVD manufacturing, ratiometric photometry is probably the more forgiving technology, but gravimetry can be effective if volumes are large and vibration is low.
Measurement rate refers to the number of samples per minute needing measurement. For gravimetry, measurement rate depends on balance sensitivity and environmental stability, and is generally limited to a few samples per minute. For single channel dispensers that operate at a slow rate, either gravimetry or photometry can be appropriate. As measurement rates become larger, it may be necessary to move to a higher density 96 or 384 well format to attain sufficient measurement throughput, and here, ratiometric photometry has the advantage.
Volume range is one factor that can quickly and easily identify which measurement system is preferred – gravimetry works best with higher volumes while photometry has an advantage at low volumes. Commercially available photometric systems have upper volume limits in the 200 uL to 1000 uL range and greater volumes may not physically fit in the measurement cells. Fortunately, balances generally work quite well at volumes of hundreds of microliters or more. There is some overlap in capability in the hundreds of microliters range, but as volumes decrease into the mid and low microliter range, gravimetry becomes increasingly difficult and expensive. At the extreme low microliters to nanoliters range, photometry becomes the only practicable choice.
Measurement uncertainty is a final consideration. For gravimetric measurements, balance resolution is usually to four or five decimal places on the gram scale (i.e., 0.0001 or 0.00001 grams resolution). Six place balances exist but are fragile and have limited use outside a controlled calibration environment. Measurement uncertainties for four and five place balances in liquid measurement operations usually range from 120 nanoliters to more than 1,000 nanoliters.
For ratiometric photometry, the measurement uncertainty is usually dependent on the volume being measured. The best measurement capability can be less than one nanoliter when measuring a 100 nanoliter sample, or less than 1% of the measured volume. Photometric uncertainty grows with sample size, maintaining roughly a constant percentage, and can reach 1,000 nanoliters or more as sample volume increases into the high end of the photometric range.
While automation does have many benefits, there are several factors that must be considered when evaluating the possibility of integrating automated liquid delivery quality assurance. First, like all systems and products used in developing and manufacturing diagnostic tests, the quality of automated technologies must be ensured and continually verified. Procedures must be put in place to provide continual and regular documentation that the automated quality assurance system is functioning. Additionally, while automation does reduce manual labor, some degree of human involvement is usually still necessary for proper function.
Automation is not the right strategy for all operations. Thorough cost-benefit and return on investment (ROI) analyses should be conducted prior to committing to a new calibration system. Factors to consider in these analyses include capital and startup costs, cost of labor, reliability and availability as well as error rate reduction and associated error costs. In automated calibration, improvements in overall labor costs and savings due to error reduction are typically the most significant factors that offset the capital and startup costs.
If the benefits of automation outweigh the costs for a specific organization, the next step is to evaluate available calibration systems. User-intuitiveness is key. For a system to be widely implemented and successful, the interface must be easy to understand. Automated systems must also be secure and provide options for limited access to prevent non-proficient users from improperly using the technology. These types of controls are readily available in automated systems that are compliant with FDA software requirements, which is another feature to consider when evaluating automated options.
Another important characteristic is the availability of training and consultation by the system provider. Each manufacturing plant and development facility may have different needs, objectives and tolerances for error. It is beneficial to work with a vendor that can provide on-site support to help get the system up and running, instruct users on proper use and optimization tactics, and troubleshoot as need be.