Life sciences companies must perform regular calibrations of instrumentation to meet regulations, but these can be costly. Modern instrumentation simplifies the process.
To fulfill regulatory compliance and ensure quality, life science companies must perform traceable calibrations on instruments. But calibrations are costly, time-consuming, can cause process downtime, and pose an increased risk for contamination. Many instruments on the market today provide self-diagnostics features which give users information about the health of the device.
ISO9001:2008-7.6, GMP, and WHO regulations and standards all require equipment and instrumentation to be calibrated or verified at specific intervals against measurement standards traceable to international or national standards. However, it is always the plant operator’s responsibility to define and execute a proper maintenance program.
The task is finding the right balance between saving operational cost by extending intervals while ensuring quality and reliability. The main issue with extended calibration cycles is the performance of instruments between calibrations.
The first recalibration step after an instrument has been installed and working for a period of time is to determine if the instrument is still operating within specifications. A failed “as-found” check can be critical and has to be further investigated for possible impact on product quality. A substantial number of FDA warning letters are issued because remedial action has been considered insufficient.
Calibrations are expensive but provide very clear results for the user. Even though many instruments have proven exceptional long-term stability, which exceeds the entire lifetime of the equipment, they still have to be checked regularly to avoid legal implications and possible issues. Some companies calibrate every six months (Figure 1).
The criticality of the process parameter helps define the maximum acceptable risk level. The blue line in Figure 1 shows how an instrument can deviate over time, but it is recalibrated back to initial specifications every six months. The average deviation (risk) between calibrations must always remain below the acceptable level. However, the possibility for an undetected “out-of-spec” situation gradually increases over time, resulting in an increased risk of product quality issues.
Calibration vs. Verification
Legal requirements for regular checks are commonly fulfilled with wet calibrations. A calibration of an instrument—for example, a flowmeter—involves determining and documenting the difference between the value read by the instrument and a reference value.
Traceability is accomplished by a formal comparison to a reference standard which is directly or indirectly related to national or international standards. Detected deviations between the displayed value and the reference value can be corrected after the calibration by adjusting the calibration factor. A calibration protocol is issued to document the findings.
The downside of wet calibrations is that the instruments typically have to be removed from the process and connected to a calibration rig or a master meter. After the calibration, the instrument is then sent back to the facility to be installed again. Damages during transport or handling sometimes go undetected and can lead to a situation where a recently calibrated instrument is not performing according to specifications.
Alternatively, a mobile calibration unit can be used to perform a calibration on site. This method typically eliminates the need for dismounting the meter under test, but it still requires the primary process loop be opened, increasing contamination risk.
An alternative way to fulfill legal requirements is in-situ verification of the device. Here, the device runs an on-board diagnostics program where all relevant components of the instrument are checked to confirm and document the instrument still meets factory conditions and that no parts have been altered, changed, or have drifted (Figure 2).
Several instrument makers offer in-situ calibration and verification, and all work in a similar fashion. The system in Figure 2 is based on Endress+Hauser’s Heartbeat technology, which provides documented proof that a flowmeter performs according to specification.
If a device is equipped with Heartbeat Technology, all test sections are monitored continuously and are part of the standard device diagnostics (sensor, front end, reference, I/O loop). If a verification is initiated, the current status of all diagnostics parameters are read and stored with a unique identifier in the failsafe memory of the flowmeter. A verification report in PDF format is generated based on the diagnostic data of this snapshot. This report can be downloaded, printed, or stored externally for audit documentation.
The purpose of instrument verification is to provide a tamperproof verification document confirming the status of the device, similar to a calibration certificate. These qualitative verification results have the same value as a wet calibration and can be used in an equal manner to prove the device under test is still fit for the defined operation.
The main advantage of an embedded verification is that the instrument does not have to be removed from the process, and therefore the risks of damage due to the handling and cross contamination of the process loop are eliminated. Process interruption is also not usually required as the verification tests can all be performed in the background while the instrument is still performing its intended function.
Requirements for On-Board Verification
Calibrations and verifications have to be traceable to national or international standards to fulfill regulatory requirements. Wet calibrations achieve traceability by using calibration rigs or master meters accredited according to ISO 17025. A more complex situation presents itself for devices with built-in self-verification functionality. Integrated solutions have to rely on a network of redundant components and built-in traceable references.
The entire signal chain of the instrument has to be analyzed for possible errors and their subsequent impact on the system and its measuring accuracy. Typically, a Failure Mode Effects and Diagnostic Analysis is used during the device design phase to identify critical components in the signal chain. This analysis starts at the process wetted parts, followed by the electro-mechanical components, the amplifier board, the main electronics, and the outputs. As a result, a proper safety measure has to be assigned to every critical path or component.
Measures include digital signal processing and continuous loop checks with the help of internal reference components. For an internal component to be used as a diagnostic reference, it has to fulfill special requirements such as factory traceability and exceptional long-term stability.
For the most critical circuits, independent and redundant components are implemented to reduce the possibility of an undetected drift. Using modern technology, it is possible to design instruments with a self-diagnostics coverage of 94 percent or higher (in accordance with IEC 61508) and low expected rate of undetected failures.
Verification on the Go
The benefit of built-in verification is that it can be easily initiated locally or remotely from the control system, usually with no process interruption. A meter can be verified on a daily basis, drastically reducing the unknown period between calibrations. In batch applications, a system check can be initiated from the control system prior to starting the batch to ensure all devices work properly. Such a system check greatly reduces the risk for unplanned shutdowns due to instrument failures.
Built-in verification can also save a significant amount of maintenance time and reduce the need for unnecessary calibrations. Figure 3 shows two instruments (A and B). Instrument A (blue) has to be recalibrated every six months based on the manufacturer’s recommendation. Instrument B (orange) is equipped with an embedded diagnostics and verification system and is verified bi-monthly by the means of an automated diagnostics system. Due to the higher test coverage of the diagnostics system, Instrument B requires wet calibration only once every 2.5 years. Instrument B is generating 80 percent savings on maintenance cost, while at the same time achieving a significantly higher confidence level than instrument A.
Wet calibrations are still the most frequently used method to check an instrument and demonstrate regulatory compliance. State-of-the-art instruments with embedded verification capabilities offer the chance to improve upon this practice. Performing regular verification on the instrument can extend calibration cycles by a factor of five or higher without jeopardizing quality or regulatory compliance. Shorter unknown periods between checks lead to an increased confidence level and reduced risk for critical applications.