The properties of the water molecule present significant challenges in the on-line measurement of trace humidity levels in process applications. These physical attributes present even more significant challenges when field verification and calibration of these measurement techniques is desired or mandated.
In the past several decades, analyzer companies have introduced newer measurement technologies into the process marketplace. These address the speed of response and accuracy requirements of these high value applications. The same level of innovation has not yet occurred in calibration and verification techniques, which by their very nature, would be more accurate than the measurement technique they are validating.
Further, some of these measurement technologies claim that their measurement technique will not change in calibration, removing the necessity for regular calibration or verification. As there is no perfect implementation of physics in the real world, process operators and plant engineers have an expectation that some mechanism be available to validate the performance of these analyzers, despite the manufacturers claims. This is driven by two factors. The first factor is a general distrust of analyzers. Too many process and ambient interferences exist that could preclude an analyzer from reporting a moisture upset in a timely and accurate manner. The second factor is that moisture plays too vital a role in their high value process. In some cases, a process can not start before the moisture content goes below a certain level. In other cases, an intrusion of moisture may necessitate a process shut-down to mitigate moisture from entering the process, as the water molecule would be a destructive contaminant.
This presentation investigates how a plant can implement and maintain proper field validation techniques that allow for the highest level of accuracy for on-line trace moisture measurement.
Frank Gerritse, GE Oil & Gas