Televac AN 3015: Recommended Practices for Vacuum Calibration

Description

This Televac® recommended practices document addresses the general procedure for the best practices for minimizing uncertainty when calibrating thermal conductivity (Televac® 2A and 4A) and cold cathode vacuum gauges (Televac® 7B, 7E, 7F, 7FC, 7FCS), which includes the vacuum sensor(s) and accompanying electronics necessary for a pressure measurement to be made. It also includes the best practices for an in process verification where limitations make it impossible to follow the best practices for minimizing uncertainty.

Verifying the accuracy and operation of vacuum gauges is critical to ensuring the maintenance of processes under vacuum.

Disclaimer

This document presents the best possible practices for minimizing uncertainty while only utilizing the most basic equipment. This document does not seek to provide a definition for uncertainty, nor an uncertainty assessment for field verification.

This document does not address systems where explosive or condensing gases are used.

Background

Thermal conductivity and cold cathode gauges have been in use since the first half of the 20th century. Currently thermal conductivity gauges, such as the pirani, thermocouple, and thermistor types, offer the most cost effective solutions for pressure measurement in the low vacuum range of 1*10-3 Torr to 1*103 Torr. Thermal conductivity gauges rely on the pressure dependent process of heat loss from a heated filament (or multiple filaments) to make a measurement.

Cold cathode gauges, such as Penning magnetrons, inverted magnetrons, and double inverted magnetron types, offer the most durable options for pressure measurement in the high vacuum range from 1*10-8 Torr to 1*10-3 Torr. Cold cathode gauges ionize plasma in the presence of an electromagnetic field and relate the measured ion current to a pressure.

Definitions

Definitions from JCGM 200:2008 International vocabulary of metrology – Basic and general concepts and associated terms (VIM). Please see the References section for more information.

Reference Standard

Measurement standard designated for the calibration of other measurement standards for quantities of a given kind in a given organization or at a given location.

Verification

Provision of objective evidence that a given item fulfills specified requirements.

Calibration

Operation that, under specified conditions, in a first step, establishes a relation between the quantity values with measurement uncertainties provided by measurement standards and corresponding indications with associated measurement uncertainties and, in a second step, uses this information to establish a relation for obtaining a measurement result from an indication.

Ideal Calibration

An ideal calibration would seek to minimize all uncertainty for the reference standard used to calibrate a unit under test. Incoming data would be taken at regular intervals in the range of measurement of the unit under test or where the unit under test is used to control a critical process as defined by the end user. This incoming data should serve as a baseline comparison. Following this comparison, adjustments may be made to the unit under test measurement system, such as a change to the electronics or replacement of sensors, to maximize the agreement between the unit under test and the reference standard. After any adjustments, a further comparison would be made and data would be recorded as outgoing data.

Throughout this process, the goal is to minimize uncertainty in measurement while comparing the unit under test to the reference standard. This section will present areas where uncertainty is introduced to the measurement and the best practices to minimize their contributions.

Background Gas Contribution

Both thermal conductivity gauges and cold cathode gauges are indirect pressure measurements that rely on measuring a property of the gas and relating that gas property to the pressure. These properties are gas type dependent and as such, the measurement of pressure for each is gas type dependent. As such, it is best to perform a comparison by the leak up method of pumping below the desired pressure and leaking up to the desired pressure. It is best to pump down to 1% or less, according the reference standard, of the lowest pressure to be compared.

Test Gas Admission

Due to the gas dependence of thermal conductivity gauges and cold cathode gauges, the test gas used during comparison should be the gas to be used during normal operating procedures. Often this is dry nitrogen or argon but could be another gas. The purity of the gas should be 99% or higher if possible. The comparison gas should be a dry source without any condensable vapors. Condensable vapors will raise the uncertainty due to the gas dependence of the gauges. It is important that the reference standard used for comparison has been calibrated using the specific test gas or possesses compensation – either automatically through the electronics or via a reference table – to minimize uncertainty. Caution: dangerous overpressures can be reached if using a light gas (Hydrogen or Helium for example) due to the difference in thermal conductivity.

It is best to leak in the gas and dwell for approximately 15 seconds to ensure a stable reading of the gas supply. For comparison below 1 Torr, it is best if gas is admitted to the system while there is still some pumping. This will minimize the effect of outgassing during comparison and allow for the pressure to be more easily controlled. For comparison above 1 Torr, no pumping is necessary since outgassing is small compared to the leaked gas load.

Warm Up Time and Ambient Temperature Effects

Thermal conductivity gauges depend heavily on the ambient temperature of the environment. Often manufacturers utilize a method of thermal compensation to minimize errors. Cold cathode gauges are also sensitive to temperature but are less so than thermal conductivity gauges. To minimize uncertainty, gauges should be at the ambient temperature specified by manufacturers for maximum accuracy. This temperature is generally between 20 C and 25 C. At minimum, the ambient temperature should be within the specified operating temperatures as defined by the manufacturers.

To decrease uncertainty, the gauges should be allowed to reach a stable operating temperature before comparison. This is often called a warm up time. Many manufacturers specify a minimum warm up time. If not specified, 15 minutes is generally the minimum allowable time.

For cold cathode gauges, the warm up time is also useful for conditioning the surfaces of the cold cathode that may have water vapor or other vapor condensed on them from before pump down.

Allowing cold cathodes to operate at low pressures will allow for this conditioning to happen and decreases uncertainty. Manufacturers may have specific recommendations for this time. Often 15 minutes is the minimum allowable time.

Geometric Concerns and Pressure Gradients

Pressure can vary throughout a chamber based on the layout of the chamber, local outgassing, and location of valves, pumps, and gauges. Pressure measurements, particularly those made with thermal conductivity gauges, can be influenced by gas flows or sources of radiative heat. Pressure measurements made by cold cathode gauges can be influenced by ion sources in the chamber.

To minimize uncertainty in measurements, reference gauges and the units under test should be located symmetrically in the chamber and should not have a line of sight, within the chamber, to other gauges, mass spectrometers, heated filaments, inlet valves, or other similar features.

Also note the correct orientation for both reference standards and units under test. While cold cathodes are not orientation specific, thermal gauges can vary widely above 1 Torr for pressure measurement based on their orientation.

In Process Verification

While the descriptions in Section 4 are presented as the ideal comparison, vacuum users may not be able to see that all or even any of the conditions are met. Even with limited control, the goal of minimizing uncertainty in measurement while comparing the unit under test to the reference standard remains paramount. As such, below is presented the best practices for users with minimal leak control.

Place References on Chamber with a Tee Cross

Reference gauges should be placed onto the chamber with the units under test using a tee cross piece. For verification, reference gauges should be placed onto the tee with sensors of the same type. For instance, if a cold cathode gauge and a thermal conductivity gauge are being used to verify another cold cathode gauge and another thermal conductivity gauge, the reference conductivity gauge should be placed on the same tee as the unit under test conductivty gauge and likewise for the two cold cathode gauges. The orientation of each should be identical and the same types of fittings used if possible.

Pump Down and Warm Up

If feasible, it is best to mount the reference gauges on the chamber and pump down to the ultimate pressure of the chamber while operating the gauges. The reference gauges and units under test should be allowed to operate for 15 minutes after the ultimate pressure is reached. This will allow the sensors to be conditioned in the same gas species as the unit under test and more closely approximate the conditions in the gauges as well as allowing for thermal equilibrium to be reach with the ambient temperature.

If this requires running a process that may contaminate the reference gauges, it is best to skip the pump down and simply proceed to the next step after operating the thermal conductivity gauges at atmosphere for 15 minutes instead.

Vent with Test Gas

After pumping down to the ultimate pressure of the chamber and warming up the references gauges and units under test, vent the system to atmosphere with a dry and pure (99% or higher) test gas. This will limit the amount of vapors in the chamber as well as minimizing gas composition uncertainties.

Pump Down and Dwell Time

After venting to atmosphere, begin pumping the chamber to the pressure closest to atmospheric pressure that verification is desired. Stop pumping as close to the desired pressure as possible. If the point is above 0.1 Torr, the dwell time should be approximately 15 seconds so that a stabile reading can be obtained. Below 0.1 Torr, chamber outgassing begins to become significant so pumping is necessary. To achieve a stable measurement, it is best to measure once the ultimate pressure of the chamber – due to the roughing pump – has been reached.

In high vacuum, below 1*10-3 Torr, a comparison is best made at the ultimate pressure of the chamber. The dwell time at the ultimate pressure should be 15 minutes at minimum to ensure that the readings are stable and the pressure in the chamber is no longer changing. It is unlikely that more points than the ultimate pressure can be compared given the lack of a controlled leak.

References

For further reading on vacuum gauges, calibration and general definition of terms, the following references provide a good introduction.

Hanlon, J. F. (2003). A User’s Guide to Vacuum Technology, Third Edition. John Wiley & Sons, Inc., Hoboken CGM 200:2008 International vocabulary of metrology — Basic and general concepts and associated terms (VIM)

JCGM 100:2008 Evaluation of measurement data — Guide to the expression of uncertainty in measurement

Recommended practice for calibrating vacuum gauges of the thermal conductivity type R. E. Ellefson and A. P.Miller, J. Vac. Sci. Technol. A 18, 2568 (2000)

Tilford, C. R. (1991). Pressure and vacuum measurements, in Physical Methods of Chemistry, Chap. 2, Vol. VI (B.W. Rossiter, J. F. Hamilton, and R. C. Baetzold, eds.), Interscience, New York.