What is adiabatic heating or cooling in leak testing applications and why is it significant?

During leak testing, when a part is at atmospheric pressure and ambient temperature is pressurized, the incoming air compresses the residual air in the part causing that air to heat. The resulting air temperature, which is now warmer than the part being leak tested, cools to the temperature of the part. This is usually the dominant thermal effect in the leak test. When a part is at atmosphere and then evacuated, the process is reversed.

How does entrapped air affect leak testing?

When the test chamber is closed, an amount of ambient temperature air will be trapped in the fixture prior to pressurizing the test part. If the temperature of the ambient air is changing faster than the temperature of the fixture, the temperature of the entrapped air can be higher or lower than the fixture. The changing temperature of the trapped air affects the leak measurement in the test volume. This effect is most pronounced in low-pressure systems (< 5 psig pressure). Better leak test methods use volume filler blocks to decrease the time constant of the system to reduce this effect.

How does test part temperature affect leak testing?

During leak testing, the rate at which a part reaches ambient temperature is proportional to the difference between the part temperature and ambient temperature. When a cool part is pressurized the test air will cool down toward the temperature of the part. As the part temperature rises toward ambient, the test air temperature will rise with the part temperature. This temperature rise of the air will increase test air pressure and create a virtual leak that will mask the actual leak. The process is reversed for a part that is warmer than ambient. For downstream leak testing with a cool part in a sealed test chamber, the effect is positive since this temperature effect will cause a pressure rise in the belljar and create a virtual leak.

Why is temperature compensation required in some critical leak testing applications?

Pressure changes caused by temperature changes account for the greatest source of error during leak testing. The pressure change caused by a slight change in temperature is often several times the pressure change resulting from the leak.

Temperature effects can be introduced by changes in the test part temperature, ambient temperature, temperature of the pressurizing air or change in the heating or cooling rate of the test part with respect to the ambient. These effects can be reduced by stabilizing ambient and pressurizing air temperatures during leak testing and isolating the test part from the external effects (such as drafts) that affect the cooling/heating rate of the test part. Calibration and the use of an appropriate reservoir can materially reduce these temperature effects.

In some leak testing applications, (Large volume/large surface area components such as automobile radiators) temperature compensation may be required to achieve the desired testing accuracy. Temperature compensation does not correct for temperature changes that are consistent with those encountered during calibration, rather temperature compensation corrects for temperature changes that enter the testing process after calibration.

Temperature compensation involves sensing the temperature of the leak tested part and the ambient. The computer determines a correction factor proportional to the difference in the two temperatures. This factor, which is applied to the leak rate calculation, offsets the effect of the part temperature.