Are you seeing the same regression in the manufacturing engineering field that I am? Some examples are so stunning and confounding—I scratch my head and wonder how it began. Is it poor schooling? A desire for cost-cutting that throws caution and common sense to the winds? Senility?
Here’s a case in point that I’ve run across a few times now. Engineering 101 tells us that a leak is measured in standard cubic centimeters per minute, pascal cubic meters per second, milligrams per second, or some well-defined and clearly measureable quantity.
Now, in a throwback to methods developed by can makers in a previous century, I’ve encountered some who contend that a leak can be defined by the diameter of a hole. This is done in the name of “convenience.”
My goodness, doesn’t precision matter? Isn’t it obvious that the same diameter hole will produce different leak rates depending on surface finish, moisture, contamination, ambient conditions such as pressure and temperature, the length-to-diameter ratio of the hole, and many other factors?
You can use crude measurements such as hole diameter if, and only if, you check all measurements against a transfer standard such as InterTech’s Calmaster®. However, engineers with a singular focus on testing would prefer to use Calmaster® and other traceable standards in conjunction with leak testing methods that have clearly defined accuracy ranges proven to be adequate for the real-world application requirements.
A hole diameter in and of itself is not and cannot be traceable to NIST or similar standards. Simply put, it does not afford the accurate calibration of leak measurements.
Jacques Hoffmann is president of InterTech Development Co., which designs and builds equipment for leak testing, functional testing and automated assembly.