Pressure measurement is the analysis of an applied force by a fluid on a surface. Pressure is a factor typically measured in units of force per unit of surface area. Several techniques have been developed for the measurement of pressure and vacuum.
However, when we appreciate the importance of measurement, we must also understand that accuracy shouldn’t be compromised at any level.
Significantly, while reading complex entities like pressure, you need to be a bit more cautious regarding accuracy.
How much care you take, certain external factors might affect the accuracy of pressure measurements. In this article, we will review these factors and see if we can find out a solution and/or easy way out to minimize them.
How Is Pressure Measured?
Pressure is generally measured in a system like air conditioning systems, ventilation ducts or in general closed internal environment. You will find pressure measuring instruments to measure pressure in different locations. These instruments are used to measure and display pressure in an integral unit and are called with other names such as pressure meters, pressure gauges, or vacuum gauges.
A manometer, another such device, measures and displays pressure by using the surface area and weight of a column of liquid.
Nowadays, you will find a number of gadgets available, which are some of the best electronic pressure measuring devices with ultimate accuracy.
Even with the utmost care, a few factors might fluctuate the accuracy of the readings.
5 Key Factors Affecting The Accuracy Of Pressure Measurement?
- The Type Of Gauge Employed For Vacuum Measurement And The Type Of Gas Being Measured
In order of responsiveness, you will find two fundamental types of pressure gauges: Ionisation and Thermal pressure gauges. Both are indirect; however, the difference is in how they measure the pressure associated with residual gas molecules.
Ionization gauges are basically used for low-pressure measurement where high accuracy is not that significant by ionizing gas molecules. These gas molecules are then furthered upon a detector, which ultimately measures the current caused by molecular impact.
On the other hand, you will find a number of different types of thermal gauges that work on the principle of gas molecules coming into contact with a hot surface, leading to the transfer of energy from the surface to the gas.
The rate of energy loss in the process will ultimately depend on the number of strikes that happen between the gas molecules, and therefore the pressure of the gas is altered.
The selection of vacuum gauges depends on the basic understanding of the principles that the gauges work. It also checks the range of pressures it can measure and what is the accuracy needed over the range.
The temperature has always been a crucial factor that affects pressure measurements. Usually, the molecules with higher mass require larger correction factors. Cold air is denser hence has a higher pressure. Warm air is less dense and has a lower pressure associated with it.
When considering thermal transfer gauges, you can allocate this phenomenon to larger molecules, which usually possess higher heat conductivity.
Also, although pressure gauges are designed considering various temperature ranges, the reading might still be affected due to extreme temperatures and result in false readings.
Changes in surrounding temperature affect the accuracy of pressure gauges in several ways.
The accuracy of a pressure gauge depends on a key factor that is the instrument calibration. Usually, when the pressure gauges are manufactured (or any device for that matter), the calibration is not certain, and no correction factor is applied.
The variability can fluctuate between 20% to 50%. The only way to improve this error is to use it regularly, with a constant correction.
Proper and timely calibration with respect to the standard can help you have higher level of accuracy in necessary applications.
Use of digital gauges and differential pressure measuring instruments makes the operation and inspection very easy.