Force sensors are part of the equipment found in almost every laboratory that deals with mechanical systems. Force transducers are also installed in material testing machines and other test machines to record both the input signal for control, and the force signal that will later be used to evaluate the results.
Strain gauge sensors consist of a spring element, on which the strain gauges (SG) are installed. When a force is applied, deformation occurs, which the strain gauges convert to a change in resistance. It is quite common to use four strain gauges wired to make a bridge circuit, which then converts the change in resistance to a measurable electric voltage. An operating voltage (excitation voltage) must always be applied to the strain gauge sensors.
The construction of piezoelectric sensors is based on crystals that give off a charge when influenced by a force. There is a linear relationship between the charge and the force. The charge is converted into a voltage signal by relevant electronics.
Force transducers almost always come supplied with a document containing information about the sensitivity of the sensor. The sensor was exposed to a known force at the factory, and the output signal was measured at this effective force. The result of this ‘calibration’ process is recorded in the relevant document. The output value at maximum force, or several measured values at different forces, are specified here in accordance with the quality and price of the sensor. Calibration is one of the most important steps in the production of a sensor and crucially determines the worth of the force transducer. Of course, later measurement can never be more accurate than the calibration used to establish the characteristic values of the sensor.
Force transducer manufacturers use relevant loading devices to precisely generate this force, and thus create the prerequisite for these calibration measurements. HBM has loading machines available for forces between 10 N and 5 MN.
Is a Newton always a Newton?
Wednesday, 31 March 2021 | 15:00 Central European Time | 02:00 PM Eastern Time
In the webinar, HBK product manager Thomas Kleckers or Field Sales Engineer Chris Novak will discuss force measurement calibration, traceability, and meeting given legal requirements.
The loading devices
Loading machine requirements are extremely strict. The machines at HBM, for example, offer accuracies of up 0.005%, based on the particular force value. This data is not based on the maximum force that can be generated by the machine, but on the force applied, whichever force the machine is using. So with a force of 1000 N, a deviation of 0.05 N is acceptable if the machine delivers an accuracy of 0.005%. The loading devices differ according to their operating principle.
Smaller forces are generated by means of weights
Weights are frequently used for smaller forces, as their mass is accurately known and the force sensors to be tested can be loaded with them. These machines are called dead load machines. The effect of these masses depends on the gravitational acceleration at the particular location, of course. Added to this is the buoyancy experienced by the masses in the atmosphere. This buoyancy force depends on the density of the mass units and the air pressure. HBM has a dead load machine with a maximum test force of 240 kN, the German National Metrology Institute (PTB) even has access to a dead load machine with a test force of 2 MN.
The geometry of these loading machines must never change under load, as otherwise the force will no longer be applied evenly to the sensor. This would cause significant measurement errors, even if the changes are not immediately visible. The masses consist of materials that chemically, are very inert. Nevertheless, it is essential to make sure that changes cannot result from reactions with the environment (e.g. oxidation). Regular checks also ensure that the the function of the machine is not negatively affected by incorrect maintenance or a lack of cleanliness (dust deposits on the masses).
Other loading devices generate the force hydraulically. This is particularly useful if the force has to be extremely large, as it would otherwise be necessary to have gigantic masses available on site. An extremely precise force transducer is connected between the equipment under test and the hydraulics. This precision force transducer is the reference which is used to establish the force being exerted on the equipment under test. So it is important to regularly check whether the integrated reference sensor is still working correctly.
The original kilogram
All force machines ultimately take their reference from the original kilogram, which is stored in the BIPM (Bureau International des Poids et Mesures) in Paris. There are various copies of this unit of mass in existence. All masses are therefore connected to an artifact. So it is a demanding technical challenge to draw conclusions about a stack of mass with a total weight of 200 t, from a mass of 1 kg.
It is important for measurement results to be reproducible: This is the only way to ensure that a measurement result established at a certain location can also be used at another location (within the uncertainty with which the result was established). This means that force sensors must be calibrated in the same way everywhere. A specific force must also always be the same in an international comparison. This results in the requirement for all machines to be regularly checked, i.e. compared with a measurement standard which is assumed to be accurate.
National metrology institutes are responsible for making sure that a certain force is the same everywhere and constitutes a prerequisite, as well as providing a definition for this force. In Germany, this is the German National Metrology Institute (PTB). These facilities have the loading machines from which all others take their reference.
To ensure that a newton is always a newton all over the world, the institutes also compare themselves with other institutes. You could remove the reference sensors and install them in the national institutes in the accurate loading machines, and remove the stacks of mass, so that they can then be compared with the national measurement standards. We shy away from doing so, as removing and installing highly-sensitive measurement technology is associated with uncertainty, and there are risks attached to transporting it.
This is why so-called transfer measurement devices are used. These force sensors are designed very specifically for ultimate measurement repeatability, even if the sensors are removed and put back again. Measurement repeatability of less than 0.002% is possible, so that extremely precise loading devices can also be connected to the national standard. A loading machine is now connected to the national standard by comparison measurements between the calibration machine to be tested and the national measurement standard, in Germany, for example, at the German National Metrology Institute (PTB). International comparisons of national standards are therefore also carried out at the highest level.
Force sensors that meet such high standards are based on strain gauge technology. There are two reasons for this. Strain gauge sensors use a Wheatstone bridge circuit. Skillful installation of the strain gauges ensures that a great many parasitic effects, such as temperature, initiated bending moment or lateral forces, are largely compensated. Strain gauge sensors are also absolutely ideal for the static measurements common in calibration, as they have no drift, and demonstrate long-term stability.
The characteristics of these sensors are evaluated according to the international standard ISO 376. In this standard, the ability of the sensors to repeat measurements is an important test point. You also find the limits of creep, hysteresis, and also the deviation of sensors from their specified characteristic curve.
For ease of orientation, the force transducers are divided into accuracy classes in accordance with ISO 376. Class 00 has the most stringent requirements. This establishes that the mounting and dismounting repeatability (“in varying mounting positions”) may only be a maximum of 0.05% of the measured value. Therefore even class 00 sensors are not adequate for connecting accurate loading machines to a national measurement standard.
HBM transfer measurement devices. Depending on their type, the sensors demonstrate repeatability in varying mounting positions between 0.002% and 0.05%. These transfer measurement devices can be used to connect forces between 2.5 N and 5 MN to a measurement standard.
Top-class sensors are not made, they are discovered
This is why so-called “top-class” sensors are made available. The precision achieved by these sensors is not readily achieved by deliberate production. It is rather that manufacturers check their ongoing production to find suitable examples. These then undergo extensive testing, until their special aptitude can be guaranteed.
At first glance, the strict requirements perhaps seem excessive, but you have to consider that precision is lost with every connecting measurement. The machines of an accredited calibration laboratory are initially connected to the national measurement standard. On these loading machines, transfer sensors are then used to connect the machines again, at the next level. For the accuracy of calibrations performed in the field to be satisfactory, you have to make sure that each connection takes place with the maximum possible accuracy. There is a noticeable general trend towards more stringent accuracy requirements for force sensors in production and testing. This trend will ultimately lead to increasing demands being placed on the calibration of force transducers in future.
In calibration terms, “connecting” is understood to mean comparison with a higher measurement standard. So the calibration machines of the measurement technology manufacturers are connected to the national measurement standard of the German National Metrology Institute (PTB) by comparison measurements.
Standard and accuracy class
The ISO 376 standard is the pertinent international standard for the calibration of reference force transducers. The sensor being tested is installed in and removed from the loading machine three times, and each time it is removed, the sensor is rotated in the machine by 120 degrees. At least eight upward forces are used for loading, as well as downward forces in two mounting positions. The results for each force are compared with one another, so that the “repeatability in varying mounting positions” can be calculated according to a given formula. The repeatability in a mounting position is also defined, as is the hysteresis error (difference between the series of tests with increasing force and the series of tests with decreasing force for a certain force), the creep (change in the output signal over time at a constant load) and the deviation from the fitting curve (deviation of the sensor’s actual characteristic curve from the characteristic curve specified by the formula). The precision of the loading machine on which the calibration is performed is also considered.
All these parameters are calculated in a calibration certificate according to ISO 376, so that the uncertainty of the sensor is shown at the different forces. As per the standard, it also indicates the accuracy class achieved in all the characteristics. If a value falls below the limit for a class in just one of the characteristics, the sensor is assigned to the inferior class overall.