# Temperature effect on the sensitivity

The temperature effect on the sensitivity is the variation of the actual output signal due to a 10 K change in temperature determined at nominal torque and related to the sensitivity. The specified value is the maximum occurring in the nominal temperature range.

The temperature effect on the sensitivity (also called the temperature coefficient of sensitivity) is a measure of the temperature effect on the output signal with a load applied to the transducer. When determining this value, the output signal has to be corrected by subtracting the initial torque signal at the respective temperature. A stationary temperature state has to be established.

The significant temperature is the transducer temperature. A stationary temperature state as defined at HBM means that the maximum temperature variation in a 15-minute period does not exceed 0.1 K. The amount of the deviation is given as a percentage of the actual span of the output signal with the respective torque applied (in the event of loading with the nominal torque this is the sensitivity).

The temperature effect on sensitivity results in a change of slope of the characteristic curve (see Fig. 2). It is of particular importance when a transducer is operated at a temperature differing significantly from the reference temperature. For partial load ranges, however, it has very little effect because the resulting deviation acts always as a percentage of the actual output signal span.

Please note that normally the temperature effect on sensitivity and the temperature effect on the zero point (TK0) are superimposed on each other.

### Example:

Consider a torque transducer with 1 kN⋅m nominal torque, let the temperature effect on the sensitivity be specified as TKC ≤ 0.1 %, the reference temperature as 23 °C and the nominal temperaturerange from +10 °C to +60 °C.

If the transducer is operated at a temperature of 33 °C (or 13 °C), the sensitivity deviation due to the temperature variation may amount to up to 0.1 %.

For a torque of 1 kN⋅m (nominal torque) this amounts to a deviation in the displayed value of 1 N⋅m. For a torque of 200 N⋅m, however, the deviation amounts to 0.2 N⋅m only, since the TKC is always a percentage deviation referring to the actual output signal span. This is due to the fact that the sensitivity is referred to as the measure of the slope of the straight line. Using the same transducer at 43 °C (20 K deviation from the nominal temperature) may result in a maximum deviation of up to 0.2 % in the worst case. This does not apply to usage at 3 °C, since this temperature is not within the nominal temperature range.

## Temperature effect on the zero signal

The temperature effect on the zero signal is the variation, due to a 10-K change in temperature, in the unloaded transducer’s output signal related to the nominal sensitivity. The specified value is the maximum occurring in the nominal temperature range.

The temperature effect on the zero signal (also called the temperature coefficient of the zero signal) is determined by measuring the variation due to a 10–K change in temperature in the unloaded transducer’s actual output signal at zero torque after re-establishment of a stationary temperature state. The significant temperature is the transducer temperature. A stationary temperature state as defined at HBM means that the maximum temperature variation in a 15-minute period does not exceed 0.1 K. Fig. 2: Temperature effect on the sensitivity TKC and on the zero point TK0.

The temperature effect on the zero signal results in a parallel shift in the characteristic curve (see Fig. 2). It is of particular importance when a transducer is operated at a temperature differing significantly from the reference temperature. By taring or zero balancing at operating temperature, the measurement error due to the temperature effect on the zero signal can be eliminated.

Please note that normally the temperature effect on the zero point and the temperature effect on the sensitivity (TKC) are superimposed on each other.

### Example:

Consider a torque transducer with 1 kN⋅m nominal torque, let the temperature effect on the zero signal be specified as TK0 ≤ 0.05 %, the reference temperature as 23 °C and the nominal temperature range from +10 °C to +60 °C.

If the transducer is operated at a temperature of 33 °C (or 13 °C), the zero signal deviation may amount to up to 0.05 % of the nominal sensitivity. This corresponds to a deviation in the displayed value of 0.5 N⋅m. This deviation is independent of the torque with which the transducer is loaded.
Using the transducer at 43 °C may result in a maximum deviation of up to 0.1 % in the worst case.

This does not apply to usage at 3 °C, since this temperature is not within the nominal temperature range.