All objects with a temperature above absolute zero emit radiation. The amount of radiation varies with wavelength, object temperature and surface emissivity. Hotter objects emit more radiation and at shorter wavelengths.
In 1900, Max Planck defined equations to model the spectral radiance from a blackbody that form the basis of infrared thermometry today.
A perfect 'blackbody' radiates all its energy according according to Planck's law and is said to have emissivity ε = 1. Real surfaces emit a reduced amount of radiation and have emissivity between 0 and 1.
From Planck theory, an object of temperature T Kelvin with surface emissivity ε will emit radiance R at wavelength λ defined in µm, where:
Infrared thermometers and thermal imagers measure the amount of radiation they receive at a specific band of wavelengths and apply Planck theory in reverse to calculate the temperature of the object.
Low emissivity surfaces are reflective (emissivity = 1 - reflectivity), so an infrared measurement device will receive reflected radiation from background objects as well as radiation from the target.
Planck theory applied to a more complex system becomes:
Modern pyrometers and thermal imagers apply emissivity and background corrections internally. The calculator below can be used to post-process data with different settings.
Enter measured and corrected values for wavelength, emissivity and background temperatures to calculate target temperature measurements.
| Wavelength | µm | |
| Instrument Settings | Corrected Values | |
| Emissivity | ||
| Background Temperature | C | C |
| Target Temperature(s) | C | C |