What Am I “Seeing” With My Camera?

IR Talk
The Snell Group

Many people have the wrong idea of what or how Infrared cameras detect and process the information into a usable format for us. How do we as humans “see” the world around us? Our eyes detect REFLECTED VISIBLE electromagnetic radiation, from which our brains interpret what shapes and colors we are observing. The electromagnetic radiation we detect VISUALLY is at a set of wavelengths between .4 to .75 microns.

Infrared cameras / imagers on the other hand, also detect electromagnetic radiation at wavelengths.  Midwave systems operate between 2.5 and 6.0 microns or A Longwave system operates between 8.0 to 15.0 microns. This radiation is emitted from a surface in direct proportion to the temperature of the surface and the surface’s ability to RELEASE HEAT, or its Emissivity.

Emissivity is an efficiency factor of how well a surface can emit energy or heat. Surfaces with a high emissivity value (a value between 1 and 0) can emit more energy, while reflecting very little. Surfaces with a low emissivity value, will not emit energy as well from the surface, but can reflect the energy that come from other Surfaces around them. The total emittance from a surface is a combination of what is being reflected, what is being absorbed or emitted and what is being transmitted. This formula is described as R+A+T=1

The imager senses a quantity of radiation that strikes the surface of the camera’s detector. The detector has a grid if elements (in simple terms pixels) from 80x80 up to 1020x780, this gives us an image on the screen in various resolutions dependent on the number of “pixels” on the detector. The amount of energy or radiation that each element detects changes the electrical output of that element. Some people think that the infrared camera is seeing temperatures but it’s not. The camera is detecting infrared radiation that is being emitted off a surface and then, infers or calculates the energy into a number.

The processor in the imager translates those electrical signals into temperatures and an image on the view screen with those temperatures represented by shades of grey or colors of the pallet that the user has selected. Unfortunately for us the imager does not know the difference between emitted energy and reflected energy. The imager sees the total energy and displays it as the surface’s emitted energy. It is up to us as the Thermographer to determine which pattern is from the surface and which is reflected off the surface.

Remember it is only the energy emitted or absorbed from the surface that we as users are interested in, not the reflected energy. We only see the condition of a surface. We cannot see below that surface at all. Thru conduction we can only then possibly detect thermal patterns from internal influences, as the energy from that internal source works its way to the surface and changing the surface temperature.

If you liked this post, you might enjoy our Knowledge Briefs Newsletter. Receive new posts delivered right to your inbox every week!
Sign-Up Here!