One of the coolest things about infrared cameras is that you can point them at a scene and get a visual image of how hot or cold things are. So … what if those things are humans? Could you use an IR camera, for instance, to check people at an airplane gate for possible Covid-19 fevers?
On the plus side, it wouldn’t require any physical contact, and you get an almost instantaneous reading. Actually, you might have seen pictures of handheld IR devices called temperature guns (which work somewhat differently) used like this in China. IR sensors are also used in factories to monitor the temperature of equipment without having to stop it.
But there are some issues with using this technology to screen people for illness. To do it well, you really need to understand how infrared sensors work. So I’m going to explain all that. Besides, the physics are just super interesting. I’m a huge fan of these cameras, because they let you see the world in a different light—literally.
Everything Emits Light, Even if You Can’t See It
With science, you don’t always get what you want. But if you try sometimes, you get something even better. That’s what happened to William Herschel in 1800. While testing some light filters, Herschel used a prism to split sunlight into its component hues. Then he set up some thermometers. He knew that light falling on an object would warm it up, but he wanted to measure the effects of each color separately.
Then he noticed something strange: A thermometer at the end, beyond the red color—one that wasn’t even in the light—also warmed up. What the heck? Of course the reason was that there was light hitting that thermometer, you just couldn’t see it with human eyes. That was the discovery of what we now call infrared light.
But wait! There’s more. You can actually use the wavelength of light emitted by an object to determine its temperature. You’ve seen this when you use an electric oven. Once the element gets good and hot, say around 2,000 degrees Fahrenheit (that’s the temperature of the element, not the air in the oven that bakes your muffins), it glows a reddish-orange color:
If you see something glowing like that, you know not to touch it, right? But that’s not a foolproof system. When you’ve just turned the oven on, say after a minute or so, it might still look black—no visible light is given off—but it’s already hot enough to burn you. So, what if I take a picture with an infrared camera? This is what it looks like in the infrared spectrum:
Now you see the light. Of course, this is a false-color image. Since our eyes can’t detect infrared light, the camera basically translates, using visible colors to represent different wavelengths in the infrared range. In this palette (which you can change), yellow is hotter than orange, which is hotter than purple. (The thing you see in orange is a reflection off the top of the oven.)
How an IR Camera Determines Temperature
All objects emit electromagnetic radiation—yes, that’s what light is—over a whole range of wavelengths. If you plot the intensity of the radiation (technically the spectral power density) vs. wavelength for a given object, you get a curve like this.
If you want to play with an interactive version of this plot, check out this cool PhET simulator.
It turns out that the highest-intensity wavelength produced—the peak in the curve above—depends on the temperature of the object. As it gets hotter, the wavelength of peak emission decreases—it moves to the left, back toward the visible spectrum.
So for something at room temperature (like 300 Kelvin), this peak wavelength is about 9.7 μm (micrometers). That puts most of the radiation in the infrared part of the spectrum. That’s why you usually can’t tell just by looking at things how warm they are.
But if you heat it up to, say, 1,200 K (like that oven element), the highest-intensity wavelength moves down to about 2.4 μm. That’s still in the infrared region, but by shifting the curve you also get more light down in the visible part of the spectrum (< 0.74 μm), so your eye can see it glowing. (Try it in the PhET simulator!)
This temperature-wavelength relationship is called Wien’s displacement law, which looks like this:
In this expression, λ is the wavelength of the light with the maximum intensity and T is the temperature (b is just a constant). This means I can get a value for the temperature of an object just by looking at the color of light it produces.
Only most of the light is invisible, so you need an infrared camera for that. It’s basically just like a normal digital camera, but instead of having a sensor that detects visible wavelengths, this one can “see” infrared wavelengths. My IR camera can even give a temperature reading right on the image. Seriously, these things are awesome.
No Reflection on You
Oh, but there’s an issue: Wien’s law only works for radiation from a “blackbody.” What’s that? A blackbody is an object that doesn’t reflect outside light; all the light it gives off is produced by the object itself. An incandescent light bulb is a pretty good example — it glows because the filament gets super hot. (That’s why incandescents are sucky light sources. They waste a lot of energy in the infrared range that you can’t see.)
In reality, the light from most things is a mix of emission and reflection. So if we want to use that light to get the temperature of an object, we need to know the ratio. There’s an index, called emissivity, that captures this. It ranges from 0.0 for a completely reflective surface up to 1.0 for a perfect blackbody. There are tables where you can look up the emissivity of different materials.
How about an example? Suppose I take two plastic cups and fill them with ice water so that they’re both at 32 degrees F. On the outside of the cups, I put some aluminum foil—but on one of the cups I painted the foil black. Here’s what they look like sitting in my driveway:
Now, let’s take a look in the infrared region and measure their apparent temperatures.
Not only do they look different, the temperature readouts are different—which they shouldn’t be. As you can see, the camera says the black cup on the left is 48.6 F, while the silver cup is supposedly 86.1. That’s because the aluminum foil surface has a much lower emissivity (e = 0.04). Most of the IR light the camera sees on that cup is just reflected off the hot pavement.
But what about human skin? Fortunately, humans are very blackbodyish (I made up that word). A typical human has an emissivity between 0.95 to 0.98. So we aren’t very reflective in the infrared region. That’s good.
Here’s my hand. With my high emissivity, that’s almost entirely me producing light.
Measuring Body Temperature
Now that you know how it works, let’s get back to the question. Can you use an infrared camera to see if someone has an elevated body temperature? Well, there is a problem. The camera looks at the outer surface of things. The internal temperature of a human should be around 37 C (98.6 F), but the outside skin is generally cooler than that.
Here’s a picture of me (in case you can’t tell). I played around with measurements at different spots on my face, and the highest temperature I could find was 95.3 F, in the inner corner of my eye. (As it turns out, an IR device manufacturer recommends focusing on the tear duct.)
For comparison, I stuck a thermometer in my mouth and measured my internal temperature at 97 F. I guess the question, then, is whether there is a consistent relationship between internal and external temperatures, so that you could still tell when someone has a fever.
Another consideration: I had to get the camera fairly close. In this case it was about 10 centimeters from my face. That would clearly violate the 2-meter social distancing rule. You could probably fix that with a higher-resolution camera. I’m just using one that connects to my phone (which is pretty cool if you think about it).
Just for fun I also took an IR picture of an open mouth. In this case it’s not my mouth, it’s a volunteer (one of my sons).
This seems to give a better measurement (97.2), but I doubt you could convince people to walk up to a camera and open up their mouths. We have a hard enough time getting people to just stay home.
Remove Your Glasses, Please
One other little issue: If you want to measure the temperature of a tear duct then anyone wearing glasses would have to remove them. Why? Because, as it turns out, while glass is transparent for visible light, it blocks infrared light. Check out this picture of my dog looking out the window.
You can see his reflection in the glass. For infrared light, the window acts just like a mirror. Now look at a human with glasses on. I’ll be the human in this case.
Those aren’t sunglasses, they’re just my normal glasses. They look dark because the infrared light from my eyes is reflected back onto me—it doesn’t get through. And on the outside, the glasses are reflecting infrared light from the room, which is colder than my body.
So, Well, Could It Work?
We saw that human bodies have a high emissivity, so that’s encouraging, and a good infrared camera can distinguish small differences in temperature. The one real issue is that it only measures the surface temperature of the skin. However, this might be OK if you are just comparing people and looking for outliers that are warmer than others.
So, yeah, I think you could use an IR camera to screen for individuals with elevated body temperatures. Of course that doesn’t necessarily mean they have Covid-19. Maybe they have an ordinary flu. Maybe they’re hot from running to catch a flight. Also, people can be infected but asymptomatic. There’s a lot of ways you can get false positives and false negatives.
Does that mean it’s pointless? No, it’s a screen, not a diagnostic test. It’s not perfect, but it could be a quick, inexpensive, noninvasive way of surveying large groups of people and, at the least, flagging those with a higher probability of being infected for follow-up questioning.
Sure, an oral thermometer is more accurate. But can you imagine stopping each person outside a grocery store to stick a thermometer in their mouth and waiting for a reading? People just wouldn’t put up with that.
More Great WIRED Stories
- The illusion of perfect protection
- Build cities for bikes, buses, and feet—not cars
- The party goes on in massive online worlds
- The Zoom privacy backlash is only getting started
- Uncanny portraits of perfectly symmetrical pets
- ? Why can’t AI grasp cause and effect? Plus: Get the latest AI news
- ✨ Optimize your home life with our Gear team’s best picks, from robot vacuums to affordable mattresses to smart speakers