Sensitivity of Image Sensors to IR

It is probably common knowledge that the camera in your phone can see the LED on your TV remote go ON and OFF whereas your eye cannot. This happens because the LED on the remote emits light in the infra-red range to which the human eye is not sensitive. But most image sensors are sensitive to IR.

This can be a problem when you take pictures. You expect the image to look exactly what your eye see. But your camera can see more which may lead to unwanted effects on the image. So most cameras have a filter which reduce the effects of IR on the image. Most of the times, you can see the IR led in your remote glow only when you directly point it at the camera. This is because the filter greatly attenuates IR such that reflected light is very hard to see.

It may be pretty hard to guess where I am going with this. But before I get to the issue, there is another topic I want to shed some light on.

Most image sensors on the market make use of the Bayer pattern. The Bayer pattern is an arrangement of pixels sensitive to red, green and blue light in a particular order so that they combine to give a complete color image. So when you analyze an image, it contains three channels (3 two dimensional matrices), one for each color. If you take an image of a red object, it appears white in the red channel and black in the other channels. When I say it appears white, it just means that it has a high value close to 255 (for 8-bit images). Similarly, a blue object appears white on the blue channel and black on the other two channels.

Most people are content to verify that their camera can indeed see the IR led on a remote glow. But I went one step ahead and looked at the red, green and blue channels of the image. I was surprised by what I saw.

One would expect that the blue and green filters of the Bayer pattern would be able to eliminate the IR light. But what I saw was that the IR was consistently bright in all the three channels. Initially I thought that I had made a mistake and re-took the image and checked the code. But again I got the same result.

I haven’t been able to get a satisfactory explanation for this. Light is filtered according to their wavelengths in the Bayer pattern and by this line of thought even IR radiation must have been filtered out. What am I missing here? I would be glad to hear what you think of this.

Original Image
Original ImageOriginal
Blue Channel
Blue Channel
Green Channel
Green Channel
Red Channel
Red Channel

Advertisements

Relight

3_color

The image looks interesting right? So how did I do it? Pretty easy! Just use three light sources – red, green and blue. Wrong! This is done using a slightly more sophisticated procedure. I took three images with different lighting conditions. Then I wrote a MATLAB script to extract the red channel from the first image, green channel from the second image and blue channel from the third image. Combine these three and you get wonderful effects.

Green Chennal
Green Chennal
Blue Channel
Blue Channel
Red Channel
Red Channel
Combined Image
Combined Image

The first set of photos here are taken with a table lamp which does not give a focused beam. Switch on the light, take picture, change light position and so on.

Blue Channel
Blue Channel
Green Channel
Green Channel
Red Channel
Red Channel
Combined Image
Combined Image

The second set of pictures are taken using the same table lamp but with a convex lens in front of it. So I get a somewhat focused beam and clearly it has a more impressive effect.

The MATLAB code for this is available on GitHub. I feel comfortable using the GUI builder in MATLAB for dialog boxes. For these, I have uploaded the ‘.m’ files on GitHub but I am yet to find a way to upload the ‘.fig’ files. Apologies for that.

https://github.com/nakul13/Computational-Photography