If you have ever written, or are planning to write, any kind of code that deals with image processing, you should complete the below quiz. If you have answered one or more questions with a yes, there’s a high chance that your code is doing the wrong thing and will produce incorrect results. This might not be immediately obvious to you because these issues can be subtle and they’re easier to spot in some problem domains than in others.
So here’s the quiz:
I don’t know what gamma correction is (duh!)
Gamma is a relic from the CRT display era; now that almost everyone uses LCDs, it’s safe to ignore it.
Gamma is only relevant for graphics professionals working in the print industry where accurate colour reproduction is of great importance—for general image processing, it’s safe to ignore it.
I’m a game developer, I don’t need to know about gamma.
The graphics libraries of my operating system handle gamma correctly.
The popular graphics library <insert name here> I’m using handles gamma correctly.
Pixels with RGB values of (128, 128, 128) emit about half as much light as pixels with RGB values of (255, 255, 255).
It is okay to just load pixel data from a popular image format (JPEG, PNG, GIF etc.) into a buffer using some random library and run image processing algorithms on the raw data directly.
Don’t feel bad if you have answered most with a yes! I would have given a yes to most of these questions a week ago myself too. Somehow, the topic of gamma is just under most computer users' radar (including programmers writing commercial graphics software!), to the extent that most graphics libraries, image viewers, photo editors and drawing software of today still don’t get gamma right and produce incorrect results.
So keep on reading, and by the end of this article you’ll be more knowledgeable about gamma than the vast majority of programmers!
Given that vision is arguably the most important sensory input channel for human-computer interaction, it is quite surprising that gamma correction is one of the least talked about subjects among programmers and it’s mentioned in technical literature rather infrequently, including computer graphics texts. The fact that most computer graphics textbooks don’t explicitly mention the importance of correct gamma handling, or discuss it in practical terms, does not help matters at all (my CG textbook from uni falls squarely into this category, I’ve just checked). Some books mention gamma correction in passing in somewhat vague and abstract terms, but then provide neither concrete real-world examples on how to do it properly, nor explain what the implications of not doing it properly are, nor show image examples of incorrect gamma handling.
I came across the need for correct gamma handling during writing my ray tracer and I had to admit that my understanding of the topic was rather superficial and incomplete. So I had spent a few days reading up on it online, but it turned out that many articles about gamma are not much help either, as many of them are too abstract and confusing, some contain too many interesting but otherwise irrelevant details, and then some others lack image examples or are just simply incorrect or hard to understand. Gamma is not a terribly difficult concept to begin with, but for some mysterious reason it’s not that trivial to find articles on it that are correct, complete and explain the topic in a clear language.
Alright, so this is my attempt to offer a comprehensive explanation of gamma, focusing just on the most important aspects and assuming no prior knowledge of it.
The image examples in this article assume that you are viewing this web page in a modern browser on a computer monitor (CRT or LCD, doesn’t matter). Tablets and phones are generally quite inaccurate compared to monitors, so try to avoid those. You should be viewing the images in a dimly lit room, so no direct lights or flare on your screen please.