A few weeks ago I put a link into the Sunday Newsletter about a 2014 BBC White Paper on High Dynamic Range video. It’s a dense but fascinating read, partly because the Introduction does a nice job comparing modern camera sensors to the capabilities of the human eye. It also has an interesting discussion about the influence of CRTs native response on the choice of video’s ‘2.4 gamma curve’. In the summary of Section 2:
for many years the dynamic range of television displays was limited to about 100:1 by CRT technology. A non-linear “gamma” curve was used to equalise the effect of noise at different brightness levels in analogue TV systems.
Elsewhere the White Paper discusses the non-linear “gamma” curve in CRTs:
Early television engineers took advantage of the non-linear characteristic of CRT displays to achieve [uniformity of noise], since the non-linearity of a CRT closely approximates a power law of 2.4
In other words CRTs were used because their native gamma response (nearly) perfectly matched their needs for natural-looking video images. Then, flat panel technology came along:
With the advent of digital TV the same gamma curve also allowed video to be quantised to 8 bits without significant contouring.
As the White Paper explains: At 8bits and above there’s sufficient narrowness between each step of brightness that the eye (usually) can’t see artifacts when using the same “gamma” curve as CRTs. Our Engineering Overlords declared the technology sufficient, 8-bit the minimum requirement for television delivery but they never thought to explicitly define the “gamma” curve—since we hadn’t yet entered the age of digital displays.
What happened on the computer side of video displays?
Why didn’t computer displays match video displays, especially since the early computer displays were CRTs?!? To answer, I put on my Google Gloves and this blog post from 2006 popped up, The Gamma Question: 1.8 or 2.2?
the standard CRT monitor built into the Mac wasn’t anything special either, still having a native gamma somewhere near the 2.5 mark. . . Apple specified how their QuickDraw graphics libraries recorded pixel values to pull the native gamma of the monitor down to 1.8. This made it so that a user adjusting an image on the Mac monitor created pixel values recorded by QuickDraw that printed as a reasonable match to the monitor image. This worked so successfully in fact that the 1.8 gamma became regarded as the gamma of the Mac monitor itself
And what were Apple computers matching their CRT display output to?
Freaking Black & White LASER PRINTERS.
Apple. Broke. the CRT. On purpose. To match QuickDraw to the printed page.
And thus began the Gamma Wars
Yes, I heard this story many years ago and still, forgive me if I can’t stop laughing. Apple has had an over-sized impact on our industry for MUCH longer and in ways than most of us realized.
Friggin’ QuickDraw. Crazy how this stuff happens, right?