Back To Listings RSS Print

GEEK OUT: The Non-Technical Technical Guide to Sony OLED Monitors

OLED monitors are about to change the way we view images, both at work and at home. Prepare to look better than you ever have before.

By Art Adams | March 06, 2012

The first time I laid eyes on a professional Sony OLED monitor I knew my professional life had changed. In a few years I'm sure we'll take this technology for granted, but right now it looks AMAZING compared to any other monitoring system I use on a regular basis. For a slightly-technical-but-mostly-educational look at why, read on...

Until recently I knew nothing about LCD and OLED monitor technology, and being somewhat curious about such things I sat down with Sony product manager Gary Mandle to see if I could understand this seismic shift in digital monitoring.


This is how an OLED monitor works. Any questions?


First, though, a little history. If you already know the fundamentals of how CRT and LCD displays function you can skip to page 2. If not, it's a good idea to read this page first so you can appreciate how different OLED displays are from anything that has come before.

The first successful broadcast television receivers were monochrome. They contained a vacuum-sealed glass tube through which an electron beam was fired at rows of phosphor dots placed on the inside front surface of the tube. Electron beams hitting the phosphors caused them to glow, and the brightness of the glow depended on the strength of the beam. Powerful magnets guided the beam on its path from the top left of the tube surface to the lower right, and this path was retraced 60 times per second. (For more detailed information on this technology check out the Wikipedia page on CRT tubes.)

Early phosphors didn't glow for very long. Scanning the beam across the surface of the tube progressively, where every phosphor dot was hit in sequence, proved ineffective as the phosphors at the top of the screen dimmed noticeably before the beam had finished painting the full surface of the tube. This caused a massive roll bar where dimming phosphors chased newly lit ones. The solution-one that has haunted us to this day-was to illuminate the phosphor dots in alternating lines: the first pass of the beam illuminated rows 1, 3, 5, etc. and the second pass lit rows 2, 4 , 6, etc. Viewers didn't notice that the interleaved odd lines dimmed as the even lines were lit, and the fact that the screen displayed 60 half resolution images per second instead of 30 full resolution images eliminated excess flicker. (50hz PAL TV sets appear to flicker to those of us who grow up in NTSC countries. Movie theaters reduce flicker by projecting each image twice as our brains see a lot of flicker at 24fps but not so much at 48fps.)

The same technology carried over into color television sets. Monochrome phosphor dots were replaced by a cluster of red, green and blue phosphor dots that fooled the brain into seeing a single color that was, in fact, a mixture of all three. For example, if the red dot was dim but the blue and green dots were lit the brain registered the combination as cyan. If the viewer got close enough to the screen they could see the individual phosphor dots, but at any distance beyond about 2' they blurred into a single color. (The earliest tubes used a triangular cluster of RGB phosphor dots. Sony Triniton sets employed three closely-set vertical phosphor stripes.)

This system had some downsides. Early red phosphors weren't bright enough to match the intensity of the blue and green phosphors so a small amount of green phosphor was mixed into the red to give it a bit more kick. Also, the response of the red phosphor was slower than the others so the addition of some green phosphor made it respond a bit faster. This phosphor mix resulted in reds having a slight orange cast. Most viewers didn't notice.

Interlaced scanning reduced resolution considerably. Interlaced cameras capture images in two passes--odd rows first, then even rows--with a short but definite period of time between the two passes. Any object that moved significantly through the frame between those passes became blurred as the object appeared to be in one screen position for the odd field and another position for the even field. Even worse, noise in the system reduced resolution further as the odd field captured one pattern of noise and the even field captured another. When two fields with different noise patterns are combined the noise is amplified, reducing the resolution of the image.

Last but not least, black was created simply by turning the electron beams off. This meant that the blackest part of the image was no darker than the blank screen, which was not truly black. An unpowered TV tube surface appeared as dark gray, and while illuminating the phosphors around a dark area made that area appear darker by comparison there was no way to create a true, rich, deep black-especially ambient room light struck the front surface of the monitor and made it brighter. Instead of dark, rich blacks we had to settle for dark gray.

This most impacted color in shadows. As the picture tube's native shade was dark gray, any color that approached "black" was contaminated by that gray. For example:


Bright colors overpower the gray base of the tube surface, but darker colors are impacted severely and are desaturated. We'll see how OLED technology solves this problem shortly.


LCD displays tossed out the electron-beam-and-phosphor model and relied instead on some obscure properties of liquid crystal, which made it much more energy efficient but not as pretty as CRTs. The secret of LCD displays is that the liquid crystal is not responsible for creating the colors we see on our LCD television sets but is simply a light valve that allows light to pass through colored filters on the screen.

From back to front, here is a simplified description of how an LCD monitor is assembled:

(1) A light source (either a fluorescent tube, white phosphor LED or a set of red, green and blue LEDs) is placed at the rear of the display. This generates all the light that will pass through the display.

(2) In front of this backlight is a polarizing filter that polarizes the light.

(3) In front of that are two layers, one of circuitry (TFT) and another of liquid crystal. The circuitry layer divides the liquid crystal into a grid.

(4) In front of each grid "cell" is a filter that is red, green or blue. These replace the phosphor dots in a CRT monitor.

(5) In front of the RGB filter layer is another polarizing filter, but this filter is installed at a right angle to the first filter. Light can only pass through this polarizer if it is twisted into alignment by liquid crystal.



Liquid crystal acts as a "light valve" by twisting light between two oppositely-polarized filters.


Polarizing filters only pass light that is polarized in a specific direction. If two polarizers are stacked and one is rotated 90 degrees from the other, they'll cancel each other out and pass very little light. That's the key principal behind LCD displays: left on their own, the two polarizers prevent light from reaching the surface of the display. Unless something else happens, the display remains black-or as black as an LCD screen can get, which, as we will see on page 2, isn't very.

Applying an electrical charge to a cell of liquid crystal grid causes that bit of crystal to "twist" the light that passes through it, polarizing it into alignment with the front polarizing filter. If a small charge is applied then the light is twisted only a little bit in the same direction as the front polarizing filter, so only a little light gets through. A full charge twists it completely, allowing all of the light to pass through. Liquid crystal is simply a light valve.


The construction of a typical LCD monitor. "TFT substrate" refers to the grid array that is employed to cause elements of the liquid crystal layer to "twist" light.


If that grid element is behind a red filter then the light that is passed through the front of the display appears as a red point. Repeat for every cluster of red, green and blue pixels on the screen.

While LCDs have proven very energy efficient there are few that are pleasing to the eye and fewer yet that can be used for critical image evaluation. The first problem is that the two polarizers never quite cancel each other out completely, so some light always passes through to the surface of the screen. This creates the same issue that we saw with CRTs, where no part of the screen is truly black, which reduces overall contrast as well as color saturation in shadow areas. LCD screens are slightly worse in this regard than CRTs.

Each kind of backlight has its own issues. Flourescent tubes and white phosphor LEDs generally have a broad-enough spectrum for reasonable color accuracy but not enough color energy in any one portion of the spectrum to reproduce strongly-saturated colors. RGB LED backlights tend to be very "peaky" as strongly-saturated dye LEDs emit very saturated light on very narrow wavelengths, as opposed to phosphor LEDs which tend to emit a broader spectrum of less pure light

The interplay between these light sources, the dye filters over the pixels, and the two polarizers can cause some very interesting color issues. There's a popular line of "broadcast quality" LCD monitors that are, for example, not always the same color and rarely the proper color. I had the treat of viewing a white phosphor LCD panel next to a fluorescent-lit LCD panel on a recent shoot, and while the white phosphor LCD image looked pretty good the other monitor looked yellow-green by comparison. I don't know if this is the interaction of the fluorescent backlight with the dye filters or the two polarizers interacting with each other, but either way it means that I don't know what's being recorded. This can be a negative drag on long-term employment prospects. (I've gotten into the habit of looking at all the monitors on the set-from the on-camera focus monitor to the director's monitor, client's monitor and playback operator's monitor-and mentally averaging the results in order to understand what's REALLY being recorded.)

LCD displays don't "reset" the way CRTs do. A CRT phosphor will dim quickly if not re-illuminated, but LCD cells don't reset automatically: they stay the same until they are overwritten. Once the liquid crystal behind a blue filter is partially depolarized, perhaps to display the color purple, the amount of light passing through that blue filter won't change until the next frame, where different instructions may be issued. This can produce a "smear" effect. (The Sony BVM L231 monitor had a mode called "black frame insertion" where a black frame was inserted between each frame, shortening the amount of time that each video frame remained on screen. This increased the appearance of sharpness but also increased black levels.)

Last but not least, highlights have a habit of glowing. There are many additional layers in between the filters and your eye (I've not touched on all of them here) that scatter light as it passes through the display. Bright areas can bleed into dark areas, resulting in a halo that reduces the apparent sharpness of bright objects.

That's enough history. Now for the good stuff...

Page 1 of 2 pages 1 2 Next »

Editor's Choice
PVC Exclusive
From our Sponsors

Share This

Back To Listings RSS Print

Get articles like this in your inbox: Sign Up

Comments

Please login or register to comment