Site icon ProVideo Coalition

Closed Loop Colour for Virtual Production

Reverse side of an LED video wall showing interconnections and power cables for each foot-square panel.
At some point, most of these video wall displays are operating in something like DCI P3.

Almost everyone reading this will have had the experience of pointing a camera at a real-world scene and then observing the resulting image on a monitor, and will know that it’s never really been expected that the two should match with any accuracy. Given the eyelid-twitching preoccupation we often have with calibration and accuracy, that might come off as carelessness, though it’s also an inevitable concomitant of displays that don’t have infinite dynamic range or colour gamut. In digital cinematography in general, that’s created a none-too-wonderful situation in which every camera has a couple of different ways of representing colour and brightness. It’d be great if that didn’t happen in virtual production, too.

It’s not entirely a new idea. Manufacturers don’t like the idea that all cameras should look the same, but normalisation of cameras has actually been implemented in software like Cinematch, based on precision analysis of a lot of camera and their behaviour. Matching cameras to each other and to the real world is clearly possible, but what we might call a universal opto-electronic transfer function has never been normal.

Perhaps it should be normal

One specific world in which that has begun to matter is in virtual production, where we might shoot material which will be displayed as part of a scene and rephotographed on other cameras. It’s a less flexible, but potentially cheaper approach than realtime 3D rendering. Camera-fresh material will often need quite significant compositing work before being ready for the display, which has a cost implication, but either way, colorimetry and brightness must be right.

At the time of writing, “right” is a moving target that’s mostly approached with nothing more sophisticated than tweaking until it looks right. That can usually only be done on the day and it’s really slow for people shooting film. That’s generally true both for real time computer-rendered backgrounds and for a scenes which have been photographed in reality, although there’s a lot more flexibility in the computer world. The surprise is that many video walls used for virtual production are operated roughly as Rec. 709 or DCI P3 displays, even though the emitters would probably allow for wider colour gamut. Brightness can be different from the written standards, but the colour is often operated quite conservatively.

Either way, there remains a need for an experienced hand to adjust until things fall into place. If this seems to you like a slightly unsatisfactory state of affairs, you’re not alone, and virtual production people are increasingly expressing the desire to regularise things, creating a closed loop for colour control in virtual production. This is difficult given the huge quantity and variety of equipment used in virtual production, which is a world of very many moving parts, from cameras, rendering and playback servers, through graphics cards, video wall processors, the tiles themselves, another camera and its associated monitoring.

Whether this is ever going to get any simpler remains to be seen.

Virtual production colour calibration

We could speculate about solutions, but we would be speculating. It is, for instance, quite valid to point a monitor calibration probe at an LED video wall in exactly the same way we’d point it at a projection screen, making it theoretically possible to close that loop. Doing that might require quite a lot of forward combinations of lookup tables, and doing that might require forward combinations of lookup tables in ways that would require some external tools. The problem is, while that might work, any such solution would very likely only remain valid for one very specific configuration of equipment, so it’s not that clear how much it’d help.

So the practicality of this is uncertain. To some extent, this is a standardisation problem, where “flexibility” and “standardisation” are often concepts in competition with one another. As so often, the vicious circle is that high-end, rarefied and expensive technologies such as virtual production stages lack standardisation because they need the flexibility to solve problems caused by the lack of standardisation. In theory, that’s a very easy problem to solve by simply dictating to everyone how they’re required to do things. In reality, that sort of enthusiastic cat-herding is likely to run into significant resistance from people who want to indulge their creative desires, invent new things, and show off their capabilities.

Two technicians at NAB 2022 making it look… well, not all that easy, actually.

It’s a complex problem, but it would be nice to think that we could avoid the parlous situation that’s been allowed to emerge in monitoring. Every camera and display manufacturer has a at least a couple of different ideas about how brightness encoding should work, as if the phrase “base two logarithm” was somehow ambiguous. Virtual production colour control is a more complicated problem, but it’s also a smaller field, and it’d be nice to imagine a future that’s a little less reliant on leaning on the controls until it looks right. Or perhaps that’s inevitable.

Exit mobile version