There’s a piece of digital cinematography equipment that has infinite dynamic range, imposes zero noise, supports stereo 3D, high frame rate, HDR and any resolution you like, cares nothing for colour gamuts, is completely codec agnostic, works with any camera and will continue to do so, almost certainly, for the rest of all our lives. It even has zero power consumption.
So it’s difficult to work out why people so often scorn filters. They have much the same potential as lenses to create real-world, organic effects, yet anyone mentioning them risks being told, with a scoff, that anything a filter can do can just as easily be simulated in post production. What should immediately be clear is that nothing we do in post has all of those desirable characteristics.
Polarisers are probably the most popular example of things that can’t be done in post. We could imagine various ways to create an imaging sensor that would record some of the polarity characteristics of light, but not without significant other compromises. So, if we want to remove a reflection in post, the only tool we have is the likes of After Effects, and consider that removing a moving reflection from a window through which we’re witnessing a plot point is the sort of job that brings most VFX people out in a cold sweat.
The march of technology
If we wanted to argue that technology is making at least some filters less necessary, we could reasonably say that the forever-increasing dynamic range of cameras has made ND grads less necessary than they once were. They still have infinite dynamic range, though, which cameras don’t, so there remains some distance between “less necessary” and “unnecessary”.
Beyond what we might call technical filters, accurate simulation becomes even trickier. Consider the halation – glow – caused by a diffusion filter. What we typically see is light diffused from a light source into adjacent areas of the frame. A flashlight might be bright enough to saturate the sensors of most cameras. Once that happens, the light source may get brighter, but assuming a notionally perfect lens, the recorded image doesn’t change. A diffusion filter might render a larger, brighter glow as the light source adds more photons to the equation. A post production effect can’t.
Similarly, contrast-reduction filters attempt to diffuse a measured amount of all the light in the frame to every other point in the frame. A post production simulation of a contrast reduction filter will fail to duplicate reality because the real contrast reduction can raise the level of shadow detail through the threshold of visibility. A post effect can make the blacks grey, but it can never reveal detail that was too dark to be recorded. Clipping works both ways.
And that’s the comparatively everyday stuff. Reach for more esoteric effects, and things become even more interesting. Star and streak filters, including things like Schneider’s Tru-Streak, can see the very core of a hot spot – a glowing filament or reflection of the sun – in a way that the camera may not. As such, depending on the conditions, it can produce a fine streak based only on the very brightest core of the light source that the camera can’t differentiate from its immediate surroundings.
Off the wall filters
Really outlandish designs, such as IBE’s dandelion seeds and loops of thread, leverage all the same things – and the fact that filters can see outside the frame. Fire a light sideways into a diffusion filter and it will fog the entire frame, an effect that’s been used in every guise from Arri’s VariCon to simply aiming lights at conventional filters. The results are exactly the sort of real-world optical effects people seem to like, with the advantage that they react effortlessly to motion in the scene.
That’s not to say that digital filter simulation has no value. Tiffen itself has experimented with simulations of its most famous filter designs with the DFX plugin, and while it’s an idea the company has since abandoned, it’s certainly reasonable to think that VFX people – as well as editors – might enjoy a shortcut to simulating whatever main unit was doing.
And filters, while cheaper than great lenses, are spendier than basic lenses. At £500 apiece for common sizes, with a set possibly comprising the three lightest grades, it’s possible to buy a very comprehensive set of very characterful ex-Soviet stills lenses for less money than a set of filters. Those filters will not simulate every glint and aberration those lenses give us, though they’re probably more predictable and consistent.
Regardless anyone’s interest in filters, though, let’s just bear in mind that not everything they do can be simulated afterward – in fact not much can, with any plausibility. After all, we wouldn’t tell someone to sell their original Speed Panchros and duplicate the resultsI in Fusion, would we?
Filmtools
Filmmakers go-to destination for pre-production, production & post production equipment!
Shop Now