Well, I guess I should rephrase that to my war against the blatant misuse of the term “DI”. As overused acronyms go, this is a chart topper in our industry that is causing great confusion among many, while benefiting very few. Just this week I had to spend half an hour talking a producer off a ledge who was overwhelmed with her project. Before you can understand my personal crusade in this matter, a little background is needed.
Film. Looks Great, PITA to Work With!
Film is still the best looking format, to my eyes at least. Unfortunately, it is also expensive to use, difficult to work with and environmentally unsound. In the traditional film post process, a feature was edited using “dailies” that were printed from the original negative (expensive). Any effects shots (even a simple dissolve) had to be sent out ahead of time and made on an optical printer (expensive). These were then sent to the edit room to join the rest of the tiny pieces of film (trims) that were a joy to keep track of.
Once picture was locked, the negative cutting step began. This was done by anal retentive individuals in clean rooms wearing white gloves. Any mistakes on their part would destroy the master (difficult). The negative had to be cut into an “A” and “B” roll with shots alternating on each reel and a black slug in between each set of shots to avoid having a splice line showing up in the end film.
These two reels were taken to the lab where they were color timed (deciding how much red, green and blue light to print through each shot) and then using those settings, an Inter-positive roll was created. From this multiple Inter-negatives were made. From those the release prints were struck. These intermediate steps were slow, tedious, expensive, and limited in the range of fixes that could be made. With an end result of wasting a lot of film and chemicals (environmentally unsound).
Can’t We Make This Easier?
This went on for around a century until Kodak created the Cineon system that could handle the resolution of film going to digital files and back to film. Of course this was very expensive and very slow. At first it was only used to fix certain shots and for special effects work. Back in the early 90s I worked on a “behind the scenes” piece for Disney’s re-release of “Snow White and the Seven Dwarfs.” This was the first feature to be scanned in, digitally restored, and recorded back out to film. It was a very slow process taking many minutes per frame, tons of very expensive storage, and a lot of dedicated processor power to handle one single frame at a time.
And Bring The Price Down?
Flash forward some years and storage gets cheaper, CPUs get monumentally faster and also drop in price. Eventually some enterprising individuals realized they could scan an entire feature in and treat the whole thing in the digital realm where they gained previously unheard of control of the image. “O Brother, Where Art Thou?” became the first feature to fully go that route. This changed the process of going through the intermediate steps on film to doing it digitally. And a new phrase was born…. The DI or Digital Intermediate.
Back to My Gripe!
So what’s my beef? Well, it’s two fold. First, to be a “Digital Intermediate” you have to start on film and end on film. If you start digitally, (Red, Viper, F900, etc.) then you aren’t doing an intermediate step, or DI. But folks are calling it that. Which is confusing the heck out of other folks who are just trying to make sense of the process. The producer I mentioned earlier was working on a RED Camera post budget for possible film release and thought she had to do an entire “DI” workflow. This is an example of why we need a new term for these scenarios. How about Digitally Captured for Film Output, or DCFO for short.
So You Think You’re a DI House?
The second part of my beef is with the plethora of folks claiming they can handle the DI workflow. (Since I don’t do DIs, this isn’t self-serving.) The look of film is not adjustable like video monitors and digital files are, a true DI house needs to be able to handle all their digital monitoring devices in such a way that every step of the process is calibrated to what the final film print will look like. There are places in town where a technician comes in first thing in the morning and calibrates the projectors, then comes in when the clients break for lunch and calibrates those projectors another time. All of this is done using Look Up Tables or LUTs, which keep the image on every device looking the same by compensating for the differences between those display devices.
Even the developing process for film is a moving target as a half-degree change in the temperature of the chemicals used can change the look of the final product. So a true DI house should have a strong relationship with a lab if not owning their own. One could write a book about the complexities of getting this right, so it’s no surprise to me that very few places have actually mastered it.
What’s Next?
Once theaters switch to digital projection, none of this will matter. Kodak doesn’t make it’s money off of the film purchased to shoot a movie. It makes money off of the tons of release prints that are made to go all over the world. Once the majority of theaters go digital, that market will collapse. Then it will be “DCDP” Digital Capture for Digital Projection, or “RBTV” Really Big TeleVision. And that is my last acronym for the day.