You don’t often see articles like these anymore, where a famous filmmaker or cinematographer explains their decision to move away from film/celluloid in favor of video/digital. A few years ago this sort of thing was a big deal, but now using film is the exception rather than the rule. Sure, there are plenty of A-list directors who are still using and championing film, but the numbers don’t lie. Movie film stock sales have dropped by 96% since 2006. Fujifilm has stopped production of the majority of their Motion Picture Film products because the revenue numbers weren’t adding up, leaving Kodak as the only major company still producing film.
Interactive video, from Shutterstock
You don’t often see articles like these anymore, where a famous filmmaker or cinematographer explains their decision to move away from film/celluloid in favor of video/digital. A few years ago this sort of thing was a big deal, but now using film is the exception rather than the rule. Sure, there are plenty of A-list directors who are still using and championing film, but the numbers don’t lie. Movie film stock sales have dropped by 96% since 2006. Fujifilm has stopped production of the majority of their Motion Picture Film products because the revenue numbers weren’t adding up, leaving Kodak as the only major company still producing film.
Authors like Art Adams have laid out why they’re going to miss film, and there’s been a passionate response to this transition. You can read arguments from directors like Spielberg and Scorsese about the grain and quality than can only be achieved with celluloid, but tools which are specifically designed to control digital texture are available now. As more and more of those tools are created and refined, the claims about not being able to replicate the “look” of film with video will get quieter and quieter.
The response to this reality is often to talk about which medium is “better”, but in light of the fact that even film’s biggest fans know where this road ends, that doesn’t seem like a worthwhile discussion. Instead, the question should be about what this means to and for content creators. Do young filmmakers make a film vs video comparison at all? Do more experienced filmmakers feel the limitations of video? What skills are being lost in this transition? What benefits besides the obvious ones are being gained? What will this mean for future generations who will have to look back through digital archives rather than film canisters? How does this change impact other areas of production, including post and archive?
Art Adams
Stunning Good Looks
I feel the limitations of video all the time. I’ve described film as similar to making a delicate image with a watercolor stamp on rice paper using only the gentlest force, while shooting HD is akin to chiseling an image out of stone with a mallet. That used to be the case with early cameras like the F900, and especially in SD, but more modern cameras like the Alexa remind me much more of film than video. I feel that I can be subtle again and not have to backlight everything to separate objects from small-color-gamut background murk.
Having said that, video is clearly becoming better and better all the time, and I think we’ll start to see it surpass what film could do very shortly. Film was always better at highlight handling than shadow detail while HD was the opposite, but HD cameras are getting better at handling highlights every day.
The biggest benefit is that budgets for image capture are dropping, and that offers a lot of people freedom to jump into the industry. You don’t have to spend $1000 per 1000′ roll of 35mm film anymore, with the additional back-end costs of processing and printing or digitizing.
The negative side of that is the loss of the apprenticeship system, where you’d have to work for years with experienced crew before anyone would allow you to look through a camera. That meant by the time you did start looking through cameras you knew what you were doing and you probably wouldn’t screw up too badly. Now that we can see an image on a monitor everyone thinks they are qualified to both make it and comment critically on it, so the DP is respected less than when they were the only person who knew what the image was really going to look like. Cheap cameras mean that no one is working their way up and learning from professionals anymore: they are graduating from film school (where you learn a lot of theory but generally learn zero about the realities of working on a film set), buying a camera and going to work. The problem is they are going to work reinventing all the stuff the us oldtimers learned long ago when we worked our way up the camera crew ladder and picked up a hundred years of cinematic knowledge along the way.
Also, digital is insanely more complex than film. If you knew your film stocks and lenses then you knew how the image was going to turn out. In digital the image on the monitor is still not necessarily what you’re recording. You’re not seeing the effect of compression and reduced bit depth on the image, for example. I just did a shoot where the VTR operator had his system set up wrong and he added 7.5% of NTSC setup to an HD signal. Raising the black level makes everything look brighter, and viewing the image on an LCD means not necessarily being able to see the blacks are lifted because LCD screens are a bit milky anyway. It was only when I noticed that playback looked darker than the live image and looked at a waveform on the monitor itself that I figured out what was wrong.
LUTs are another minefield. Is your LUT capturing legal range, extended range or “raw” (effectively no range–nothing is trimmed off the top and bottom of the signal)?
What color gamut are you capturing? Are you exposing your log curve correctly? How noisy is the camera and what can you do to fix that without compromising the image? Can post unwind all that you’re doing in the camera without pulling up an image, not understanding what’s going on, panicking the producer and getting you fired?
How do you tell a production company that the cheap camera they bought can only be used with certain lenses, has a number of flaws that professional cameras don’t, and shoots unusable green screen? They don’t want to hear that… but it’s all true.
I’m okay that film is going away. It was fun but it seems like old news now. Digital is becoming more exciting every day. But imaging is now more complicated than ever, and I don’t think a lot of people get that. And beneath that we’re losing the core knowledge that the film industry has perfected over the last century that’s made it one of the most efficient industries in the world because no one wants to spend time in the trenches anymore–and we need that knowledge now more than ever.
Arthur Vincie
Chaotic Sequence
Great summarization of the issues, Art! I think you hit the nail on the head – what’s gained is the access to the tools; what’s lost is the expertise and the journeyman’s pathway.
There’s a couple of more sinister aspect to this that are worth pointing out:
- I’m not entirely unhappy that labs and negative cutters have (mostly) gone away. I’ve had too many bad experiences with them, and hated how they sometimes lorded it over us (especially on indie shoots).
- BUT, those labs employed some great people, and gave them decent salaries. They often fought hard and long to salvage troublesome footage, and really understood the mechanics of the process.
- Film crews have gotten smaller, as cameras have gotten smaller and more light sensitive (and budgets have shrunk, both as cause and effect). I’m not unhappy about this either, since large crews can sometimes “wag the dog,” and were harder to manage and get to move quickly.
- There’s an industry-wide trend (especially at the high-end) to make decisions in post rather than in production. This effectively neuters the director and DP, and puts the stick in the hands of the money people, who too often are motivated by expediency and fear. Even on indie shoots, I hear the “yes, it’s great with 4K you can just punch in” argument all the time, even though it’s frankly a stupid reason to shoot 4K. That should be a Hail Mary, not because you couldn’t make up your mind about shot composition on set. I hear the same argument about shooting on RAW (just shoot oatmeal now, and play with it infinitely in post. Ugh).
- Not coincidentally, these producers lean heavily on CGI personnel and post staff to “direct the movie” in post, many of whom are not unionized and/or are overseas (that’s how you end up with Rhythm & Hues going bankrupt while winning an award for “Life of Pi.”) This is part and parcel of the larger macro trend of capital extracting profits by squishing labor as much as possible.
- Film has always been more resistant to this kind of post – just look at the amount of time and work that the SFX team had to put in on “2001” (years). You were taught to expose and compose close to the mark, so as to reduce post costs. If you’ve ever tried to line up an optical printer, you know just how easily things could get screwed up.
- Digital cinema helps facilitate this trend. It’s easier to manipulate in post (easier being a relative term, of course). It’s potentially less expensive to composit with. It’s less expensive to print (DCP packages even at their most expensive are still far cheaper than a single release print).
- Film is a great archival medium. Digital is not. The film/tv industry has a lousy record of keeping its film stored properly; this bodes even less well for the future, as storage standards and interfaces change.
- Digital cinema vendors have no reason to stop “innovating,” even when that innovation is really just ramming things down the customer’s and professional’s throats. Why stop with a 4K TV when you can get an 8K TV? Why shoot on 4K when you can shoot on 8K? Today’s cameras and gear depreciate very quickly. There’s no way to build value on your gear. Film camera makers weren’t innocent in this regard, but by and large you could pick up a camera built in the 70’s and still shoot with it 20 years later (as long as it was in good shape).
- Related to this (and the economic issues above), I see a lot of DPs, editors, colorists, etc. on a constant “Red Queen” treadmill, running faster to buy gear they can barely pay off before it’s time to get the “next generation” of gear. Often, the next gen isn’t much better, but the producers don’t know that, they just want “the latest.”
- Directors, producers, and everyone else comes to be obsessed with “K” counts, color fidelity, sensor size, sensor types, etc. (See above). This is largely besides the point. Audiences will watch almost anything, shot on almost anything, as long as the story is good and the dialog is audible. I have lost hours and days of my life discussing/researching/arguing with producers over the choice of format and camera that would have been better spent finding better locations, hiring better crew members, or letting the director rewrite the script again.
While I don’t miss film (labs, expenses, larger film crew sizes, negative cutters, etc.), I feel like we have to push against some of the less-than-utopic trends I’ve outlined above.
Steve Hullfish
CUT.N.COLOR
Film vs. Video is a bit of a misnomer.
One of the reasons that the whole “versus” discussion has become a bit invalid is that “video” in the way we knew it when this discussion first came up, is almost gone – replaced by digital. Maybe that’s a bit of a fine distinction, but “video” was originally an analog recording of an electronic signal from imaging tubes, then chips. What we have now is quite different.
Almost always, when we’re really comparing film versus “video” what we’re talking about isn’t really video at all. Is a RED camera or an Alexa really a video camera? I don’t think so.
When it was really a discussion of video versus film there were a number of things that defined why film was better than video and there were a number of things that people did to try to make their video look more like film. Oddly, most of those things actually involved degrading the image in very real ways.
One of the most obvious ways that video was distinguishable was frame rate. For Americans, this meant 30fps versus 24 fps. Now, with digital cameras, that difference is largely gone. Time still has an effect though. A film camera truly records the image of a frame instantaneously. Digital sensors still require a small amount of time between recording the first pixel at the top of the image and the last pixel at the bottom of the image. Global shutters deal with most of this.
Grain is certainly another element. Oddly, though grain seems to be prized now, it’s really an artifact of the recording medium that is a limitation on resolution and clarity. The ability to add organic grain now has certainly improved, but the texture that it provided and the life that it added to even blown-out highlights, was definitely one of the things that made film more desirable than video. Conversely, the ugly “grain” of video – noise basically – was something that made video less desirable.
The contrast curve of film was always seen as more desirable than the linear recording of the exposure in video. This was one of the easy fixes to make your video look more like film. However, the problem was that because the exposure on video was already limited to a fairly small tonal range, adding a nice “film” S-curve to it basically just served to crush the blacks and blow out the highlights even more. Nowadays, digital camera manufacturers almost all have their own “secret sauce” to deliver a filmic tonal curve.
Color rendition is another film versus video issue that is tied to video’s limited ability to record certain colors accurately and the “personalities” of specific film stocks. Digital camera manufacturers have the same personalities or biases. Often these are corrected out in later iterations of a chip. D.P.s used to choose film stocks specifically to take advantage of these personalities. The way one records warm skin tones or the way another records lush greens. Depending on the story and the desired look, the film stock helped craft the image as much as the light or the lens. Now we are essentially “personality-less.” Art Adams, points this out. Often there’s no vision. And as some have pointed out, sometimes the studio or client doesn’t WANT a vision. They want to be able to play and change in post. Would a studio shoot “Raging Bull” in black and white now? Or would they protect themselves by shooting color and decide later what the look would be? To some extent, this ability to play with the look later on has “always” been here. That’s what negative color timers did. Want it more cyan or magenta? No problem. And even with telecine transfers, I worked on several film-shot projects that were heavily manipulated in telecine, so even when the D.P. was using film, that was still no guarantee that a “vision” was being created on set and followed through in post.
I did a bit of research on this subject of film versus video when I was writing my color correction books. One of the really interesting things that I discovered when talking to some plug-in manufacturers was that another aspect of why film has a unique look is that film not only records using chemicals – as opposed to electronics – but that those chemicals are recording the image on three separate layers – or four, actually – and that each of those layers records the image at a slightly different focus! The lens focuses the light onto the negative, but the negative is actually a three-dimensional object with different layers of emulsions and plastics and silver. A digital sensor is a single layer of receptors. Practically speaking, this doesn’t make much of a visual difference, but when we’re talking about the ephemera of why film images are somehow “magical” every little nuance helps.
Like others who have discussed this topic, I believe that the loss of film has resulted in a huge loss of discipline and craft and vision. As an editor, looking at 100s of hours of “film” for a movie or documentary or even some corporate project is enough to make you want to strangle the “cinematographer” who endlessly rolls, seemingly unwilling to hit the stop button after hitting it initially at the first call of action in the morning. Almost all “single camera” productions nowadays are really two cameras at least. This is really only financially possible in the digital age. I was on a recent film shoot where camera was rolled on a regular basis almost 90 seconds before action was called. That kind of waste and lack of discipline was unheard of 30 years ago.
When I first started as a TV news photographer, I learned the craft from a senior photographer who instilled in me the necessity for shooting my stories with a minimum of footage. We were shooting U-Matic and Betacam at the time, but taking more than a single 20 minute tape to shoot a two minute package for the evening news was severely frowned upon. Since then, I’ve regularly had to piece together two minute stories from HOURS of footage.
D.P.s and film sets have certainly changed with the move from film to digital. Obviously, “video villages” have existed since the invention of the video-tap, but there was a wonderful discipline for the director and D.P. to understand the way film recorded the image and know how to expose and IMAGINE the way the camera would capture the work without being glued to a TV monitor. You used a light meter. You used your eyes on the performance of the actors in front of you. Now, film-making looks so much more complicated.
I never went to film school. I never shot on motion picture film (still film, yes). I never edited film on a KEM or Moviola. There are times when I wish I could say that I did, but mostly, I’m glad for the ease of being able to create with modern tools. I remember writing school papers with a typewriter. I’m much happier writing this on a computer.
Multimedia store, from Shutterstock
Adam Wilt
Camera Log
Instead, the question should be about what this means to and for content creators. Do young filmmakers make a film vs video comparison at all?
Sometimes they do, as in this story about a short shot on the last batch of Fuji film. The short itself is now password-protected, but the story is a delight.
Do more experienced filmmakers feel the limitations of video?
I grew up on film and transitioned to video. Yes, there are limitations, just as there are advantages. If film is like oils, video (or “digital”) is like acrylics: both are paints, and you can paint an equally great picture with either one, but they require different working practices and have different looks, and each can do things the other can’t.
Film’s greatest advantage, in my opinion, is its naturalistic handling of highlights, Film’s shoulder (what would be the “knee” in traditional video parlance) has a seemingly infinite capacity to roll overexposed highlights off in a smooth S-curve. Only recently have digital cameras been able to approach this capability without unnatural hue distortions or saturation errors, never mind the harsh clipping that was common a decade ago and still plagues many digital cameras today.
Another thing: film is surprisingly resolution-agnostic. I’ve seen 4K (or 8K; I don’t recall) scans of Super8mm film – a medium that, by conventional lp/mm metrics, isn’t much better than SD video – resolving fine detail in the grain structure to yield an image that’s perceptually sharper than Super8mm “should” be, clearly better than an HD scan. While I’m not saying we should all go rescan our Super8 (what most high-res Super8 transfers really show is how soft and lousy many of the low-cost Super8 zooms of that era were), the stochastic nature of photochemical image capture – the film grain position changes from frame to frame – captures fine detail, over the course of many frames, with a greater finesse and subtlety than a raster-imaging system can. With an HD camera, you capture 1920×1080 now and forever more. With a film camera, you’re not locked and limited by the resolution du jour; while the acutance of the stock and the resolving power and microcontrast of your lenses do put limits on what can be pulled out of your images, those limits are often well beyond the oft-quoted nominal resolutions of various film formats.
What skills are being lost in this transition?
Art mentions the loss of the apprenticeship system, but I attribute that to the increased affordability and availability of the tools more than any fundamental difference in film vs video. The same dynamic has played out in typesetting, graphic design, and publishing, too, never mind the music industry. Even “traditional,” TV-style video has seen a similar cultural transition. Back when a decent 3-tube camera ran $20K – $80K or more and a broadcast-quality VTR wasn’t much less expensive (and in 1970s or 1980s dollars at that), you didn’t get your hands on the kit without working your laborious way up through the ranks. And don’t get me started on the inaccessibility of online A/B/C roll editing with graphics and ADO!
For me the biggest loss is the requirement that the photographer be able to visualize the scene without immediate feedback: with video, you see your pictures live; with film, there is the delay of at least several hours before what you shoot can be seen (and nowadays, unless you’re working on a big-budget feature, episodic, or TVC, you may wait two weeks for a lab to accumulate enough film to run a batch through the soup). Film forces you to look at light and color with an active eye: you have to understand at least the rudiments of exposure, color, and contrast, because you can’t see what you’ve captured until it’s too late to do anything about it. That understanding lets you walk through a location or a set and light it, by eye or by meter, without reference to the camera. It lets you light a scene by informed intention, not by trial and error; even when you have the crutch of camera and monitor available, you can still work faster and with more certainty.
Of almost equal importance: with film, magazines have limited capacities, and when you hold the run trigger down you can feel the dollars running through the gate. Film teaches discipline and discrimination: you roll on the important stuff, cut when you can, and conserve your stock. You don’t just leave it rolling between takes, and your shooting ratios are lower. You think before you squeeze that trigger. The result is a much lower workload in post, without sorting through endless hours of pointlessly-captured footage.
What benefits besides the obvious ones are being gained?
The flipside of film’s lack of playback: video’s immediacy lets you see the results of your lighting and composition decisions directly, and with the level of abstraction that an LCD or OLED screen in a frame provides (a film camera’s optical finder, often with lookaround area, is still a periscope version of reality itself, not a faithful representation of what you’re actually capturing). The learning curve is immensely accelerated, as you can make a change and see its effect with only a frame’s delay: the feedback loop is shortened from hours or days to a fraction of a second. It allows me to see mistakes faster, fix them before any damage is done, and it augments the learning experience by the immediacy of the feedback.
This is critical when you’re given a new camera to work with: in just a few minutes of looking at pix on a screen (and on the waveform monitor and vectorscope), you can get a feel for how that camera handles extremes of color and light: how it responds to overexposure, underexposure, color, and motion.
What will this mean for future generations who will have to look back through digital archives rather than film canisters?
You can pull century-old film off a reel by hand and see what it is. If you want to retrieve a digital file from the vaults, and it’s not in a current format, I really, really hope you’ll have access to a compatible media drive, the right hardware interface to connect it to, and the driver and OS support to retrieve the file… never mind a playback tool that understands the disk format, file format, and codec. Does it sound too extreme to suggest mothballing entire computer systems (and spare parts) every couple of years, just so we’ll be able to play back that Media Composer 8.0 project in 30 years? Or open a clip using the Apple Intermediate codec? Or decode an FCPX Camera Archive of H.264 footage? Videotape archivists do that already with survivalist-style caches of quadruplex Ampexes and other obsolete VTRs.
How does this change impact other areas of production, including post and distribution?
This change has been playing out over the past thirty years or so; you could write a whole book about it (and some people already have). Anomalies like Mr. Spielberg aside, that revolution has long since swept through post.
Most content delivery to TV networks has gone digital to the extent that they don’t even want to see tapes (or any other physical medium) any more: it’s all file delivery over IP. Distribution of film prints has collapsed as DCPs have taken over, to the point where, but for the intervention of Spielberg and friends, Kodak would already have left the film market entirely.
This revolution won’t be televised, because it’s already old news!
Scott Simmons
The Editblog
I can’t add much to the great discussion generated thus for so I’ll just relay a story about film. I used to work at a post-production house that offered film transfer as part of our services. We had a DaVinci 2K Plus with a Spirit telecine. It was a beautiful system that had transferred 35mm film for (among many other things) some of the most iconic country music videos ever made. As film began its slow and steady slide into obscurity (especially in the secondary markets beyond the Nolan’s and Tarantino’s) this once state-of-the-art film transfer system had been reduced to transferring grainy, old and often damaged film libraries, usually from 16mm.
I don’t remember what year it was exactly but it was deep enough in the RED Revolution that the facility and colorist had graded many beautiful projects of 4K origination. While were weren’t finishing at 4K we were working at full 10 bit 1920×1080 on high quality monitors so we were seeing the best of the best of 4K acquisition at the time. But we had seen enough of it that there wasn’t much coming through that really made you stop and watch just to admire a quality image that was shot well and graded beautifully.
One day among the usual hustle of a busy post-production week we walked into the machine room and saw absolutely stunning images going by on one of the monitors. It was a beach somewhere in the Caribbean and while that is an image anyone would like to see it was unlike other beach footage we had seen go through in previous weeks. Something about this footage was just different. Come to find out it was 35mm film being transferred. It’s what I’m guessing might have been some of the last purposefully shot 35mm film to ever hit that Spirit. There is just something different about well done 35mm film that the digital packages still have a hard time touching. Yes RED has made some fine films (and music videos) and the ARRI Alexa is up on many screens at the multiplex but to remember the post-production folks crowding around that monitor watching that 35mm film footage being transferred reminded me that film still has that special something that digital can barely touch. Articles like this will attempt to articulate what that special something really is but until you’ve worked with film taking it all the way through the process it’s something you may have never seen … never felt. Sadly some (A lot? Most?) of the “film” schools around the country don’t shoot film anymore so new students don’t ever get the chance. Of course who can blame them when film manufacturers don’t make the stock anymore.
Mark Christiansen
Production Values
I always loved working with film for VFX compositing. There was no way to do it without understanding how to decode 10-bit Cineon, and I had the privilege to work at The Orphanage which wrote a set of tools for HDR handling that more or less forced the After Effects team to respond by giving the application a 32-bit float linear compositing mode.
There are a lot of people working with digital images who don’t understand how light works, and that, more than the demise of film, is the hurdle to creating cinematic images with a digital camera. Add to that the fact that our monitors don’t (yet, for the most part) display more than 8-bit-per-channel, and that the internal color mode of the system continues to use that format for the foreseeable future, and we are all working with a broken model. It’s still much faster and more efficient for images to remain at a lower bit depth, but knowledge of how to handle overbrights—whites brighter than what your monitor considers absolute 255, 255, 255 white—is pretty scant.
You can point even the cheapest film camera directly at a light fixture, or even at the sun, and you’ll get back an image with room for detail in the highlights. We’re all still somewhere between wary and terrified of what happens if we do this with a digital camera, and so the choices that are made in ignorance are too often a bit bland.
Worst of all, those choices are still mostly made on the Canon 5D and equivalent cameras. What everyone knows about these cameras is that they can produce an image that looks filmic, but that’s because of the glass in front of the camera, not the image sensor’s meager ability to handle dynamic range, which is not even a lot better than the phone in your pocket. If the comparison is between a film camera and an Alexa in the hands of a camera crew that knows what they’re doing, there’s not much competition.
Woody Woodhall
Sound for Picture
My colleagues are skilled and experienced artists who’ve spent time in both worlds of film and video. I too started with celluloid, mag tape, synchronizers and the dreaded squawk box. There was a personal investment with film, a tactile investment, with grease pencils, splicers, tape and glue, that is completely gone in the digital world. My first job in Hollywood was as an assistant film editor, I spent my days digging in bins of work prints, looking for numbers next to the sprocket holes, finding specific clips and hanging them up by scene for my editor. The origins and DNA of film strands through every aspect of the process. I’ve heard people complain about the legacy use of words like “bins”, wondering why they aren’t just called folders. I often see very specific film effects, like film melting against the projector lamp, or poorly aligned film running jittery through a projector, or an edit with a tape mark across it, and many more visual effects that echo the “analog” of film projection. A lot of the audiences of today have no real life frame of reference to these popular visual ideas.
Although I cut film and I cut video, in editing my main stock in trade is audio. I’ll offer a slightly different take of the seemingly eternal – film vs. video – debate. Obviously the original films were silent films. It took a while for sound to technologically become part of the process. Even today, it sometimes feels that sound is an annoying, however necessary, part of the production process. By nature, with a few exceptions, film was primarily produced with double system sound. There would be the full camera equipment and camera, grip and lighting teams, but a dedicated audio recording system and a team for the location sound as well. Cameras, lenses and mounts for the pictures, recorders, microphones and booms for the audio. Picture and audio – double system. This arrangement served many a classic film, for almost a hundred years of film sound.
The advent of video camera technology created a number of amazing abilities in a single device. Lightweight cameras with crisp zoom lenses and low light capabilities came with built-in microphones and audio recording. Amazing to be sure, but of course, also introducing a whole other set of issues and problems. The imagers and lenses created challenges for the pictures of course, but the audio technology and recordings were often compromised as well.
The lure of this single device, with a zoom and the ability to shoot without additional lighting, was seductive and fallacious. Of course many talented video cinematographers today possess those same sets of broad skills as their experienced forebears, and create wonderful sound and pictures. But the power and ease brought by the new technologies has also crippled many a decent production, because the skill set was beyond the experience of the user. Creating content, on film or digitally is a difficult process that requires skill and experience to execute well. Many of the issues seen in digital production could and should have been stopped before post production – be it blown out backgrounds on the image side or blown out recordings on the audio side. There can be a “casualness” to the video content creation process that, perhaps due to the expenses back then, was not really a part of the film production process. Generally in film production, each shot was seriously considered, had purpose and meaning and was only rehearsed, shot and “printed”, if it had value.
Audio took a big hit when the first wave of digital video cameras started really gaining momentum with the DV Cam and mini-DV formats. Serious filmmakers would use a separate mic, but it was typically, (and often still is) fed back into the video recorder. A much superior option to using the camera mounted mic to be sure, but a far cry from using a dedicated audio recording device. Many people really don’t understand that the microphones and audio recorders used in production have unique qualities and characteristics, with very individual strengths and weaknesses. It is quite similar to the different quality of images produced, depending on the particular camera or lenses that are used. Filmmakers will typically know their production’s camera make and model, the digital recording format (RAW, Pro Res, H.264 etc) that it is shot with, and the various lenses employed. In fact, they may expound about the outstanding quality of the RED, or the Alexa or the Sonys, but they will be at a complete loss to identify the different microphones that were used, or the device that made the audio recordings.
Video, and specifically digital video, also brought with it additional advances due to the relative ease of intense manipulation in computers. Without expensive labs and processing you could re-frame shots, change, add or delete colors and alter many aspects of the digital image. The same sort of digital magic was expected with the audio. Except with audio, rather than be used for artistic expression like the pictures, it was expected to fix problems and mistakes. Distorted, off-mic and poorly executed location recordings were brought to audio post with the misconception that since it was “digital”, it could be easily “fixed” with a few mouse clicks. “A little bit of EQ and noise reduction…” Hard lessons were learned about audio recordings regarding how it is just as essential to get it right on the set or interview or location for the audio, as it is for the footage. There is a lot of digital computer magic to be had for both image and sound, however, it too has limitations.
DSLR in recent years has created a resurgence in double system sound. In fact, since recordings were now happening in the camera onto the digital file and was also being recorded simultaneously on a dedicated audio recorder, it has now become known as dual system sound. Since a DSLR camera that shoots stills and video was not created like the digital video cameras, right out of the box, its audio limitations were apparent. Also fortunately, some very smart filmmakers were talking about the importance of great audio recordings, and how these cameras were insufficient for audio. At the same time, there was a boom in really high quality, hand held recorders, available at very affordable prices. There was a higher motivation to spend a few extra dollars toward their soundtracks, since the cost of entry to decent sound had gotten considerably lower.
Film vs. Video – both amazing mediums. The celluloid process was the origin for what has grown into a very sophisticated art form of entertainment that we call filmmaking. Most of us no longer use celluloid in our process. But we will never be free from the origins of this amazing form of storytelling and communication language. We will “roll”, we will “cut” and we will “screen”. Whether you create your content with a film camera or a video camera, these days, it’s about a choice.
However, most people don’t call themselves “videomakers” or “digital content creators” instead, they refer to themselves as filmmakers. That seems right, no matter what the medium.
Terence Curren
AlphaDogs
I agree with all the great points made so far. I was recently asked if I could pick just one change in our industry that was the most impactful and I said it had to be the invention of digitizing video. That led to NLEs, digital cameras, computer based VFX handling, and eventually, the death of film. It also led to the commoditization of media creation which is having such a negative effect on salaries and employment in our industry.
When I started in this industry back in the early 80s, there was a distinct divide between video people and film people. I loved film. I tended to dismiss those who said film would be replaced eventually by video. When the RED camera came out I knew film was in trouble. Digital projection was the nail in the coffin for film as the real money was made on making release prints not in the acquisition stage.
Besides the unfavorable points made by others here, the actual experience of viewing the image is what really suffers. Digital just can’t handle the transition from blown out highlight to image the way the organic process of film does. The halation effect in film adds a pleasing softness to those transitions that the electronic approach of digital just makes too crisp. Certainly a good DP can avoid this by careful lighting and shooting. But let’s be honest, very little of the content being created now is being done by good DPs.
There are many legacy elements of film that still impact our workflows today starting with 24 frame production. That’s a look many of us prefer, but going forward it will be phased out as younger audiences get used to the hyperrealism of 60 or 120 FPS releases in 4K HDR. The linear tracks in NLEs are another example that FCPX has started to break. Even lighting is changing as digital cameras can dig so much more detail out of the low end. And that affects storytelling as we get more neorealistic productions. In the end it is just another change in our toolsets. No matter what happens, story will still be king regardless of what it is shot on and how it is distributed.
Jose Antunes
DSLR Moving Pictures
There are people here, like Art Adams, whose comments say it all. But he also admits that “Digital is becoming more exciting every day” and it is from there I want to take it. I’ve no experience with film beyond a few Super 8 films I did at a young age. I also did video (VHS-C, if I remember correctly, mostly to try new camcorders) but photography is my main area, and it’s from that experience I want to continue this conversation.
Photographers, as cinematographers, have had to change the medium they use to tell stories. Coming from a Ilford HP5 or FP4 background through transparencies like early Ektachromes, then Fujifilm Velvia, Ektachrome Panther and Lumiere, to having sensors as my film, made for some drastic changes. I’ve adapted rather well, and while I do remember the darkroom days and would not mind to do/create some guided visits for learning purposes, I do not want to go back to it, and I am happy using computers as the developing trays to “develop” my images.
We’ve come a long way from the initial sensors, and I do not think there has to be a fight over what is best. We should be glad that there is still choice, although, when it comes to film, choices are fewer for each new year. Fujifilm abandoned the area, for example. I mention them, because they are, I believe a good example of a company that having deserted film for commercial reasons, have not given up on creating the “look” of film.
The fact that Fujifilm has, for a long time, emulated some of their emulsions on their digital cameras and sensors is a sign of hope. They are a company that wants digital to have a, let’s call it, “soul”. Fujifilm has not only recreated their classic transparency emulsions, Velvia, Provia and Astia, besides colour negative and some black & white in what they call Film Simulation modes, they’re exploring new concepts, to create “films” that are an answer to the requests from users.
The recent Classic Chrome is such an example. Introduced in some of the new cameras, this mode does not aim to recreate any known emulsion, it aims to create a feeling, a mode reminiscent of the images each individual carries in their mind and the physical prints of such images. Fujifilm asked professional photographers what they wanted their photos to look like and departed from there to design a mode, adjusted in camera, that gives a specific result. But the Classic Chrome is not a canned solution, as users can adjust the camera’s control features (including shadow and highlight tones) to match the scene and emotion, so users can achieve a broader range, better reflecting their expectations each moment.
Although we’re in the realm of photography, nothing stops, apparently, users from selecting the same option for video captured with a Fujifilm camera. I explore the theme in one article published here at ProVideo Coalition, looking deeper into aspects of Fujifilm’s pioneer approach in some areas of film, and their move to digital. So, while things may be different, maybe we’ve come to a time when companies have to start looking at ways for digital to get the “soul” of classic emulsions. Or a new “soul” altogether. Fujifilm seems to be leading the way there, right in-camera.
Art Adams
Stunning Good Looks
Film teaches discipline and discrimination: you roll on the important stuff, cut when you can, and conserve your stock. You don’t just leave it rolling between takes, and your shooting ratios are lower. You think before you squeeze that trigger. The result is a much lower workload in post, without sorting through endless hours of pointlessly-captured footage.
This. There’s no sense that someone has to deal with all this extra footage down the road, and deal with it quickly. There’s also no realization that your camerapeople are still handholding (often) heavy cameras, and not cutting for a half hour to an hour at a time is putting strain on them that will result in poor shots later.
There’s a disease called “directoritis,” and it’s when directors become so self focused that they ignore the fact that they are working with human beings and push them beyond reasonable bounds. (My sense is that it’s related to insecurity, as experienced and confident directors tend not to do this.) I saw this once when I was shooting second unit on a commercial. The talent would shoot with first unit, and then come do funny improv stuff with us while they were lighting the next setup.
The second unit director had directoritis. He was constantly dreaming up shots that were so complex that by the time we were ready to shoot first unit would call the talent back. 18 hours in, as first unit lit their last setup, we got the talent for ten minutes or so. “Okay, here’s what I want you to do,” said the director. “Run around the store really quickly and try on as many items of clothing as you can.”
Our utterly exhausted talent said, “What?”
She still did it, but she was completely whacked. Our director never noticed.
As Adam said, film teaches discipline. It also teaches awareness: directors have to learn when they have the shot and when they don’t, and they have to stop periodically during reloads to think about it. Modern directors can roll for hours without reloading, and sometimes they do. That doesn’t make the performances any better, it just exhausts the crew and the talent and makes more work for editorial. On modern budgets it’s more important than ever to learn to work efficiently, because the cost savings of video are being eaten up by the fact that there’s a need for more media than ever before but the total dollar amount budgeted to produce all that media hasn’t increased much over the last few years.
Film made us think consciously about what we were shooting, and whether we had it or not. Video is easier in this regard, but it also makes us lazy. It allows some directors to shoot, shoot, shoot without regard to how much workload they are adding to post.
This is critical when you’re given a new camera to work with: in just a few minutes of looking at pix on a screen (and on the waveform monitor and vectorscope), you can get a feel for how that camera handles extremes of color and light: how it responds to overexposure, underexposure, color, and motion.
This is critical. While I use my light meter constantly to try to force myself into some semblance of my former film habits, it really is critical to view a monitor of some sort when shooting high quality video. Not only are camera models highly variable, and new cameras a complete mystery until used for a while, they all have idiosyncrasies that only show up under certain conditions. Sometimes they can’t be seen well in a viewfinder.
If you want to retrieve a digital file from the vaults, and it’s not in a current format, I really, really hope you’ll have access to a compatible media drive, the right hardware interface to connect it to, and the driver and OS support to retrieve the file… never mind a playback tool that understands the disk format, file format, and codec.
I have a bunch of material archived using the Microcosm codec, which was the standard archiving format for a local and very famous visual effects house that went out of business years ago. It’s a great codec but it’s also ancient history as the company that made it hasn’t supported it in years.
My original copy of the codec stopped functioning after an OS upgrade a few years ago, and it was pure luck that I found the last version available online for free, distributed as a curiosity. (It was invented in 2002.)
Film was the primary origination, distribution and archival medium for motion imaging for about a century. These days the most common “high quality” origination medium for anything that’s not a feature film is ProRes, which has only been around since 2007. What will we be using in another eight years, and will we be hunting around the web looking for free copies of ProRes to read our old files?
The ProVideo Coalition experts have weighed in, but now it’s your turn. Continue the discussion in the comments section below and/or on Facebook and Twitter.
Filmtools
Filmmakers go-to destination for pre-production, production & post production equipment!
Shop Now