Back To Listings RSS Print

CAMERAS: The Zacuto Fallout Continues

Tests themselves are less important than how the results are presented. In this case, one innocent quote from a non-DP has spread like wildfire.

By Art Adams | August 18, 2012

What's more important than technical accuracy? Making sure your test speaks for itself. When others speak for you--especially if they are non-experts--the messaging can quickly spin out of control.

I believe Zacuto, and all the people who participated in their 2012 camera shootout, had only the best of intentions. Unfortunately they don't know much about doing comparative testing. The part they didn't think through was the political part: how does this test affect manufacturers? cinematographers? producers and directors?

That's why most tests let the results speak for themselves. You present your methodology, you show the footage, and you ask if anyone has any further questions. If you present conclusions then anything you present should be backed up by visual evidence that everyone can see. And, most importantly, if the audience only ever sees the test results, and never sees the supporting documentation, they should still walk away with an accurate perception of what transpired. All the important information must be presented at once, because you only have one chance to make a first impression that will linger with the audience for a long, long time.

This test contained a number of fatal flaws, many of which I detailed here. There are two that I want to stress again before I show how the test messaging spun out of control:

(1) Allowing DPs to relight the same set, with the same actors and the same action and camera movement, to make prosumer cameras look similar to high-end cameras, was a huge mistake. Zacuto says that this was really meant to be about the separate DPs and not the cameras, but that's clearly not the impression given when these DPs changed some of the lighting and nothing else. The camera move remained the same, the composition remained the same, the broad strokes of the lighting remained the same.

String all of these clips, shot by different cameras, together and you have a camera shootout. It's not a "documentary" and it's not a DP shootout--the emphasis is clearly on the cameras. If you want to do a DP shootout, then let each DP light the set however they want from scratch and give them a single camera to work with. When the only variable changed is the DP we can now examine how that one variable affects everything else.

It's incredibly silly to say that these tests are about both the DPs and the cameras when it's impossible to know where one leaves off and the other begins. Zacuto created a situation where a camera like the Panasonic GH2 looks really good not because it's a phenomenal camera but because its crew knew how to work to its strengths and compensate for its weaknesses--but there's no way to quantify what part was the camera and what part was the crew. All the casual viewer knows is that the GH2 looked damned good. Oh, and they know one other thing...


(2) Francis Ford Coppola really liked the GH2. Or at least that's the message that's traveling around the Internet right now.

Test results should be presented with as little comment as possible, or the comments should be backed up during the screening by drawing attention to different aspects of the frame. A celebrity audience offering commentary on what they saw can add significantly to the viewing experience, but only if all of the viewers are celebrities because they are visual experts. Francis Ford Coppola is a magnificent director but he does not have as good an eye as most of the people in that room, all of whom have done nothing else but create commercially successful and spectacular images for decades--yet he is, by far, the most famous.

Allowing a famous celebrity, who is not a DP, to participate in this presentation resulted in a quote that has spread across the Internet like wildfire--and the quote, taken out of context, doesn't mean what many think it means. It does, however, feed into a bias--created originally by RED--that low-end tools are now good enough that anyone can execute Hollywood-level cinematography with inexpensive tools. This is true, but--sadly--the emphasis falls on the tools first, when what really matters is the craftsperson using the tools.


As of Saturday, August 18, 2012, at 1pm, here's what a Google search turns up based on the keywords "Zacuto test Coppola":

43rumors.com:

"The audience had no way to know which movie had been made with a certain camera. At the end they had to pickup their favorite choice. The most voted camera was the Panasonic GH2!"

How much of what they liked was the DP's choice in lighting versus the camera itself? We'll never know.

EOSHD:

"Francis Ford Coppola gives his answer in the new Zacuto Shootout, choosing in order of preference the Panasonic GH2 (lit by Colt Seaman), Alexa and Epic"

"That tells you that in a controlled studio environment like this you can adjust the lighting to match the dynamic range you have available in the camera, and that 15+ stops of dynamic range is god damn overrated."

"The thing I was most impressed with is that some guys or gals with something to prove did better at lighting than the established cinematographers with a good camera" - Bruce Lundeen (33 min 13 seconds). This is how you make a great film, a great shot, a great scene - Passion, hunger, creativity and a $700 camera."

Does anyone think that the takeaway is "Get a great DP!" as opposed to "Buy a $700 camera!"?

Gizmodo:

"The contenders included a wide range of cameras, ranging from the $65,000 Sony F65, right down to the iPhone 4. Audiences of filmmakers around the world were shown each camera's results, the names of each camera remaining a mystery. The most favored machine, to the shock of many, turned out to be the $700 Panasonic GH2 micro four-thirds camera."

The article also says: "It's impressive that a consumer camera could stand up to professional cinema rigs, but there is a great degree of subjectivity at play here. The skill and decisions of each cinematographer definitely played a key role, as did the personal preferences of those voting." But we have no way of knowing how the skills and decisions of the cinematographer affected the outcome...

Luminous Landscape (forum comment):

"But what I find amazing is that all the guys involved were experienced and also could lit according to each cam specs. The fact that many experienced eyes like Coppola put the hacked GH2 in first place when viewed on theater is still amazing me, because all the other cams had also the optimized lightning etc..."

It's a valid point, unfortunately...

Imaging-Resource.com:

"The results were then screened to an audience of filmmakers, including Oscar-winner Francis Ford Coppola, allowing them to select their favorites without identifying which camera was used for each clip. At the end, the cameras' identities were revealed, and surprisingly even the iPhone 4S video wasn't instantly recognizable to all present. And which were the favorites for Mr. Coppola? The Panasonic GH2 fared best, triumphing over the vastly more expensive Red Epic and Arri Alexa."

Personal-View.com:

The camera you use "almost" doesn't matter anymore!"">"For digital theatre screenings in 2k or 1080p, extremely high end cameras simply don't look that much different to the lower end ones. It is therefore the lighting and the skill behind the camera that makes a larger percentage of the difference. As the level of skill involved here behind the camera was all very high, each scene was similar butthe GH2 was still able to overcome the F65 or Epic, mainly because - A) It was handled better B) The lighting was most creative C) The technology inside the GH2 was advanced enough to deliver on this shoot relative to a $70,000 F65" Yep, as I always try to remind you it's your creative skills that do matter. The camera you use "almost" doesn't matter anymore!"

My emphasis. And it's not true. The camera matters tremendously, and must be matched to your budget and project needs.

Ryan E. Walters, cinematographer (who participated in the test), at ryanewalters.com:

"As you watch the episode, listen to how many people say that the letter C (Epic) is in their top picks. Even Francis Ford Coppola had it in his list, and it was above the Alexa. So everyone on Reduser should be breathing a sigh of relief. After all, there were people who placed C ahead of F (Alexa), and there were people who didn't even have F on their list. I would think this would be good news to the ears of the Reduser community. Unfortunately, this little fact doesn't seem to matter much, as all of the news and talk is about Camera B, the GH2. Should this really be surprising? Of course not- I expect a camera like the Epic to do well. So when it does well, I think, yep it did what it should have done- that is not news. But when a camera like the GH2, which costs almost nothing, does well and better then I expected, that is news."

He goes on to say that the biggest differences between the low-end cameras and the high-end cameras could be easily seen on a big screen. I had to watch the Vimeo feeds, as I suspect most people do. The differences aren't as obvious there, so the audience commentary makes a HUGE difference.

pmanewsline.com:

Headline: Video test: $700 Panasonic camera beats $65,000 competition

PhillipBloom.net:

"A different DP could easily have made the GH2 the worst of the bunch or the best. It's harder to screw up a camera with a terrific DR but very easy to screw up cameras with the limited DR. You really need to nail it!"

He makes a good point. You could love a cheap, prosumer camera simply because that DP did a subjectively better job of lighting the scene than did the DP tasked with making a higher-end camera look good. The fact that the GH2 looks so good has nothing to do with the camera objectively, but the test presentation makes this impossible to know for sure.

MarketWire:

"The biggest surprise by far was the overwhelming response to footage shot with the $700 Panasonic GH2 by DoP Colt Seman and Johnny Zeller. Another surprise was the quality of the iPhone footage shot by Michael Koerbel. The audience choices were varied, as expected, proving that the test is subjective and everyone will choose based on their personal tastes."

Are they choosing cameras because they like the way the camera looks or how the DP lights? Who knows? There's no way to tell.

This is a press release put out by Zacuto itself. It does attempt to give credit to the cinematographers for their exceptional work, but does anyone doubt that the takeaway is that a $700 camera "won"? When Zacuto itself emphasizes that the GH2 received an "overwhelming response," do they realize what kind of message they are putting out there?

This was also picked up by Reuters.


News.Doddleme.com:

"The Revenge of the Great Zacuto Shootout has concluded and the winner is … going to shock you. With such heavy hitters as the Sony F3 & FS100, the RED Epic, the Arri Alexa and the Canon C300, you'd expect these three to slug it out for the top honors amongst some of Hollywood's cinema elite. But while these professional grade (and professionally priced) camera rigs were busy squaring off against each other, a sub $1,000 camera snuck in and impressed a great many tasked with evaluating each rig. For those who use it, that wasn't a surprise. But the big surprise was who it actually impressed."

Want to guess who that person was? I'll give you a hint: he wasn't the most visually sophisticated person in the room, but he was--by far--the best director...

DPReview Forums:

"The results are in from Zacuto's Revenge Of The Great Camera Shootout 2012. The majority of those at the cinema screenings - including Francis Ford Coppola preferred Colt Seaman's lighting and the capturing of it by the Panasonic GH2."

Were they voting for the GH2 or for Colt Seaman? Most likely Mr. Seaman, but how do we know objectively?

Techcitement:

"But the option that appears to have stolen the show in votes, both online and at the actual events was choice B. This was even the camera picked by the Godfather himself, Francis Ford Coppola, who attended the screening at Skywalker Ranch. Clearly B was the $30,000 RED EPIC or the $60,000 Arri Alexa, right? Or at least the $16,000 Sony F3 or Canon C300?"

No, camera "B" was the GH2. It's hard to know what voters were responding to when watching this footage in a small video window on a computer screen. Most probably they were impressed by the lighting, but what does that have to do with the camera?

Now, what I didn't know when I started this article was that the website NoFilmSchool conducted an online vote, and the GH2 won. This meant that voters were looking at a compressed Vimeo feed, which probably affected their perceptions as much as the objective look of the camera and the subjective lighting executed by the DP. This kind of thing can't be avoided when posting video to the web because most people won't be able to see this in a theater. The problem, though, is that the lack of side-by-side objective comparisons of the cameras make it incredibly hard to see the differences in a small video window. I suspect it would be quite surprising, however, to see what differences CAN be seen under those circumstances, assuming that the cameras are looking at the same set with the same lighting. Once the lighting changes then we really have no idea what we are comparing.

Nearly every article above emphasizes that the tests are subjective and a lot of what people are responding to is most likely Colt Seaman's excellent work. That's the problem, though: how do we know, when we view the results from each camera, whether we are responding to the camera's look or the DP's touch? We have no idea.

The other problem is that every article first mentions how surprised everyone was with the GH2's overwhelmingly positive response, and then credits Colt Seaman's contribution second. While nearly all of these articles do the right thing and try to put credit where credit is due, they all--consciously or subconsciously--mention the camera first. The DP reference is always second. This is the way people think at this moment in time. It can't be avoided.

Camera tests are hard. Presenting their results is even harder. It helps a lot when experts show you what they see in each image, and you can see what they're talking about because you can see test images side by side and note where something happens in one frame that doesn't happen in another. When, though, some of the contributors are experts and others are simply famous you're opening yourself up for a world of political trouble if the famous people disagree with the experts. In the media, famous people always win.

I suspect that the inclusion of Mr. Coppola had more to do with marketing than with critical visual analysis. He's an incredible artist, but he's not looking at those images in the same way that an ASC member is.

In the end it could be pure luck that the GH2 got so much attention in these tests. If Mr. Seaman had instead lit for the Sony F3, would that camera be getting all the attention now? What if he lit the iPhone segment instead? We'll never know. What we do know is that if he'd lit the scene once, and never changed the lighting, we'd have a much better idea of what the cameras can do. At this point we can only know that every DP does work that some find subjectively better than others, and some cameras appear subjectively better than others, and without isolating one or the other we can't know which is making the difference.

It's because we can't know which camera is objectively better, or which DP is subjectively better, or see results from multiple cameras in the same frame at the same time, that these tests are largely a failure. It is possible to learn something from them, but this could have been a tremendous opportunity to see how each camera--each digital filmstock, if you will--stacked up against another.

Q: Which film manufacturers would line up a bunch of cameras loaded with different film stocks in front of one set, let one DP per filmstock light the set to their liking, and then screen the results as a film comparison shootout?

A: None.

Art Adams is a DP--and educator--who endeavors to shoot tests whose results can not be interpreted incorrectly. His website is at www.artadamsdp.com.


"ProVideo Coalition shut down comment registration a few months ago due to spam attacks. Commenting will be back and fully functional when the site relaunches which is happening soon. In the meantime, you can engage with the community via our Facebook or Twitter pages or send an email to info(at)provideocoalition.com to get a message posted in the comment section of this article."


Just so you know, I monitor the PVC Facebook page as well as the Cinematography Mailing Lists (CML), so if you can't post comments here try posting something in either of those places. I'll most likely see your post and respond.

I understand there's a discussion happening on DVXUser but I doubt I'll get around to joining that this weekend. If I don't walk away from this now I'm going to end up sleeping on the couch.

-Art
Editor's Choice
PVC Exclusive
From our Sponsors

Share This

Back To Listings RSS Print

Get articles like this in your inbox: Sign Up

Comments

IEBA: | August, 18, 2012

Well said.
Really, there’s little more that can be added.

In an effort to make sure each camera looked as good as possible, they (the people that executed the test) created so many variables that it’s impossible to see an the footage and understand what makes it look good. The lighting, the DP, the grading, the camera settings or maybe, lastly, the camera.

Art Adams: | August, 18, 2012

Great point. It’s pretty clear that a lot of work had to be spent making several of the cameras look “normal.” The grading didn’t do the FS100 any favors… if I recall correctly the highlights went an interesting shade of cyan.

You’re right, it’s really hard to know what to think about a test like this that had so many variables that interact that one doesn’t quite know what happened where. The hardest thing to do in a test like this is define what it is you’re really testing for, and how to keep things fair, and that inevitably means that some cameras are going to look better than others but maybe not as good as they could. At least, though, you learn what their strengths and weaknesses are.

In this case the goal seemed to be to make them look the best possible by manipulating both the lighting and the grade, at which point how can you learn anything about them at all?

With adequate crew, lots of lights, 90 minutes of lighting time and 90 minutes of grading time -per shot- I suspect we can make any camera look good. Let me know as soon as that starts happening on a regular basis. I want some of that action.

Please login or register to comment