Vashi Nedomansky trained a team of Avid editors and assistants for their switch to Premiere Pro for “Deadpool” and delivered a workflow that put the big-time box office hit in theaters everywhere.
Vashi Nedomansky has a growing list of editorial credits on films like the “An American Carol” and “Sharknado II,” but he’s also called upon both as a colorist and editorial consultant. Most of this consulting work is due to his long-standing use of Premiere. Most recently he trained a team of Avid editors and assistants for their switch to Premiere Pro for “Deadpool” and delivered a workflow that put the big-time box office hit in theaters everywhere. The movie opened Valentine’s Day weekend and brought in $60M just Thursday and Friday nights setting it up to break multiple records.
deadpool-Deadpool_Trailer2_Redband_h264_hd from Steve Hullfish on Vimeo.
HULLFISH: Tell me about your workflow and why you were brought on board for “Deadpool.”
NEDOMANSKY: I was brought on board by post-supervisor, Joan Bierman to come in, but before I did that I sat down with both Tim Miller and Jennifer Miller at Blurand had a meeting before they even started shooting, probably a month before and I was recommended by Adobe. I had previous experience training and working on “Gone Girl.” And before that I had cut about eight years on Premiere Pro. I started with Bandito Brothers which is a production company that made “Need for Speed” and “Act of Valor.” I was their lead editor for four years and they were an entirely Adobe house from 2005 or 2006. So I had a lot of experience and obviously Premiere has come a long way since then. Sometimes it was crashing back then, so when I sat down with the “Deadpool” team, they asked, “So why would we want Premiere Pro? We don’t know it. We haven’t heard of it.” But the thing is that Tim Miller is good friends with David Fincher and he wanted to see what they did to successfully get “Gone Girl” through post and whether that could be translated to “Deadpool.” Since I had worked on “Gone Girl” previously, they wanted me to come in and explain it to them and run them through what they should expect. It was mostly a technical meeting in terms of “What kind of computers should we have? What storage?” At the end of the day, every NLE is pretty much the same in terms of functionality – marking in, marking out, trimming, all that kind of stuff – but they wanted to stay ahead of the curve. They wanted to go with an NLE that was on the cutting edge of technology and always staying ahead of the curve, but also embracing everything that is there currently. At the end of the meeting they were pretty confident that they wanted to build out, because they were actually building five edit suites at their studios in Culver City. So that was a major investment on their part to build it out and commit to cutting with Premiere. So I think it was a combination of me having experience with Premiere; that Fincher had a good experience; and the desire to be able to handle effects and everything moving forward.
HULLFISH: What would you say has changed in Premiere to make it better as a feature film editing tool even since “Gone Girl?”
NEDOMANSKY: As a feature film editor myself, I’ve cut nine feature films – the last four of them on Premiere – and the reason I did those last four on Premiere is because of its flexibility, its power and its stability. I think, like other editors, I’ve cut on every other platform. For me, Premiere was the easiest to interact with, the easiest to customize. Once the power of the Mercury Engine underneath the hood was installed in about 2010, that opened up being able to cut natively in any format and any codec, that was crazy. Not having to transcode on any project – big or small – not only saves time and storage space but lets you cut right away. There’s lots of films I’ve worked on and you’re on set and they literally pull the cards out, dump it in and you’re cutting and showing the director immediately. That wasn’t possible in FCP7 or Media Composer at that time, especially with films shooting so many different formats. I mean, every film that comes out that was either shot on RED or ALEXA always has a 5D shot… always has a Blackmagic Cinema camera shot… always has a GoPro shot… always has a Sony FS7 shot. Even if it’s one shot, if you don’t have to transcode, you get closer to the finish line creatively and it gives you more time to play with the cut and finesse it. Between when post began on “Deadpool” and “Gone Girl” Premiere is like an exponential step up.
HULLFISH: What specifically has changed that has made your life on a feature easier?
NEDOMANSKY: There’re a lot of changes. There are about 200 new features and bug-fixes and requests that were raised by Fincher’s team over the stretch of “Gone Girl” that were eventually applied to the version that we were using. So sometimes it takes someone from out of your house to call up stuff and say, “This is what we need.” and you would have never thought of it. For example, one of the biggest things since “Gone Girl” is Render and Replace, which is standard in every issue of Premiere Pro. Because Fincher on “Gone Girl” and Miller on “Deadpool” had so many Dynamic Links in their timeline, obviously there’s a limit when you’re shooting if 4K, 5K, 6K whatever it may be, once you use a couple of Dynamic Links and they’re active and you have both After Effects and Premiere open it bogs down the system, especially with those high levels of information, so by with Render and Replace, with being able to swap it out with a Quicktime file that makes no hit on the processor is huge. We were doing split screens in both “Gone Girl” and effects that were being swapped out on the fly, so once you have a timeline of 50 or 60 shots like that in a reel, it’s so nice to have a Quicktime file from the Render and Replace function instead of dealing with an active, open Dynamic Link. So that’s a huge takeaway that saved our day so many times.
HULLFISH: Render and Replace takes place in the background?
NEDOMANSKY: No. You choose to do it. You have a shot in your timeline and let’s say you open up After Effects, you pull a key, you do something crazy… and when you come back to Premiere, all those effects from After Effects are live in your timeline and you right-click on the effect and you save “Render and Replace” and it literally makes a Quicktime file, removes the After Effects composition that’s open – the link – so it’s now just an asset in your project. And what’s even better is that if you decide “I need to change something” you can right-click and go back a step and go back and open up that After Effects project inside the link and the render file will disappear and when you do it again it will replace it with a more current version.
HULLFISH: One of the things I was shocked with in my interview with Kirk Baxter, who edited “Gone Girl” was that Premiere needed to have 5 or 6 separate projects – one of each reel of the movie, instead of the entire movie existing in a single project. Is that getting improved? That’s a crazy way to have to edit a movie.
NEDOMANSKY: Over the last two versions some of the things they’ve been improving on is how long it takes to open a project. That’s an issue on any project. During the post of “Gone Girl” they improved it by a factor of 10, so a reel that took 10 minutes to open only took 1 minute. And then during “Deadpool” they improved it by another factor of 10 again. So that same project that initially took 10 minutes was now only taking 10 seconds. So when director wants to see a cut or a shot, you can’t wait 10 minutes to open up a project or a reel, so that was addressed. But most importantly, on most productions, going back to film days, is it’s nice to be able to break the film down into components, like reels of about 20 minutes each, only because when you’re handing off the sound or handing off to VFX, it’s easier to be able to hand off a reel than the entire project. So it’s not a question of “you can’t do it” it’s a question of functionality of the workflow for the entire team in post-production. I know there were six reels on “Gone Girl” and six reels on “Deadpool” and there were six reels on “Hail Caesar.” But, I cut “Sharknado II” on one timeline and project of 90 minutes. I cut a documentary last year on PTSD using 9 different codecs on a single timeline – 90 minutes – so no reels, but I wasn’t cutting in 4K and I just wanted to move forward, and as an editor I like to have a birds-eye view of my timeline. I can see the pacing, I can see the length of shots and I can better judge visually what I’m looking at and how far along I am and what the pacing is. So you can do it on one timeline or you can do it in several reels.
HULLFISH: The film I’m cutting in Premiere right now is over two hours long and it’s all in one timeline. I do think it takes WAY too long to load up that project, but you only have to do that once, as long as you don’t crash.
NEDOMANSKY: If there’s convenience to that and you’re willing to wait that one time – ’cause after it’s loaded once, hypothetically you don’t have to open it again.
HULLFISH: On “Deadpool” there was no transcoding of these giant files? You were actually using raw camera footage?
NEDOMANSKY: We cut at 2K. We had the original Arri RAW files that were 2.8K, but they were the full raster from the camera, so we used an extraction that was pulled off because we had the film in 2.35, so the Quicktime ProRes files that were 2K were created and delivered which were a conversion of the ARRI RAW, then we had a matte to make our 2.35. We had to have a couple of versions because the actual extraction was top-heavy – it was higher in the frame, so you couldn’t just put a matte on the original image, because it would have been centered, and our image was raised up, so we had to have a custom matte made for that and when we were cutting we wanted to see just the image and not the timecode and everything running by, but the studio wanted to see the take and timecode, so there were two versions that were being passed around in terms of all that, so it was a 2K ProRes timeline, which was the same thing that “Gone Girl” did.
HULLFISH: I don’t understand, because on the feature I’m cutting, we’ve got a slightly odd aspect – essentially the same aspect as a full frame DCP – and to display that, we just created our sequences with that exact aspect ratio, so there’s no matte needed at all.
NEDOMANSKY: On the extraction of the Quicktimes, it was strange because of using the upper part of the frame in the camera, so if we’d done that, they added black so our extraction would be in the middle. So it was a weird process. We tried doing that off-set in Premiere, but if you pass off a clip to someone else with an off-set, then someone else would get it in VFX and it would be off from what they expect, so we couldn’t do it that way, so we just had it done for the deliverables – the dailies. It’s not as confusing as it sounds.
HULLFISH: How did they organize the project?
NEDOMANSKY: That’s why I was brought in. The entire team
HULLFISH: You’ve cut on FCP7 and Avid?
NEDOMANSKY: I’ve cut features on FCP7, Avid and Premiere Pro. I’m pretty fluent in all three only because as an editor I want to be hired and I want to know what all the technologies are and if you just say, “I’m a Final Cut Pro 7 editor” you’re going to lose a lot of jobs. I think it’s our duty and our job as editors to become familiar with all the platforms because we’re being hired for our storytelling, our communication skills, not what platform we’re editing on.
HULLFISH: The reason I asked is: there’s pretty common wisdom – and I’m interested in your perspective as one of the few people I know who – like me – have cut features in all three NLEs – that if you’re switching from Avid to Final Cut Pro, or Avid to Premiere, or Premiere to Avid, you don’t want to simply transpose the way you cut in the LAST NLE when you move to the NEXT NLE. You don’t want to try to force Premiere, for example, to work like you did in Avid. So if you make the transition from Avid to Premiere, what SHOULDN’T you do like Avid and what CAN you do like Avid?
NEDOMANSKY: The first step to make them comfortable was to show them the features of Premiere that they weren’t even aware of. Because I’m an editor, I can relate to them editor-to-editor and not Adobe-to-editor. I’m not pushing anything on them, I’m only trying to help them. So I’m showing them stuff like, “You should be aware of this” – like the tilde key blew them away. Whenever you hit the tilde key, whatever panel you have active, that panel will go full screen.
HULLFISH: I love that feature… I miss that when I move back to Avid.
HULLFISH: Describe this more… I’m not getting it.
NEDOMANSKY: In your normal window you have a source monitor, program monitor and a timeline. So what you do is you open a second timeline and you attach it underneath the first timeline, so you have two open timelines, one on top of each other. So whatever timeline is active, then that’s in your program monitor and you can grab a clip from one timeline and drag it straight to the other timeline without changing tabs, without – like Avid – loading one timeline into the source and then swapping timeline views – with Premiere you can just mouse and click and drag between two timelines. So that allows you to put a 10 hour selects reel in one timeline and go through and grab whatever you want and put it into an empty timeline underneath and quickly build out an assembly or a cut or a finer selects reel. Without that, you’d be clicking on a shot and loading it in the source and then dragging it into an empty timeline. This allows you to more efficiently go through large amounts of footage with two active timelines.
HULLFISH: A lot of the editors I’ve interviewed in the last year or two – many of them make selects reels, so that would be a great way to take those selects reels and get them into a real scene sequence.
NEDOMANSKY: Absolutely. Correct. This is something Angus and Kirk did on all their films. That’s the way they edited in Final Cut 7 and I’d seen it done in some other places, so I wanted to show them that. It’s just very efficient. I’ve done a couple of posts about it, so you can see more detail in those
http://vashivisuals.com/adobe-cs6-5-editing-tips-for-music-videos/ TIP #3
http://vashivisuals.com/the-pancake-timeline-maximum-limit-is-24-hours/
HULLFISH: Talk to me about organization. So you’ve got your project pane and in the project pane is … what?
HULLFISH: One of the complaints Kirk had with Premiere was that when the director wanted to see reel 3 and they were working on reel 2, he’d have to shut down the reel 2 project and launch the reel 3 project and it would take 10 minutes. So, to avoid that, you could have another project that held the sequences of every single reel?
NEDOMANSKY: Correct. Totally. That’s how we did screenings. That’s what we did when we had to make a DCP. We’d just open a project, import the six most current reel sequences and the project ONLY contained those six sequences. Also, since that time when Fincher and Baxter were working on “Gone Girl” that ten fold improvement of speed has been implemented since then, so it’s less of a factor now. But that’s contingent on the size of the project and the resolution of the project.
HULLFISH: Those 2K files you said you used? Some dailies company probably produced those for you, right?
NEDOMANSKY: Correct. We had the ARRI Raw because we were doing the VFX in house, so we had to have the original source material, but we had the 2K source material was being generated on-set.
HULLFISH: On Avid, I know how the metadata transfer takes place to get you linked from the 2K Quicktime back to the original raw file, but how does that work in your workflow? So in Premiere, if you have a shot that’s ready to be handed off to VFX, how do you link back to the original raw camera file?
NEDOMANSKY: We did have the metadata in the ProRes, but since everything was named the same and everything was in-house, I think they just used a search for the filename. The editing team all had Premiere on their systems and they were on After Effects as well, so we were just handing over via the latest project. It was just that simple. We also were using Filemaker and Intelligent Assistant. We had some third party software that was being implemented as well. As you said, there’s a certain way that every studio wants stuff handed off, like with sound, so we used XML export out of Premiere Pro to deliver thumbnails, per their request to share internally. We used Intelligent Assistance’s Change List software to help match the AAF outputs in an AVID workflow for sharing information amongst post-production divisions.
I know that Phil Hodgetts he definitely made some changes for us specifically, which was really cool, because it was a functionality that we needed because these Avid editors, with their pre-existing workflows that they were comfortable with their handoffs, he made the changes to allow that.
HULLFISH: So why switch a bunch of Avid editors over to Premiere? What were the things that they were able to exploit in Premiere and the Adobe universe that couldn’t have been done in Avid to make “Deadpool” efficient to edit in?
NEDOMANSKY: I think because of the heavy VFX workflow and the opportunity to test out Dynamic Linking and the ability to have everything come out of Premiere as a hub and the history with Fincher and his experiences. Initially there were discussions of trying to cut at 3K or even 4K, so that right away ruled out any other NLE, be it Avid or whatever else.
HULLFISH: I thought FCP-X could do 4K native.
NEDOMANSKY: Well, I ran a huge number of tests to see how many 4K and 5K files we could play, how much effects stuff could we have and handle. So we really pushed the limit to see what was possible and that was part of the process. Given the technology and the tools, we wanted to see how far we could push it and we felt we should go at 2K… I don’t know if Avid could even do 2K when we started. I know it was locked in HD for a while. You’d know better than me what Avid could do.
HULLFISH: I think that 2K and 4K were introduced for Avid somewhere in the fall of 2014. We had to cut “War Room” in 1080p because we started that in June of 2014.
HULLFISH: I agree that is a major consideration… just earlier this week when we recorded this, Avid announced 64 tracks of audio… finally. But it definitely would have been a consideration a year ago.
HULLFISH: I agree that is a major consideration… just earlier this week when we recorded this, Avid announced 64 tracks of audio… finally. But it definitely would have been a consideration a year ago.
NEDOMANSKY: I made the editors a lot of audio pre-set effects, like telephone voice, and even we made some preset visual effects in addition to the pre-existing visual effects that are already built into Premiere which are incredibly powerful, I built a bunch of custom visual effects and also had a friend of mine, Jarle Leirpoll, who also does work on Premiere, he contributed some hand-held real-time camera shake effects of various varieties all based on keyframes he pulled from real cameras, so it wasn’t a wiggle function, it wasn’t a fake digital shake, it was actually mapped from real camera moves. So they were using those on shots that were locked off but they wanted them to feel handheld or give a little of that organic motion to and that was all real-time and you didn’t have to go someplace else to do that. We did it all inside Premiere.
HULLFISH: Let’s talk about the Dynamic Linking. Was there somebody on the team… were there VFX houses that were not local to the editing team?
HULLFISH: Did they do any Dynamic Linking to Adobe Audition?
NEDOMANSKY: I don’t think there was a need to. They were able to do all the editing and mix inside Premiere. Julian and Matt were both amazing with audio and there was no need to go to Audition.
HULLFISH: There’s a lot of stuff in Audition that I love, so I find myself going back and forth to Audition…
NEDOMANSKY: I send stuff to Audition all day long because I know Audition so well.
HULLFISH: But nobody else on the team was as comfortable in Audition as you.
NEDOMANSKY: Yeah. I think it was enough to be learning Premiere. I don’t think they wanted to be taking on another app, but more importantly, everything they really needed was inside of Premiere anyway.
HULLFISH: As I move back and forth between apps, there’s always something you miss that’s in the OTHER application. What were some of those things where you were so glad you were in Premiere?
NEDOMANSKY: The editors wanted to do multi-cam but they wanted to do multi-cam so that when you hit matchframe it would go back to the original camera with all of the original six production channels. The way that happens in Avid is the reverse of what happens in Premiere Pro. So they were saying, “this is what we want, but in reverse.” Then Matt, who had never used Premiere Pro was just messing around and he hit some button that switched it to work exactly the way he wanted, and we didn’t even know Premiere could do that. He solved the problem as a new user before I could even be called in to address it.
HULLFISH: Just because it was somewhat intuitive?
HULLFISH: Tell me about how multi-cam was used and how many cameras were going on and how were they organizing the multi-cam in the bins.
NEDOMANSKY: Matt was getting them in and syncing them – just grab two shots, right-click, Make Multi-cam. You get to choose which audio channel is going to be playing initially so you know you’re just getting the stereo mix usually. If you wanted to assign it to a lav, it would create the multi-clip with just that, and obviously you could change that later. So he could also do it by bulk. Grab an entire folder of a scene and grab all of them you can quickly multi-cam clip everything in there and make your isolations visible. Most of the time it was two cameras. On all of the heavier action scenes and fight scenes there were definitely more cameras. I think Tim said that at some point they were using seven or eight on some of the bigger gags. But most of it was two camera.
HULLFISH: Talk to me about the project window and what was in it?
NEDOMANSKY: Each project was different. Julian just had the current stuff that was going on. It was honestly kept very simple. It was Current Cut, Sound Effects, VFX, Stills, Slugs, … it was stuff everyone’s seen. There was nothing crazy about it. We tried to keep stuff not floating around and Julian was a trooper about it, cutting like a maniac for nine or ten months and it was the assistant’s job to clean up his project window at the end of the day because he’s pulling in stuff from everywhere. He was doing what he could do to keep it organized. There was nothing magical or any different than most movie. Matt – the assistant – his project was a lot more complex because he had to archive all of Julian’s previous cuts all organized for Julian and Tim, so Matt’s projects were immensely more complicated and deeper.
HULLFISH: How did that collaborative process work in Premiere? One of the things that’s typically attributed to Avid as its strong suit is the collaborative workflow. Obviously for “Deadpool” you guys needed to collaborate as well.
NEDOMANSKY: Twofold: We had six edit bays once we were all up and running. Everyone is pulling off the same open drive system that Jeff Brue created specifically for this project – an entire solid state, hugely immense storage. He’s the same guy that set up “Gone Girl” so that was just an extension of that with more drives. Talking to Tim I think he said he shot more footage than Fincher did on “Gone Girl” and that was 550 hours on “Gone Girl” so if his math is right, then “Deadpool” was over 550 hours. That all lived on the open drive on one system. Six people pulling off of it and collaboration was just communication. Each editor and each bay had their own folder with their name on it – as I said before – their most current work would be in there. And they would have other folders for where their previous work was. So that way you don’t have to call someone and say, “Hey, where’s your latest cut?” If you went to their folder, you’d know that whatever is in there is the most current. And we never had any issues. No showstoppers in terms of the shared storage or anything like that. So you can share, collaborate in Premiere very well. Six isn’t a large number, but about six months before that I had worked with Time Warner Sports in Los Angeles in El Segundo and they had just transferred from FCP7 to Premiere Pro and I trained 65 editors over two weeks and all of them had bays and they were running 30 edit bays off the same storage. They have five or six deadlines every day and they have to broadcast, so you can’t have any screw-ups and they were fine.
HULLFISH: But there’s a difference between sharing media and sharing a project.
HULLFISH: Love it. That’s awesome. Who were those other people hanging off of the shared storage other than Julian and his assistant, Matt?
NEDOMANSKY: The VFX editor who was constantly updating the latest shots coming back from internal VFX and external vendors.
HULLFISH: When you are replacing effects like that with Dynamic Linking or you’re doing the keying right in Premiere, and you’re switching from what you personally created as the editor and what the VFX house has created, are you maintaining the original edit choices so you can go back to your own edit? So let’s say I add a green-screen performance to a plate… that gets sent off to the VFX guys and comes back… in MANY workflows, that VFX shot gets added in a layer ABOVE the work I actually did, that way, if the performance needs to change or the timing needs to change, I can dump the VFX shot on the top track and go right back to my own rough comp so I can change the performance or timing or positioning or whatever while I wait for the VFX house to create the revised final effect.
NEDOMANSKY: We were doing the exact same thing in Premiere… just stacking the timeline… sometimes seven different iterations going vertically. It’s much easier to do that. So that was something Matt was doing. It was always available. For Julian’s timeline, we were trying to keep that as lean as possible, but he always had access to the original.
HULLFISH: And were they doing the same thing with sound? Were there sound handoffs as well?
NEDOMANSKY: Until we finally sent it in for the final mix, everything was done inside of Premiere. That’s how good these guys are… Julian and Matt. They had the sound built out so deep, so until we sent off the reels for final mix, we didn’t have anything coming or going like that. We were doing our own mixes along the way.
NEDOMANSKY: Surround. The rooms were all built for surround. They’re all 5.1. This is the kind of movie where you want the sound to wrap around you for sure.
HULLFISH: This has been extremely enlightening. Thank you so much for sharing your expertise. I really enjoyed talking with you about this.
NEDOMANSKY: Thank you Steve. It’s been my sincere pleasure to chat with another editor about how we spend most of our days! Feel free to include my contact info for your readers:
Vashi Nedomansky
Twitter: @vashikoo
Facebook: VashiVisuals
Email: Vashi@me.com
Coming shortly, I interview “Deadpool” editor, Julian Clarke, who also edited “Elysium,” “District 9,” “Chappie” and the 2011 version of “The Thing.” Read past Art of the Cut interviews HERE and follow me on Twitter @stevehullfish to hear more about upcoming interviews with “Spectre’s” Lee Smith, “Risen’s” Steve Mirkovich, “Whiskey Tango Foxtrot’s” Jan Kovoc (cut on FCP-X), and Margaret Sixel on editing “Mad Max: Fury Road.”