Back To Listings RSS Print

HPA Tech Retreat Day 1: Super-sized Session: More, Bigger, but Better?

Looking at HFR, HDR, high-res, immersive sound, and more...

By Adam Wilt | February 19, 2013

The Tech Retreat is an annual four-day conference (plus Monday bonus session) for HD / Video / cinema geeks, sponsored by the Hollywood Post Alliance. The program for this year's show is online at http://www.hpaonline.com/2013-program and the opening "Super-sized Session" was on "More, Bigger, but Better?"

Herewith, my transcribed-as-it-happened notes from today's session; please excuse the typos and the abbreviated explanations.

(During the rest of the week, you can follow the Tech Retreat on Twitter with hashtag #hpatech13, thanks to various Tweeters in the audience. I will of course post my notes at each day's end.)


Intro:

By the end of the year, predict we may not deliver any films on film in the USA.

3 new technologies, because "it's all about the story":

1) HDR: we've been at 16 foot-lamberts in the theater for a long time, limited by flicker. Digital doesn't flicker (the same way); it's time for brighter pictures.

2) HFR: variable frame rates to tell the story. 48fps has been underutilized; not for everything, but useful. 48fps for 2D for greater immersiveness.

3) Immersive sound: $50-$100K per theater, so not cheap, but big, loud 7.1 sound (and beyond: object-based sound) really helps.

4K/8K? Not so much. Fine for production, display, not as important for distro. Hard to tell the difference from 2K on the screen [I'm not so sure, myself, but they didn't ask me! -AJW]


Leon Silverman:

We look longingly at bigger and better — yes, Virginia, size does count! How many Ks are OK? More res, more color, more dynamic range, more sound (5.1, 7.1, Imax, Barco Audio, Dolby Atmos all mixed for the new Disney Oz film). UltraHD is quad HD, 3840x2160, but true DCI 4K is 4096x2160. Bigger is better, but costlier; increased storage (4K adds 7-10% to the cost of VFX over 2K). Current HD / DPX tools need extension for higher res, wider gamuts, OpenEXR, ACES; move past proprietary color sciences and the "snowflake workflows" (HPA 2011; provideocoalition.com/awilt/story/hpa_tech_retreat_2011_day_1/) they engender. How to properly screen / evaluate these new media? How do we store it: "What is a Master? How does that relate to Archive? With so much stuff, where do you put the stuff?" 300 TB at the end of "Tron". The new Oz film was 300 TB of compressed RED raw. At $0.01 per Gig per month (Amazon S3), 5 TB (a day) is $51.20/month; $300 TB is $3072/month... compared to storing a can of film: 12.5 cents a month!


4K for TV

Phil Squyres, SVP Tech Ops, Sony Pictures Television (SPT)

A case study of Sony's experiences in producing episodic TV in 4K. Is 4K the new 3D? Big topic at CES, seems more viable than 3D; already in cinema; easier to extend HD tools to 4K than to 3D.

SPT has shot 5 productions in 4K: "Made in Jersey", CBS, DP Daryn Okada, F65; finished in HD. "Save Me", NBC, DP Lloyd Ahern, F65, 4K finish at ColorWorks. "Masters of Sex", Showtime, DP Michael Weaver, F65, 4K finish @ ColorWorks. "Michael J Fox", NBC, DP Michael Grady, F65, 4K finish @ ColorWorks. Plus: "Justified" seasons 3 and 4 RED Epic 4K, DP Francis Kenney, delivered to F/X in 4K. "Justified" is a candidate for 4K remastering in future. "Breaking Bad" shot 35mm 3-perf, remastering in 4K at ColorWorks (season 2 in process). SPT will shoot 3-5+ pilots in 4K this year, either all F55 or combos of F55/F65 cameras. (All these examples are roughly balanced between comedies and dramas.)

Why 4K? Future-proof library for UltraHD. Creative reason: 4K raw capture gives incredible latitude, color space, freedom for blow-ups and repositions. Raw is closer to film; can treat more like film (latitude, color). F65 especially handles mixed color environments very well; don't need to gel windows or change practical lighting so much.

Bill Baggelaar, SVP Technologies at Color Works, says: shoot in 4K for the future. For sustainability of the post houses, transitioning to 4K is the way to go. ColorWorks designed from the get-go to support 4K for cinema, TV, remastering. Today's priority deliverable is HD, can't jeopardize it by focusing on 4K, so focused on tech that allowed 4K work with 4K and HD each being "just another render". Centralized storage at SPT, the Production Backbone, allows conform and assembly of 4K assets in Baselight, seamless combination of 2K and 4K in Baselight, transparent upres/downres in postproduction, allowing HD workflow with a 4K render at the end of the day. Working resolution-independent: you don't HAVE to have a backbone, but you do need to be able to handle 4K data. ColorWorks partnered with FilmLight (Baselight's maker), but other tools work too. It's early days yet; things will get fleshed out. VFX for TV: some things better at 2K, some at 4K, depends on budgeting. Capturing at 4K gives better downsampling/upsampling  for HD VFX (4K's oversampling makes a better HD picture).

Camera packages should NOT cost more. Cost of media can be moderated by investment (cheaper media such as SxS cards; SPT bought 150 SxS cards and reuses 'em; SPT buying F55 media the same way). Moving large files can be minimized by workflow design. New options reduce costs of grading and finishing 4K (e.g., DaVinci Resolve).

On set: F65 smaller, lighter than F35 or Genesis. Bulkier than Alexa but shorter. Epic and F55 are smaller and lighter than other large-sensor digital or 35mm film cameras. 4K is easier to shoot than 3D! SPT looking at combinations of F55s and F65s.

Media: for local shows, camera media sent straight to lab, no on-set backup. Typical F65 shows shoots 4 1 TB cards per day (not filling 'em up, necessarily; a card is about 100 minutes), two cards in each of two cameras, morning and night. No on-set wrangling, saves time and money; little chance of loss in transit. Consider buying cards, not renting.

Out of town: experimenting with two optional workflows for local backups (was NEVER done with film or videotape, by the way, and we rarely suffered for it). 1: parallel backup with simultaneous HD recording. 2: parallel HD recording (in DNxHD 115 for "proxy") with 4K local backup (near-set backup to 4K RAID, where it's kept until shuttle drive media are received at lab; don't need to overnight, ship once or twice a week instead, much cheaper!).

Dailies: processing 4K dailies is problematic! Opted to make a proxy copy early on; a proxy for HD editorial dailies. A challenge for onset systems, best done at lab.

Editing happens with DNxHD 36 on Avid, VFX pulls uprezzed as needed.

Why finish in 4K now? Why not edit in HD and hold the 4K for remastering later if needed? After all, no one is watching 4K TV today...

...but then you do the work twice; the second time is more difficult (learning this on the Breaking Bad remastering project). Always last-minute fixups and changes, not well documented (or ever documented); can be several hundred "undocumented fixes" in 100 episodes, and rediscovering the "recipes" for making the fixes and redoing them is very expensive and troublesome. QC takes longer than real time, adds 1-2 hours per episode for these fixes (just to find 'em, not to fix 'em!).

Thus the preference for today's productions becomes: Finish in 4K - IF YOU CAN.

Conforming: Avid Media Composer to read and organize EDLS and bins; Smoke to conform the Avid cuts and insert FX and fixes; Baselight for final conform (using original raw files) and grading. Traditional stages of finishing, but file-based.

Archive: 2 LTO sets made in dailies, of the 4K masters and the HD proxies and editorial elements. Once 4K pushed to the backbone, LTOs made of all 4K footage and mastering elements.

At this point, 4K doesn't take any more time than HD does. The only difference is the final assembly on Baselight and opticals on Smoke.

Q&A: What happens when 8K comes along? Breaking Bad may not be "for the ages" but it made sense for a 4K remaster, especially as we're set up for 4K digital work. What about the color issues? How to convey looks? One way: DP has a DIT creating LUTs, LUTs go to lab. LUTs / CDLs embedded in ALEs. Another way: DP created 5 looks beforehand, chose a look on set and lit to it, then those CDLs set downstream. Third way: design looks with colorist, share with DP. In all cases CDLs carried in ALEs and used throughout editing. Do you ever look at the 4K in dailies, or is all work done in HD? Proxies made from the 4K masters; if there's a problem in the 4K it'll show up in HD. No one edits in 4K today. True, you may not see some fine details in DNxHD 36, but it's not a huge problem. Eventually we'll edit in 4K and make HD a final deliverable, but today it's an HD-oriented workflow.


Immersive Sound / "Elevating the Art of Cinema"

Stuart Bowling, Dolby

What's needed: better audio experience (richer palette, more artistic control); easy and intuitive (new authoring tools, fit current workflows), simplified distribution (single deliverable format for all different room types), scalable playback (for very different theaters). Dolby Atmos: arrays of speakers along walls and overhead; full-range surround speakers. Rows of speakers along walls, dual rows running down ceiling. "Channels" for complex audio textures, "objects" for sharp dynamic sounds; channels + objects = Atmos.

Authoring tools sit within ProTools 10, HDX, and HD-MADI. Sends 128 channels to Dolby RMU renderer, which feeds speakers. The theater renderer has a map of the speakers and maps the sound precisely to match the theater configuration. Working on console integration: AMS-Neve DFC in beta, System5 EUCON, JL Cooper.

 


Panel: Object Based Sound Design and Mixing: How It Went and Where It’s Going

Moderator: Stuart Bowling, Dolby
Craig Henighan, "Chasing Mavericks" (Sound Mixer)
Gilbert Lake, Park Road Post, "The Hobbit" (Sound Mixer)
Doug Hemphill, "Life of Pi"
Eugene Gearty, "Life of Pi"
Ron Bartlett, sound designer & mixer, "Life of Pi"

Craig, one of the first to mix Atmos on Mavericks: an eye-opening experience. Did 7.1 mix, happy with it, but Zanuck theater was being refitted with Atmos, so remixed for Atmos. Put 7.1 mix in the bed, wound up cutting New FX, the overheads really gave a better sense of being on the beach or in the waves, much more immersive. The real trick is subtlety, nice atmospheres, could put a seagull off to the left and have an echo off the back wall, Wind and water sounds, a wave splash; 5.1 and 7.1 quite frankly can't do this.

Doug, "Life of Pi" mixed in Atmos: also happy with our 7.1 mix. "Embrace the audience" with this speaker system. "Why no speakers on the floor?" "Americans spill their Cokes on the floor!" Ideas led to ideas... requires someone facile with the tech.

Eugene: Studios wanted more music, Ang Li wanted to let the actual sound tell the story. Simplicity. Impressionistic. It's not about hearing every detail, it's about making an impression. It'd be better to start sound design with Atmos, not reworking from a 7.1 basis. 9.1 channels, but within each channel an object can be placed very precisely. We thought it was just going to be about sound effects, but there was a lot of work moving the music around the theater, putting instruments in different places.

Gilbert, used Atmos on "The Hobbit": wanted to match what 48fps did for the picture with sound. Started with simple panning, developed into a "creative wonderland" with so much possibility. New tech, wanted it to work with existing workflow; Dolby responded with EUCON integration with System5. Two rooms at Park Road, a 7.1 room and an Atmos room; simultaneous mixes as the schedule was so tight. EUCON automation meant 7.1 automation could be called up in Atmos room and reworked.

Ron: "Life of Pi" had a really great score. Went through the whole film, experimented; the composer liked the Atmos mix but wanted it pushed farther (putting choir in back, for example), everything we could do to envelop the audience ("give a hug" with the sound). For "Die Hard", putting the sound IN the theater, not just along the walls and front. Made dialog clearer despite a very dense sound palette.

As the sound field increases in intensity, placement and localization suffer; Atmos helps preserve them.

Imagine the opening shot in the original "Star Wars" with that ship coming over; now imagine being able to do that same move with the sound. Great "bang for the buck" with Atmos; very cost-effective way to add production value.

With "Hobbit" there are giant FX sequences; you know you can do things with that. But it's the Gollum riddle sequence, story-driven, bouncing echoes off the roof of the cave, splashes and water drops, added more sound here than anywhere else. Nice when it's low-level. Also Bilbo and the dwarves in the cave, sleeping, putting each snoring dwarf into his own single speaker, for pure entertainment.

Exhibition is under a lot of pressure from competing media, Atmos helps bring the "special event" experience back to the theater in a way you can't replicate at home [...yet! -AJW].

Q&A: Do you want to build a dramatic soundscape, or represent reality? Both, really. Can you bring in 3D object data from an animation system and use that as a basis for sound object placement? We start with the story; we let the movie tell us where to put the sound. Very much performance-based, audience-based, following our instincts.

 

Next page: what does the next generation want;  premium large format theater mastering; 24p, and The Hobbit

Page 1 of 2 pages 1 2 Next »

Editor's Choice
PVC Exclusive
From our Sponsors

Share This

Back To Listings RSS Print

Get articles like this in your inbox: Sign Up

Comments

Please login or register to comment