Back To Listings RSS Print

HPA Technology Retreat 2009

Three solid days of "Tech Treat".

By Adam Wilt | February 23, 2009


How is the Tech Retreat like (and unlike) a location shoot? Why does New York's Metropolitan Opera put ten minutes of solid white at the tail end of each live HD cinemacast?

The Hollywood Post Alliance 2009 Tech Retreat commenced at 2pm on 17 February 2009, the USA's analog shutoff day—whoops, its former analog shutoff day. It then ran for about 72 hours (with minor time-outs for sleeping) of conference proceedings, technology demos, roundtables, and (in a break with tradition) bowling (instead of softball).

The Tech Retreat is very much like a location shoot in that one travels to a distant city (Palm Springs), has early call times (7:30am), and endures exciting but exhausting 12 to 14 hour days.

It is unlike most shoots in that the craft services are over the top (internal catering by the Westin resort where the Tech Retreat was held; with all due respect to excellent craft services folks everywhere, where else does the dessert menu feature raspberry chocolate tamales?), and those long days are spent mostly sitting and taking notes: the result is that participants stagger home not only with full brains but with full bellies, too.

It's not possible for me to properly describe everything that went on, or to adequately convey the animated discussions between technologists, editors, broadcasters, and industry executives. Instead, I'll just cover a few highlights.


Some of the Trends at TR2009




  • 3D is alive and well: there was a panel on 3D production and distribution, and the demo room was full of 3D displays, using polarizing or shuttering glasses, lenticular screens viewed without glasses, and even white-light holograms containing up to 1280 frames of moving video (seen by walking past the hologram). JVC demoed an LCD TV with real-time synthesis of a 3D image from a plain-vanilla 2D feed in real time. It wasn't perfect (snow on distant mountains came forward too much, for example), but it was impressive. But the compromises involved in 3D presentation (the bother of wearing glasses; resolution loss in lenticular displays; etc.) keep it an open question as to whether 3D in any of its present incarnations will survive as a mainstream phenomenon, not just as a momentary novelty or fad.

  • The DTV conversion proceeds apace: Active Format Descriptors for automatic aspect-ratio conversion are now widely available in various bits of the processing chain, and most broadcasters are using them. In 2008, more HD sets were sold than SD sets. But the analog cutoff was delayed (again), and all press pool systems are still NTSC composite 4x3 with mono audio, despite component video being in use since 1981, stereo audio since 1984, and 16x9 appearing in 1985.

  • Metadata matters: as we move ever farther into file-based production, more intelligence can be moved into the files and the systems that handle them. "The inflection point between manual and metadata-drive operations has been crossed", especially in operations like broadcast automation and traffic management, format conversion and repurposing, and archiving.

  • Post houses are about community, not equipment: the days of heavy iron for realtime processing may be fading, but the value of the post house is in being a nexus for creatives, clients, and technological infrastructure. Indeed, the post house of the future may be a massively parallel-processing server farm, capable of storing, archiving, and batch-processing editing, grading, and processing tasks dispatched from the MacBook Pros of the editors sitting with clients in the front rooms—which, if present societal trends hold out, will come to resemble nothing more or less than really comfortable coffee bars. (OK, I extrapolated that last bit, but I dare any other Tech Retreat attendees to disagree with me on the fundamental vision!)

  • CPUs and GPUs are getting "fast enough" to do really interesting things: all manner of tasks requiring heavy lifting are now running entirely on general-purpose platforms: realtime 3D synthesis, near-real-time HD frame-rate conversion, and 2K and 4K color grading are all taking advantage of multicore CPUs and programmable GPUs, aided by fast storage pools. The expensive, drop-dead-gorgeous control surfaces are still there, but behind them stand off-the-shelf Linux, Windows, and OS X boxes with off-the-shelf AMD and nVidia GPUs in them.



Selected Scenes & Random Revelations





HPA organizer Mark Schubin holds forth in his analog shutoff T-shirt.


Television Engineer Mark Schubin organizes the Tech Retreat. If you read his articles in Videography you'll get a feel for the breadth and depth of his knowledge and interests, which he applies in his day job at the Metropolitan Opera in NYC. It's his encyclopedic eclecticism that drives the Tech Retreat program, making it the highest signal-to-noise event around for video technologists.



Thomson's 23-stop image, lit only by the 500 watt light in the image (inset: detail from that same frame).


What if your camera could dynamically control each pixel of its sensor, shutting off integration when it was full? Thomson put firmware in an Infinity camcorder and got 23 stops out of it for their efforts. The sole illumination in the sample frame is the 500 watt lamp aimed at the camera: as the inset shows, the filament in the lamp is resolvable, yet there's still separation between the darkest two bars in the DSC Labs CDM chart.



TDVision's 3D compression method: L or R image, plus difference data to synthesize the other view.


This may be the cleverest, most why-didn't-I-think-of-that? thing at the Retreat: TDVision's method for encoding a 2D-compatible 3D image. TDVision puts a 2D image into an MPEG stream in the usual way, and stores a channel-difference signal as an interleaved separate stream (think mid-side recording in audio). Any normal, 2D MPEG decoder sees only the normal 2D stream, and displays it. Any TDVcodec-enabled MPEG decoder (many existing consumer electronics decoders can be TDVcodec-enabled via a firmware update) will be able to reconstruct both views of the 3D image and output it in the form required for the display being used: line-interleaved, checkerboarded, anaglyphic, etc. It's entirely display-agnostic at the distribution end (Blu-ray, OTA, etc.) even to the point of not caring whether the viewer has 3D capability at all. Brilliant.



JVC's GY-HM700 1/3" 3-chip QuickTime camcorder with 20x Canon zoom and optional SxS recorder.




SDHD slots on the JVC GY-HM700



The JVC GY-HM700 is the big brother to the GY-HM100 shown at the FCPUG SuperMeet at MacWorldExpo 2009. It's a 3-chip camcorder recording long-GOP MPEG-2 at 25Mbit/sec or 35 Mbit/sec to SDHC cards in a QuickTime wrapper, so the clips are directly editable in Final Cut Pro without rewrapping or re-encoding. With the optional SxS adapter, it can record the same material to SxS cards, too. The 700 is supposed to ship next month with a brand-new Canon 20x zoom (the one shown here is the only one in the USA at the moment, or was; it was supposed to go back to Japan as soon as the Retreat was over), and I've asked JVC for one to review.


Next: Codec concatenation, a webserver in a Varicam, dynamic-range measurements, Cinnafilm, S.two, a digital test pattern, and why the Opera broadcasts white...

Page 1 of 2 pages 1 2 Next »

Editor's Choice
PVC Exclusive
From our Sponsors

Share This

Back To Listings RSS Print

Get articles like this in your inbox: Sign Up

Comments

Please login or register to comment