Since it was announced by Apple last year, immersive-minded filmmakers have been waiting for the arrival of the Blackmagic URSA Cine Immersive. While it’s available for preorder (now US$32,995/UK£25,049/AU$49,005) it’s not quite ready for sale just yet (UPDATE: it’s expected to arrive in mid-summer). However, the camera and its workflow has been shown off at NAB, and some really interesting details have come to light.
Many thanks to those who’ve shared their news from NAB, including our esteemed editor Scott Simmons, independent immersive filmmaker Hugh Huo, Reuben Evans from CineD, and the anonymous person on Reddit who shared their recording of the entire Blackmagic presentation. Please follow the links to check out all their coverage for more details, but I’ll summarize here.

Camera details from Blackmagic
Interestingly, the URSA Cine Immersive doesn’t use the same sensor as the monster URSA Cine 17K 65. Instead, it uses two of the sensors used in the URSA Cine 12K LF, and this is so it can hit a 90fps frame rate at the monster resolution of 8160×7200 per eye.
Despite the huge numbers, the camera is easier than most cinema cameras to use, because a lot of the settings are locked down: resolution, frame rate, focus, aperture and zoom are all fixed and cannot be changed. However, there is an internal ND (2/4/6/8 stops) to control exposure, should the 16-stop dynamic range not be enough. The image is reportedly in focus from about 2m, and slightly soft at 1m. (UPDATE: the aperture is approximately 5 to 5.6.)
The included 8TB media module can record approximately 1 hour and 38 minutes of footage at the recommended 12:1 BRAW compression setting, though a 16TB media module will also be available.
A stereo mic is built in and there are two XLR feeds, but if you want to record ambisonic audio, you’ll need to use something else. (While technically you could pop an iPhone on top and hit record, if you’ve got the budget for this camera, you can probably find budget for a pro-level ambisonic mic.)
On the side, the viewscreen shows one of the two eyes at a time, though it’s easy to switch between left and right to check for mismatches like glare. Though the view is a camera-native spherical fisheye, it’s possible to overlay an outline to give a rough indication of the field of view possible to a viewer in an Apple Vision Pro, and also to punch in to check detail.
This will certainly help with framing, but anyone coming from 2D filmmaking has a big mental shift to make. This is not a camera intended for handheld work; it’s designed to be stable and level at all times. Though Apple have shown that immersive cameras can move, through their Submerged and Metallica immersive productions, those movements still have to be smooth, and probably (if you want to follow Immersive Company’s guidelines) along a single axis at a time.
With a lens separation (IPD) of 64mm to match the average human, the dual lens is built-in and factory calibrated. Those calibration details will be recorded in the files as metadata, and that’ll be important later.
Workflow details too
If you connect to the camera’s 10GB ethernet, you can start by editing direct from its media module, if you wish. That native 8160x7200x90fps — in stereo! — means you’re pushing 24x the data of a single regular 4K stream, so you’ll need a beefy Mac to work with the footage. Their baseline recommendation of a Mac Studio with M3 Ultra wasn’t a minimum spec, but MV-HEVC encoding speed on this configuration sat around 20-30fps. Since the original footage is at 90fps, that means 3-4x program runtime to export, so using a slower Mac could be too slow for comfort. I still look forward to throwing some test clips at my M3 Max to see what happens.
(Note: at NAB, an unreleased private beta version of Resolve 20 was used to demo immersive workflows, but these new features aren’t yet in the public beta. When that arrives, you’ll need to choose the Apple Pro Immersive Workflow when setting up your project.)
One fascinating detail is that the camera-native clips remain in the same format all the way through the pipeline until they hit the Vision Pro. There’s no Mercator-style reprojection from the original spherical image format during editing, and the metadata from the camera that described the lens characteristics survives all the way through. Instead, the Apple Vision Pro itself is responsible for using that metadata to reproject the original camera file into your eyes.
Historically, most immersive editing workflows would translate camera-native footage into something like “equirectangular” (used for 360° footage) or “half equirectangular” (for 180° footage) but if you avoid ever making that conversion, you can ensure quality remains as high as possible and also (presumably) save quite a bit of processing effort.
This does have implications for transitions. While cross dissolves and dips to color are supported, they’re effectively applied at viewing time. Because more than one camera — each with unique lens characteristics — could be in use in the same timeline, a transition might to move between two different image projections, across left and right angles for each stream.
Another valuable tidbit of information is that though Blackmagic uses the P3 D65 HDR Gamma ST. 2084 with 1000 nits, the maximum brightness that the Apple Vision Pro will display is just 108 nits. Whether this is due to a particular cinema standard (Dolby Cinema 3D is set to 108 nits, which matches up) or due to hardware limitations, it’s good to be aware of.
Distribution news from Apple
While the ultimate distribution pipeline remains unclear, Apple has stepped up with a new free tool available on the App Store now for Mac and Vision Pro — the Apple Immersive Video Utility. With this app, it’s possible to create streamable media from local files and to maintain a library of these prepared streams on your Mac, organized into playlists. It’s also possible to link to one or more Apple Vision Pro devices running the same app,, and then to stream a video from your Mac to all those devices at once — even controlling playback from the Mac if you wish.

This is fantastic news for anyone preparing immersive content in post production, and also ideal for anyone running installations or live events that involve immersive content. The recent updates to visionOS 2.4 make it easier to explore 3D spatial content, and we now have an option for sharing our own immersive content too.

You may be wondering how we can make our own immersive content, given that the 16K cameras from Immersive Company and this one from Blackmagic aren’t yet available for purchase. Well, there are a few ways to go, including a 180° VR mod for the Kandao QooCam 3 Ultra, and the Canon R5 with the 180° lens. While neither of these options approaches the 16K 90fps options that the big guns do, you can still create 3D footage that works with Apple Immersive Video Utility and then loads onto the Apple Vision Pro.

According to the official user guide, to import media and create an AIVU file, you’ll need to import an MV-HEVC file containing the image data, plus an Apple Immersive Media Embedded (AIME) metadata file that’s generated by the camera on set during the shoot. (Presumably, this is the file which Blackmagic’s camera creates and which travels through the pipeline, but I’m waiting for confirmation.)
Good news, though — if you don’t have an AIME because you’ve made footage with another camera, there’s a button marked “HEQU” which makes it work. Although this isn’t clarified in the user guide, it would match up with “half equirectangular”. (This term is documented by Mike Swanson in his detailed exploration of the Spatial format.)

Pushing the boundaries without a fancy camera
For a variety of tests, I’ve been using my own Final Cut Pro plug-in, Spatial Kit, for alignment and correction of footage from stereo lenses on a single camera. (Full disclosure: yes, I do profit from sales of my plug-in!) I’ve also manually manipulated separate left and right clips for clips shot on separate cameras, without any plug-ins.
Bear in mind that you shouldn’t simply treat regular Spatial clips as Immersive — the focal length is all wrong, and the image will appear far too large if you fill the frame. Instead, to use Spatial content in an Immersive frame, shrink it down so it’s in the center of the frame. Some VR-focused cameras and lenses do actually produce footage with a narrower field of view than 180°, so it’s not unprecedented — in fact, Apple themselves have used this approach in their Immersive production Man vs Beast.

As part of some more extreme tests, I’ve taken footage shot on a GH6 and a GH7 in a synced side-by-side configuration, with a relatively massive interpupillary distance (IPD) of about 150mm between two 14mm lenses. There are many reasons why this isn’t a good idea, but if you keep your subject at least 4m away, the large IPD shouldn’t induce a headache.
Shooting on both cameras at 5760×4320 in open gate mode, I’ve then lined the two clips up in a stereo timeline, shrunk the clips down to the center of the frame to minimize distortion and keep the apparent size roughly correct, and exported to an 8Kx4K side-by-side HEVC file direct from Final Cut Pro.
The Apple Immersive Video Utility expects stereo input, but doesn’t mind if it’s side-by-side or Spatial, and using the HEQU option produces a streamable file quite quickly. This streams to the Apple Vision Pro in the matching app, and looks… actually pretty good?
It’s not technically correct in terms of its projection, but it’s not wildly distorted and the 3D effect is strong (thanks to the large IPD). From a workflow point of view, this app will drastically reduce the time it takes to move from output to on-device full-quality review by one or more people. Also, while this app is entirely focused on Immersive and not Spatial content, if you’re interested in bridging the two standards, or you’re producing Immersive content at any level, it gives you a quick pipeline.
But… how high can you go? Well, if you’re willing to wait for the encoding, you can use Compressor to export a 7200×7200-per-eye Spatial file.
Despite complaints that the file size exceeded the width allowed, my 3.5 minute test clip took about 20 minutes to encode on my M3 Max MacBook Pro, creating a 3GB file that QuickTime Player was happy to play back. While Apple Immersive Video Utility took less than a minute to create its streamable version, sadly the Vision Pro refused to co-operate.

Tests with a 6000×6000-per-eye clip also sadly failed. Clearly there are some real limits to such massive files, and the devil remains in the details as to what’s going to eventually be possible.

While final distribution options are still limited, it’s possible to distribute through third-party apps like Prima Immersive and Theater today. Since Apple have a pipeline for distribution themselves, I’d expect to see third-party Immersive content in the TV app (AKA the iTunes Store) eventually too. And very interestingly, the Apple Immersive Media Utility shows this error before it connects to the same app on the Mac:

So… what’s an Apple Immersive enabled application? Will developers easily be able to create an app on the Mac that ties into an Immersive view here? Fingers crossed.
Conclusion
Immersive content has gotten a huge boost in the last few days. Blackmagic’s moved a lot closer, Apple’s helped workflows along, and many more video professionals at NAB have been able to view immersive 3D content for themselves in the Apple Vision Pro. As Blackmagic said on stage, it’s something you really have to experience for yourself; talking about it is like trying to describe the taste of an orange.

In the future, when more people have the ability to view 3D Spatial and Immersive content, the audience will absolutely want more — it’s really compelling. And while I’m very happy to explore Spatial content myself today, I can appreciate the benefits of the no-compromise 8K-per-eye-90fps Immersive approach on show that Blackmagic has on show. It’s as close to reality as we’ve ever seen, and that’s really something new.

Filmtools
Filmmakers go-to destination for pre-production, production & post production equipment!
Shop Now