The “COMINATCHA Tour,” which started last November to match the release of the Japanese band WANIMA new album, was planned as usual, but due to the national stay at home order, many of the shows originally planned had to be cancelled. While uncertainty did not allow the band to hold large scale concerts, WANIMA decided to keep in touch with fans and opted for a streamed concert at ZOZO Marine Stadium, where they had planned to have their tour final concert.
Nothing can beat the experience of a live concert, both for the performers and the public, but we’ve, throughout this year, learned to see things differently, and if one good thing comes out from this is that live streaming opens shows to potential wider audiences. Besides, it offers a unique experience, different, like nothing that a live show allows.
One good example is Nick Cave’s ‘Idiot Prayer’ livestream, which, promised the cinematographer behind the adventure, “captures the purest form of his music”. Interviewed by NME, the award-winning cinematographer Robbie Ryan (The Favourite, Marriage Story, American Honey) revealed how it was to film the live streaming and how it relates to live events.
Show things a live performance can not
WANIMA’s live stream also had to go beyond the capture of a regular live event, so a Japanese production company, Wamhouse, was hired. Kazuaki Nakamura, producer at Wamhouse, and who supervised the live streaming concert, said: “The management of WANIMA wanted to let the band have their tour finale as they had to cancel many of the shows originally planned. And if we streamed their show, we wanted to show something new which the audience could not see at an actual live concert.”
“We wanted to show a collaboration of technology and entertainment – he added – by taking advantage of not having an audience at the venue and letting us set cameras up wherever we wanted. And we wanted to program switching of the show to allow us to show things a live switched performance could not.”
To embody Nakamura’s visual idea for this streaming concert, he brought on Yoshimichi Suemune of CREATIVE ORCA and Shuji Iyama, Fubright Communications. Suemune and his company are known for robotic motion development and content productions, and Iyama is an experienced engineer who has done many advanced system developments. The “no audience” streaming concert was captured and streamed live to more than 100,000 fans using a new live production system named “soundiv”.
The new “soundiv.” live production
Warmhouse used the empty seats at ZOZO Marine Stadium, which can accommodate more than 30,000 people for live shows, to place 25,000 glow sticks to light the venue. A crane and drone for shooting were brought in and used around a special center stage as well as the main stage. Then, the new “soundiv.” live production system was built, which included 24 Blackmagic Micro Studio Camera 4Ks, an ATEM Constellation 8K switcher, a Smart Videohub 40 x 40 router, DeckLink 8K Pro capture and playback cards and a number of Video Assist 4K monitors/recorders.
The 24 Micro Studio Camera 4Ks surrounded the center stage, with camera outputs sent to the ATEM Constellation 8K, which was programmed beforehand to switch camera views much faster than normally possible to give the effect of looking at a single camera output. The ATEM Constellation 8K was also used to switch feeds coming in from a 360-degree camera installed at center stage.
“Blackmagic’s open SDKs lets us be creative and was one of the reasons we chose Blackmagic,” said Iyama who took part in developing the soundiv. system. “We could build the ‘soundiv’ system to switch the ATEM switcher’s inputs at high speed, change the angle of the 360-degree camera’s output and control the timing of switching based on each song. The output from the 360-degree camera was also captured via DeckLink 8K Pro and the images processed by the Unreal Engine and then sent to the ATEM switcher.”
Suemune, who designed the system’s operation workflow and customized video production, said: “Even though we used SDKs to develop the system, Blackmagic products already had many features that allowed us to do what we needed without customization. For example, it was very time consuming and not realistic to set all 24 cameras one by one, but we could control camera settings such as color and shutter speed all at once on the switcher side using ATEM Camera Control.”
Flexible ATEM switcher
Sending ancillary data from the ATEM Constellation 8K to a Smart Videohub router, and then distributing to each Micro Studio Camera 4K, allowed the technical crew to control the camera settings via ATEM Camera Control. For focus control, Suemune selected each camera from the switcher and adjusted focus using focus assist feature on the Video Assist 4K.
“Allowing us to control camera settings on the switcher side helped us a lot as it was such a hectic schedule and it was must have feature for this project,” he continued.
Suemune also appreciated the ATEM Constellation 8K’s many inputs and outputs and M/E layers. “It was hard to find a switcher which could show a 24 camera Multiview. It allowed us to use M/E layers for more creative purposes other than just switching. We had switching for clockwise, counterclockwise and special switching called split, set on a different M/E layer. We switched these and feeds from the 360-degree camera, which means mixing different programmed switching to create new imagery expression.”
“The programmed switching – he continued – was very stable and working with another switcher for streaming went well. We also needed many switcher outputs for each M/E layer feed and the 360-degree camera feed, to send them to the streaming switcher and to a recorder as well as output for camera setting and calibration. The ATEM switcher had enough outputs which we could set flexibly to meet our needs.”
Nakamura concluded: “Live streaming cannot be the true alternative to live concert experience. However, we can still offer new visual entertainment that audiences cannot see in a regular live concert.”