Rapper Azealia Banks’ latest single “Wallace” is as inventive as her music and her entire persona. To showcase this innovation, the brand agency COLLINS created an immersive app that lets viewers become part of the video and control Azealia’s movements. Custom software designed by COLLINS tracks the viewer’s facial movements through their web camera, allowing Azealia’s audience to physically interact with the webbased video. About halfway through, the viewer appears in the video behind Azealia. Fans are embracing the experience in droves and posting hundreds of “selfies” of their coappearances with Banks on social media.
Here, we talk with Director Nick Ace and Director of Experience Design Brett Renfer, both of COLLINS, about their roles in creating the unique app and music video experience, with help from Adobe Creative Cloud for teams.
Rapper Azealia Banks’ latest single “Wallace” is as inventive as her music and her entire persona. To showcase this innovation, the brand agency COLLINS created an immersive app that lets viewers become part of the video and control Azealia’s movements. Custom software designed by COLLINS tracks the viewer’s facial movements through their web camera, allowing Azealia’s audience to physically interact with the webbased video. About halfway through, the viewer appears in the video behind Azealia. Fans are embracing the experience in droves and posting hundreds of “selfies” of their coappearances with Banks on social media.
Here, we talk with Director Nick Ace and Director of Experience Design Brett Renfer, both of COLLINS, about their roles in creating the unique app and music video experience, with help from Adobe Creative Cloud for teams.
Adobe: Tell us more about your backgrounds.
Renfer: I studied at the College for Creative Studies in Detroit and then went on to work at Rockwell Group, an architecture and design firm. As the Director of Experience Design at COLLINS, I have done everything from responsive architecture to interactive music videos. My job here is to create anything that one or more persons can experience in an immersive way. I love the convergence of physical and digital worlds and learning how people interact when the two come together.
Ace: I got my first copy of and taught myself Adobe Photoshop at 16. At the time I was playing drums for several bands while creating flyers for performances in Photoshop. Without me knowing, my sweet mother took all my flyers to Sage College of Albany. She landed me a scholarship. That’s how my career in art got started. After five years of working at record labels, and obtaining an MFA at SVA here in New York, I took a job as creative director for a hiphop and street culture magazine in Los Angeles. Now I’m directing films, creating animated pieces, and designing other experiential events here at COLLINS. And, of course, thank you Mom.
Adobe: How did you two start collaborating?
Ace: Brett and I started working together about a year ago. We were having lunch and began to generate wild ideas. I’ve always made documentaries, music videos, and graphic design. And Brett and I saw that many of us here at COLLINS were already operating in this intersection of design, film, music, and digital–so we wanted to explore ideas that brought all of these disciplines together. My most frequent collaborator, Rob Soucy and I had just begun working on films with Azealia when all of the pieces fell into place.
Adobe: How does the app work?
Renfer: The app switches between six different parallel video tracks as you move your head around in front of your webcam. Nick and I had already been doing some research into face tracking software when we started the project. Then Nick had a eureka moment in the shower when he saw his face moving in his shaving mirror.
We built our idea around this simple ritual, and designed a sort of “liquid” effect that lets us show several videos at once, melt them together, and have them feel psychedelic and surreal. It wasn’t easy.
We had to figure out how to mash together webcam tracking technology, multiple pieces of film, and our technical and design toolkits while centering everything on a narrative. We also had to combine closeup takes and kaleidoscopic background edits. Luckily, I was collaborating with my friend Lars Berg to bring his WebGL muscle to the project. There was a lot of trial and error, a lot of bizarre glitches, and some ugly technical surprises. But then there was an amazing moment when everything came together–it was really organic and fluid.
Adobe: What tools did you use to complete the project?
Ace: We have licenses of Adobe Creative Cloud for teams, so we have full access to all of Adobe’s creative tools. For Azealia, we first laid out the idea in a composite created in Photoshop CC and a storyboard created in Adobe InDesign CC. We used human elements–like Azealia’s lovely face–to build out the foreground and background. Then we storyboarded how the experience would be controlled by the viewer’s face. Azealia was all in once she saw everything so we were shooting three days later. We had engineered a solid working app about six weeks later.
Adobe: How did you handle the video and effects?
Renfer: Once we had the footage, we ran into a hard limit on the web where we could only play six videos from one site at one time. We had tons of cuts and had to smash them together–and they were shot green screen to boot. So, we had to max out the amount of video we could feasibly stream and process the green screen.
We set up a huge number of comps in Adobe After Effects CC. We had some experiments that didn’t work–like encoding multiple videos into different RGB channels–but it was still too much data to process. In the end, we set up a queue in After Effects with various tiers of quality and ran the footage through Adobe Media Encoder CC to compress everything for the web and pull the video content into the app. The video technology was amplified by the Google Cloud Platform, specifically the Google App Engine and Google Cloud Storage.
Adobe: What about the video elements and the overall interface of the app?
Ace: We were working with several editors using a mix of tools, but the key one we used was Adobe Premiere Pro CC. We cut the final video in Premiere Pro CC. We switched from Final Cut Pro to Premiere Pro CC. We love how it integrates with Adobe’s other tools.
Adobe: Were there any other parts of the process where Creative Cloud for teams came into play?
Ace: We did all the title work and designed the interface and trailer elements in Adobe Illustrator CC. We even retouched our case study stills in Photoshop. Adobe could even be cited in the credits for this project, for sure.
Adobe: What’s been your experience with Creative Cloud for teams?
Renfer: We moved to it from Creative Suite 6 last year and were thrilled. I remember when I used to have to install six discs. I admittedly have a nostalgia for physical media but there are limits… Now, I click a button and all that great software appears on my desktop and constantly updates itself. And the data recovery—especially in InDesign and Photoshop—is fantastic.
Adobe: What other types of projects have you been working on together?
Ace: We’re working with a range of clients, from Spotify to Target to museums and independent artists. We always ask, “What’s the most meaningful story that sits at the center of this problem, and how can we best express that?” Our point of view is that 2D, static design is increasingly dull, so we aim to create enveloping experiences instead. Things that react, respond, bring people together. We feel like we broke new ground on this with “Wallace.” A big movie studio – even Forbes magazine called to ask “how the hell did you do THAT?”
We didn’t tell them.
Azealia Banks – “Wallace” – Case Study from COLLINS on Vimeo.
Learn more about Adobe Creative Cloud