Articles

How the music video for H.E.A.T was made in Virtual Production

I started the whole journey into Virtual Production by trying to make music videos in a cost-effective and fast way with amazing scenery. With this video I think I really come full circle. Read a long for some insights in the process of making a virtual production music video.
Richard Frantzen

Richard Frantzen

What should we do?

As always the creative process starts with lots of meetings with the band about exactly what we should do. This is always troublesome since a band consists of several people with different opinions. And it doesn’t make things easier that the possibilities in Virtual Production are endless.

We shot a few ideas around and it was hard to nail down a recording date since the band and studio were very busy. Also, the which song to record kept changing. This meant that we changed ideas several times. 

Keep in mind we need time to prepare the 3D world in advance as well so as the final shooting date got closer we decided on the broad concept of a H.E.A.T City where the band has taken over a whole city and made it their own complete with fans and everything and a Macy’s Parade thing.

Inspiration comes from that one gig Beatles did on a roof and the AC/DC video It’s a long way to the top.

Preparation

So to fit this within budget it’s not reasonable to design everything from scratch, but since Unreal has a excellent marketplace with myriads of assets we where able to find a New Yorkeske city that could work as a foundation. 

I of course knew this when I pitched the idea. You have to guide the client in the right direction creatively to where you know there already are assets available.

From there it was a matter of set dressing the city. 

We found a great area for the band to perform on a roof. We added a wall of marshalls, some pyro, some fire and a few other assets. 

The whole scene is filled with giant LED-screens which I took over and routed a pre-made videofile that’s in sync with the song for added effect. Very easy way of creating uniqness. 

The other scene was the ”Macy’s Parade” where the band plays on the trailer of a Semi with fans standing a long the road. What we did here is to modify an asset from the marketplace and made it move. Then we attached the camera tracking data to the same movement and voila the band is standing on a moving semi.

And for the scenes where the band are giants in the city it was just a matter of scaling them up for a giant effect.

Shooting

Basically what you do when shooting a music video like this is that you play the song several times and grab the performance with one or more cameras. 

To make all takes match we pre-programmed all happenings, like pyro and video cues, beforehand in Aximmetrys Sequencer. As an added benefit we also pre-programmed some physical lighting cues that lit up the band in the physical studio.

Then it was just a matter of grabbing as many takes the band could possibly do.

This shoot was great cause we had a handheld cameraoperator and a crane working simutanesoly. Which made us grab double takes every single time.

We did about 5 takes per scene. So first 5 takes on the roof then 5 takes on the semi. 

After that we moved on to single-takes for the footage with them as giants.

We ended the shoot by having the singer perform at various places in the world for added b-roll. And this is great when working in a virtual environment. I as the VP tech could just move around in the world and place the cameras in a cool area and let the camera operators find cool shots. Very creative and fast way to work together.

Post-Production

Here’s the thing about virtual production, you skip all the VFX parts and get just right into editing.

All the takes were laid out into a multi-cam sequence and then the editor just cut it like normal. Then sprinkled some b-roll on top and finetunes. Done. 

After edit just off to grading. The grading consisted of finding a look and adding some extra bloom to really accentuate those pyros and fireblasts. 

Conclusion

I love this way of working because it keeps the creativity free to roam on set and everyone can see what’s going on big screens and can react and come up with ideas. Especially since the camera operators see the final pixels live on their monitors they can really work with the 3D environment.

What do you think?

Softwares used

Aximmetry for Virtual Production

Unreal Engine for 3D Rendering

Premiere Pro for Editing and Grading

Equipment

Blackmagic Ursa Mini Pro G1

Antilatency Tracking System

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments

Explore

Members Only

Join my Patreon for members only posts and livestreams.

Forum

A friendly forum where we all gather to share knowledge

Discord

Unreal Engine Virtual Production Discord channel

Latest blog posts