Post Production

Deep Fake This – Fashion

The whole “Deep Fake” thing is something I have been interested in for some time. Each year it gets better and better, and as AI / Machine learning and technology advances being able to discern what is real, and what isn’t is getting harder. Where will this be in 10 years is hard to say, but the implications are pretty obvious. What does a person do when they can no longer tell if a video of someone is the real thing or not?

OK, enough of what could lead down a long and disheartening rabbit hole of despair about the future and how technology wreaks havoc on humanity.

To showcase the 2022 Spring Balenciaga fashion collection, creative director Marcus Dryden and the talented crew at MDC combined AI/machine learning, a real-time game engine, and hands-on VFX work to create a deepfake of American artist Eliza Douglas wrapped in every look from Balenciaga’s Spring ’22 collection. There is some live-action footage blended with CG and some solid post-work. They don’t say what the game engine is that they used, but I bet it was probably something like “Unreal”, or “Unity”. Below the video is a statement from Dryden breaking down the production.

“In Pre-production, we were able to plan the whole show. We used a games engine to previsualize which looks could be body doubles vs. which one needed to be the real Eliza. Also, the pre-viz defined the scale of the set for the art department and allowed production to choose the best lens, angles, and positions needed to run the multiple cameras in sync whilst on location.

During the shoot, MPC on-set supervisors Carsten Keller and Damien Canameras captured photogrammetry of Eliza’s face and oversaw a variety of in-situ plates to extract her face and transpose it onto the body doubles shot on the catwalk.

We also used a CG scan of Eliza’s head and an on-set photo reference to build a proxy Eliza head to help visualize the face replacements. This allowed our compositing team to study and analyze each shot, each face to define the best process to achieve the highest-quality clone.

The team then applied the best technique to create the face replacements: Planar tracking, roto animation, Keen Tools (a 3D tracking and modeling tool inside Nuke), and Machine Learning (AI/deep fake).

Once we began attaching Eliza’s faces, we matched light, textures, and motion artifacts using compositing. Using the references and the scan of the head, we made sure each clone’s face was as pixel-accurate to Eliza’s face as possible while still retaining the nuance of the specific Balenciaga design aesthetic.

The final film shows all the clones with Eliza’s photogrammetry-captured and CG-scanned face as they march down a minimalist runway to a sci-fi-inspired soundtrack composed by BFRND, which includes an AI voice narrating the lyrics of La Vie En Rose.”

Creatures of the Deep

When I first watched this video on Vimeo, I was drawn in by the fantastic cinematography, and the atmosphere that is created in Alan Williams studio. The visuals hooked me but as his story, and discussion about process unfolded, I knew I was here for the full 8-minute duration. After watching it with the sound on, I muted the audio and watched it again, full screen and really looked at the way this was shot, edited, and composed. Ben Cox does a really nice job of framing his shots and using shallow depth of field to focus the viewer on specific elements within the frame. Lighting and color grading come together to really help enhance the story and create a mood that captures Alan Williams personality and the artwork he creates. This short has such a solid look, and great story hooks as well, it’s definitely going in the visual reference library for inspiration at a later date.

Pulse, 3 Weeks Into Winter and I’m Longing For Spring.

Here we are 4 days into 2017, and officially 3 weeks into winter. Tomorrow, we are supposed to get 4 inches of snow, and the high temps are going to maybe hit 20 degrees, which frankly has me longing for my favorite time of the year, mid-May through early July. All of this got me to thinking about the powerful thunderstorms that roll through the midwest fueled by warm moist air blowing up from the Gulf of Mexico and colliding with a cold front rolling in off of the Northern Plains. That got me to searching the internet for some video footage to warm my chilled bones and remind there are just 84 more days until spring.

My discovery this afternoon was the video below by Mike Olbinski. Shot in 4K, color graded to black and white, timed out to just under 5 minutes, it’s absolutely breathtaking. The fact that he had the idea of taking this in a new direction with a black and white post production just makes it. The soundtrack adds to the ominous power of the visuals and makes me long for the opportunity to be able to sit on the sun porch and watch this happening live. (not the tornado part, I like my house). 

If you have the opportunity watch this in 4K on a larger TV. The visuals will knock your socks off. For more info on how Oblinski made it click through here.

The Kitchen.

Most people never realize just how much work goes into producing a TV commercial. For the most part what we see, if we are fast-forwarding over them, is the fifteen-second edit of the original sixty-second spot. They whiz by in a blip sandwiched between other ads that blend into a seamless stream of no one paying attention. But occasionally someone posts a video showing how things get done.

Have you ever wondered how they match the 3D animations to live action footage? Blend shots together? What the total production of a video looks like?  The video below for Canal+ shows you. No it doesn’t go into any lengthy detailed VFX breakdown, but it does give you a pretty solid idea of what it took to produce the promotional spot titled “The Kitchen”.

The finished sixty second spot

How they made it.