Posts filed under Video Playback

Passengers

Several years ago, I created my own video playback software which has been heavily used on many of the movies I and my colleagues have worked on. While that software is still quite capable, as someone who has always been interested in new technologies, I’m always looking for new ways to push the boundaries on what we can do. Although I’ve been closely monitoring the progression of the Unity game engine since version one, it wasn’t until fairly recently that the feature set advanced to the point of being a viable platform upon which I could create new playback software. Between the plugins available on the Unity store as well as custom ones I’ve built myself, I’ve been excited to be able to create ways of bringing the interactivity of on-set graphics to a whole new level.

cafeteria.png

When I was asked to do Sony’s futuristic sci-fi movie, Passengers, it seemed like a great opportunity to put what I’d been doing in Unity through its paces. While great actors, such as Chris Pratt, can make nearly anything they do believable even if all they’re interacting with is a green screen, it is my opinion that the more realistic and interactive we can make the technology they’re performing with on set, the easier it is for them to stay fully immersed in their character’s world. Which is why it was a big deal to me to overcome the challenges we were presented with on Passengers’ cafeteria set. We had to seamlessly tie together the actions of a tablet that was floating on glass with the 4K TV directly behind it. Traditionally, a scene like this would have been triggered remotely which requires choreography of the movements of the actor’s hands and the order in which they press the buttons on screen. It wouldn’t work well to have the actor pressing the left side of the screen if I’m triggering a button on the right.

fooddispenser.png

To eliminate the need for any graphics puppeteering or choreography, I decided to use OSC (Open Sound Control) to send network commands from the tablet to control the computer playing the graphic on the TV screen. That way when Chris would interact with the tablet on set the graphics on the larger monitor would react automatically and, instead of having to remember any kind of choreography, allow him to focus on his character’s predicament of having minimal access to the food dispenser.

In the hibernation bay, there were twelve pods containing four tablets per pod plus backups which meant more than fifty tablets that needed to display vital signs for whichever character was in the pod. 

Since the extras were constantly shifting between pods, we had to have a way to quickly select the right name and information for that passenger. This was done by building a database of passenger information that could be accessed via a drop-down list on each tablet which let us reconfigure the room in just a few minutes.

Since Passengers takes place in a high-tech spaceship, tablets were embedded in the walls throughout the corridors and rooms. Everything from door panels to elevator buttons and each room’s environmental controls were displayed on touch screens.

Because of how the set was constructed, many of the tablets were inaccessible once mounted in place. We’d start by loading the required content before the tablets were mounted but we also needed to have a way to make modifications through the device itself if changes were necessary. 

The beauty of using a game engine is that it renders graphics in real time. Whenever color, text, sizing, positioning or speed needed to be changed, it could be done quickly and remotely either by using a game controller or interactively on the tablet itself. This kept us from causing the kind of delays in the shooting schedule that would have resulted if we’d had to rip devices out of the walls every time a change was made.

I learned a lot about Unity’s capabilities on this movie and I’m excited to continue exploring the boundaries of using game engines in ways they weren’t necessarily built for. This experience has allowed me to refine and expand upon what I thought was possible and I’m excited to use even more advanced versions of my playback software on future projects. That said, without amazing graphics to play back, the best software in the world still won’t get the job done. Chris Kieffer, Coplin LeBleu and Sal Palacios at Warner Bros. produce graphics that are second to none and I’m always grateful for the times I get to work alongside them. 

To read more, here is a link to my article in Local 695 Production Sound & Video Magazine.

Passengers photos courtesy of Columbia Pictures

Posted on January 7, 2017 and filed under Film, Video Playback.

Interstellar

When working on Interstellar, our team was brought in early on, in pre-production, to work with the art department on designs for the screens in the various ships. We were tasked with the goal of making a very utilitarian, functional, NASA style design. After much research into actual space shuttle control screens, we were able to use the real-life examples as a starting point.

Graphic designers Chris Kieffer, Coplin LeBleu, and Sal Palacios did a great job finding a visual middle ground between what would be futuristic space travel to the audience yet antiquated to the crew of the Endurance.

Although I was also involved in the design process, wrangling all of the various content and transforming it into interactive media that could be controlled seamlessly whether it appeared on monitors, tablets, or laptop computers was my specialty.

This film called for me to develop special software to remotely control an iPad mini which was built into a prop featuring fake buttons that needed to respond to Anne Hathaway's interaction.

Both the TARS and CASE robots each featured two additional iPad minis that also needed to be controlled remotely via this same software. Although I have since created more flexible tools for controlling devices, this was the first time we were able to utilize other iOS products to remotely trigger the iPads across a wireless network.

The crew of Interstellar was among the most talented group of people I've ever had the pleasure of working with and I feel very fortunate to have been a part of making this film.

Posted on February 5, 2016 and filed under Film, Video Playback.

Man of Steel

In the Daily Planet set for Man of Steel, it needed to look and feel like a lived in functional newspaper office.  We needed an easy way for our guys on set in Chicago to build multiple desktop computer screens for all the monitors on set.

Because of clearance issues, we couldn't use any actual existing operating system.  Chris Kieffer designed a custom UI for our fake OS.  I built a custom application for the film that allowed create multiple desktop layouts.  Using sets of PNG files we could now just select a background, icon set and add and position windows and save a different layout on each on set computer.

I was also on the production in the California unit as a playback operator for the green screen in the tibetan tent scene.  Since it was an old CRT Tube TV, I needed to synchronized it's refresh rate to the 24 frame film camera or there would be a rolling bar in the shot making it harder for the VFX guys to composite in post.

Because of the enormous amount of post production VFX that needed to happen on Man of Steel, I was asked to help out on the monitor replacement in the Daily Planet scenes.  

Move mouse to see before & after (touch on left & right of image on a touch screen device)

I used Nuke for the tracking and composites, and color matched in DaVinci Resolve for editorial screenings.

Move mouse to see before & after (touch on left & right of image on a touch screen device)

Posted on October 1, 2015 and filed under Visual FX, Video Playback, Film.

Alex Cross

In the film Alex Cross, most of the playback on set was green screen to be replaced in post production. I designed and animated the graphics for the interactive mobile device where Alex Cross disables the Police Dept. Security System.

I was also brought on to do the VFX replacement composites of the green screen computer monitors in post production.

Using tracking markers in the green screen playback files on set made tracking the shots easier.

Move mouse to see before & after (touch on left & right of image on a touch screen device)

I moved to Nuke for compositing the screens on this film because of it's speed and powerful compositing tools. 

and DaVinci Resolve for color matching the outputs for editorial's screenings.

Move mouse to see before & after (touch on left & right of image on a touch screen device)

Posted on February 27, 2015 and filed under Film, Video Playback, Visual FX.

The Avengers

Because of the scale of The Avengers, Rick Whitfield, Jim Sevin, Tim Gregoire, and I were brought on to engineer the hundreds of playback screens.  Cantina Creative created the animated computer graphics that we needed to playback on the various set throughout the film.

The bridge set alone had 130 monitors.  We needed to develop a way to be able route any of the 30 computer feeds out to the monitors on set.  

Using 4 Blackmagic Design 40x40 3G videohubs we could organize and control what was on any given monitor in the shot.

Our video playback booth built under the carrier bridge set

Our video playback booth built under the carrier bridge set

In order to speed up our ability to manipulate the layout of the graphics on set as well as being able to put green screens on any monitor quickly for post VFX, it was clear I needed to write custom software to control the videohub routers.  

Since the routers could accept Telnet commands over a wired network, I developed a router controlling application in Xojo (formerly Real Studio), that we could setup layouts of all the monitors on set at once and save it to a preset.  This allowed us to switch to saved preset for a given scene with a single button click.

As well as routing the computer screens, we also need to control the timing of the playback graphics for some scenes.  For example, in the scene where the mind controlled Hawkeye shoots the computer virus arrow into the bridge computer, I built a timed delay into our quicktime playback software to create a computer outage ripple effect.

Since the helmsman and map controls at the front of the Bridge set were shapes that could not be actual live playback screens.  It was decided that instead of using green on those surfaces that they should be designed as a static backlit display.  I was approached by the art department about designing the graphics for practical on set pieces.  I goal was to make it blend in and fit with the graphics designed by Cantina Creative that we would be playing back on set.

On the helm display I used Adobe After Effects for the final print image.  I ended up having better control over the shape and curvature.  

As well as making it easier for the animators in post to make the final replaced shot in the film from the composition I created for print.

We also helped out at the Comic-Con Avengers booth.

Setting up a sample of the bridge control screens from the set on the Avengers stage.

Posted on November 12, 2014 and filed under Film, Video Playback, Development.