Real-Time in Film Production

 


When I started using visual effects applications I spent so much of my time waiting for things to render so I could see what I made. Deep in my video archive there’s a video I created where I mimed myself pulling curtains of video effects over the frame. Though this video is only a few seconds of me standing in one spot, it took me a whole day to render those out! So much of that time was spent filling in the time waiting for the render to finish, hoping it turned out well (which it rarely did since I was just beginning). At the end of the day I had a video, but at what cost! Though I would continue this routine trying to cut the render times down and plan ahead what I would do while rendering, I always had a sour taste in my mouth. Then I stumbled upon the wonders of real-time.

Dark Helmet: “What the hell am I looking at? When does this happen in the movie?”Colonel Sandurz: “Now, you're looking at now, sir. Everything that happens now is happening now.”Spaceballs(1987)

Dark Helmet: “What the hell am I looking at? When does this happen in the movie?”

Colonel Sandurz: “Now, you're looking at now, sir. Everything that happens now is happening now.”

Spaceballs(1987)

But what is Real-time?

Real-time is the time-saving term that lets us see our final product as we create it. This allows for more quality time for other things like creative engagement, feedback, and even sleep! It can be applied towards mediums such as video, photography, 3D model and even 2D/3D animation. For 3D animation, instead of having to initiate a process to render the movie to view it, you can see the final product every time you tweak any parameters or change objects in the scene. It’s not a new invention but rather a new optimized strategy for approaching workflows created through applying modern day technology. These concepts have not only changed the way I work but also the way I look at the laborious render pipelines of large scale productions, especially in the context of post-production where a ton of time is sunk into rendering time.

Rogue One: A Star Wars Story“The advantage of the TouchDesigner user interface explained Leo was it allowed the lighting to be triggered as needed. For example, he explained how the DOP would stand beside someone with a TouchDesigner ipad who would …

Rogue One: A Star Wars Story

“The advantage of the TouchDesigner user interface explained Leo was it allowed the lighting to be triggered as needed. For example, he explained how the DOP would stand beside someone with a TouchDesigner ipad who would trigger the lighting changes to the LED when the DOP indicated the moment they wanted the ship jump to hyperspace.”-Third floor and ILM

The Benefits

Generally, the best thing about real-time rendering is the time you save. Instead of waiting hours for the render to finish, you can focus on making sure the lighting, textures, camera placement, the speed of the shot, or the general mood is exactly how you like it. Every time you change something the final product updates to reflect that a second later. After establishing a real-time render pipeline I’ve found I could cut my render times of a 30 second 4K clip from 30 minutes to 40 seconds. This has allowed me to toss out the preciousness of each render and focus on quickly reviewing the content with my team or clients. And if something needs to change?

”Give me a minute...hear you go” or “Let’s take a look together... live!”

With render times and heavy technical set up out of the way, we can now focus on the creative work and refining the experience with all stakeholders. If you’re savvy like Psycho Jelly you can even get this all on a laptop for small projects or scale it within large industries such as film, design, rapid prototyping, data visualization and architecture. Once you've experienced the benefits of a real-time workflow, going back to anything else can feel like using dial-up internet!

Real-time in Film

So how can this be applied something like the film making process? Well let’s say a film crew has set up a scene, the director shouts action, the actors go through the scene, the director says cut, and they review the shot. Prior to the digital age, you’d have to wait for the film to be developed to even know whether it was any good and from there it was left to post to work through. You couldn’t even know if the exposure and coloring was correct! Now with the beauty of digital you can watch what was just shot no problem to know all the time and money to set up the shot was not wasted. Because of technology the delay to see actual footage shortened, the reviewing of the content benefits the creative process during the films actual creation and not in post.

Typical Production Pipeline

This is a summarization of a typical production pipeline moving from pre-production on the left to release on the left. If things need to change later in the pipeline it can be bothersome as you may have to start back at the beginning and go back th…

This is a summarization of a typical production pipeline moving from pre-production on the left to release on the left. If things need to change later in the pipeline it can be bothersome as you may have to start back at the beginning and go back through the pipeline before seeing your final product.

Real-time Production Pipeline

This is an example of a real-time pipeline modified from the typical pipeline shown previously. Since much of the conception and design is running parallel to the production, if things need to change anywhere in the project you can quickly see how i…

This is an example of a real-time pipeline modified from the typical pipeline shown previously. Since much of the conception and design is running parallel to the production, if things need to change anywhere in the project you can quickly see how it affects the final or nearly final product without as much consequence to time or other resources.

Now, that can go a step further and we can review the shot with the color grading, preview different edits of the scene using the shot, throw in some mood audio with the shot, or even compare multiple shots simultaneously to not just get the best shot but also make sure everything is cohesive. You can even record depth data to put in computer generated graphics in the shot so the actors know what they are reacting to and the cinematographer knows how everything will be placed in the frame to get the best composition and lighting. This would have taken a whole team of post-production operators and their clunky rigs a lot of time and energy to do 10 years ago. Now this can be done in real-time, seconds after the shot was taken on a laptop, with custom software like the ones Psycho Jelly develops.

Real-time interface used in film production of Prometheus(Alien Prequel) by Soma CG using Touchdesigner

Real-time interface used in film production of Prometheus(Alien Prequel) by Soma CG using Touchdesigner

Real-time video in film production by Lux Machina for warp speed effects in Solo: A Star Wars Story using Notch and Touchdesigner

Real-time video in film production by Lux Machina for warp speed effects in Solo: A Star Wars Story using Notch and Touchdesigner

Want to know more?

If you’re wanting to implement a process like this and are looking for a good place to start, Elburz does a fantastic job of presenting an introduction to the term real-time for applications and installations in his blog post the facets of real-time installations. I’ve also found some great videos on the subject here at cutsceneartist and the GDC(Game Developer Conferences) youtube channel. For those wanting to dive head first into real-time rendering software, TouchDesigner and Notch are great places to start as both software are used in the photos examples above. If you’re a developer, utilization of API’s for real-time data and SDK for the hardware you are using can also give you more flexibility to connect to the applications you may already have. Many of the independent game building software like Unity, more specifically this video, and Unreal have taken a huge moves towards real-time video creation since games already utilize real-time rendering for gameplay. 2D/3D rendering software like Blender, Cinema 4D, and Adobe applications are also starting to integrate these techniques but hardware is still at times a barrier for entry and the end product often is a rendered image and not something to run real-time.

Conclusion

From taking renders from 30 min to 40 seconds to making instant pre-visualization, the benefits of real-time cannot be exaggerated. But using real-time workflows doesn’t mean you have completely replace your old workflows. With some simple reviews of your workflow and small adjustments, you can quickly start leveraging these techniques to spend more time refining the creative and crucial parts of your projects. These examples also don’t just apply to giant Hollywood scale productions, huge agencies or design firms with giant render servers, they can be used by any sized company with some good thought into the workflow.






 
Psycho Jelly