Unreal Engine 4.21 Now Supports DeckLink

Virtual Production Technology News

We’ve been using DeckLink I/O video cards in our virtual production server builds, now Epic Games and Black Magic just announced that the latest version of Unreal Engine 4.21 will natively support these Black Magic video in and out cards.

Blackmagic 8K Capture and Playback Cards.
Unreal Engine 4.21 supports Blackmagic 8K Capture and Playback Cards.

DeckLink SDK binaries and source code from Epic will also now be available free to download on the company’s Unreal Engine Marketplace.

Epic Games’ Unreal Engine 4.21 now supports DeckLink 8K Pro, DeckLink Duo 2 and DeckLink 4K Extreme 12G capture and playback cards, announced Blackmagic Design.

Unreal Engine is the most used real time engine providing the highest quality solution for creating virtual reality (VR) and augmented reality (AR) experiences, as well as virtual sets for film and television. UE is a complete suite of creation tools designed to meet the most ambitious artistic visions while being flexible enough to ensure success for individual developers or the largest creative teams.

Unreal Engine, which added support for up to 1080p60, HDMI and SDI capture and playback in version 4.20, now includes a number of new features in 4.21, to meet its growing adoption in live broadcast and virtual production workflows. A key part of these new features is the support of Blackmagic Design DeckLink products and SDK.

Real-Time Ray Tracing in Unreal Engine 4

Unreal+Engine-real-time-ray-tracing2

Epic Games, in collaboration with NVIDIA and ILMxLAB, today gave the first public demonstration of real-time ray tracing in Unreal Engine. Real-time ray tracing is considered to be a holy grail for those creating high-end cinematic imagery, and one that signifies a leap forward in the convergence of film and games.

Epic Games Demonstrates Real-Time Ray Tracing in Unreal Engine 4 with ILMxLAB and NVIDIA

During yesterday’s “State of Unreal” opening session at the Game Developers Conference (GDC), the three companies presented an experimental cinematic demo using Star Wars characters from The Force Awakens and The Last Jedi built with Unreal Engine 4. The demonstration is powered by NVIDIA’s RTX technology for Volta GPUs, available via Microsoft’s DirectX Ray Tracing API (DXR). An iPad running ARKit is used as a virtual camera to draw focus to fine details in up-close views.

Unreal+Engine-real-time-ray-tracing1

Epic built the computer-generated (CG) scene using assets from Lucasfilm’s Star Wars: The Last Jedi featuring Captain Phasma, clad in her distinctive armor of salvaged chromium, and two stormtroopers who run into her on an elevator on the First Order ship. In the tech demo, lighting is moved around the scene interactively, as the ray-traced effects including shadows and photorealistic reflections render in real time. The stunning image quality of highly reflective surfaces and soft shadows has never before been achieved at such a high level of image fidelity in Unreal Engine.

Next-generation rendering features shown in today’s demo include:

Textured area lights
Ray-traced area light shadows
Ray-traced reflections
Ray-traced ambient occlusion
Cinematic depth of field (DOF)
NVIDIA GameWorks ray tracing denoising

“Ray tracing is a rendering process typically only associated with high-end offline renderers and hours and hours of computer processing time,” said Epic Games Founder and CEO Tim Sweeney. “Film-quality ray tracing in real time is an Unreal Engine first. This is an exciting new development for the media and entertainment linear content worlds—and any markets that require photorealistic visualization.”

“At ILMxLAB, our mission is to create real-time rendered immersive experiences that let audiences step into our stories and connect with cinematic universes that look and feel as real as the ones on the movie screen. With the real-time ray-tracing technology that Epic and NVIDIA are pioneering, we are a pivotal step closer to that goal,” says Mohen Leo, ILMxLAB Director of Content and Platform Strategy.

Epic Games worked closely with NVIDIA to support the NVIDIA RTX technology available through the DXR API. Running on an NVIDIA DGX Station, the demo was brought to life via a collaboration between Epic’s dedicated graphics and engine team, NVIDIA’s world-class ray tracing experts and the technical ingenuity and creative artistry of ILMxLAB.

Realtime VFX in Unreal Engine 4 Tutorial

realtime VFX tutorial
Unreal Engine's real-time volumetric rendering tools to create seamless special effects that integrated perfectly with the action.

Realtime VFX using Unreal Engine (UE4) is pushing the boundaries of on-set realtime production. Here at On-set Facilities we build realtime VFX machines, sets and realtime production solutions that are optimised for realtime VFX using Unreal Engine (UE4). Tutorial Video after the jump:

Continue reading “Realtime VFX in Unreal Engine 4 Tutorial”

Unreal Engine Building & Optimizing Worlds for Real-Time.

We work with many vendors on projects so we thought it a good idea to share a bit of learning to make everyones life easier. This video focusses on creating content assets in your favourite DCC software, exporting them as FDX files, and considerations when designing content in DCC’s for VR engines. You won’t be surprised, small is beautiful, fast is best, file this under the art of code.