In recent days Epic updated the official Unreal Engine roadmap with a plethora of new features across the entirety of the engine. In this article we'll be focusing on the latest and greatest virtual production features, anticipated to arrive in Unreal Engine 5.4.
SMPTE 2110 for In-Camera VFX
Although Epic started to implement initial end-to-end support for SMPTE 2110 in Unreal Engine 5.3, the goal for upcoming versions is to provide full production-ready support, through the increased integration of:
Automatic detection and repair for instances where the system loses framelock to ensure synchronization is preserved continuously.
Ability to use PTP as a Timecode Provider.
Ability to timecode-align 2110 playback.
OCIO support for 2110 media sources.
SDP (Session Description Protocol) import/export support.
General stability improvements to Unreal Engine's core SMPTE 2110 support.
Multi-Process Inner Frustum Rendering For ICVFX
The enhanced support for SMPTE 2110 brings forth the capability to allocate an entire nDisplay render node exclusively for the task of rendering the inner frustum. This empowers stages to allocate maximum hardware resources to what is visible in the camera (such as dedicating multiple GPUs specifically for the frustum).
This marks a crucial advancement in workflows, especially as cameras capture content at higher resolutions and volumes continue to grow in size. The powering systems must be adaptable to scale in tandem with these developments, meeting the ever-evolving requirements of productions.
Depth Of Field Estimation
As many of us know, depth of field plays a crucial role on any production, and trying to achieve authentic depth of field on a In-Camera VFX shoot is one of the most difficult creative challenges.
Depth of field estimation addresses this challenge by utilizing both the real-world and virtual metrics provided within the nDisplay system. This method ensures a precise depth of field (DOF) effect, where virtual objects exhibit varying levels of sharpness based on their position. The calculation of depth of field is achieved through a combination of metrics from both the real and virtual environments including:
The placement and orientation of the stage volume in the Unreal scene.
The camera's location, along with its aperture, lens focal length, and focal distance, all of which are captured through LiveLink.
The relative distances among all objects present within the Unreal scene.
LiveLink Hub App
LiveLink Hub is designed to address challenges in production setups that use multiple instances of Unreal Engine (UE) in a Multi User Editing environment.
In these settings, each UE editor session might need to be configured differently, especially regarding how they connect to LiveLink sources. Currently, UE uses a default preset for connecting to LiveLink sources when a session starts, but this one-size-fits-all approach isn't always sufficient for complex productions.
To address this, the plan is to develop an external application. This app would serve a similar function to the Multi-User Slate server, which is used for coordinating multi-user editing sessions in UE. The LiveLink Hub App would:
Monitor Running Editor Sessions
Connect to LiveLink Sources
Visualize and Rebroadcast LiveLink Data
Customize Data for Each Session
Upcoming Techviz Tools
Epic Games is responding to the growing trend of artists and studios using Unreal Engine for efficient, high-quality previsualizations by enhancing their offerings in technical visualization. They aim to provide artists with advanced capabilities for creating detailed Techvis renders, diagrams, and simulations directly within Unreal Engine.
Ideally this approach will ensure that artists won't need to export their work out of the engine at any stage, facilitating a more seamless and integrated workflow for visualisation in all of its varying stages.
Android & Mac Support for Virtual Camera
The Live Link VCam, a companion application for the Virtual Camera system in Unreal Engine, is set to launch on Android. Users who have ARCore-compatible Android devices will now have the ability to fully utilize the Virtual Camera features for their projects.
The Virtual Camera feature is being introduced to Mac, offering complete support. Users can take advantage of software-based Pixel Streaming on both x86 and ARM architectures, and enjoy hardware acceleration on ARM-based M-Series Mac devices.
This article highlights just a few of the most significant and thrilling virtual production features on the horizon. For a comprehensive overview of all the latest advancements coming across the hole Unreal Engine, be sure to check out the Official Unreal Engine Roadmap here.
Comentarios