Real-Time Ray Tracing in Unreal Engine 4

Unreal+Engine-real-time-ray-tracing2

Epic Games, in collaboration with NVIDIA and ILMxLAB, today gave the first public demonstration of real-time ray tracing in Unreal Engine. Real-time ray tracing is considered to be a holy grail for those creating high-end cinematic imagery, and one that signifies a leap forward in the convergence of film and games.

Epic Games Demonstrates Real-Time Ray Tracing in Unreal Engine 4 with ILMxLAB and NVIDIA

During yesterday’s “State of Unreal” opening session at the Game Developers Conference (GDC), the three companies presented an experimental cinematic demo using Star Wars characters from The Force Awakens and The Last Jedi built with Unreal Engine 4. The demonstration is powered by NVIDIA’s RTX technology for Volta GPUs, available via Microsoft’s DirectX Ray Tracing API (DXR). An iPad running ARKit is used as a virtual camera to draw focus to fine details in up-close views.

Unreal+Engine-real-time-ray-tracing1

Epic built the computer-generated (CG) scene using assets from Lucasfilm’s Star Wars: The Last Jedi featuring Captain Phasma, clad in her distinctive armor of salvaged chromium, and two stormtroopers who run into her on an elevator on the First Order ship. In the tech demo, lighting is moved around the scene interactively, as the ray-traced effects including shadows and photorealistic reflections render in real time. The stunning image quality of highly reflective surfaces and soft shadows has never before been achieved at such a high level of image fidelity in Unreal Engine.

Next-generation rendering features shown in today’s demo include:

Textured area lights
Ray-traced area light shadows
Ray-traced reflections
Ray-traced ambient occlusion
Cinematic depth of field (DOF)
NVIDIA GameWorks ray tracing denoising

“Ray tracing is a rendering process typically only associated with high-end offline renderers and hours and hours of computer processing time,” said Epic Games Founder and CEO Tim Sweeney. “Film-quality ray tracing in real time is an Unreal Engine first. This is an exciting new development for the media and entertainment linear content worlds—and any markets that require photorealistic visualization.”

“At ILMxLAB, our mission is to create real-time rendered immersive experiences that let audiences step into our stories and connect with cinematic universes that look and feel as real as the ones on the movie screen. With the real-time ray-tracing technology that Epic and NVIDIA are pioneering, we are a pivotal step closer to that goal,” says Mohen Leo, ILMxLAB Director of Content and Platform Strategy.

Epic Games worked closely with NVIDIA to support the NVIDIA RTX technology available through the DXR API. Running on an NVIDIA DGX Station, the demo was brought to life via a collaboration between Epic’s dedicated graphics and engine team, NVIDIA’s world-class ray tracing experts and the technical ingenuity and creative artistry of ILMxLAB.

Open Source VR Engines 2018

Godot Engine

Some open-source VR engine options: Apertus VR / OSVR / GoDot / Annwvyn / take a look at those to start with. Which one to go for?

In our view, from looking at whats on offer GoDot looks good, but then take a look at Xenko. We soon came across Xenko, and due to their flashy graphics and showreel we got a little excited, take a look at the trailer video:

Open Source games engines.

GoDot have released a fresh third version update. Impatient users can put an end to 18 months of waiting by jumping directly to the Download page and start playing with Godot 3.0!

How to choose an open source VR engine / game engine.

Our best advice is follow the links from this post, then take a look at the social metrics for each option. Do they have a big supportive community? Do they include the API or source code access and publishing options you need?

For sure check out if they respond in, or even have a great support forum – these are the questions you’ll need to ask to evaluate if an open-source VR engine is right for your project plans.

Continue reading “Open Source VR Engines 2018”

Welcoming AI & Machine Learning to Colour Grading

File under how machines learn, and why that auto-match in every colour correction software you’ve ever used, never works like you wish it did. Welcoming AI & Machine Learning to Colour Grading, a talk at the BSCexpo 2018, presented by Dado Velentic, from Colourlab.co talking here for Colour Intelligence.

Especially interesting for anyone that wants to understand the basics of machines learning and how we humans teach AI via human passive learning, to train the AI training bots to know when a colour grade is good or bad.

Unreal Engine Building & Optimizing Worlds for Real-Time.

We work with many vendors on projects so we thought it a good idea to share a bit of learning to make everyones life easier. This video focusses on creating content assets in your favourite DCC software, exporting them as FDX files, and considerations when designing content in DCC’s for VR engines. You won’t be surprised, small is beautiful, fast is best, file this under the art of code.

Octane Render in Cinema4D it’s all about the Render.

Cinema4D Octane Render Example
Octane Render Example in C4D

Credits:

Design and Produce : Taehoon Park
Character design and Modeling : Hyunsup Ahan
Synopsis and Edit : Jihoon Roh
Music and SFX : Echoic

Software:

Cinema4d, Octane, Aftereffects, Photoshop

Artists Synopsis:

The future mankind has extremely extended their lifespan by highly advanced technology. The only way to meet death exist as a type of euthanasia. Death managing company ‘DREAVELER’ which have researched a diversity methods of euthanasia invented a brand-new system which makes people travel their dreams and memories during REM sleep before the last sleep. The system also named ‘DREAVELER’ compound word of Dream & Travel focused on soothing people who get bored with a desolated future and filled with nostalgia for the nature of a vanished past.

On-set Facilities Real-time VFX Compositing Set-ups.

We have had an amazing response to our teams work in real-time compositing and live on-set VFX.

In this post, I will try and tell you more about how we build our real-time VFX set-ups. if you have any questions please use our contact form and one of the team will get right back to you. [UPATE: We have just set up our Facebook Page, like to keep up with new developments]

This is a great PreViz tool so that we can record in REALTIME, the FG, the BG, the Matte.

On-set Facilities Real-time VFX systems, are built by OSF teams in the UK and Madrid, using virtual, 360, 3D environments, camera tracking, hardware compositing tools, color correction software, and game engines.

Realtime VFX Compositing 2nd Test

Various manufacturers provide the vital technology (listed below) and OSF is now working closely with developers and manufacturers to push the boundaries between, CGI, VFX, 3D, Mixed Reality, film, and games.

“So far we have created a great tool for pre-visualisation and real-time VFX. It allows us to record in real-time, live-action foregrounds (with our without live-action sets, using real props and actors) and digitally bake them within HD, 2K, 360º, and 3D Virtual Worlds.”

The camera tracking information is captured along with zoom and focus data in FBX, it can then be passed directly to NUKE for on-set compositing or saved to be manipulated later.

Here is a list of the technologies you’ll need to explore if you want to build your own real-time compositing set-up and add live VFX to your next production.

We bake in UE4, we comp in NUKE, we create 3D environments and animated characters and elements in 3D Studio Max. Some hardware is OSF own as are the configuration tools and methods. You can always talk to us if you need us.

Live VFX Octane Render

This is another test with a scene made by Sungwoo Lee, it uses Octane Render in Unity and then baked into UE4, we are now all waiting for Octane render for UE4, and expect a release in the near future. Thanks to Sungwoo for this test.