Boost Realtime VFX in Unreal Engine (UE4) with multiple GPUs using SLI

GPU Render Performance

Scalable Link Interface (SLI) is a multi-GPU configuration that offers increased rendering performance by dividing the workload across multiple GPUs.

Since UE4.15 Unreal Engine has been able to take advantage of machines and servers with multiple GPUs, so long as the GPU and system are compatible with SLI functionality.

Realtime Ray Tracing Demo
Realtime VFX demo Ray Tracing at GDC 2018 by Unreal Engine ILMxLab and Nvidia.

To take advantage of SLI, the system must use an SLI-certified motherboard. Such motherboards have multiple PCI-Express x16 slots and are specifically engineered for SLI configurations.

Building Multiple GPU machines

To create a multi-GPU SLI configuration, [NVIDIA] GPUs must be attached to at least two of these slots, and then these GPUs must be linked using external SLI bridge connectors.

Once the hardware is configured for SLI, and the driver is properly installed for all the GPUs, SLI rendering must be enabled in the NVIDIA control panel. At this point, the driver can treat both GPUs as one logical device, and divide rendering workload automatically depending on the selected mode.

There are five SLI rendering modes available:

  • Alternate Frame Rendering (AFR)
  • Split Frame Rendering (SFR)
  • Boost Performance Hybrid SLI
  • SLIAA
  • Compatibility mode

If you are building a multiple GPU system with GPUs of different capabilities, say a Titan X and then a couple of Quadros, you can utilise the SLI Compatibility mode. This mode enables UE4 to push rendering tasks to the most suitable GPU in your set up. Hard tasks go to the more powerful while the other less powerful GPUs in your rig handle the less, and more appropriate tasks. If you are interested in understanding more about SLI take a look at the following page on the Nvidia website.

UPDATE 26/03/2018 after posting this post to the Octane Render group on Facebook a few interesting comments came up that we thought we’d ad to this post.

James Hibbert said “just for clarification, this article is talking a lot about SLI, and using SLI bridges, you do not need any of that for rendering with Octane using multiple GPUs.” But then added “IF you are using UE4, then yes you will probably want SLI if in the context of your project it actually gives you some benefit. That is not always a given with Raster rendering. However with Octane, your speed scales 1:1 with the number of GPUs you have.

James Hibbert, just an aside, every PC Tech guru seems to agree on one thing. For games, at least the vast majority of them, a gamer is better off getting the fastest single GPU they can afford, rather than getting 2 slower/cheaper cards and running them in SLI/Crossfire. For Octane, and Red Shift, you simply need as many GPUs as you can afford.

Just remember Multi-GPU and SLI are not the same thing. SLI is a specific technology from Nvidia. Octane does not use SLI, Octane uses muli-gpu (not sure exactly wich flavor there is, but your motherboard does it on it’s own with the help of the OS).

There is a difference.

Now there is another form of of Multi-GPU from nvidia called NV Link, NV Link is similar to SLI, but allows you to do things like stack GPU memory, so if you have 4 GPUs with 11gb of VRAM you will have a total of 44gb of VRAM, where as all other forms would still leave you with the original 11gb. keep in mind that NV Link is not available on consumer GPUs, and you need to use Quadro or Tesla cards to use it.

Hopefully that will change with the next line of consumer GPUs from Nvidia. SLI support from nvidia has dropped off quite a bit to the point where they only support 2 way SLI officially. I kinda suspect that they will either be dropping SLI altogether or migrate everything to NVLink in future products. Because of the Raytrace UE4 demo, UE4 will feature support for NV Link in a future build, because they land to link multiple GPUs to get it to run in real-time.”

Real-Time Ray Tracing in Unreal Engine 4

Unreal+Engine-real-time-ray-tracing2

Epic Games, in collaboration with NVIDIA and ILMxLAB, today gave the first public demonstration of real-time ray tracing in Unreal Engine. Real-time ray tracing is considered to be a holy grail for those creating high-end cinematic imagery, and one that signifies a leap forward in the convergence of film and games.

Epic Games Demonstrates Real-Time Ray Tracing in Unreal Engine 4 with ILMxLAB and NVIDIA

During yesterday’s “State of Unreal” opening session at the Game Developers Conference (GDC), the three companies presented an experimental cinematic demo using Star Wars characters from The Force Awakens and The Last Jedi built with Unreal Engine 4. The demonstration is powered by NVIDIA’s RTX technology for Volta GPUs, available via Microsoft’s DirectX Ray Tracing API (DXR). An iPad running ARKit is used as a virtual camera to draw focus to fine details in up-close views.

Unreal+Engine-real-time-ray-tracing1

Epic built the computer-generated (CG) scene using assets from Lucasfilm’s Star Wars: The Last Jedi featuring Captain Phasma, clad in her distinctive armor of salvaged chromium, and two stormtroopers who run into her on an elevator on the First Order ship. In the tech demo, lighting is moved around the scene interactively, as the ray-traced effects including shadows and photorealistic reflections render in real time. The stunning image quality of highly reflective surfaces and soft shadows has never before been achieved at such a high level of image fidelity in Unreal Engine.

Next-generation rendering features shown in today’s demo include:

Textured area lights
Ray-traced area light shadows
Ray-traced reflections
Ray-traced ambient occlusion
Cinematic depth of field (DOF)
NVIDIA GameWorks ray tracing denoising

“Ray tracing is a rendering process typically only associated with high-end offline renderers and hours and hours of computer processing time,” said Epic Games Founder and CEO Tim Sweeney. “Film-quality ray tracing in real time is an Unreal Engine first. This is an exciting new development for the media and entertainment linear content worlds—and any markets that require photorealistic visualization.”

“At ILMxLAB, our mission is to create real-time rendered immersive experiences that let audiences step into our stories and connect with cinematic universes that look and feel as real as the ones on the movie screen. With the real-time ray-tracing technology that Epic and NVIDIA are pioneering, we are a pivotal step closer to that goal,” says Mohen Leo, ILMxLAB Director of Content and Platform Strategy.

Epic Games worked closely with NVIDIA to support the NVIDIA RTX technology available through the DXR API. Running on an NVIDIA DGX Station, the demo was brought to life via a collaboration between Epic’s dedicated graphics and engine team, NVIDIA’s world-class ray tracing experts and the technical ingenuity and creative artistry of ILMxLAB.

Google buys Lytro for next to nothing.

Insiders at Google and Lytro are saying that the sale price of Lytro could be as little as $25m. Google is in talks with Lytro over its assets which include some 59 patients in light field and optical technologies.

Google buys Lytro

To much too soon for this Light-field pioneer.

Its a case of too much too soon, as the once shiny Lytro took to the front of the light field development scene. But then, Lyto’s consumer product flopped and expensive cracks started to appear. Not to say the company was wrong, but more a case of too soon for a market that was just not ready to support such ideas.

Typical, think back to the start of the web, have you even heard of Compuserve? No, well history seems to be repeating itself as early market entrants get overtaken, sold off or worse they simply fade out of business. [CompuServe was the first major commercial online service provider in the United States. It dominated the field during the 1980s and remained a major influence through the mid-1990s] Some of the companies at the front of the light-field, volumetric, 360, and VR race seem to be running out of fuel, meanwhile others, in some cases who may not be pushing the envelope so far, are seeing a rush of funding.

The world’s not a fair place, the tech world is down right savage. The sale of Lytro to Google is the start of a mopping-up wave that will see bigger companies harvesting the IP assets of these early pioneers. We feel quite sad as Lytro was indeed a true pioneer of what is today becoming the vital component of virtual experiences – the ability to see light and depth.

What will Google do with the Lytro technology? For the cost of the purchase, Google will probably just pass the IP into its own hardware developments, the learning itself is worth the cost, which could be as low as $25M. Even if the price of Lytro was double that, the hard work that the Lytro team have done over the past few years is like hitting a quick key, to the Google teams.

Realtime VFX in Unreal Engine 4 Tutorial

realtime VFX tutorial
Unreal Engine's real-time volumetric rendering tools to create seamless special effects that integrated perfectly with the action.

Realtime VFX using Unreal Engine (UE4) is pushing the boundaries of on-set realtime production. Here at On-set Facilities we build realtime VFX machines, sets and realtime production solutions that are optimised for realtime VFX using Unreal Engine (UE4). Tutorial Video after the jump:

Continue reading “Realtime VFX in Unreal Engine 4 Tutorial”

A short Essay on Volumetric Filmmaking

May I share how I believe directing and shooting in a digital volume (360, light depth, VR etc) will begin to change how we make films and content, and ultimately tell visual stories.

I’d also like to invite other cinematographers, actors and directors to join me in embracing these new filmmaking techniques, today.

For I believe virtual production and volumetric filmmaking will give birth to a new age in artistic storytelling, but it will in the beginning at least, borrow heavily from seasons in the past, and take us back to another time in cinema history.

Firstly – If you agree, film is all about capturing a performance and managing how it is then retold, you’ll enjoy the future. For I believe virtual storytelling and volumetric capture will focus storytellers on the craft of performance, direction, and story. Here’s my thinking:

Examples of modern editing style in film and American cinema.

Continue reading “A short Essay on Volumetric Filmmaking”

Open Source VR Engines 2018

Godot Engine

Some open-source VR engine options: Apertus VR / OSVR / GoDot / Annwvyn / take a look at those to start with. Which one to go for?

In our view, from looking at whats on offer GoDot looks good, but then take a look at Xenko. We soon came across Xenko, and due to their flashy graphics and showreel we got a little excited, take a look at the trailer video:

Open Source games engines.

GoDot have released a fresh third version update. Impatient users can put an end to 18 months of waiting by jumping directly to the Download page and start playing with Godot 3.0!

How to choose an open source VR engine / game engine.

Our best advice is follow the links from this post, then take a look at the social metrics for each option. Do they have a big supportive community? Do they include the API or source code access and publishing options you need?

For sure check out if they respond in, or even have a great support forum – these are the questions you’ll need to ask to evaluate if an open-source VR engine is right for your project plans.

Continue reading “Open Source VR Engines 2018”

Who owns the Virtual Metaverse and the ideals it’s being built on?

the metaverse

What are the common, open, standards of the VR Metaverse, and will creation in the Metaverse become a human right?

The answer to this question presently is no.

We are in a new time of technological revolution, one driven by mankind inherent drive to travel to new places and build new things, the Metaverse is now actively being built.

But, unlike the birth of the internet, this time around, capitalism and corporations have a firm hold over users and they control the worlds biggest digital networks.

Some open-source VR engine options: Apertus VR / OSVR / GoDot / Annwvyn / more to come on these as they come to market.

There’s a lack of WWW humanist ideals in VR. 

Talking to many players in the VR production, all of them are trying to win a race to become the dominant VR platform.

But, without the open, shared and common standards that built the internet, how will we activate millions of developers and builders? Will the metaverse ever gain the open foundations for innovation that gave birth to the likes of Facebook, Apple and Google?

Continue reading “Who owns the Virtual Metaverse and the ideals it’s being built on?”

360° 6DOF Volumetric VR Video Technologies Reviewed 2018

HypeVR Tonaci Tran RED rig

We explored the web for 6Dof (volumetric) video technologies and companies. As a virtual production system integrator, OSF bring 6Dof volumetric production technologies to locations, sets and stages worldwide. In this post we take a look at what’s out there in the volumetric video, VR video, and 6dof space.

Lets explore 360° 6DOF VOLUMETRIC VIDEO

We’ll be updating this post as we discover new players in the space, but for now here’s our run down of companies and technologies in volumetric video production.

Continue reading “360° 6DOF Volumetric VR Video Technologies Reviewed 2018”