Boost Realtime VFX in Unreal Engine (UE4) with multiple GPUs using SLI

GPU Render Performance

Scalable Link Interface (SLI) is a multi-GPU configuration that offers increased rendering performance by dividing the workload across multiple GPUs.

Since UE4.15 Unreal Engine has been able to take advantage of machines and servers with multiple GPUs, so long as the GPU and system are compatible with SLI functionality.

Realtime Ray Tracing Demo
Realtime VFX demo Ray Tracing at GDC 2018 by Unreal Engine ILMxLab and Nvidia.

To take advantage of SLI, the system must use an SLI-certified motherboard. Such motherboards have multiple PCI-Express x16 slots and are specifically engineered for SLI configurations.

Building Multiple GPU machines

To create a multi-GPU SLI configuration, [NVIDIA] GPUs must be attached to at least two of these slots, and then these GPUs must be linked using external SLI bridge connectors.

Once the hardware is configured for SLI, and the driver is properly installed for all the GPUs, SLI rendering must be enabled in the NVIDIA control panel. At this point, the driver can treat both GPUs as one logical device, and divide rendering workload automatically depending on the selected mode.

There are five SLI rendering modes available:

  • Alternate Frame Rendering (AFR)
  • Split Frame Rendering (SFR)
  • Boost Performance Hybrid SLI
  • SLIAA
  • Compatibility mode

If you are building a multiple GPU system with GPUs of different capabilities, say a Titan X and then a couple of Quadros, you can utilise the SLI Compatibility mode. This mode enables UE4 to push rendering tasks to the most suitable GPU in your set up. Hard tasks go to the more powerful while the other less powerful GPUs in your rig handle the less, and more appropriate tasks. If you are interested in understanding more about SLI take a look at the following page on the Nvidia website.

UPDATE 26/03/2018 after posting this post to the Octane Render group on Facebook a few interesting comments came up that we thought we’d ad to this post.

James Hibbert said “just for clarification, this article is talking a lot about SLI, and using SLI bridges, you do not need any of that for rendering with Octane using multiple GPUs.” But then added “IF you are using UE4, then yes you will probably want SLI if in the context of your project it actually gives you some benefit. That is not always a given with Raster rendering. However with Octane, your speed scales 1:1 with the number of GPUs you have.

James Hibbert, just an aside, every PC Tech guru seems to agree on one thing. For games, at least the vast majority of them, a gamer is better off getting the fastest single GPU they can afford, rather than getting 2 slower/cheaper cards and running them in SLI/Crossfire. For Octane, and Red Shift, you simply need as many GPUs as you can afford.

Just remember Multi-GPU and SLI are not the same thing. SLI is a specific technology from Nvidia. Octane does not use SLI, Octane uses muli-gpu (not sure exactly wich flavor there is, but your motherboard does it on it’s own with the help of the OS).

There is a difference.

Now there is another form of of Multi-GPU from nvidia called NV Link, NV Link is similar to SLI, but allows you to do things like stack GPU memory, so if you have 4 GPUs with 11gb of VRAM you will have a total of 44gb of VRAM, where as all other forms would still leave you with the original 11gb. keep in mind that NV Link is not available on consumer GPUs, and you need to use Quadro or Tesla cards to use it.

Hopefully that will change with the next line of consumer GPUs from Nvidia. SLI support from nvidia has dropped off quite a bit to the point where they only support 2 way SLI officially. I kinda suspect that they will either be dropping SLI altogether or migrate everything to NVLink in future products. Because of the Raytrace UE4 demo, UE4 will feature support for NV Link in a future build, because they land to link multiple GPUs to get it to run in real-time.”

Netflix only accept 4K footage for Original Productions

On Set Facilities On Set Colour Grading

A sign of our times, Netflix has firmly stated that it would only use 4K cameras for its original productions, which has been a source of frustration for some cinematographers, especially those that use ARRI gear and work on Netflix projects.

Post-production pipelines must also meet Netflix Original Productions 4K delivery requirements.

Re camera hardware Netflix says “The ARRI Alexa and Amira are fantastic cameras, and we stream plenty of content that was captured with these cameras. However, since these cameras do not have true 4K sensors, we cannot accept them for our 4K original productions.”

Netflix explains, “For those who pay a premium for our UHD 4K service, we only deliver content that was shot and delivered at a true UHD 4K resolution.”

Ultra HD. 4K UHD (2160p) has a resolution of 3840 pixels × 2160 lines (8.3 megapixels, aspect ratio 16:9) and is one of the two resolutions of ultra high definition television targeted towards consumers, the other being 8K UHD which is 7680 pixels × 4320 lines (33.2 megapixels).

On-set Facilities Real-time VFX Compositing Set-ups.

We have had an amazing response to our teams work in real-time compositing and live on-set VFX.

In this post, I will try and tell you more about how we build our real-time VFX set-ups. if you have any questions please use our contact form and one of the team will get right back to you. [UPATE: We have just set up our Facebook Page, like to keep up with new developments]

This is a great PreViz tool so that we can record in REALTIME, the FG, the BG, the Matte.

On-set Facilities Real-time VFX systems, are built by OSF teams in the UK and Madrid, using virtual, 360, 3D environments, camera tracking, hardware compositing tools, color correction software, and game engines.

Realtime VFX Compositing 2nd Test

Various manufacturers provide the vital technology (listed below) and OSF is now working closely with developers and manufacturers to push the boundaries between, CGI, VFX, 3D, Mixed Reality, film, and games.

“So far we have created a great tool for pre-visualisation and real-time VFX. It allows us to record in real-time, live-action foregrounds (with our without live-action sets, using real props and actors) and digitally bake them within HD, 2K, 360º, and 3D Virtual Worlds.”

The camera tracking information is captured along with zoom and focus data in FBX, it can then be passed directly to NUKE for on-set compositing or saved to be manipulated later.

Here is a list of the technologies you’ll need to explore if you want to build your own real-time compositing set-up and add live VFX to your next production.

We bake in UE4, we comp in NUKE, we create 3D environments and animated characters and elements in 3D Studio Max. Some hardware is OSF own as are the configuration tools and methods. You can always talk to us if you need us.

Live VFX Octane Render

This is another test with a scene made by Sungwoo Lee, it uses Octane Render in Unity and then baked into UE4, we are now all waiting for Octane render for UE4, and expect a release in the near future. Thanks to Sungwoo for this test.