Unreal Engine Real Time Digital Character Animation

File under creating content for AI advertising platforms.

To meet the demands of AI driven advertising platforms for monster.com and The Mill turned to real time production methods. [AI advertising platforms are using deep learning to optimise ad creative placement and they require a large amount of creative options to get going.]

real time rendered characters

How to produce 35 ready to fly creative ads in one day.

The team used realtime production methods, using Unreal Engine to make content on mass, in this case around 35 creative assets. Requiring real time render and captures at broadcast quality, real time production was the only answer. The choice to adopt real time production methods gave the editors a mass of rushes to take away and start cutting right away, all to deliver back to the agency and on to those hungry AI driven advertising platform.

Best Practice for Live 360 Video Production

Imeve CEO Devon Copley recently gave this keynote presentation on Best Practices for Live 360 Video, as part of the IVRPA 2018 conference in Japan. In this video, Devon shares hard-won, real-world tips on how to plan and run a successful live virtual reality production. He covers issues like camera choice, camera position, client management, encoding options, and much more. Whether you’re new to live 360 or have dozens of productions under your belt, there’s something here among these best practices for you. Check out the video below:

Best Computers for VR Production

On Set Facilities

Most people don’t realise that VR games require seven times the graphics power of normal 3D games. This is because the graphics card has to deliver two different high-resolution images to both eyes at 90 frames per second.

Want to build the metaverse? In this post we are going to take a look at the best specifications for VR development workstations, what you’ll need.

The Best GPU for VR Production:


Foundry Trends Interviews OSF Founder Asa Bailey

Originally posted on foundry.com

Asa Bailey is a screenwriter, producer, director and novelist – and somehow, in between all of that, he had time to found On-Set Facilities, a British technology company with ambitions of becoming the world’s leading system integrator of real-time and virtual production solutions. Foundry Trends caught up with Asa to discuss his career, his real-time philosophy and his views on the sector.

Foundry Trends (FT): You’ve had a varied career up to this point, Asa. How did you end up setting up a service production company?

Asa Bailey (AB): I started out in advertising and design, but began making viral videos. This was all before Youtube, but even so my work was achieving around 20 million views. Back then there was no such thing as pre- or post-production – we just didn’t have the budgets. We had to learn how to shortcut things.

At the start I was scrabbling round on budgets of $5,000, but at the height of that period I was directing the likes of Jackie Chan in Hong Kong, on multi-million-dollar projects. The decision to set up On-Set Facilities (OSF, originally called Mum & Dad) came after traveling around the world on these projects, and constantly wanting to find a way to optimise the pipeline.

FT: How did you move into real-time production work?

AB: We’d been doing the on-set side of things – editing, colour-grading, the IT side of things – from day one. Our work with real-time has come about slowly as the technology has become available, from GPUs to rendering – it wasn’t something we came across and thought: “oh, that’s clever!” It grew from the company philosophy of doing everything faster, while squeezing out everything we can for greater optimisation.

That philosophy has done us well so far. From setting up the company with my wife, we now have a studio in Mexico and we’re leading the development of real-time production development at Pinewood Studios.

FT: What stands you out in your field?

AB: I don’t think anyone can put us in a box, and if they tried to they’d be wrong. We firmly believe that all good technology needs to have mass-adoption, so we’re totally against proprietary systems within real-time production. That’s not the way to go, especially as there are so many levels and usages within the space.

What we’ve been doing is pulling technologies from all sorts of fields – from broadcast to game development. I think our heritage is pretty unique, as we’re from a background of digital optimisation, and our outlook is maybe slightly more “Silicon Valley” than our peers.

FT: What’s the thing that most excites you most about the real-time production space?

AB: “Real-time production” itself isn’t something that can be entirely put into one box and sold as “you must buy this”. In fact, that’s been one of the biggest hurdles to mass adoption.

This space is a mad, crazy place to be – with so many different solutions and technologies coming up all the time. The wave that’s in progress has completely overtaken the companies that have tried and failed to create a one-size-fits-all, black-box solution. People don’t want that. They want to have access to the technology, the pipeline or the people – so they can do it themselves.

FT: You’ve just hinted at a hurdle to mass adoption there. Is there anything else that’s holding real-time production back?

AB: I’d say the biggest hurdle is with the producers. The tech guys like myself can see the way the industry’s going, and so can the rest of the VFX industry. The problem is at the business end. The film industry has a long heritage of doing things just because they’ve been done for a long time – when producers and investors are used to a certain type of production, they’re more receptive to it.

Of course, real-time involves a lot of pre-production. That means diverting funds that would usually be spent in post. I think we need to educate some of the financiers in the film industry that the old model is changing, but that they should embrace it.

FT: What would that look like? Could it be as simple as consistently producing great work using real-time?

AB: Pretty much. The one thing is that people think there’s an immediate cost-benefit. There is, of course, but – more than that – it’s about creating more opportunities for filmmakers. It’s about giving them more options to be as creative as possible.

FT: Where does OSF go from here?

AB: Directing is a lifelong career, but I’m really lucky in that I got the majority of the creative monkeys off my back early on in my career. Now, with OSF, we can just help people. We want to continue to work with directors and help bring their vision to life through real-time production.

Real-time is for everyone – that’s the main message we want to spread by working hand-in-hand with others.

V-Ray Plug in For Unreal Engine UE4 in BETA

Chaos Group has just launched V-Ray for Unreal, the new version of its V-Ray renderer for Unreal Engine.

The product makes it possible both to import V-Ray scenes into Unreal Engine to use as the basis of immersive real-time experience and to generate ray traced renders within UE4.

Import V-Ray scenes created in 3ds Max, Maya and SketchUp into Unreal Engine V-Ray for Unreal makes it possible to import scenes created in other editions of V-Ray – at the minute, 3ds Max, Maya and SketchUp – into Unreal Engine. The process automatically converts V-Ray lights and materials into their real-time equivalents.

(Epic Games’ own Unreal Studio – itself just released in beta – does something similar, but only for 3ds Max scenes with V-Ray materials.)

Adjust materials and lighting in UE4 in real time, then generate a ray traced render As well as using a V-Ray scene as the basis for a conventional Unreal Engine project, users can also use UE4 as an environment in which to make changes to materials and lighting in real time.

The changes are then propagated back to V-Ray; or you can generate a ray traced render within UE4 itself.

The system is intended to make it possible to create both offline and real-time content with a single unified workflow, particularly for visualisation projects.

Pricing and availability – V-Ray for Unreal is currently in closed beta. You can apply to join the beta program here: to do so, you will need to have registered for an account on Chaos Group’s website, which is free.

So far, Chaos Group hasn’t annnounced any details of the commercial release date or pricing.

Read more about V-Ray for Unreal on Chaos Group’s website

Boost Realtime VFX in Unreal Engine (UE4) with multiple GPUs using SLI

GPU Render Performance

Scalable Link Interface (SLI) is a multi-GPU configuration that offers increased rendering performance by dividing the workload across multiple GPUs.

Since UE4.15 Unreal Engine has been able to take advantage of machines and servers with multiple GPUs, so long as the GPU and system are compatible with SLI functionality.

Realtime Ray Tracing Demo
Realtime VFX demo Ray Tracing at GDC 2018 by Unreal Engine ILMxLab and Nvidia.

To take advantage of SLI, the system must use an SLI-certified motherboard. Such motherboards have multiple PCI-Express x16 slots and are specifically engineered for SLI configurations.

Building Multiple GPU machines

To create a multi-GPU SLI configuration, [NVIDIA] GPUs must be attached to at least two of these slots, and then these GPUs must be linked using external SLI bridge connectors.

Once the hardware is configured for SLI, and the driver is properly installed for all the GPUs, SLI rendering must be enabled in the NVIDIA control panel. At this point, the driver can treat both GPUs as one logical device, and divide rendering workload automatically depending on the selected mode.

There are five SLI rendering modes available:

  • Alternate Frame Rendering (AFR)
  • Split Frame Rendering (SFR)
  • Boost Performance Hybrid SLI
  • Compatibility mode

If you are building a multiple GPU system with GPUs of different capabilities, say a Titan X and then a couple of Quadros, you can utilise the SLI Compatibility mode. This mode enables UE4 to push rendering tasks to the most suitable GPU in your set up. Hard tasks go to the more powerful while the other less powerful GPUs in your rig handle the less, and more appropriate tasks. If you are interested in understanding more about SLI take a look at the following page on the Nvidia website.

UPDATE 26/03/2018 after posting this post to the Octane Render group on Facebook a few interesting comments came up that we thought we’d ad to this post.

James Hibbert said “just for clarification, this article is talking a lot about SLI, and using SLI bridges, you do not need any of that for rendering with Octane using multiple GPUs.” But then added “IF you are using UE4, then yes you will probably want SLI if in the context of your project it actually gives you some benefit. That is not always a given with Raster rendering. However with Octane, your speed scales 1:1 with the number of GPUs you have.

James Hibbert, just an aside, every PC Tech guru seems to agree on one thing. For games, at least the vast majority of them, a gamer is better off getting the fastest single GPU they can afford, rather than getting 2 slower/cheaper cards and running them in SLI/Crossfire. For Octane, and Red Shift, you simply need as many GPUs as you can afford.

Just remember Multi-GPU and SLI are not the same thing. SLI is a specific technology from Nvidia. Octane does not use SLI, Octane uses muli-gpu (not sure exactly wich flavor there is, but your motherboard does it on it’s own with the help of the OS).

There is a difference.

Now there is another form of of Multi-GPU from nvidia called NV Link, NV Link is similar to SLI, but allows you to do things like stack GPU memory, so if you have 4 GPUs with 11gb of VRAM you will have a total of 44gb of VRAM, where as all other forms would still leave you with the original 11gb. keep in mind that NV Link is not available on consumer GPUs, and you need to use Quadro or Tesla cards to use it.

Hopefully that will change with the next line of consumer GPUs from Nvidia. SLI support from nvidia has dropped off quite a bit to the point where they only support 2 way SLI officially. I kinda suspect that they will either be dropping SLI altogether or migrate everything to NVLink in future products. Because of the Raytrace UE4 demo, UE4 will feature support for NV Link in a future build, because they land to link multiple GPUs to get it to run in real-time.”