AI for VFX and whats it mean for you.

AI VFX on-set

Artificial Intelligence is set to change the way VFX is approached and produced. There’s been discussions by some of the biggest names in the industry like Digital Domain concerning the applications and various forms of AI and VFX, with the question, “how they can be integrated together?”   

Image result for digital domains

Last year at SIGGRAPH there were a series of key talks and panels all discussing the topics of deep learning and examples of convolutional neural networks, generative adversarial networks and autoencoders. The panels discussed how deep learning and convolutional neural networks will be able to benefit the VFX and 3D design industries with everything from face and fluid simulation to image denoising, character animation, facial animation and texture creation.  

The panel was focused mainly on how these new tools will be changing the VFX pipelines. Doug Roble stated that the technologies are “scary tools when seen for the first time” although he went on to claim “you can use these to do visual effects in a completely brand-new way” showing the abilities of the new machine learning technologies. 

However, with these changes in the technologies there’s no denying that jobs will be displaced. But they will also be openings and a shift for jobs in the industry to the visual effect and programming sector.  

The use of machine learning within VFX gives the possibility of making models without the requirement for texturing, lighting and rendering, due to the computer knowing how to do these aspects itself. This would fundamentally change the way the VFX pipeline works and massively decrease the post production times on films.  

We’re closer to being able to have these opportunities because of the newly found data driven approach rather than the previous mathematical methods of programming and hand tuned algorithms.  

Motion control robot for previsulazation

What’s it mean for us? 

With the abilities of AI machine learning constantly developing it won’t be long before AI totally changes the VFX pipeline and possibly shifts it from post to pre-production as is the shift virtual production methods and on-set graphic systems. If the AI can edit a scenario by adding VFX we will be able to make the entire process real-time and cut out a large portion of post-production, cutting down production times and costs.  

They’d still be a lot of work within VFX but it’d be done before a shoot instead of afterwards therefore cutting times. It would benefit actors due to them not having to imagine the VFX they could simply see them on a monitor and then react appropriately; Producers would enjoy the faster production times and less costs. Everyone involved within the production would benefit.

 

Realtime VFX in Unreal Engine 4 Tutorial

realtime VFX tutorial
Unreal Engine's real-time volumetric rendering tools to create seamless special effects that integrated perfectly with the action.

Realtime VFX using Unreal Engine (UE4) is pushing the boundaries of on-set realtime production. Here at On-set Facilities we build realtime VFX machines, sets and realtime production solutions that are optimised for realtime VFX using Unreal Engine (UE4). Tutorial Video after the jump:

Continue reading “Realtime VFX in Unreal Engine 4 Tutorial”

Unreal Engine Building & Optimizing Worlds for Real-Time.

We work with many vendors on projects so we thought it a good idea to share a bit of learning to make everyones life easier. This video focusses on creating content assets in your favourite DCC software, exporting them as FDX files, and considerations when designing content in DCC’s for VR engines. You won’t be surprised, small is beautiful, fast is best, file this under the art of code.

On-set Facilities Real-time VFX Compositing Set-ups.

We have had an amazing response to our teams work in real-time compositing and live on-set VFX.

In this post, I will try and tell you more about how we build our real-time VFX set-ups. if you have any questions please use our contact form and one of the team will get right back to you. [UPATE: We have just set up our Facebook Page, like to keep up with new developments]

This is a great PreViz tool so that we can record in REALTIME, the FG, the BG, the Matte.

On-set Facilities Real-time VFX systems, are built by OSF teams in the UK and Madrid, using virtual, 360, 3D environments, camera tracking, hardware compositing tools, color correction software, and game engines.

Realtime VFX Compositing 2nd Test

Various manufacturers provide the vital technology (listed below) and OSF is now working closely with developers and manufacturers to push the boundaries between, CGI, VFX, 3D, Mixed Reality, film, and games.

“So far we have created a great tool for pre-visualisation and real-time VFX. It allows us to record in real-time, live-action foregrounds (with our without live-action sets, using real props and actors) and digitally bake them within HD, 2K, 360º, and 3D Virtual Worlds.”

The camera tracking information is captured along with zoom and focus data in FBX, it can then be passed directly to NUKE for on-set compositing or saved to be manipulated later.

Here is a list of the technologies you’ll need to explore if you want to build your own real-time compositing set-up and add live VFX to your next production.

We bake in UE4, we comp in NUKE, we create 3D environments and animated characters and elements in 3D Studio Max. Some hardware is OSF own as are the configuration tools and methods. You can always talk to us if you need us.

Live VFX Octane Render

This is another test with a scene made by Sungwoo Lee, it uses Octane Render in Unity and then baked into UE4, we are now all waiting for Octane render for UE4, and expect a release in the near future. Thanks to Sungwoo for this test.