We have had an amazing response to our teams work in real-time compositing and live on-set VFX.
In this post, I will try and tell you more about how we build our real-time VFX set-ups. if you have any questions please use our contact form and one of the team will get right back to you. [UPATE: We have just set up our Facebook Page, like to keep up with new developments]
On-set Facilities Real-time VFX systems, are built by OSF teams in the UK and Madrid, using virtual, 360, 3D environments, camera tracking, hardware compositing tools, color correction software, and game engines.
Various manufacturers provide the vital technology (listed below) and OSF is now working closely with developers and manufacturers to push the boundaries between, CGI, VFX, 3D, Mixed Reality, film, and games.
“So far we have created a great tool for pre-visualisation and real-time VFX. It allows us to record in real-time, live-action foregrounds (with our without live-action sets, using real props and actors) and digitally bake them within HD, 2K, 360º, and 3D Virtual Worlds.”
The camera tracking information is captured along with zoom and focus data in FBX, it can then be passed directly to NUKE for on-set compositing or saved to be manipulated later.
Here is a list of the technologies you’ll need to explore if you want to build your own real-time compositing set-up and add live VFX to your next production.
We bake in UE4, we comp in NUKE, we create 3D environments and animated characters and elements in 3D Studio Max. Some hardware is OSF own as are the configuration tools and methods. You can always talk to us if you need us.
This is another test with a scene made by Sungwoo Lee, it uses Octane Render in Unity and then baked into UE4, we are now all waiting for Octane render for UE4, and expect a release in the near future. Thanks to Sungwoo for this test.