Deep compositing with no green screen requires a volumetric, depth view of the scene. The Intel Realsense range of cameras are a consumer level mass product that allows creators to combine VR sets with real world images based on depth, so that elements like background and foreground can be easily separated from the camera image, or more precisely layered in the correct order so that your picture makes sense.
We explored the web for 6Dof (volumetric) video technologies and companies. As a virtual production system integrator, OSF bring 6Dof volumetric production technologies to locations, sets and stages worldwide. In this post we take a look at what’s out there in the volumetric video, VR video, and 6dof space.
Lets explore 360° 6DOF VOLUMETRIC VIDEO
We’ll be updating this post as we discover new players in the space, but for now here’s our run down of companies and technologies in volumetric video production.
We have had an amazing response to our teams work in real-time compositing and live on-set VFX.
In this post, I will try and tell you more about how we build our real-time VFX set-ups. if you have any questions please use our contact form and one of the team will get right back to you. [UPATE: We have just set up our Facebook Page, like to keep up with new developments]
On-set Facilities Real-time VFX systems, are built by OSF teams in the UK and Madrid, using virtual, 360, 3D environments, camera tracking, hardware compositing tools, color correction software, and game engines.
Various manufacturers provide the vital technology (listed below) and OSF is now working closely with developers and manufacturers to push the boundaries between, CGI, VFX, 3D, Mixed Reality, film, and games.
“So far we have created a great tool for pre-visualisation and real-time VFX. It allows us to record in real-time, live-action foregrounds (with our without live-action sets, using real props and actors) and digitally bake them within HD, 2K, 360º, and 3D Virtual Worlds.”
The camera tracking information is captured along with zoom and focus data in FBX, it can then be passed directly to NUKE for on-set compositing or saved to be manipulated later.
Here is a list of the technologies you’ll need to explore if you want to build your own real-time compositing set-up and add live VFX to your next production.
We bake in UE4, we comp in NUKE, we create 3D environments and animated characters and elements in 3D Studio Max. Some hardware is OSF own as are the configuration tools and methods. You can always talk to us if you need us.
This is another test with a scene made by Sungwoo Lee, it uses Octane Render in Unity and then baked into UE4, we are now all waiting for Octane render for UE4, and expect a release in the near future. Thanks to Sungwoo for this test.