On-set Facilities Real-time VFX Compositing Set-ups.
We have had an amazing response to our work in real-time compositing and live on-set VFX with our RT3 virtual production workstations.
This post was updated on 2 November 2019. On-set Facilities Real-time VFX systems, are built in the UK, integrating advanced computers that are optimised for Unreal Engine with virtual camera tracking, real-time compositing, motion-capture, multi-layer recording and data asset management systems. All these technologies combined create a system that is capable of creating real-time film, animation and immersive content as well as recording on-set performance data for post-production pipelines.
OSF build turn-key virtual production systems, with worldwide studio set up service and support.
“So far we have created a great tool for pre-visualisation and real-time VFX. It allows us to record in real-time, live-action foregrounds (with our without live-action sets, using real props and actors) and digitally bake them within HD, 2K, 360º, and 3D Virtual Worlds.”
Virtual Camera Tracking for Virtual Production
Connecting your on-set camera to your virtual camera in the game engine is a key factor in realtime compositing and virtual mixed reality production workflows. To do this we incorporate various camera tracking technologies in our systems, Which one depends on the use case but most often we’ll supply either Ncam or Mo-Sys Startracker. We’ve found these to be the best camera tracking technologies for our clients on-location and in studio requirements. The two systems are chalk and cheese in comparison and they use very different patented technologies, the essential difference is that the Ncam uses vision via stereoscopic witness cameras and the Mo-Sys uses upwards looking UV sensors to track fixed constellation of UV stickers or ‘stars’.
With either system the camera motion data is captured and recorded in two places, one onboard the tracking system hardware and second inside Unreal Engine as a camera track in a sequencer project. Using the FBX standard, camera motion animations can be passed down the post-production pipeline to say NUKE or back in to Maya where virtual / CGI backgrounds can be recreated using traditional render methods.
Realtime Compositing Hardware
Real-time chromakey using hardware and software.
In our systems real-time compositing can take place in two places. You can use hardware such as a Blackmagic Ultimatte 12 which will give you fast 12G SDI 10Bit 4.2.2 REC2020 masks, in real-time. To get the chromakey mask from the hardware and into our virtual production workstations we equip our on-set machines with I/O video cards from Aja and or Black Magic.
Compositing in Unreal Engine
Alternatively you can build a pipeline around real-time software compositing, but this method is challenging and requires an advanced understanding of how to use specialist software applications and plug-ins. When we install a systems we provide training using various proprietary virtual production plugins for Unreal Engine. Again depending on the use case of the system we may use a variety of Unreal plug-ins.
Back at our workshop in North Wales we test many virtual production technologies, both hardware and software. This is a video shot on an iPhone taken in the workshop to show one of our team standing in a virtual set, testing how bright lights created by the set lights, interact and affect the composited actor layer. As you can see when you composite in the engine you get much more ‘interaction’ between the real subject and the virtual light generated in the virtual scene.
[Review changes, Camera Tracking, Live Compositing, Virtual Production Systems] Learn about real-time virtual production systems and on-set VFX for content production, virtual sets, mixed reality, VR/AR, real-time animation and motion capture. If you have any questions please use our contact form. [UPATE: We have just set up our Facebook Page, like to keep up with new developments].