Controlling Robots from Inside Unreal Engine
A world first look at OSF director of virtual production Asa Bailey using the Mo-Sys Engineering team’s new code for controlling the Mo-Sys Lambda with Roll-Axis (Gravity, Life of Pie) from inside Unreal Engine 4.22.
Usually in virtual production when we think about integrating technologies within UE4, we think about sending data from the real world into the virtual world of the game engine; like when we are rigging up a motion capture suit to UE to capture realtime performance data. But in this example of integration we can see UE4 sending data back out to a robot thats very much in the real world.
What the guys at Mo-Sys Engineering have done is work out how to use in engine user inputs, usually used for user interaction within the engine and turned it on it’s head, so that the users interactions now feed out to the robot. The robot can also in effect be absolutely anywhere with a data connection. The robot could be in any country as the data not only ports out and drives the robot, but the way it does it through low latency IP networking means that 1 UE4 user could in theory control any number of robots, in synchronised movements, anywhere.
Florian Gallier Mechatronics Engineer (specialises in embedded systems) at Mo-Sys said “A solution that allows you to control the Mo-Sys Lambda head remotely from any location inside Unreal Engine, synchronising virtual cameras in Unreal Engine with motion controlled cameras on set with zero latency. When you start to think about all the possibilities you realise that it’s a game-changer for filmmakers who need real-time results.”
So now let’s put this development into a use case on-set as it could be used in virtual production. Well for starters you could take any virtual camera in UE4, add movement (robot) parameters to it, use it to pre-vis your shots, then simply export the data to the robotic head on-set.
James Uren Technical Director of Virtual Production at Mo-Sys stated “In addition to Mo-Sys Lambda being a robust, heavy duty remote head, it doubles up as a virtual camera thanks to its precise tracking data. Along with our Mo-Sys tracking plugin on the Unreal Marketplace, this solution is great for VFX filmmakers and real-time workflows.”
But as an 80’s gamer boy, I want to control a robot camera man, using a game pad/controller in UE4, and so this will be our next trick. When I got hold of this tech, I asked the guys at Mo-Sys if they could set up a game controller to my favourite virtual production rig set up.
My go to VP rig consists of a semi circular track, a Panther Dolly and my Alexa Mini. But after seeing this spinning robot remote head (it’s the same one they used throughout the making of the film Gravity) I’ve now asked the guys to replace my Panther Dolly with a robotic rig, on my track, that I’ll be able to drive with a game pad, from behind my virtual production deck.
Robotic camera operators, they don’t get tired, I can record and replay any camera move over and over again in perfect repetition. I can synchronise multiple cameras to pull the same move, I can even control cameras remotely in any part of the studio, in another studio or on-sets that are in different studios, even countries.
But let’s just get back to the point of going from Unreal to Real.
I can also see this technology in other areas of life what about science, medicine and manufacturing? Simulated or augmented applications, like a visual application in UE of an oil rig rendered in realtime with live data telling the user of where the faults are, the user could be sitting in a safe place working on the virtual simulation while the robot was out there working in harmony, fixing issues in what could be a very non friendly, dangerous environment.
After just writing and reading the above my arty-farty use of these technologies seem a bit well less important than saving human lives. But heck, I’m a robot obsessed filmmaker so that’s what I’ll do, with thew help of Mo-Sys I’ll get back to hacking these new developments into my virtual production workflows.