What Does Virtual Production Mean For Filmmakers?

Green Screen Virtual Sets
Unreal Engine look at what is virtual production

Virtual Production (VP) is changing the way we make films television and digital content. Unreal Engine has registered uevirtualproduction.com and redirected it to new portal that looks at how UE is being used to create real-time virtual production.

Virtual Stage Sets For Film & TV
Virtual Production, mixing real props with virtual sets © Curtesy of OSF Madrid.

Take a look at OSF work in virtual production

Virtual production fits into two camps, first its an advanced pre-vis tool a way for directors and crew to previsualise digital assets on set, including virtual sets, character animations, and effects on-set with real actors and props.

The developing area of virtual production is in real-time production, this is where live action plates are composited with Virtual Engine generated sets, effects and animation to create finished shots.

Virtual film Production demonstration for film

Adding on-set post production, film and content production times and therefore costs can be reduced by employing realtime virtual production methods. Above are two examples produced in OSF studios. Find out more about OSF Virtual Production, Real-Time VFX and On-Set Post Production Facilities Click here.

Unreal Engine Real Time Digital Character Animation

File under creating content for AI advertising platforms.

To meet the demands of AI driven advertising platforms for monster.com and The Mill turned to real time production methods. [AI advertising platforms are using deep learning to optimise ad creative placement and they require a large amount of creative options to get going.]

real time rendered characters

How to produce 35 ready to fly creative ads in one day.

The team used realtime production methods, using Unreal Engine to make content on mass, in this case around 35 creative assets. Requiring real time render and captures at broadcast quality, real time production was the only answer. The choice to adopt real time production methods gave the editors a mass of rushes to take away and start cutting right away, all to deliver back to the agency and on to those hungry AI driven advertising platform.

Best Practice for Live 360 Video Production

Imeve CEO Devon Copley recently gave this keynote presentation on Best Practices for Live 360 Video, as part of the IVRPA 2018 conference in Japan. In this video, Devon shares hard-won, real-world tips on how to plan and run a successful live virtual reality production. He covers issues like camera choice, camera position, client management, encoding options, and much more. Whether you’re new to live 360 or have dozens of productions under your belt, there’s something here among these best practices for you. Check out the video below:

Best Computers for VR Production

On Set Facilities

Most people don’t realise that VR games require seven times the graphics power of normal 3D games. This is because the graphics card has to deliver two different high-resolution images to both eyes at 90 frames per second.

Want to build the metaverse? In this post we are going to take a look at the best specifications for VR development workstations, what you’ll need.

The Best GPU for VR Production:

 

Foundry Trends Interviews OSF Founder Asa Bailey

Originally posted on foundry.com

Asa Bailey is a screenwriter, producer, director and novelist – and somehow, in between all of that, he had time to found On-Set Facilities, a British technology company with ambitions of becoming the world’s leading system integrator of real-time and virtual production solutions. Foundry Trends caught up with Asa to discuss his career, his real-time philosophy and his views on the sector.

Foundry Trends (FT): You’ve had a varied career up to this point, Asa. How did you end up setting up a service production company?

Asa Bailey (AB): I started out in advertising and design, but began making viral videos. This was all before Youtube, but even so my work was achieving around 20 million views. Back then there was no such thing as pre- or post-production – we just didn’t have the budgets. We had to learn how to shortcut things.

At the start I was scrabbling round on budgets of $5,000, but at the height of that period I was directing the likes of Jackie Chan in Hong Kong, on multi-million-dollar projects. The decision to set up On-Set Facilities (OSF, originally called Mum & Dad) came after traveling around the world on these projects, and constantly wanting to find a way to optimise the pipeline.

FT: How did you move into real-time production work?

AB: We’d been doing the on-set side of things – editing, colour-grading, the IT side of things – from day one. Our work with real-time has come about slowly as the technology has become available, from GPUs to rendering – it wasn’t something we came across and thought: “oh, that’s clever!” It grew from the company philosophy of doing everything faster, while squeezing out everything we can for greater optimisation.

That philosophy has done us well so far. From setting up the company with my wife, we now have a studio in Mexico and we’re leading the development of real-time production development at Pinewood Studios.

FT: What stands you out in your field?

AB: I don’t think anyone can put us in a box, and if they tried to they’d be wrong. We firmly believe that all good technology needs to have mass-adoption, so we’re totally against proprietary systems within real-time production. That’s not the way to go, especially as there are so many levels and usages within the space.

What we’ve been doing is pulling technologies from all sorts of fields – from broadcast to game development. I think our heritage is pretty unique, as we’re from a background of digital optimisation, and our outlook is maybe slightly more “Silicon Valley” than our peers.

FT: What’s the thing that most excites you most about the real-time production space?

AB: “Real-time production” itself isn’t something that can be entirely put into one box and sold as “you must buy this”. In fact, that’s been one of the biggest hurdles to mass adoption.

This space is a mad, crazy place to be – with so many different solutions and technologies coming up all the time. The wave that’s in progress has completely overtaken the companies that have tried and failed to create a one-size-fits-all, black-box solution. People don’t want that. They want to have access to the technology, the pipeline or the people – so they can do it themselves.

FT: You’ve just hinted at a hurdle to mass adoption there. Is there anything else that’s holding real-time production back?

AB: I’d say the biggest hurdle is with the producers. The tech guys like myself can see the way the industry’s going, and so can the rest of the VFX industry. The problem is at the business end. The film industry has a long heritage of doing things just because they’ve been done for a long time – when producers and investors are used to a certain type of production, they’re more receptive to it.

Of course, real-time involves a lot of pre-production. That means diverting funds that would usually be spent in post. I think we need to educate some of the financiers in the film industry that the old model is changing, but that they should embrace it.

FT: What would that look like? Could it be as simple as consistently producing great work using real-time?

AB: Pretty much. The one thing is that people think there’s an immediate cost-benefit. There is, of course, but – more than that – it’s about creating more opportunities for filmmakers. It’s about giving them more options to be as creative as possible.

FT: Where does OSF go from here?

AB: Directing is a lifelong career, but I’m really lucky in that I got the majority of the creative monkeys off my back early on in my career. Now, with OSF, we can just help people. We want to continue to work with directors and help bring their vision to life through real-time production.

Real-time is for everyone – that’s the main message we want to spread by working hand-in-hand with others.

V-Ray Plug in For Unreal Engine UE4 in BETA

Chaos Group has just launched V-Ray for Unreal, the new version of its V-Ray renderer for Unreal Engine.

The product makes it possible both to import V-Ray scenes into Unreal Engine to use as the basis of immersive real-time experience and to generate ray traced renders within UE4.

Import V-Ray scenes created in 3ds Max, Maya and SketchUp into Unreal Engine V-Ray for Unreal makes it possible to import scenes created in other editions of V-Ray – at the minute, 3ds Max, Maya and SketchUp – into Unreal Engine. The process automatically converts V-Ray lights and materials into their real-time equivalents.

(Epic Games’ own Unreal Studio – itself just released in beta – does something similar, but only for 3ds Max scenes with V-Ray materials.)

Adjust materials and lighting in UE4 in real time, then generate a ray traced render As well as using a V-Ray scene as the basis for a conventional Unreal Engine project, users can also use UE4 as an environment in which to make changes to materials and lighting in real time.

The changes are then propagated back to V-Ray; or you can generate a ray traced render within UE4 itself.

The system is intended to make it possible to create both offline and real-time content with a single unified workflow, particularly for visualisation projects.

Pricing and availability – V-Ray for Unreal is currently in closed beta. You can apply to join the beta program here: to do so, you will need to have registered for an account on Chaos Group’s website, which is free.

So far, Chaos Group hasn’t annnounced any details of the commercial release date or pricing.

Read more about V-Ray for Unreal on Chaos Group’s website