The Problem With Microsoft Mixed Reality Capture Studio

Microsoft Mixed Reality Studio Review

You wont be seeing armies of volumetric Orcs or legions of stormtroopers coming out of a Microsoft’s Mixed Reality Capture Studio, not yet anyway.

Microsoft are on a push with their Mixed Reality platform and along with their Mixed Reality Capture Studio solutions, but there’s a funky problem, call it a pain-point if you like. It’s how the Microsoft system relies heavily on  infrared sensors and how they see materials. There’s a workaround but its an expensive one.

What is Volumetric Video Capture?

Volumetric capture promises to record bodies, faces and heads in detail with depth data, as well as overlaid optical information showing colour and texture like that shot from a traditional camera. Don’t get us wrong we all love this idea, but Microsoft have rushed to sell what we’d call out as vapourware.

Shot in a volume, volumetric video promises 3D animation-ready VR/AR characters in the geometry based .MP4 format used for games and interactive experiences. The system overlays and merges optical, depth and animation data so that you can import 3D animated characters directly into your Unity or Unreal Engine game dev. But in the industry we do all this already with mo-cap and plenty of hard work. The benefit of volumetric video capture should be one of speed, but that’s not the case so far.

microsoft mixed reality studio

With the Microsoft Mixed Reality Studio system, 190 cameras are used in a 360-degree studio, each camera is equipped with one optical and in the case of Microsoft one infrared sensor which detects depth and movement and records them as .FBX data.  That’s all fine and dandy, but its the IR sensors part that seems to be the course of so many problems.

The problem with volumetric IR depth sensors.

When shooting with IR as in Microsoft’s Mixed reality capture studio its essential that you take into consideration the wardrobe of the production, because it could make or break your volumetric video production budget. We are not saying this makes the system a total waste of time, but in the interest of seeing disappointed faces on set, you need to know this.

volumetric costume design
Volumetric costume design.

Wardrobe and Production Design are Key.

The reason why wardrobe and design are so important is because infrared cameras struggle with many different materials, making the uninitiated stylist’s job a potential hell. 

Inferred cameras have trouble with:

  • Leathers  
  • Glass  
  • Jewellery
  • Dark colours (especially BLACK!) 
  • Shinny metals
  • Plastics
  • Stitching
  • Patterns

But then in saying all that, in tests one pair of black jeans vanished while another ‘brand’ of black jeans worked perfectly fine. So what’s going on?

These restrictions (or maybe not restrictions depending on the unique properties of each) may not seem like a massive problem but these are just some materials that have coursed problems in shooting volumetric video.

You have to take into consideration the variables like density, chemical make up, the weave pattern, fabric texture, even the physical qualities of makeup applied and its colour. These restrictions may limit a lot of creative decisions for Directors, stylists and production designers.

The reason why the infrared doesn’t pick up or detect these materials is because (here comes the science) when infrared waves (light) are sent towards an object, they do one of 3 things, depending on the properties of the object and surface they hit.

infrared wave lengh, Volumetric video capture

IR light waves are either reflected, absorbed or transferred. When shooting volumetric video with Microsoft mixed reality capture the beams have to be reflected back to the camera to relay the needed data.  

How can this be helped? 

Certain aspects of the problem can be reduced for example the metals can be sprayed with dulling spray to bring down the amount of reflection to have more control of the IR waves. There’s other exceptions, but you need to think like a chemist and physicist to know why.

As we said, some clothes for example black jeans shouldn’t work but then when tested some worked fine due to their different weave, dye and chemical make up. Turning trousers inside out and shooting the opposite side of the fabric solved one shoot. Some stitching scans but then stand out above the fabric, it seems to be down to the actual physical and chemical properties as well as colour and reflectivity.

So what do you do?

The only real way to know if a costume is going to work, is to go to a studio and do physical tests with the system, cause as reported there’s sometimes exceptions and unless you test you will never know, you’ll just be guessing.

Is this just a problem with Microsoft’s volumetric capture solution? We’d say its a problem for any IR based depth and motion sensors. Testing costume and props before shooting means more time in the volumetric volume and that means more costs.

With the cost to hire a volumetric stage using the Microsoft Mixed Reality Capture Studio software currently set around $20,000 a day, at least half of the day is spent tweaking and testing costume and props in regards to getting a good  result from the systems IR sensors.

testing volumetric video

Some volume operators are also charging additional facility fees for each costume change, as the system needs to be repeatedly tested and possibly changes made on-set to the costume for every actor.

Imagine having to scan 20 different types of game characters, having to test all the armour, weapons and kit, then only to find that they don’t ‘read’ in the volumetric volume. VFX Supervisors need to budget for testing, re-testing, shooting, cleaning up the data, and then finally shooting.

Volumetric video capture solutions… next!

Unreal Engine Plugin Market Opens Up

Take a look at what’s possible, Theia Interactive built Optim as a plug-in for Datasmith thanks to how Epic Games Unreal Engine enables Python developers to get in, under the engines hood to make custom plugins possible for all kinds of functionality.  This is another example of how Epic Games is inventing new markets in its own ecosystem.

Optim 

With Epic recently making the editor 100% scriptable with python, developers have total freedom enabling them to write simple code to automate everyday actions like imports processes, or deleting and generating UVs. Simple tasks made automatic saving developers time, efforts and speeding up production.  

Optim uses Datasmith’s 20+ format support to get high-fidelity data out of those core applications. Epic has focused their technical skills on accurately bringing over metadata, converting materials to Unreal Engine, and dealing with a wide range of data-prep issues. Optim builds on that base to simplify the optimization process. 

Some examples of things that you could automate with Optim are: 

  • Skip the import of any meshes with names containing a certain string 
  • Create LODs for any mesh larger than a given number of triangles 
  • Instance any mesh with a particular name 
  • Merge everything with a particular property under a single group 
  • Replace all materials by name with existing materials from the Content Browser 

Python Scripts 

Unlike Blueprints, the Python environment is only available in the Unreal Editor, not when your Project is running in the Unreal Engine. That means that you can use Python freely for scripting and automating the Editor or building asset production pipelines, but you cannot currently use it as a gameplay scripting language. If you wish to use a gameplay scripting language blueprints are the best option for now.  

Python coding in Unreal Engine

New Markets? 

Optim shows the newly found possibilities of Unreal with the help of python being integrated with Datasmith it allows all developers to make the Unreal Engine theirs and customize every aspect of it. Optim will also be sold as a subscription package in early 2019. Could this be the start of a new market? Epics developed a design engine which is now stable enough to possible be the home of a totally new market. A market where third-party teams like Theia Interactive can take advantage of the Unreal Engine creating services and content without any barriers to enter the market and profit to be made.

AI for VFX and whats it mean for you.

AI VFX on-set

Artificial Intelligence is set to change the way VFX is approached and produced. There’s been discussions by some of the biggest names in the industry like Digital Domain concerning the applications and various forms of AI and VFX, with the question, “how they can be integrated together?”   

Image result for digital domains

Last year at SIGGRAPH there were a series of key talks and panels all discussing the topics of deep learning and examples of convolutional neural networks, generative adversarial networks and autoencoders. The panels discussed how deep learning and convolutional neural networks will be able to benefit the VFX and 3D design industries with everything from face and fluid simulation to image denoising, character animation, facial animation and texture creation.  

The panel was focused mainly on how these new tools will be changing the VFX pipelines. Doug Roble stated that the technologies are “scary tools when seen for the first time” although he went on to claim “you can use these to do visual effects in a completely brand-new way” showing the abilities of the new machine learning technologies. 

However, with these changes in the technologies there’s no denying that jobs will be displaced. But they will also be openings and a shift for jobs in the industry to the visual effect and programming sector.  

The use of machine learning within VFX gives the possibility of making models without the requirement for texturing, lighting and rendering, due to the computer knowing how to do these aspects itself. This would fundamentally change the way the VFX pipeline works and massively decrease the post production times on films.  

We’re closer to being able to have these opportunities because of the newly found data driven approach rather than the previous mathematical methods of programming and hand tuned algorithms.  

Motion control robot for previsulazation

What’s it mean for us? 

With the abilities of AI machine learning constantly developing it won’t be long before AI totally changes the VFX pipeline and possibly shifts it from post to pre-production as is the shift virtual production methods and on-set graphic systems. If the AI can edit a scenario by adding VFX we will be able to make the entire process real-time and cut out a large portion of post-production, cutting down production times and costs.  

They’d still be a lot of work within VFX but it’d be done before a shoot instead of afterwards therefore cutting times. It would benefit actors due to them not having to imagine the VFX they could simply see them on a monitor and then react appropriately; Producers would enjoy the faster production times and less costs. Everyone involved within the production would benefit.

 

Octane Render Plugin for Unreal Engine News 2019

It’s happening, just waiting on 2019.1 features in core for wider beta release” was the response of OTOY CEO Jules Urbach in a comment in the OctaneRender Facebook Group, when OSF pushed for release date news of the new OctaneRender UE4 plugin and integration.  

Earlier this year OTOY and Epic Games announced that OctaneRender and Unreal Engine will be integrated in the first half of 2019. The software’s have been designed together to help improve AI-accelerated GPU path-tracking and light field baking for unreal powered games, films, VFX, arch vis and mixed reality and virtual production applications.   

The new software abilities and features will be included in the OctaneRender’s $20/month baseline subscription package. This subscription enables you to have 20 GPU’s, network rendering and over 20 DCC integrations all at your fingertips.  

See the source image

Earlier this year at SIGGRAPH where there was an Unreal Engine integration display the engine showed abilities to: 

  • Rapidly do automatic conversion of Unreal Engine scenes and materials into OctaneRender. 
  • Brigade Engine was demonstrated during the display, an OctaneRendering tool was used to realtime path trace games and interactive content, the tool was powered by the Unreal Engine. 
  • The display showed the ability to AI Light, AI Scene, AI Spectral and Volumetric Denoising, and Out-of-Core Geometry, UDIM Support and Light Linking for production-ready final rendering.  
  • Volumetric Geometry and Lighting for infinite detail and granular lighting control. 
  • By using OTOY’s ORBIX scene format you’ll able to support more than 20 of the industry’s leading DCC tools- including Cinema 4D, Autodesk Maya, and 3DS Max. This enables artists to create scenes wherever they prefer and easily drop them into the Unreal engine and the content be fully responsive.  
  •  OctaneRender is optimized to support: NVIDIA RTX Ray Tracing, Vulkan, DXR, CUDA and Metal iOS/Mac OS backends built on OTOY’s cross-platform RNDR framework.  
OSF virtual set Octane render in Unreal Engine

Above: OSF virtual sets rendered in octane, baked in Unreal Engine.

This isn’t the first collaboration between OTOY and Epic Games, this collaboration expands on the previous release of Paragon assets which was given to all Unreal Engine developers. Which included OTOY’s LightStage and this gave the developers more facial scanning technology to reach new levels of photorealism. The integration of OctaneRender 2019 into Unreal Engine will gain developers access to the RNDR SDK which will serve as the portal for OTOY’s end-to-end holographic mixed reality.  

 

Virtual Production Spectrum

The arrival of Realtime Virtual Production comes thanks to the advances in realtime technologies, a shift to GPU enhanced hardware and software and the ever increasing convergence of digital technologies. To better understand the potential scope for the development of Realtime Virtual Production methods, OSF created the Virtual Production Spectrum.

Virtual Production Spectrum

The Virtual Production Spectrum is a tool that shows the scope of virtual production as a way to creating final-image-quality 2d and 3D content in realtime. All shots in RVP fall somewhere on the Virtual Production Spectrum. OSF

At one end of the spectrum you have 100% physical production, when shooting 100% live action. In the middle you have Augmented or Mixed Reality, this is where virtual production methods combines live action elements with virtually created elements (realtime VFX, virtual sets, realtime animation etc.) On the far right of the The Virtual Production Spectrum is Virtual Reality, this is where the final image is created as a 100% digital image, in realtime.

Since the birth of the film, filmmakers have sought out the most fantastic filming locations and used mechanical and practical effects to tell their stories. On a virtual sound stage, any real-world location can be pre-captured and then digitally recreated, as a photorealistic virtual set.

By combining live action with mixed reality and virtual reality on a virtual sound stage, Virtual Production methods and realtime graphic systems are today capable of producing final-image-quality footage for a number media and entertainment applications.

Realtime Virtual Production combines the best of live action, mixed reality and virtual reality to enable limitless story telling opportunities.

Mixed Reality Tools Intel RealSense Depth Cameras

Deep compositing with no green screen requires a volumetric, depth view of the scene. The Intel Realsense range of cameras are a consumer level mass product that allows creators to combine VR sets with real world images based on depth, so that elements like background and foreground can be easily separated from the camera image, or more precisely layered in the correct order so that your picture makes sense.

Virtual Production in Unreal Engine

Virtual Production
VFX Breakdown of virtual production in unreal engine by on-set facilities.

To round off our year, we set our team the challenge of producing a music video in just 12 hours using OSF Virtual Production Systems, from first shot to uploading to the OSF youtube channel. With no post-production.

Created entirely using OSF virtual production systems in Unreal Engine, this test demonstrates high end mixed reality, with real-time compositing of 3D virtual sets, with real-time character, foreground and chroma key layers, with real-time rendering and on-set colour grading.

The test video was all filmed on the OSF MR Factory Stage, a 500m virtual production green-screen stage at On-set Facilities in Madrid.

Real-time compositing of live action actors in virtual sets powered by Unreal Engine. With realtime VFX and on-set colour correction, the only post production required for this video was editing and mastering the 4k file in Adobe Premier. Directed by Asa Bailey, Virtual Production Supervisor at On-set Facilities.

The Virtual Production

Shot by a small crew of just myself on the camera (Alexa Mini), 3 system operators and 2 willing actors, it took little more than 3 hours to shoot the test and a previous 5 hours in preproduction to find our props, texture, light and then bake the 3D set. We used C4D to model the set and we rendered in Octane before baking into Unreal Engine (plugin on the way).

The idea of this test was to show VFX Supervisors, Directors and Producers that it is possible to produce quick cost effective “ready to edit” footage with the OSF realtime virtual production system.

Virtual Production Spectrum

The rules of this test said no post allowed. But, do you remember 8bit and then 16bit computer games? No, well once upon-a-time computer games looked well dodgy, and its a bit like virtual production methods today (written Dec 2018). To really use the technology for high end film productions you’ll still want to open up the files and give them a polish in post. But the more effort you put into pre-production (creating 3D sets and realtime VFX) the better.

16 bit game graphics
Remember 16bit game graphics, virtual production requires 32bit and 64bit.

But what we are proving is that for content production, today virtual production methods can greatly reduce the post-production process, freeing up production budgets to spend more on-set and not in post.

We had just 12 hours. It was a test in discipline for the crew and also an annual quality bench mark of the realtime footage from our systems. What you are seeing in the test are 4k video files from our systems, as recorded on-set.

Virtual Production is a Tech Wave

Virtual production is an emerging space and it involves realtime visual effects, live audio (3D), realtime character animation and realtime motion capture. With developments moving fast, the amount of post-production required to deliver final shots in realtime is reducing rapidly, month to month.

In OSF film work, 60% of close and mid shots are good to go directly to the edit, with just 40% needing to be opened up in post and given a polish. Wide shots are more tricky, but we are making big improvements in colour matching optical and virtual layers and in developing AI shadows and reflections.

Virtual Production Workstations.
OSF Realtime Machines Custom Built Virtual Production Workstations.

 

This year we’ve seen a massive increase in quality, as OSF push towards 32bit and even 64bit colour and graphics in Unreal Engine. Realtime Virtual Production takes serious on-set power. In response OSF have started a line of OSF Realtime Machines, Intel and Nvidia based workstations that are optimised to power realtime virtual productions and AI driven realtime VFX.

The Technology Used in This Test

OSF realtime virtual production system integrations are based on OSF Realtime Machines, running Unreal Engine with realtime rendering and virtual camera cinematic capabilities.

Realtime camera tracking is taken care of by Mo-sys Engineering on top of  the optical camera (Alexa Mini). Blackmagic hardware also plays a large part in OSF solutions, incorporate 4K and 8K broadcast standards.

On-Set Facilities Madrid Studios
OSF Green Screen Stage and Realtime Virtual Production Systems.

But what about Post Production Options?

In this test case, the only post production allowed was to pull the footage from our data recorders and onto a timeline in Adobe Premier. We then cut the rushes to a library track from Audio Networks and that was it.

But, we could have opened the recorded layers and data – Matte – Foreground – Background and RAW optical layers, into a long list of post production software and clean and tweak until our hearts (and clients) content.

For anything more than a test, we’d still prefer to open up the files in post as all assets from the system are editable in post production applications.

For instance we’d have liked to clean up the chroma, improving the look and feel of the virtual sets. But that was not the idea in this test – it was strictly no post allowed, what you see is what you get, in realtime.

Have faith.
Asa Bailey
Director and Virtual Production Supervisor
linkedin.com/in/asabailey

UK TAX Breaks for Film Production

UK Film Tax Breaks

Here’s a quick introduction and round up of what’s on offer Tax Break or as what’s called ‘soft money’ to Producers who are looking to produce films and television in the UK.

UK Film Tax Relief: Up to 25% cash rebate of qualifying expenditure. Must pass a cultural test or qualify as an official co-production (that is, made under UK’s co-production treaties). Project cap: 80% of qualifying expenditure. Minimum spend: 10% of qualifying production expenditure

Google – National Funding: National funding incentives available to international producers include BBC Films, BFI Production & Development Funding, Film4 and Pinewood Pictures. Also see Regional Funding:  Regional incentives include the following:

(1) the Yorkshire Content Fund (will invest the lower of up to 10% of the total production budget or up to £500,000);

(2) Ffilm Cymru Wales Production Funding (provides grants of up to £300,000 for writers, directors and producers who are either born or currently residing in Wales working in English or Welsh);

(3) Creative Scotland Screen Funding (provides single project development Funding of £3,000-£50,000, Slate Development Funding of £50,000-£100,000, production funding of £50,000-£500,000, and distribution and exhibition funding of £5,000 – £15,000);

(4) Northern Ireland Production Funding of a maximum of £800,000 for feature film and television production and £500,000 for interactive content production, up to a ceiling of 25% of the overall project budget. Funding is in the form of a recoupable loan with profit participation or in limited circumstances a grant. Available to productions with more than 65% of funding already in place.

Unreal Engine 4.21 Now Supports DeckLink

Virtual Production Technology News

We’ve been using DeckLink I/O video cards in our virtual production server builds, now Epic Games and Black Magic just announced that the latest version of Unreal Engine 4.21 will natively support these Black Magic video in and out cards.

Blackmagic 8K Capture and Playback Cards.
Unreal Engine 4.21 supports Blackmagic 8K Capture and Playback Cards.

DeckLink SDK binaries and source code from Epic will also now be available free to download on the company’s Unreal Engine Marketplace.

Epic Games’ Unreal Engine 4.21 now supports DeckLink 8K Pro, DeckLink Duo 2 and DeckLink 4K Extreme 12G capture and playback cards, announced Blackmagic Design.

Unreal Engine is the most used real time engine providing the highest quality solution for creating virtual reality (VR) and augmented reality (AR) experiences, as well as virtual sets for film and television. UE is a complete suite of creation tools designed to meet the most ambitious artistic visions while being flexible enough to ensure success for individual developers or the largest creative teams.

Unreal Engine, which added support for up to 1080p60, HDMI and SDI capture and playback in version 4.20, now includes a number of new features in 4.21, to meet its growing adoption in live broadcast and virtual production workflows. A key part of these new features is the support of Blackmagic Design DeckLink products and SDK.