Deep compositing with no green screen requires a volumetric, depth view of the scene. The Intel Realsense range of cameras are a consumer level mass product that allows creators to combine VR sets with real world images based on depth, so that elements like background and foreground can be easily separated from the camera image, or more precisely layered in the correct order so that your picture makes sense.
File under creating content for AI advertising platforms.
To meet the demands of AI driven advertising platforms for monster.com and The Mill turned to real time production methods. [AI advertising platforms are using deep learning to optimise ad creative placement and they require a large amount of creative options to get going.]
How to produce 35 ready to fly creative ads in one day.
The team used realtime production methods, using Unreal Engine to make content on mass, in this case around 35 creative assets. Requiring real time render and captures at broadcast quality, real time production was the only answer. The choice to adopt real time production methods gave the editors a mass of rushes to take away and start cutting right away, all to deliver back to the agency and on to those hungry AI driven advertising platform.
Most people don’t realise that VR games require seven times the graphics power of normal 3D games. This is because the graphics card has to deliver two different high-resolution images to both eyes at 90 frames per second.
Want to build the metaverse? In this post we are going to take a look at the best specifications for VR development workstations, what you’ll need.
The Best GPU for VR Production:
Chaos Group has just launched V-Ray for Unreal, the new version of its V-Ray renderer for Unreal Engine.
The product makes it possible both to import V-Ray scenes into Unreal Engine to use as the basis of immersive real-time experience and to generate ray traced renders within UE4.
Import V-Ray scenes created in 3ds Max, Maya and SketchUp into Unreal Engine V-Ray for Unreal makes it possible to import scenes created in other editions of V-Ray – at the minute, 3ds Max, Maya and SketchUp – into Unreal Engine. The process automatically converts V-Ray lights and materials into their real-time equivalents.
(Epic Games’ own Unreal Studio – itself just released in beta – does something similar, but only for 3ds Max scenes with V-Ray materials.)
Adjust materials and lighting in UE4 in real time, then generate a ray traced render As well as using a V-Ray scene as the basis for a conventional Unreal Engine project, users can also use UE4 as an environment in which to make changes to materials and lighting in real time.
The changes are then propagated back to V-Ray; or you can generate a ray traced render within UE4 itself.
The system is intended to make it possible to create both offline and real-time content with a single unified workflow, particularly for visualisation projects.
Pricing and availability – V-Ray for Unreal is currently in closed beta. You can apply to join the beta program here: to do so, you will need to have registered for an account on Chaos Group’s website, which is free.
So far, Chaos Group hasn’t annnounced any details of the commercial release date or pricing.
File under cute, but oddly compelling.
Realtime VFX using Unreal Engine (UE4) is pushing the boundaries of on-set realtime production. Here at On-set Facilities we build realtime VFX machines, sets and realtime production solutions that are optimised for realtime VFX using Unreal Engine (UE4). Tutorial Video after the jump:
In our view, from looking at whats on offer GoDot looks good, but then take a look at Xenko. We soon came across Xenko, and due to their flashy graphics and showreel we got a little excited, take a look at the trailer video:
GoDot have released a fresh third version update. Impatient users can put an end to 18 months of waiting by jumping directly to the Download page and start playing with Godot 3.0!
How to choose an open source VR engine / game engine.
Our best advice is follow the links from this post, then take a look at the social metrics for each option. Do they have a big supportive community? Do they include the API or source code access and publishing options you need?
For sure check out if they respond in, or even have a great support forum – these are the questions you’ll need to ask to evaluate if an open-source VR engine is right for your project plans.