File under creating content for AI advertising platforms.
To meet the demands of AI driven advertising platforms for monster.com and The Mill turned to real time production methods. [AI advertising platforms are using deep learning to optimise ad creative placement and they require a large amount of creative options to get going.]
How to produce 35 ready to fly creative ads in one day.
The team used realtime production methods, using Unreal Engine to make content on mass, in this case around 35 creative assets. Requiring real time render and captures at broadcast quality, real time production was the only answer. The choice to adopt real time production methods gave the editors a mass of rushes to take away and start cutting right away, all to deliver back to the agency and on to those hungry AI driven advertising platform.
Chaos Group has just launched V-Ray for Unreal, the new version of its V-Ray renderer for Unreal Engine.
The product makes it possible both to import V-Ray scenes into Unreal Engine to use as the basis of immersive real-time experience and to generate ray traced renders within UE4.
Import V-Ray scenes created in 3ds Max, Maya and SketchUp into Unreal Engine V-Ray for Unreal makes it possible to import scenes created in other editions of V-Ray – at the minute, 3ds Max, Maya and SketchUp – into Unreal Engine. The process automatically converts V-Ray lights and materials into their real-time equivalents.
(Epic Games’ own Unreal Studio – itself just released in beta – does something similar, but only for 3ds Max scenes with V-Ray materials.)
Adjust materials and lighting in UE4 in real time, then generate a ray traced render As well as using a V-Ray scene as the basis for a conventional Unreal Engine project, users can also use UE4 as an environment in which to make changes to materials and lighting in real time.
The changes are then propagated back to V-Ray; or you can generate a ray traced render within UE4 itself.
The system is intended to make it possible to create both offline and real-time content with a single unified workflow, particularly for visualisation projects.
Pricing and availability – V-Ray for Unreal is currently in closed beta. You can apply to join the beta program here: to do so, you will need to have registered for an account on Chaos Group’s website, which is free.
So far, Chaos Group hasn’t annnounced any details of the commercial release date or pricing.
Some open-source VR engine options: Apertus VR / OSVR / GoDot / Annwvyn / take a look at those to start with. Which one to go for?
In our view, from looking at whats on offer GoDot looks good, but then take a look at Xenko. We soon came across Xenko, and due to their flashy graphics and showreel we got a little excited, take a look at the trailer video:
Open Source games engines.
GoDot have released a fresh third version update. Impatient users can put an end to 18 months of waiting by jumping directly to the Download page and start playing with Godot 3.0!
How to choose an open source VR engine / game engine.
Our best advice is follow the links from this post, then take a look at the social metrics for each option. Do they have a big supportive community? Do they include the API or source code access and publishing options you need?
For sure check out if they respond in, or even have a great support forum – these are the questions you’ll need to ask to evaluate if an open-source VR engine is right for your project plans.
Talking to many players in the VR production, all of them are trying to win a race to become the dominant VR platform.
But, without the open, shared and common standards that built the internet, how will we activate millions of developers and builders? Will the metaverse ever gain the open foundations for innovation that gave birth to the likes of Facebook, Apple and Google?
Developed by OSF to educate Producers and Directors in new digital production techniques, the Immersive Production Spectrum illustrates the full spectrum of computerised content production.
Starting with live-action (digital) capture on the left, to augmented reality production (real-time VFX) in the middle, and fully computerised virtual production methods on the far right.
The Immersive Production Spectrum is not intended to pitch one technique over another. Instead, it acts as a loose roadmap to enable Producers and Directors to evaluate and employ new technologies within their content productions.