Take a look at what’s possible, Theia Interactive built Optim as a plug-in for Datasmith thanks to how Epic Games Unreal Engine enables Python developers to get in, under the engines hood to make custom plugins possible for all kinds of functionality. This is another example of how Epic Games is inventing new markets in its own ecosystem.
With Epic recently making the editor 100% scriptable with python, developers have total freedom enabling them to write simple code to automate everyday actions like imports processes, or deleting and generating UVs. Simple tasks made automatic saving developers time, efforts and speeding up production.
Optim uses Datasmith’s 20+ format support to get high-fidelity data out of those core applications. Epic has focused their technical skills on accurately bringing over metadata, converting materials to Unreal Engine, and dealing with a wide range of data-prep issues. Optim builds on that base to simplify the optimization process.
Some examples of things that you could automate with Optim are:
Skip the import of any meshes with names containing a certain string
Create LODs for any mesh larger than a given number of triangles
Instance any mesh with a particular name
Merge everything with a particular property under a single group
Replace all materials by name with existing materials from the Content Browser
Unlike Blueprints, the Python environment is only available in the Unreal Editor, not when your Project is running in the Unreal Engine. That means that you can use Python freely for scripting and automating the Editor or building asset production pipelines, but you cannot currently use it as a gameplay scripting language. If you wish to use a gameplay scripting language blueprints are the best option for now.
Python coding in Unreal Engine
Optim shows the newly found possibilities of Unreal with the help of python being integrated with Datasmith it allows all developers to make the Unreal Engine theirs and customize every aspect of it. Optim will also be sold as a subscription package in early 2019. Could this be the start of a new market? Epics developed a design engine which is now stable enough to possible be the home of a totally new market. A market where third-party teams like Theia Interactive can take advantage of the Unreal Engine creating services and content without any barriers to enter the market and profit to be made.
File under creating content for AI advertising platforms.
To meet the demands of AI driven advertising platforms for monster.com and The Mill turned to real time production methods. [AI advertising platforms are using deep learning to optimise ad creative placement and they require a large amount of creative options to get going.]
How to produce 35 ready to fly creative ads in one day.
The team used realtime production methods, using Unreal Engine to make content on mass, in this case around 35 creative assets. Requiring real time render and captures at broadcast quality, real time production was the only answer. The choice to adopt real time production methods gave the editors a mass of rushes to take away and start cutting right away, all to deliver back to the agency and on to those hungry AI driven advertising platform.
Chaos Group has just launched V-Ray for Unreal, the new version of its V-Ray renderer for Unreal Engine.
The product makes it possible both to import V-Ray scenes into Unreal Engine to use as the basis of immersive real-time experience and to generate ray traced renders within UE4.
Import V-Ray scenes created in 3ds Max, Maya and SketchUp into Unreal Engine V-Ray for Unreal makes it possible to import scenes created in other editions of V-Ray – at the minute, 3ds Max, Maya and SketchUp – into Unreal Engine. The process automatically converts V-Ray lights and materials into their real-time equivalents.
(Epic Games’ own Unreal Studio – itself just released in beta – does something similar, but only for 3ds Max scenes with V-Ray materials.)
Adjust materials and lighting in UE4 in real time, then generate a ray traced render As well as using a V-Ray scene as the basis for a conventional Unreal Engine project, users can also use UE4 as an environment in which to make changes to materials and lighting in real time.
The changes are then propagated back to V-Ray; or you can generate a ray traced render within UE4 itself.
The system is intended to make it possible to create both offline and real-time content with a single unified workflow, particularly for visualisation projects.
Pricing and availability – V-Ray for Unreal is currently in closed beta. You can apply to join the beta program here: to do so, you will need to have registered for an account on Chaos Group’s website, which is free.
So far, Chaos Group hasn’t annnounced any details of the commercial release date or pricing.
Some open-source VR engine options: Apertus VR / OSVR / GoDot / Annwvyn / take a look at those to start with. Which one to go for?
In our view, from looking at whats on offer GoDot looks good, but then take a look at Xenko. We soon came across Xenko, and due to their flashy graphics and showreel we got a little excited, take a look at the trailer video:
Open Source games engines.
GoDot have released a fresh third version update. Impatient users can put an end to 18 months of waiting by jumping directly to the Download page and start playing with Godot 3.0!
How to choose an open source VR engine / game engine.
Our best advice is follow the links from this post, then take a look at the social metrics for each option. Do they have a big supportive community? Do they include the API or source code access and publishing options you need?
For sure check out if they respond in, or even have a great support forum – these are the questions you’ll need to ask to evaluate if an open-source VR engine is right for your project plans.
Talking to many players in the VR production, all of them are trying to win a race to become the dominant VR platform.
But, without the open, shared and common standards that built the internet, how will we activate millions of developers and builders? Will the metaverse ever gain the open foundations for innovation that gave birth to the likes of Facebook, Apple and Google?
Developed by OSF to educate Producers and Directors in new digital production techniques, the Immersive Production Spectrum illustrates the full spectrum of computerised content production.
Starting with live-action (digital) capture on the left, to augmented reality production (real-time VFX) in the middle, and fully computerised virtual production methods on the far right.
The Immersive Production Spectrum is not intended to pitch one technique over another. Instead, it acts as a loose roadmap to enable Producers and Directors to evaluate and employ new technologies within their content productions.