Unreal Engine Real Time Digital Character Animation

File under creating content for AI advertising platforms.

To meet the demands of AI driven advertising platforms for monster.com and The Mill turned to real time production methods. [AI advertising platforms are using deep learning to optimise ad creative placement and they require a large amount of creative options to get going.]

real time rendered characters

How to produce 35 ready to fly creative ads in one day.

The team used realtime production methods, using Unreal Engine to make content on mass, in this case around 35 creative assets. Requiring real time render and captures at broadcast quality, real time production was the only answer. The choice to adopt real time production methods gave the editors a mass of rushes to take away and start cutting right away, all to deliver back to the agency and on to those hungry AI driven advertising platform.

Best Practice for Live 360 Video Production

Imeve CEO Devon Copley recently gave this keynote presentation on Best Practices for Live 360 Video, as part of the IVRPA 2018 conference in Japan. In this video, Devon shares hard-won, real-world tips on how to plan and run a successful live virtual reality production. He covers issues like camera choice, camera position, client management, encoding options, and much more. Whether you’re new to live 360 or have dozens of productions under your belt, there’s something here among these best practices for you. Check out the video below:

Real-Time Ray Tracing in Unreal Engine 4

Unreal+Engine-real-time-ray-tracing2

Epic Games, in collaboration with NVIDIA and ILMxLAB, today gave the first public demonstration of real-time ray tracing in Unreal Engine. Real-time ray tracing is considered to be a holy grail for those creating high-end cinematic imagery, and one that signifies a leap forward in the convergence of film and games.

Epic Games Demonstrates Real-Time Ray Tracing in Unreal Engine 4 with ILMxLAB and NVIDIA

During yesterday’s “State of Unreal” opening session at the Game Developers Conference (GDC), the three companies presented an experimental cinematic demo using Star Wars characters from The Force Awakens and The Last Jedi built with Unreal Engine 4. The demonstration is powered by NVIDIA’s RTX technology for Volta GPUs, available via Microsoft’s DirectX Ray Tracing API (DXR). An iPad running ARKit is used as a virtual camera to draw focus to fine details in up-close views.

Unreal+Engine-real-time-ray-tracing1

Epic built the computer-generated (CG) scene using assets from Lucasfilm’s Star Wars: The Last Jedi featuring Captain Phasma, clad in her distinctive armor of salvaged chromium, and two stormtroopers who run into her on an elevator on the First Order ship. In the tech demo, lighting is moved around the scene interactively, as the ray-traced effects including shadows and photorealistic reflections render in real time. The stunning image quality of highly reflective surfaces and soft shadows has never before been achieved at such a high level of image fidelity in Unreal Engine.

Next-generation rendering features shown in today’s demo include:

Textured area lights
Ray-traced area light shadows
Ray-traced reflections
Ray-traced ambient occlusion
Cinematic depth of field (DOF)
NVIDIA GameWorks ray tracing denoising

“Ray tracing is a rendering process typically only associated with high-end offline renderers and hours and hours of computer processing time,” said Epic Games Founder and CEO Tim Sweeney. “Film-quality ray tracing in real time is an Unreal Engine first. This is an exciting new development for the media and entertainment linear content worlds—and any markets that require photorealistic visualization.”

“At ILMxLAB, our mission is to create real-time rendered immersive experiences that let audiences step into our stories and connect with cinematic universes that look and feel as real as the ones on the movie screen. With the real-time ray-tracing technology that Epic and NVIDIA are pioneering, we are a pivotal step closer to that goal,” says Mohen Leo, ILMxLAB Director of Content and Platform Strategy.

Epic Games worked closely with NVIDIA to support the NVIDIA RTX technology available through the DXR API. Running on an NVIDIA DGX Station, the demo was brought to life via a collaboration between Epic’s dedicated graphics and engine team, NVIDIA’s world-class ray tracing experts and the technical ingenuity and creative artistry of ILMxLAB.

Google buys Lytro for next to nothing.

Insiders at Google and Lytro are saying that the sale price of Lytro could be as little as $25m. Google is in talks with Lytro over its assets which include some 59 patients in light field and optical technologies.

Google buys Lytro

To much too soon for this Light-field pioneer.

Its a case of too much too soon, as the once shiny Lytro took to the front of the light field development scene. But then, Lyto’s consumer product flopped and expensive cracks started to appear. Not to say the company was wrong, but more a case of too soon for a market that was just not ready to support such ideas.

Typical, think back to the start of the web, have you even heard of Compuserve? No, well history seems to be repeating itself as early market entrants get overtaken, sold off or worse they simply fade out of business. [CompuServe was the first major commercial online service provider in the United States. It dominated the field during the 1980s and remained a major influence through the mid-1990s] Some of the companies at the front of the light-field, volumetric, 360, and VR race seem to be running out of fuel, meanwhile others, in some cases who may not be pushing the envelope so far, are seeing a rush of funding.

The world’s not a fair place, the tech world is down right savage. The sale of Lytro to Google is the start of a mopping-up wave that will see bigger companies harvesting the IP assets of these early pioneers. We feel quite sad as Lytro was indeed a true pioneer of what is today becoming the vital component of virtual experiences – the ability to see light and depth.

What will Google do with the Lytro technology? For the cost of the purchase, Google will probably just pass the IP into its own hardware developments, the learning itself is worth the cost, which could be as low as $25M. Even if the price of Lytro was double that, the hard work that the Lytro team have done over the past few years is like hitting a quick key, to the Google teams.

Realtime VFX in Unreal Engine 4 Tutorial

realtime VFX tutorial
Unreal Engine's real-time volumetric rendering tools to create seamless special effects that integrated perfectly with the action.

Realtime VFX using Unreal Engine (UE4) is pushing the boundaries of on-set realtime production. Here at On-set Facilities we build realtime VFX machines, sets and realtime production solutions that are optimised for realtime VFX using Unreal Engine (UE4). Tutorial Video after the jump:

Continue reading “Realtime VFX in Unreal Engine 4 Tutorial”