We work with many vendors on projects so we thought it a good idea to share a bit of learning to make everyones life easier. This video focusses on creating content assets in your favourite DCC software, exporting them as FDX files, and considerations when designing content in DCC’s for VR engines. You won’t be surprised, small is beautiful, fast is best, file this under the art of code.
Design and Produce : Taehoon Park
Character design and Modeling : Hyunsup Ahan
Synopsis and Edit : Jihoon Roh
Music and SFX : Echoic
Cinema4d, Octane, Aftereffects, Photoshop
The future mankind has extremely extended their lifespan by highly advanced technology. The only way to meet death exist as a type of euthanasia. Death managing company ‘DREAVELER’ which have researched a diversity methods of euthanasia invented a brand-new system which makes people travel their dreams and memories during REM sleep before the last sleep. The system also named ‘DREAVELER’ compound word of Dream & Travel focused on soothing people who get bored with a desolated future and filled with nostalgia for the nature of a vanished past.
Facebook Unveils Two New Volumetric Video ‘Surround360’ Cameras, Coming Later this Year. I’m going off to figure out why they qualify as ‘volumetric’ – I’ll be right back.
OK, I got the answer, they are volumetric, here’s why.
Facebook today announced two new Surround360 cameras. These hardware initiatives are poised to make facebook 360 videos more immersive. And they are volumetric as they can see – depth!
Unveiled at the company’s yearly developer conference, F8, the so-called x24 and x6 cameras are said to capture 360 videos with added “depth information” giving captured video six degrees of freedom (6DoF).
This means you can not only move your vantage point up/down, left/right like before, but now forwards/backward, pitch, yaw, and roll are possible while in a 360 video.
Even the best stereoscopic 360 video cannot go backward and forward, so the idea of a small, robust camera(s) that can record volumetric data, is exciting, especially when you’re in the immersive worlds of positional / motion tracking capabilities of the Oculus Rift, HTC Vive, or PSVR.
We have had an amazing response to our teams work in real-time compositing and live on-set VFX.
In this post, I will try and tell you more about how we build our real-time VFX set-ups. if you have any questions please use our contact form and one of the team will get right back to you. [UPATE: We have just set up our Facebook Page, like to keep up with new developments]
On-set Facilities Real-time VFX systems, are built by OSF teams in the UK and Madrid, using virtual, 360, 3D environments, camera tracking, hardware compositing tools, color correction software, and game engines.
Various manufacturers provide the vital technology (listed below) and OSF is now working closely with developers and manufacturers to push the boundaries between, CGI, VFX, 3D, Mixed Reality, film, and games.
“So far we have created a great tool for pre-visualisation and real-time VFX. It allows us to record in real-time, live-action foregrounds (with our without live-action sets, using real props and actors) and digitally bake them within HD, 2K, 360º, and 3D Virtual Worlds.”
The camera tracking information is captured along with zoom and focus data in FBX, it can then be passed directly to NUKE for on-set compositing or saved to be manipulated later.
Here is a list of the technologies you’ll need to explore if you want to build your own real-time compositing set-up and add live VFX to your next production.
We bake in UE4, we comp in NUKE, we create 3D environments and animated characters and elements in 3D Studio Max. Some hardware is OSF own as are the configuration tools and methods. You can always talk to us if you need us.
This is another test with a scene made by Sungwoo Lee, it uses Octane Render in Unity and then baked into UE4, we are now all waiting for Octane render for UE4, and expect a release in the near future. Thanks to Sungwoo for this test.