Importing Mo-cap data to Marvelous designer results in the loss of the usual T-pose character model position. This makes designing the garments very difficult. So here’s the workaround to transition from the T-pose into your motion capture animation.
Thanks to Travis Davids for making this tutorial and to fvckrender for spreading the knowledge in the first place.
File under how machines learn, and why that auto-match in every colour correction software you’ve ever used, never works like you wish it did. Welcoming AI & Machine Learning to Colour Grading, a talk at the BSCexpo 2018, presented by Dado Velentic, from Colourlab.co talking here for Colour Intelligence.
Especially interesting for anyone that wants to understand the basics of machines learning and how we humans teach AI via human passive learning, to train the AI training bots to know when a colour grade is good or bad.
Back in the 90’s creating strategies for the likes of IBM, Dell and Cisco, I found out the importance of having a good channel strategy. Everyone in IT knew that…
Without The Channel the IT industry would not have been able to rapidly serve the demands of a global market with physical products and services.
I firmly believe that its time for companies in the VR space to think about their channel strategy. Ask this, are you a manufacturer of VR hardware, a distributor of VR products, a software company, a Value Added Reseller or System Integrator?
Why the VR industry needs to re-build The Channel.
Developed by OSF to educate Producers and Directors in new digital production techniques, the Immersive Production Spectrum illustrates the full spectrum of computerised content production.
Starting with live-action (digital) capture on the left, to augmented reality production (real-time VFX) in the middle, and fully computerised virtual production methods on the far right.
The Immersive Production Spectrum is not intended to pitch one technique over another. Instead, it acts as a loose roadmap to enable Producers and Directors to evaluate and employ new technologies within their content productions.
A sign of our times, Netflix has firmly stated that it would only use 4K cameras for its original productions, which has been a source of frustration for some cinematographers, especially those that use ARRI gear and work on Netflix projects.
Post-production pipelines must also meet Netflix Original Productions 4K delivery requirements.
Re camera hardware Netflix says “The ARRI Alexa and Amira are fantastic cameras, and we stream plenty of content that was captured with these cameras. However, since these cameras do not have true 4K sensors, we cannot accept them for our 4K original productions.”
Netflix explains, “For those who pay a premium for our UHD 4K service, we only deliver content that was shot and delivered at a true UHD 4K resolution.”
Ultra HD. 4K UHD (2160p) has a resolution of 3840 pixels × 2160 lines (8.3 megapixels, aspect ratio 16:9) and is one of the two resolutions of ultra high definition television targeted towards consumers, the other being 8K UHD which is 7680 pixels × 4320 lines (33.2 megapixels).
On-set Facilities releases a 2nd test from its real-time compositing team.
This time the OSF Real-time VFX team tackle a classic dance scene from ‘The Artist’ testing the DCC element pipeline and real-time matte methods, pushed with some fancy footwork. Filmed at OSF studios in Madrid. More to come on this soon, subscribe to our youtube channel to know when new videos are released.