Tons of studios are now using Unreal for final rendering, including Disney and several blockbuster movies.
The fantastic thing about Unreal is that you can do realtime rendering on-set (e.g. for directorial choices/actor feedback) and then post-production upscale it with the ceiling only being cost. Unreal in the TV/Movie industry is already huge and only getting bigger, year-on-year.
You've definitely seen a TV or Movie that used Unreal.
Which Disney films use Unreal for final render? Disney has two separate path tracing renderers that are in active development and aren’t in danger of being replaced by Unreal.
These renderers are comparable in use case & audience to MoonRay, which is why I don’t think you’re correct that MoonRay needs external contribution to survive.
“Used unreal” for on-set rendering is hand-wavy and not what you claimed. Final render is the goal post.
Hey that’s pretty cool! Thanks for the link, it’s helpful to see the shots in question. Am I understanding correctly that the K droid was rendered from behind using Unreal in those shots, and the front shots were rendered with the in-house renderer? If true, I’d love to hear what the reasons were for not being able to use it on all the shots in the sequence. Are there more recent examples? Is Unreal still being tested like this at ILM, or is the focus on the in-house real time renderer?
BTW I’m hugely in favor of pushing real-time rendering for film (and I work on high performance rendering tech, aiming squarely at film + real-time!) I only was disputing the broad characterization by @Someone1234 that Unreal is widely used today for final, and that film studio path tracers are in imminent danger of death by game engine.
So, it's been a while, but I'll try to add clarification from the best of my memory.
So, in this sequence, I think it is the case that K2-SO was only rendered from behind.
IIRC, the reasons for not using it on more shots, and specifically the front shots in the sequence were two-fold, Primarily, we only had one TD/Lighting Artist trained in our pipeline using Unreal, which was still a little clunky to fit into our pipeline, so we were time limited. Now, K2-SO was not rendered from the front in a close-up due to problems with complexities with his eyes. (Some details from Naty later in the talk) Specifically, K2's eyes require fully lit transparencies with the full lighting model, at-least at the time Unreal only supports their full lighting feature set in their deferred renderer, and their forward renderer, used for transparencies was a vastly simplified lighting model which isn't able to fully capture the effect of K2's eyes. We were building a capable Forward renderer inside of Unreal internally, but this was not finished in time for Rogue One.
As an aside, we had a parallel internal renderer we were building for use on Rogue One, that even at the time had advantages, but Unreal was chosen for what I saw as political reasons.
I do not know of more recent examples, but I'm not involved in this project anymore, I know they used Unreal for Season 1 of the Mandalorian, but moved to their internal real-time renderer for Season 2. The internal renderer has a few advantages, not having to deal with the complexity of merging significant changes with Epic's engine for example with the forward renderer was one major advantage, but my understanding is that the major win, is just being able to build a renderer that integrates much better in their existing pipeline. Unreal's renderer is pretty strongly integrated into the rest of their engine, and the engine itself is very opinionated regarding how content is managed. And as you can imagine, ILM has their own opinions going back to about 30+ years of history.
I agree with your dispute of the broad characterization, but thought the counter-example would be illustrative.
BTW, I'm starting a new project investigating real-time rendering for film, and always interested to new perspectives, hit me up if you want to chat real-time rendering sometime.
Yes, the example is very illustrative, thanks again for posting it, and thanks for the context here! This history is fun to read. I was partly curious if texture sampling is still one of the reasons for avoiding real-time tech. Back when I was in film production at PDI two decades ago, texture sampling was near the top of the list. It seems true still today that games tolerate (suffer from) texture aliasing and sizzling routinely, while film sups will not tolerate it at all, ever. High quality texture sampling was, at the time, one of the main reasons offline renders took a long time. I remember being blown away how sensitive the lighting sup was to the differences between texture filters, lanczos, sinc, Blackman, Guass, etc., and how quickly he could see it. Today maybe it’s more often about how many samples are used for path tracing.
The K-2S0 droid character in Rogue One (voiced by Alan Tudyk) was, in fact, rendered in real-time using Unreal, then composited into shots afterwards.
John Knoll from ILM gave a talk at GDC 2017 about it.
The catch, though, is that ILM took the Unreal Engine source code and modified it extensively in order to be able to render K-2S0 as he appeared in the film. It's not like they just downloaded it from the Epic Store and ran with it.
Yup. The state of the art for real-time rendering just isn't there yet for hero work. Even ILM's custom Helios renderer is only used for environments and environment-based lighting, as far as I've read. Assets, fx shots, and characters are still rendered offline.
Even with real-time rendering for environments, I'm sure there's plenty of post-processing "Nuke magic" to make it camera-ready. It's not like they're shooting UE straight to "film".
I have seen reports of Unreal Engine being used quite successfully for pre-viz, shot planning, animatics, etc., though.
Unreal is used in TV quite often, yes. But no major studios use it for theatric releases, and I'm not aware of any who plan to. (Partner is in the industry)
Why do you think this? Nobody in film or vfx is using Unreal for final rendering, Unreal is built for games not offline path tracing.