There are some flaws to Lumen and Nanite based on some videos I have seen. The first is that the Lumen engine has a hard time consistently converging all GI light rays in real time. There is this shifting blotchiness which you would notice should the scene contain smooth surfaces. I would imagine to get the performance you need, the error threshold is set higher than you would use in a standard render engine. That is why all demo scenes use rocks and rough surfaces where the subtle shifting of brightness and color is not as noticeable. If smooth surfaces are used, they will probably not be a brightly lit. Another problem is with thin objects, especially leaves. If the camera is up close, all looks good but at a distance the leaves just disappear. That one I cannot explain nor could the demo operator in the video I was watching.
Nevertheless, Unreal 5 still remains hugely impressive. It is just not the be-all and end-all yet of real time, hyper-realistic rendering. But they are close. I also have to admire that all demos are run on a PS5. I mean, that alone is impressive.
Now, I hear two things when it comes to real time 3D being used in Stagecraft for The Mandalorian. I hear it is based on Unreal Engine but it also uses ILM's Helios render engine. Pretty certain they are not running the 70 foot diameter of LED "volumes" of Stagecraft using PS5 so the partnership between ILM and Unreal probably has developed the be-all and end-all of real time rendering. It probably requires a render farm of 1000 nVidia RTX-A8000's to make it all work - especially if they want to use for more than TV and make large format movies to be shown in IMAX theaters. If anyone has the specs on Stagecraft 2.0, please share.
Interesting fact: The partnership between ILM and Unreal probably was very easy to create given that Unreal's CTO is Kim Libreri, a past ILM VFX supervisor who left in 2015 for Epic. I met Kim during Siggraph 2009 and we actually had a nice chat about fluid simulations. Quiet guy with a searing intelligence.
Dave