I am unsure about ILM (I did do a quick search on Google, but nothing related to ILM no longer making use of virtual sets came up), but it seems at least other studios and series have utilized the same virtual sets: Netflix's The Midnight Sky and Jingle Jangle.
Various parties are interested in establishing a standard for virtual sets. The main issue right now is hardware: as anyone knows who tried to purchase a new graphics card, that supply chain is severely bottle-necked.
As for traditional CPU rendering versus GPU rendering versus GPU realtime rendering: as far as I am seeing, the boundaries are slowly becoming fuzzier and fuzzier. I observe this in render engines like Eevee and Nvidia Omniverse, for example. It is a joy to render at a few seconds a frame, in particular for animations.
Traditional slow CPU-only bound render engines are becoming less and less relevant, in my opinion (for freelancers, architectural work, etc - not so much yet in film CG). Cinema's internal physical render engine is lagging behind, and I expect the C4d devs will be replacing it in the next or thereafter release with Redshift/RT. They must to stay relevant in that area after ditching ProRender..