In broad terms, Omniverse is a suite of freemium applications (free for personal use, payed for companies as an Enterprise version). There are applications for developing your own apps and games based on OV technology, apps for viewing archviz scenes and apps for building scenes. There's a launcher from where you can install what you need.
Omniverse has its own custom realtime RTX raytracer and a fast path tracer as well as its own physics system for rigid and soft bodies. These are more or less the key features you'd use when you want to use OV for building and rendering scenes.
Omniverse requires a Nvidia RTX GPU to run. All apps are based completely on USD. Omniverse apps do not accept anything BUT USD. In the background, the apps use a database core called Nucleus. This is essentially a localhost server that you can setup only for your machine or you can setup a true server to share with team mates online. There's also an app called OV drive for quickly creating a virtual harddrive in Windows Explorer that is linked to the Nucleus core.
Software companies can develop connectors (= plugins / live connections) for OV. With these plugins, you then export your entire scene or single assets from the scene as USD files to the Nucleus core, e.g. by exporting them to the virtual Omniverse drive. Then you import the USD assets from the Nucleus core folder structure to your OV scene.
Once exported, you can continue editing the scene in your DCC application and with every change you make, be it a modelling change, lighting, materials, adding new objects (...you name it), a re-export of the scene as USD (thus overwriting the existing USD asset) to Nucleus is triggered in the background. The nucleus core keeps track of only the differences since the last export and streams them to the OV scene, causing the assets to update in near real-time in OV. So if you are using a nucleus server that is online, one team member working from home can edit an object in Maya, another team member somewhere else does something in C4D and another person textures something in Substance Painter. With the connectors from these apps, all team members can see what the others are doing live on their own screen through the scene in OV.
Because OV is based on USD, it is compatible with Pixar's USD Hydra delegate (essentially a bridge for render engines): Any render engine that supports Hydra can be plugged into OV and used for rendering the scene instead of OV's default path tracer.
The main idea behind OV is thus collaboration across multiple apps in real-time where OV can serve as both a bridge and an endpoint for rendering the final scene. Because everything is a baked, exported asset, there are no incompatibilities between apps. USD supports MaterialX, so I suppose we'll see more MaterialX support across render engines in the future in addition to each engine's custom materials.
Now read the initial post again and it should make more sense 🙂
Edit: German press release from Maxon if you want to know more about C4D / Redshift and OV: Team-basierte Zusammenarbeit für Maxon-Produkte jetzt mit NVIDIA…