Jump to content

HappyPolygon

Registered Member
  • Posts

    1,898
  • Joined

  • Last visited

  • Days Won

    94

Everything posted by HappyPolygon

  1. Wow, I thought UI design where a PS thing... How long ago was that ?
  2. I thought "offline" was the oposite of real-time. By "offline" I refer to any non-real-time render engine (buckets, scanlines etc...). The real-time engine doesn't have to produce the final render in real-time. The scene can be so heavy that it takes 1+ second per frame to render but the recording will be in any FPS set by the user. Like the Viewport Renderer in C4D. With this question I wanted to know if some C4D users were considering to use real-time render engines for their work or if they don't, why. That's why I narrowed the use cases to product advertisements and FUI/HUD graphics. I know that most C4D works (not all) don't actually require much realism (like TV idents or News and Sports animations, cartoon-ish animations and music videos)
  3. Due to recent advances in real-time graphics seen in showcases of Omniverse, MetaHumans, Ziva Dynamics and other, I wonder for how much longer should we rely on countless hours of PB rendering. I was playing Spyro - Reignited the other day, just look at this ! Volumetric lighting and SSS ! I didn't know there could be a faked SSS ! And the game is 3-4 years old. (I'm not a gaming person but I've always liked this game for the innovations it made back in 1998) And the demo from U-Render is amazing (with a very uplifting and inspiring music) So my question is this : Do we really need offline renderers as default in motion graphics ? C4D is mostly used for funky animations like product advertisements and FUI/HUD graphics... Do hard-truth refractions and shadows really matter in most of artwork done with C4D ? I don't think so. Unreal Engine is being used more and more in works other than game development for the fast results it provides. To my knowledge there isn't any material, light property or camera effect not possible rendering in real time with a little sacrifice on accuracy. I wish U-Render was part of C4D. I'd like to hear your thoughts in this subject and how much of your work depends on very accurate light simulations. Of course someone could use the Vieport OpenGL like this guy did but has limitations that AE and U-Render haven't.
  4. I believe you have a good reason for using consecutive Insets and not just adjust the Subdivisions parameter. This is my setup. I believe this is what you asked for. As you can see if you select the Inset propagated port on the left you will get a single Inset parameter on the right controlling all three Inset nodes. I renamed the port to inset from the default Input by entering the Resource Editor by right clicking on the vertical gray band where the port is positioned (in case you didn't know how to do that.) Your screenshot looks like it's from R23... Srek told me to refrain from messing with the Resource Editor in R23 as it was not completed and it really was driving me mad when trying to arrange a lot of parameters on it. In this simple case you won't have any trouble though. What I cannot explain is why I get a geometric progression type of result and not a linear like the subdivision parameter would do. Here's the scene in R25 insets.c4d
  5. You could make one node to promote the same value to multiple nodes. Then you can propagate the input of that promotion node and have that apear on the the Resource Editor.
  6. both That's some heavy equipment you have Cerbera. I guess it's not that uncommon for 3D artists to also have some music background.
  7. What music hardware do you use ? I bought a cheap Alesis Midi Keyboard Q49 MK2 for 75€ on Black Friday sales just for the pressure keys and mod/pitch wheels. Before that I used a Yamaha PSR-36 (which costs around 180€ with a quick glimpse in e-bay) which I was pluging to my laptop through a MIDI2USB cable I bought more than 12 years ago for about 15€. I also bought Magix Music Maker 2021 Ashampoo edition on Black Friday sales for about 60€. It did hit a nerve that I had to pay an additional 10€ for enabling VSTis on it but it's still a bargain.
  8. The 3DWorld mag, issue 283 features a 2016 short animation called Planet Unknown from CG director Shawn Wang. Planet Unknown recently won Best Animated Short at the Burbank International Film Festival. Software's: Cinema 4D, plugin TurbulenceFD and Octane for C4D were heavily used for most of the tasks. Houdini was used to fracture things. Then Zbrush for sculpting, Mari for texturing, After Effects for compositing, and Premiere for editing. Python and JavaScript were used for scripting in C4D, Mari and AE, which helped speed up the process a lot. It took around 4 months to build the 3D assets. Then Shawn moved on to animatic previews. Each individual shot got more clear and problems to solve became more specific. Then the rest of the time were mainly focused on building scenes, keyframe animation, FX simulation, rendering and compositing. Director biography: Shawn Wang is an animation director and 3D artist. Having a passion for CGI and storytelling, Shawn loves creating narrative visuals through all types of media. Credits: Written & Directed by: Shawn Wang Modelling, Texturing, Animation,Compositing & Editing: Shawn Wang Music & Sound Design by: Echoic Audio Composer: Sam Foster Sound Design by: Tom Gilbert, David Johnston Special Thanks to: Evolutions Re-recording Mixer: Will Norie Executive Producer: Xinyuan Huang Faculty Adviser: Yucheng Huang Special Thanks to System Advisers: Sicong Wang, Jiawei Cao, Horizon Bian Website http://planetunknownfilm.com Shawn Wang http://www.shawnwangvfx.com Echoic Audio http://www.echoicaudio.com
  9. Hmmm.... that looks like a procedure the Subdivision was not meant to do and has to do with the priority of execution in the node tree... I would suggest something including planes to cut your object horizontally and vertically but there is no geometry bool node... If your case involves only the construction of a plane then that is definitely possible in polygon level with the use of Distributors. and some easy math.
  10. I think the eyes are correct this time. This time the mouth is in the uncanny valley. I think lighting is the problem. It's so soft that the shadows from her lips shouldn't give her teeth so much depth. Shadows and wrong value of SSS can ruin the inside of the mouth. From the outside the problem was always the folds between the cheeks and the corners of the lips. That gives an antiaging and very plastic look to cheeks.
  11. Read more here https://blog.unity.com/technology/welcome-ziva-dynamics
  12. Can't this be done with classic bone rigging ?
  13. I have a simple solution. The key concept is "discrete". If your model moves in discrete distances your shadows will also move in discrete distances. If you have a moving light source then that light source should move in discrete distances. We essentially convert the animation to a stop-motion format without dealing with frame rate. This leads to other problems though... I know there is a way to make a voxelated object that moves discretely in space using the Cloner. (If you don't know how to do this let me know) What I don't know, is how to make the light source move discretely in space. If you do want that and don't know how to do it we can summon someone expert on this. One more problem you should solve is positioning your light source in a certain way to have the shadow fall exactly parallel to your wall without spreading (I think the parallel spot is what you are looking for) - you may want to switch to Hard Shadows. Maybe make the shadows more transparent. You can duplicate the light and position it bit next to the first one (it is up to you to find the exact distance so it coincides with the voxel size) with a different shadow opacity to make a penumbra shadow effect.
  14. If I understand correctly, your problem is not the pixelated shadow itself but the discrete way it is supposed to be projected on the wall. Am I right ?
  15. Isn't the same if the render resolution is low and use a video editing software to resize the video ? (could speed up the rendering process)
  16. Animcraft 2.1 is available for 64-bit Windows 10+. Linux and macOS version are coming “soon”. The integration plugins support 3ds Max 2016+, Blender 2.81-2.83, Cinema 4D R20-22, Maya 2016+, Unity 2017+ and Unreal Engine 4.23+. The software is rental-only, and costs $199/year per node-locked licence; $259/year per floating licence. link read more here: CGchannel
  17. Lead graphic vendor on this highly anticipated fourth edition of The Matrix franchise, Studio C, the creative offering of Compuhire Ltd, created the visual language, graphic UI and content for on-set playback and post delivery. The Matrix Resurrections spanned two years, including a Covid-related shut down, and was Studio C’s biggest project to date. The main tools in the motion design pipeline included Cinema 4D, Redshift, X-Particles, Adobe After Effects and Creative Suite, and Unity, Blender, Substance Painter. Rhino and Sketchup were also used for refining and converting CAD elements supplied by the Art Department. Read more here: https://cgsociety.org/news/article/5289/studio-c-reimagines-the-graphic-world-of-the-matrix-resurrections?fbclid=IwAR1KUEz5A5galbVrvF6shHbC2FKV1yB9rpwfD8jyrt4ESK-Qb25VfUJKc00
  18. HappyPolygon

    Slowdown

    Adjust the F-curves in the Timeline window.
  19. Chaos and Enscape are to merge. The private equity-led deal unites the developers of three of the most popular renderers in the arch viz market: V-Ray, Corona and Enscape. All of both firms' existing products remain in development. Find more details here: https://cgsociety.org/news/article/5288/chaos-and-enscape-to-merge-backed-by-ta-associates-and-lea-partners?fbclid=IwAR2Y8pzevTWTqdflio3UJ4x6w_rrOVAkHbRyxNcgFAn0NxuFF-GCJCLVQ6g http://www.cgchannel.com/2022/01/chaos-and-enscape-to-merge/?fbclid=IwAR2ObBIYMPhxQZp7Y1BMj6xjF10DdQf7VoGy7bdU9JV7k2bh7Zeyc2fR7Io
  20. For a moment I thought the rocks where on top of each other but they intersect 😒 recently found this but it's not with nodes: https://www.facebook.com/marcio.lpinheiro.71/videos/1232222323938528/?__cft__[0]=AZUTAC2bsPLSwKZk76bAJ38XS1L1gEThbOovax-LgID3n5ULE6ddqofjnZd9f5wV_Uo_DzlsXUGCSrdydAxCqlxiAXFRS_HfO4VD53cEnqD0R7PAYVC9eDKlBHZnQKu7k9PpIs-XCNQh8q0iqLbLTF0k1O2YD64WImV2AlT4-0VH3Q&__tn__=%2CO-R
  21. https://support.maxon.net/hc/en-us/categories/4405723856402?fbclid=IwAR2EZkoCj79-qdIXqSFrcEdzELx6RHiZaaI8_E-bZXtLsxoOnd6FBYhTDZk
  22. I think this type of motion is relatively easy with XPresso.
  23. Had to ask 'cause usually a concept art made to serve as a 3D modeling blueprint is usually drawn in top-view, side-view and even maybe from front-view for faster and correct modelling.
  24. The video explains what file format and material format Omniverse uses.
×
×
  • Create New...