Jump to content

Mash

Registered Member
  • Posts

    652
  • Joined

  • Last visited

  • Days Won

    47

Everything posted by Mash

  1. Regarding redshift cpu/gpu. C4D standalone now includes redshift on your cpu, just select it as the render engine and the interface will switch to redshift lights, cameras and materials. If you have maxon one or a redshift subscription, c4d will continue to use the gpu in your system, you haven't lost anything As far as speed goes, currently expect your gpu to be roughly 10 times faster than your cpu on a balanced system. eg 16 core ryzen + 3090, 12 cores + 3080, 8 cores + 3070, 6 cores + 3060. If you have a gpu licences of redshift, there is currently no reason to switch to, or enable the cpu version, it will be significantly slower, even enabling both cpu and gpu at the same time will be slower than gpu alone. It will get faster with time, but nobody know by how much or how soon it will happen. It is mostly there to guarantee that every system can render with the new render engine and so that cpu render farms can take on these rendering tasks. If you have a vaguely modern computer and a redshift licence, there is no reason to use cpu mode at all. For those who have commented along the lines of "my gpu isnt very good, ill use the cpu version instead", even a cheap old geforce 1060 will likely be faster than any consumer cpu. In short, use the gpu version if you can, the cpu engine is just there for potato computers.
  2. 1. The feature nobody expected, and nobody wants: Emoticon generator. Move a series of sliders to control joy, anger, nervousness and sanity to create a vector spline emoticon. 2. The feature they've sneakily removed without telling anyone: All right click menus, this was done to prevent touch screen users of c4d from having a different experience from mouse and tablet users. 3. New feature, but can't be used in production A new render engine has been written from the ground up, but it only works on Ti graphing calculators up to 3Mhz. 4. New way to screw over perpetual licence owners ZBrush has been integrated into C4D, but all brushes require a Maxon one subscription. Perpetual licence users are limited to a 1 pixel wide brush.
  3. A real glass half full post here 😉 What about "New feature, but it can't be used in production" or "New way to screw over perpetual licence owners"
  4. Nope, it is what it is. Your only hope would be to submit an idea on the maxon website asking them to increase the zbuffer, especially for wireframes, to 32bit. currently it is either 16 or 24bit, which isnt enough to draw them accurately.
  5. I think hes not asking why the lines along the joints are missing, but rather why do the lines overshoot and cross over each other. The answer is the z-buffer. Everything in 3D space essentially has a depth assigned to it so that C4D knows which items to composite in front of, or behind others. This z buffer only has so much detail, when surfaces overlap with each other, there's always a little inaccuracy where they meet. This problem is worse for the wireframe which gets composited over the top of the shaded 3D view because its better to stay on the cautious side to make sure the wireframe can be seen, rather than risk it vanishing behind geometry where surfaces meet. This issue is exaggerated on scenes where model models meet at very shallow angles, or on shots where the camera is far away, but optically zoomed in.
  6. Do you have "all frames" enabled in the playback menu?
  7. Wheres the diamond hands emoji?... Strictly speaking there's been like 10% inflation this year, I wouldnt be surprised to see that reflected in prices.
  8. There's quite a few things which jump out to me: The sun angle of the background image shows the sun is just a few degrees off the horizon, this means there should be long streaking shadows running all the way out of shot. Instead they're short and dumpy as if its almost midday and the sun is above us. the light angle is simply wrong for the shot you're compositing into. There's too much fill light on the left. That shadow should be almost black with direct sunlight, instead it is very well lit which make the unlit not look planted onto the floor Its leaning to the right. Consider rotating the camera to the right so the object is centered, the using film offset X in the camera to pish it back. Your metal is just too clean. Sheet metal comes off a roller, there would be some brushing or anisotropy effects on the surface. Or fingerprints, or corrosion. Galvanised stainless steel for industrial outdoor use is never that perfect. The background image choice isnt great. The entire right hand side of the image is brighter than the left, so this just makes all the lighting issues even worse as it looks like there's a bright haze of fog/sunlight behind the product. Did a nuclear bomb go off behind the product? Maybe. Quick PS fix, but I cant fix the short shadows:
  9. Nothing wrong with eyeballing it. You'll get the corners in place in 20 seconds within a few pixels. On an 8k render this will equate to being accurate within a tenth of a mm or so. The real life product will likely have a 1-2mm tolerance for label placement, the chances of the label visually being identified as not perfect is zero. Really there's no point messing around going through AE or using dead 3D tools for something like this when you can get a 100% passable image in a few seconds in PS. Ideally what you want is a plugin to distort a layer to a uv pass from c4d, and I am gobsmacked that such a thing doesn't exist. Such a tool would perfectly map and distort your label onto your 2D render, but it just isnt there.
  10. What i do in these cases is render a version with a square plane on top of the tin. Load this into PS. Add my label as a new layer on top. Convert to smart object, then transform in distort mode and just pin the 4 corners to the plane. Then you can replace the render with the normal tin, render a mask for the label on top and now you have a PS file where you can drop in replacement labels as needed.
  11. It isnt a question of making them appear as one gpu to the os via sli/nvlink. gpu engines are happy to see and use multiple cards just the same as a cpu engine will split an image over 8 or 128 cpu cores. octane can use different cards, from different generations and will more or less use 95%+ of their power on average. I've tested this with a system containing a 3090, 2080, 2080s and 2080ti and it works fine. We've also tested on up to 10 quadros in a single system and at this stage youre getting the majority of the power from the gpus. By that point you're looking at about 90% efficiency. Redshift, things arent so shiny there unfortunately. 2 gpus is around 95% efficient which is fine, 3 gpus this is down to around 90% efficient, 4 gpus and youre down to about 75% efficiency, anything after this is pretty much a waste, once you go past 5 gpus you will find your render actually get slower. So, fine for workstations, but it makes building a RS render farm tricky, as a machine which should be able to be kitted out with a dozen gpus is basically useless.
  12. Keep in mind, the menus highlighting will happen for the most mundane things like renaming an object
  13. 25% more electricity usage, 25% more cost, for 7% more render speed,
  14. I don't have any links to hand but the jist of it is this; Currently on ETH you're using huge amounts of electricity powering huge numbers of GPU to crunch math. If your math processing happens to stumble upon a coin, you get to keep it, but because this is so rare, many people join a mining team, and you are instead awarded a portion of that coin depending on how much effort your computers put into the task. This is all known as PoW, Proof of work. The new system that ETH is switching to is PoS, Proof of stake. Whereby instead of putting your gpu power on the line, you put up a stake of ETH. Super simplified, its like putting money into a bank account and earning interest on it. These huge ETH stakes are essentially a guarantee that the transactions you claim happened are real, if you're caught trying to lie about transactions, you lose your stake; which at the moment have a starting price of £100,000 to join. Its a bit like if the government said, "we trust you to file your own tax returns, but if you ever lie even a tiny bit, we're taking your house away". You'll be scarred into being honest, and theres a thousand other random people double checking your tax returns who will get rewarded if they spot a cheat. This is immediately the vast majority of the power usage removed. The second element is whats known as a layer 2 zero knowledge rollup. Its a stupid name but all it means is that rather than going onto the blockchain and making a transaction every time you buy or sell something, a third party will instead bundle up 100, 1000, a million transactions into a single bundle, then put those through the ETH system as 1 bit of data rather than a million. Thus reducing the usage and transaction costs by several orders of magnitude. Add these together and you have a system that can handle more transactions per second than all the visa, mastercard and banking systems combined, and without melting the icecaps.
  15. On the eco footprint, that is all about to 99.999% vanish. The switch from proof of work (mining with a million gpus) to proof of stake will eliminate 99% of the energy usage and will make gpu prices plummet. On top of that the level 2 system will allow thousands of times as many transactions to be performed before being bundled up into a single ETH transaction. so yeah its horrific right now, but by the end of the year we'll need a few less power stations around the world. As for NFTs. I 100% agree on what you showed, NFT's in their current state are almost entirely garbage, most often used to launder money or milk cash from idiots. However. NFTs do have and will soon have their main intended use flung into the limelight. Proof of ownership of other items. Currently all you see them used for is "hey, I own this url which points to a crap image that took somebody 30 seconds to make" Very soon you'll find your Rolex watch comes with an NFT so you can prove to future buyers that yours isnt a knockoff. The same will go for any high end designer item where counterfeits are a problem. The whole NFT in gaming thing, that one is yet to play out. The idea that items you buy in game will genuinely be yours to keep, to trade, sure why not. Its effectively bringing the whole Magic the Gathering card system to virtual items. Currently mobile gaming is funded 90% by whales who drop obscene amounts of money on in-game items, either to help them win, or to give them a rare cosmetic look. I suspect a lot of this will be aimed at them, but it could also work out in everyone's favor. Joe Bloggs finds the super stylish and rare tie-dye beanie hat in the new GTA6 game? some idiot with too much money wants it and will pay £10k for it? Sure, why not. 80% cut to the guy that sells it, 20% cut to the game publisher and the rich idiot gets his pretend item.
  16. First thought, you have enabled the create at view centre option in the preferences Second though, youve right clicked a coordinate and selected 'set as default'
  17. It depends what the differences are going be between your different renders. Takes are great when you need variations of the same thing. A room at night, a room in the day. Or a room kitted out as an office then the same room as a dining room etc. You do need to be careful though because it is very easy to accidentally record take data you didnt want, much like leaving auto key turned on for animation. The take system is largely stable, though its weaknesses are that working with animation sucks, and that it is very difficult to manage take data across multiple projects as it is so very much tied to the specific project file.
  18. Honestly, I would just place a circle/square on the label artwork. Add a cylinder or cube in 3d, then adjust the texture width/height until the shape on my texture matches the 3d shape. If your label artwork is say 8000x2000 pixels, then even matching this up visually to within a couple of pixels will get you 99%+ accuracy at the centre of the texture. Beyond that nobody is ever going to tell it isnt perfect because the majority of the surface is going to have a 5-10% error at the top and bottom of the tapered shape anyway, so a 0.1% error in the middle is meaningless.
  19. A lot of you will be working from home, are you by any chance connect to work VPNs? Windows is notoriously naff when connected to a VPN. Drives take a minute to refresh/timeout, local network drives and printers vanish. Windows pretends it cant find a path when connected because you tried to browse the folder before the VPN handshaking had finished etc.
  20. Work on the same files back and forth constantly changing between blender and c4d, or a one off transferral of a model from one app to another? you can move models, textures and some animation from one app to another using a format like fbx or alemic. But theres absolutely no sensible way for you to both work on a full production project with 2 different apps. Its like asking the best way for a native chinese speaker and a native french speaker to write a book together using esperanto.
  21. Because its a question of budgets and whats needed. A hollywood production centre is going to relatively not care about the price of a render node. 500 nodes, 1000 nodes, 2000 nodes... meh whatever. Just make sure it can do everything needed and price be damned. There's also data size to consider; a CPU engine is going to cope much better with 50gigs of alembic data on a 128gig node than a 20gb gpu will. But step down from that to a much more common sized company, or department within a company, and the price to performance is often going to dictate what can and cannot be done. So many jobs will not have GI enabled, or motion blur, because the engine will die a slow death when you turn those features on. Or youll be stuck doing 1080p jobs because quadrupling render times for 4k just isnt viable with the render hardware the company has at hand. On archviz, from what I see, there is a mass exodus away from cpu engines over to unreal due to being able to pump out a full walkthrough in an hour or two as opposed to several days. I dont think cpu is dead. It just doesnt make much sense for the average user other than the fact that their 3d app happens to ship with a cpu engine.
  22. Any examples? I've yet to find the graphics task where 10,000 simple gpu cores are slower than 64 cpu cores. By quite a margin. For the longest time the biggest ace cpu renderer's had up their sleeve was that gpu's simply couldn't do many tasks. SSS, motion blur, large particle and volume clouds etc. But now? Im really struggling to see what cards the cpu engines are holding. All that's really left is memory capacity, but out of core rendering works fast and is stable, and these new data streaming techs will allow them to take a huge bite out of the "gpus cant deal with large data sets" pie. GPUs still have their eternal driver problems where one driver bug will ruin your day, but personally I could never see myself returning to a cpu render engine again. Stick half a dozen gpus in a large case and you have enough render power to push out thousands of frames of animation overnight. The real kicker for me, is that there's absolutely no more "hmm, can we afford to turn on GI on this job?" Or SSS, or AO, or DOF, or motion blur, or area lights, or glows, or blurry reflections, or decent antialiasing, or... Genuinely, every single feature for me is "eh, screw it, why not?"
  23. vram depends 100% on what you want to do. We do all our product animations in single room environments comfortably within 10gb or so. Rarely do we overflow into system ram. If we get lazy and start throwing in 5 million poly hero objects for everything in the room with all internals still in place then we go over this, but so long as you spend 5-10 minutes with the worst offenders to simplify them then everything is fine. Typical project for us is 5 to 40 million polys. 20x 8k image maps, 20x 4k and another 100 low res images. We would set 12gb as a minimum spec for any new workstation gpu,
  24. Bonus good: The AM4 socket is very mature and stable. All the problems have been ironed out. It is currently one of the fastest systems you can buy IT uses relatively cheap and stable DDR4 memory, DDR5 will have early compatibility issues and high prices GPU prices won't crash, they'll just drop down to RRP, there's still lots of pent up buying pressure from people that put off upgrading.
×
×
  • Create New...