Jump to content

All Activity

This stream auto-updates

  1. Past hour
  2. @b_ewers 1. Change the Maxon Noise > Input > Source to UV/Vertex Attribute, so that the noise samples in 2D Texture Coordinate (UV) space, rather than 3D space. 2. Significantly reduce the overall scale from 100 to something like 1.2 3. Adjust the relative scale to something like 1, 20, 1 to get the vertical streaking. 4. Increase contrast to 1 5. Change the noise type to Turbulence uv-space-noise_v01.c4d
  3. Today
  4. Hello C4Ds, I would like to use the Noise Node to create lines on a sphere, but it only gives me a perfect noise pattern on the surface. I tested this with noise generated in Photoshop, imported it as an image, and it behaved exactly as I wanted. Does anyone know if there's a workaround or an additional node to modify the Noise Node's behavior to mimic that of an image? Thank you for your help! 😉
  5. Brilliant MASH, Just Brilliant! Cheers Jacobite
  6. ... and on that, is there a way to get this properly into photoshop? PS has no 'interpret' options for me to discard the alpha to do the same trick I do in AE, outside of rendering the image again with no alpha channel. Strictly speaking I CAN get the content into PS and it does look good if I go via 32bit exr, but once in, I cant squash it down to 8 or 16 bit as doing this re-obliterates the glass again.
  7. Very pleased to hear that Rocket Lasso is back in action after a very long time away, starting with Season 7 Ep 1 tonight at 8 pm (BST). CBR
  8. Ok, so my current method of working in AE where I take a copy of the render interpreted with no alpha channel (so I get all the rgb data) and set it to add mode; then use a second copy of the render with the alpha enabled and throw it on top as a solid normal mode is pretty much correct then? ie. its as good as adobe AE is going to get given its shortcomings.
  9. Yesterday
  10. That is why it can be useful for the pedagogical side to provide citations, and demonstrations that align with meaningful ways to think about the problem. The latter seems to be the largest stumbling block, due to preconceived ways of thinking about the problem. You can say that again! This is the preconconception problem. Someone poses a baseline model, and we orient to it, and then our thinking goes out the window and we are left with the fragments of a broken understanding. A good example is "white" and "black". Ask a hundred experienced authors what "white" and "black" are and you'll hear someone say "Code value zero" or "One hundred percent code value". This is almost too obvious to think about. Yet when we are told these things, and other ridiculous ideas such as "highlights" or "super whites" or such nonsense, we turn our brains off and instead get stuck in the confused orthodox soup. Have we actually tried to think about what "black" or "white" is? Yet another one of those "So simple it is silly" questions. But then we look around us... Wait a minute... how can "black" and "white" be at what appears to be every single code value??? And those... those are still "highlights", aren't they? How can those be at every code value too? How can "black" and "white" reside at every single luminance value of a pictorial depiction? And how are we even able to sample a region on the sphere-like depictions that corresponds loosely to what we think we are seeing? The moment we dedicate any actual thought to some of these taken-for-granted subjects, the perplexing nature of their mechanisms becomes far more apparent. No harm in downloading the free-as-beer version of Resolve, which includes Fusion, and testing the mechanisms to make sure one isn't losing their marbles. The Resolve-Fusion should composite and display the EXR in the thread according to what has been discussed, plus or minus the massively significant picture formation algorithm. I hate to agree with you, but this is sadly where we currently find ourselves. Trust is a tricky thing to build, and if you read that horrible Adobe thread, you can get a sense of reality defying push back. Also note the date... and the software may still not work properly. The best inoculation is an intuitive understanding that one can hopefully, and cautiously, pass along to those who may not quite see the complexity. Find names and minds you have skeptically come to rely on, and chase their threads. Oh... and test against other software!
  11. I am rendering a scene in R18 with a plane textured with an image with Alpha applied in the texture settings. On render, the image creates a Halo which is not in the Photoshop image. I remember seeing years ago in a tutorial that I had to set an additional step in my render setting to sort this issue and I don’t remember what it was, maybe a Radio Button. Thans for reading Jacobite
  12. I hope you realize how valuable this is, thank you SO much for taking the time to explain this over and over! It's really problematic to research a topic like this if you're not sure what sources you can even trust. As you said many times, there's just so much conflicting information on the internet it's not even funny. Yes, I did type that out in a wrong way. Apologies 🙂 I have understood this part at this point, although it was really tough to wrap my head around this principle after almost 20 years of thinking that 100% transparent = no color information. I feared that this would be the answer... sadly I do not have the skill in these other softwares and neither do I have the time or budget to switch over right or the near future. This will be a "maybe in a year or two" thing. How AE is still so prevalent in the CGI space with all that bullshit going on is beyond me. I'm building my pipeline right now and I'm having NOTHING but issues with AE. Color Management is an opaque and a needlessly complex pain in the ass, Alphas are handled incorrectly, performance is bad. Man I wish I would have learned Nuke instead of this crap. At this point I don't even know what to trust anymore, my intuition and experience or some piece of Adobe software that should work but isn't exactly likely to work correctly, considering all the crap happening around their software.
  13. Last week
  14. I was absolutely perplexed by something that seemed so simple at first. Later on, I acquired some mental trauma from tracking down a particularly nasty bug around alpha and PNGs. You'd probably not be surprised that once one becomes more or less familiar with the nuances around alpha channel handling, that nuanced bugs can crop up in even the most robust software. So... yes. 🤣 This is where I encourage folks to gain enough confidence in their own, hopefully well researched, understanding. Eventually, it enables folks to identify where the specific problem is emerging. The skull demonstration has zero alpha regions within the "flame" portions. Following the One True Alpha formula, the operation should be adding the emission component to the unoccluded plate code values. There is no "scaling by proportion of occlusion" to the plate that is "under", as indicated by the alpha being zero. The author has some expanded commentary over on the issue tracker. The following diagram loosely shows how the incoming energy, the sum of the green and pink arrows, yields a direct "additive" component in the green arrow, and the remaining energy indicated by the pink arrow, scaled by whatever is "removed" in that additive component, is then passed down the stack. If this seems peculiar, the next time you are looking out of a window, look at the reflections "on" the window. They are not occluding in any way! Similar things occur with many, many other phenomena of course. For example, in the case of burning material from a candle, the particulate is effectively close to zero occlusion so as to be zero. Not quite identical to the reflection or gloss examples, but suitable enough for a reasonable demonstration. Sadly, Adobe is a complete and utter failure on the subject of alpha for many, many years. If you crawl over the OpenImageIO mailing list and repository, you will find all sorts of references as to how Adobe is mishandling alpha. Adobe's alpha handling is likely a byproduct of tech debt at this point, culminating with a who's who of image folks over in the infamous Adobe Thread. Zap makes a reference to the debacle in yet-another-thread-about-Adobe-and-Alpha here. Gritz, in the prior post, makes reference to this problem: You can probably read between the lines of Alan's video at this point. So this is two problems, one of which I've been specifically chasing for three decades, and one that relates to alpha. As for the alpha problem, I cannot speak directly to AfterEffects as I am unfamiliar with it, but the folks I have spoken with said the last time they attempted, Adobe still does not properly handle alpha. I'd suggest testing it in some other software for compositing just to verify the veracity of the claims folks like myself make, such as Nuke non-commercial, Fusion, or even Blender. All three of those should work as expected. My trust that Adobe will doing anything "correct" at this point, is close to zero. As for "colour management"... that is another rabbit hole well worth investigating, although it's probably easier to find a leprechaun than pin down what "colour management" in relation to picture authorship means to some people or organizations. Keeping a well researched and reasoned skepticism in mind in all of these pursuits is key. 🤣
  15. I have a feeling you have a LOT of experience that comes with the confusion on this topic, lol. Because I am still confused. I tried downloading the test image you posted and setting it up in AE. Either AE's Alpha handling is bad or I didn't set it up correctly. This is how it looks right now, on the right I opened the "CCSkull_06_d-sRGB_t-ACES-sRGB.jpg" from the Repo as reference. Some of the colors look really close, some look totally off. Of course, the entire candle flame AND the glow is missing. But with color management "close" is not good enough, so what am I doing wrong?
  16. Faking it can be done with tutorial above using multitude of tools you have at disposal. I think it would be quite interesting to have a physically correct rig for this 🙂
  17. I haven't done this myself so far, but I suspect the best way to go here might involve cycloid splines, blend mode cloners and Field Forces. Searching around those terms I found this tutorial from Insydium. Obviously, they are using their own particle system, but there is no reason to think the native one couldn't also do it... though I am probably not the best person to advise on the specifics of that; I don't have much experience in Cinema particles yet ! CBR
  18. Hi all I am desperately stuck on how to create a magnetic field such as this one At the moment I don't have a slightest idea how to do it, if anyone has some tips I would be very thankful
  19. Apologies... I can only post one post per day. Probably better that way... 🤣 The issue, as the linked video in the previous post gets into, is sadly nuanced. The correct math within the tristimulus system of RGB emissions is remarkably simple. Sadly, it is just that software and formats are less than optimal. The dependencies of the "simple compositing operation" are: 1. Software employed. Many pieces of software are hugely problematic. See the infamous Adobe thread as a good example. 2. File formats employed. Some file encodings cannot support the proper One True Alpha, to borrow Mr. Gritz's turn of phrase, such as PNG. 3. Data state within software. Even if we are applying the proper mathematical operation to the data, if the data state is incorrect, incorrect results will emerge. The short answer is that if we have a generic EXR, the RGB emission samples are likely normatively encoded as linear with respect to normalized wattages, and encoded as gained with respect to geometric occlusion. That is, in most cases, the EXR data state is often ready for compositing. If your example had a reflection off of glass, a satin or glossy effect, a flare or glare or glow, a volumetric air material or emissive gas, etc., you'd be ready to simply composite using the One True Alpha formula, for the RGB resultant emissions only^1: A.RGB_Emission + ((100% - A.Alpha_Occlusion) * B.RGB_Emission) Your cube glow is no different to any of the other conventional energy transport phenomena outlined above, so it would "Just Work". If however, the software or the encoding is broken in some way, then all bets are off. That's where the video mentions that the only way to work through these problems is by way of understanding. Remember that the geometry is implicitly defined in the sample. In terms of a "plate", the plate that the A is being composited "over" simply lists the RGB emission, which may be code value zero. As such, according to the above formula, your red cube RGB emission sample of gloss or glow or volumetric would simply be added to the "under" plate. The key takeaway is that all RGB emissions always carry an implicit spatial and geometric set of assumptions. This should never happen in well behaved encodings and software. If it does, there's an error in the chain! JKierbel created a nice little test EXR to see if your software is behaving poorly. Hope this helps to try and clear up a bit of the problem surface. See you in another 24 hours if required... 🤣 -- 1. The example comes with a caveat that the "geometry" of the sample is "uncorrelated". For "correlated" geometry, like a puzzle piece perfectly aligned with another puzzle piece, like a holdout matte, the formula shifts slightly. The formula employed is a variation of the generic "probability" formula as Jeremy Selan explains in this linked post. If we expand the formula, we end up with the exact alpha over additive formula above. It should be noted that the multiplicative component is actually a scaling of the stimuli, based on energy per unit area. A more "full fledged" version of the energy transport math was offered up by Yule and (often misspelled) Neilsen which accounts for the nature of the energy transport in relation to the multiplicative mechanisms of absorption attenuation, as well as the more generic additive component of the energy.
  20. As far as I know the best way to comp your glow from renders is just don't. Getting the glow to correctly export from render engines is always a major pain in the butt for exactly this reason and at this point I just do it in post via AE or some playing around in PS. As said above Red Giant has some really good Glow plugins for AE, and they're part of Maxon One now.
  21. have you thought about applying it in post, whether ae, fu or nuke? Some of the plugins like Sapphire and Red Giant have pretty flexible glows with different ways to tweak matte and background sources. Add a bit of light wrap too.
  22. Then pardon my daft question, but what would be the correct way to composite this in post? Lets say you have a red glowing cube you want to bring in on top of your composition. You bring in the cube renders (premultiplied so the glow is maintained in the rgb data), they sit on top, but the alpha layer outside of the cube in every 3d render engine ive used will be solid black because there is no geometry there, thus the glow of the red cube is completely cut off. Previously (and currently frankly) I have been bringing in a second copy of the footage (which does have a glow visible in it, but the alpha obliterates it), placing it under the red cube layer, disabling the alpha entirely, then setting it to add or screen so the glow can show back up again and composite over my background. But this always needs 2 copies of the render in my comp, if I just use the add/screen layer then the red cube would show the background through it.
  23. Hello all. Flattered to be mentioned here. I just wanted to point out that the statement is not quite correct; the result will indeed include an emission component in the RGB after the composite. With associated alpha (aka “premultiplied”) the emission is directly added to the result of the occlusion. This happens to be the only way alpha is generated in PBR-like rendering systems, and is more or less the closest normative case of a reasonable model of transparency and emission. It also happens to be the sole way to encode any additive component like a gloss, a flare, a glare, fires or emissive gases, glows, etc. I’ve spent quite a few years now trying to accumulate some quotations and such on the subject, from luminaries in the field. Hope this helps.
  24. slightly off topic but I believe the individual in that blog post is the guy who worked on the AgX colorspace that was all the rage in the blender side of things, I think it replaced Filmic as default but idk...
  25. I'm even more confused now. Wikipedia says this: So Wikpedia also says that Straight Alpha is the one that has the emission independent from the Alpha. Also, apparently Premultiplied Alpha IS multiplied with the Alpha value. It would make sense, since a 50% "covered" 70% green is 35%, because half of the emission is absorbed by the coverage. I think I understand the issue now. The Blog is simply about the BLENDING operation with premultiplied alpha values. It has nothing to do with the images themselves. It is simply easier to do calculations with premultiplied images, since straight alpha images need to be multiplied with their alpha by the renderer at runtime to get premultiplied values needed for blending. Premultiplied simply comes with the multiplication integrated already. I could have thought of this sooner, since a couple of pages before is talking about how Photoshop internally uses straight alpha calculations and thus ends up with wrong colors. Photoshop only blends correctly in 32 bit mode, since it uses linear premultiplied maths then. At least that's how I understood it. tl;dr: Doesn't really matter, just tell your software what you're feeding it with and it should be fine. Also use straight or premulitplied according to what you want to do, like @mash said here: https://www.core4d.com/ipb/forums/topic/119202-premultiplied-vs-straight-alpha-worst-rabbithole-i-have-ever-been-in/?do=findComment&comment=764541
  1. Load more activity
×
×
  • Create New...