1
$\begingroup$

Think of a simple setup with three nested cubes - an outer, a middle, and an inner cube. They all get some simple material with the "Principled BSDF" shader. In the Cycles render engine you can decrease the Alpha value of the outer cube and the middle cube becomes visible. When you decrease the Alpha value of the middle cube the inner cube becomes visible (provided that the outer cube is transparent. So this is quite intuitive and straight-forward as expected.

The situation with Eevee is strange, however. First of all it is needed to change the "Blend Mode" in the additional material options to "Alpha Blend" and "Show Backface" needs to be disabled. However, I found that the Alpha blending is only working to the next surface whose materials "Blend Mode" is set to "Opaque". That is for instance, when the "Blend Mode" of the inner cube is set to "Opaque" (always visible), and the middle and outer cubes to "Alpha Blend", but the middle cube has got Alpha of 1 (so should be visible when the outer cube is transparent), the middle cube is not visible when I fade the outer cube into transparency. When the "Blend Mode" of the inner cube is set to "Alpha Blend" as well, then both the middle and inner cubes are like "not present" when I turn the outer cube transparent.

Is this behavior in Eevee normal or did I forgot some point?

Btw., the application behind that is to fade smoothly between multiple objects while the objects are slightly different in size in order to cope with "Z-fighting". At least with two objects this is working nicely. Perhaps there is also a different way to do this apart from such a nesting/transparency-approach. Ideas are welcome.

One work-around to deal with that issue might be to scale the objects dynamically (depending on the current frame) from within a script and also manipulate the "Alpha Blend" mode dynamically. I need to check that....

Thanks, Mario

$\endgroup$

1 Answer 1

1
$\begingroup$

Is this behavior in Eevee normal or did I forgot some point?

This behavior is normal, assuming you're limiting yourself to talking about alpha blend, although there has been some evolution in how its been treated.

Rasterizers work by rendering materials one after the other. When they render a sample, they write to a depth buffer. When they are then asked to render another sample, they first check that depth buffer. Is the sample we're about to write actually further away from the camera than what we've already put there? Then don't draw it. This works well for opaque meshes, and keeps performance high.

However, in the case of alpha blend meshes, what if we don't write our alpha blend materials in the correct order? If we write something that should be alpha blended, then it will prevent anything behind it from being drawn by writing to the depth buffer. Or, if we don't write to the depth buffer, then we render something behind it, we'll be occluding the transparent mesh inappropriately. And even if that wasn't a problem, the rasterizer is only keeping track of a single layer: once you write alpha over some other material, you couldn't figure out how to write a different alpha between those two layers even if you could figure out not to occlude it-- the separate information about the alpha layer and the layer behind it is already lost, collapsed into a single value in the frame buffer.

It might seem that you could sort your alpha layers. In some cases, you could. In the general case, this rapidly becomes an immense problem. Meshes themselves can have multiple layers, meshes can be concave, multiple materials can intersect, multiple times back and forth, and that intersection isn't at the locations of vertices, but in the middle of faces. Accurate depth sorting of alpha blended meshes is a really expensive nightmare.

Because of this, you should only ever use a single layer of alpha blend. I think there have been some changes to this over Blender generations. It used to be that Blender would just try it's damnedest to sort, and then inevitably screw up the sort, and maybe you'd only find out on the render when your eyes suddenly appeared to turn inside out. Now, it seems like it just renders the frontmost samples to a single alpha blend layer and composites it in. Having been in that position of discovering that viewport sort was somehow different than render sort, I personally prefer just not even being tempted by something that will maybe only look right up until the point that I render.

However, there is an alternative, which is alpha hashed. In alpha hashed blending, each sample is either opaque or transparent-- if the object has an alpha of 0.25, then 25% of the samples will be transparent. Because each sample is either/or, there are no sorting problems. This looks awful when rendering with a single sample, but starts to look pretty good when you get up to 32 or 64 samples or so.

Unfortunately, there are also limits to hashed alpha. It is not exactly the same as alpha blend in its appearance. (It is worse IMO.) It cannot support colored transparency, because there is no "opaque in blue channel but transparent in red channel" possibility, there's just the one frame buffer.

My recommendation is save alpha blend for the very, very important layers of transparency, for the layers that just aren't going to look good with alpha hashed, and use alpha hashed for the rest.

$\endgroup$
1
  • $\begingroup$ Thanks for your detailed explanation @Nathan. I got the point and try to see how to deal with it at best. $\endgroup$
    – Mario
    Commented Sep 14, 2022 at 17:54

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .