I have a small problem with particles. I think I understand what is the cause, but I wasn’t able to find a way to solve it yet.
I’m using particles, and they are looking okay until I’ll turn camera so that new sprites become closer to the camera than older ones. In this moment older particles are rendered first, their alpha=1 are covering newer particles, and when newer particles are rendered, their pixels, which are under the older particle ones, are not rendered.
Different camera angle, smoke became partially covered by older particles, which are not actually rendered:
I’ve read several pages about depth test, transparency, etc., but I still didn’t find a way to solve the problem.
That sounds like a plausible theory. To test it, you could try switching to a non-order-dependent transparency mode (such as M_binary)—it will look awful, but if this particular glitch goes away, you’ll know that that’s the problem.
Since you’re using black smoke in this instance, there may be a particular ColorBlendAttrib mode that you can use that would yield the desired effect without being order-dependent.
What happens if you put your particle-system NodePath into the “unsorted” bin, then disable tepth-testing for it? The result–if I’m imagining it sufficiently accurately–might not be perfect, but it might nevertheless look acceptable, given the darkness of your particles.
That would work, but I think that would mean that other objects (eg. a bridge that the train rides underneath) can no longer properly appear in front of the smoke particles.
That’s true. Sprites are no more covering each other, but they are in front of all the other models now: trees, light posts, etc.
I’ve also tried to use transparency modes, but it doesn’t give any effect so far
Oh, I think, I got it! Here is what I did:
Thank you both, you’ve pushed me in the right direction!
Ah, right–I had you disabling the wrong end of the depth-culling process–good catch there!