I have previously reported on this behaviour here Screen Space Local Reflections v2 and today, faced with it again.
My shader won’t work correctly on Intel onboard card if I disable “show-buffers”. No any errors in console. When the show buffers enabled shader works as well.
That’s a bit of a vague description. What constitutes “doesn’t work correctly”?
For example with my SSR shader I see only color result in window without reflections, as soon I enable show-buffers - It’s work again correct.
This is black magik. Something occurs in this part of the fragment shader:
for (int i = 1; i < 35; i++)
{
samplePos = (startPosSS.xy + vectorSS.xy*i);
currentDepth = linearizeDepth(startPosSS.z + vectorSS.z*i);
sampleDepth = linearizeDepth( tex2D(depth, samplePos).z );
deltaD = currentDepth - sampleDepth;
if ( deltaD > 0 && deltaD < maxDelta)
{
color = tex2D(albedo, samplePos);
color.a *= fade / i;
break;
}
}
If I comment lines:
for (int i = 1; i < 35; i++)
{
samplePos = (startPosSS.xy + vectorSS.xy*i);
currentDepth = linearizeDepth(startPosSS.z + vectorSS.z*i);
sampleDepth = linearizeDepth( tex2D(depth, samplePos).z );
deltaD = currentDepth - sampleDepth;
//if ( deltaD > 0 && deltaD < maxDelta)
//{
color = tex2D(albedo, samplePos);
color.a *= fade / i;
// break;
//}
}
then result with show-buffer == result without
Hmm. I found cause of this behaviour. For some reason when show-buffers disabled, depth buffer appears only in the red channel of the texture while I read it from z (blue channel)
Glad that you found it. You should only read from the red component, or use f1tex2D instead of tex2D as attaching a texture to the buffer can (apparently) change the internal storage format.