makeTextureBuffer introduces texture coordinate problems?

Hi, it’s me again.

So I’ve been wanting to get rid of the filtermanager to handle multiple shader passes for me.

It’s really simple, the render-to-texture example helped.

from panda3d.core import Texture, Shader, NodePath, ColorBlendAttrib
loadPrcFileData('', 'textures-power-2 none') 
loadPrcFileData('', 'gl-coordinate-system default')

	RenderNode = NodePath("RenderNode")
		RenderTarget ="RenderTarget", 1600,1000)
		RenderTexture = RenderTarget.getTexture()
		RTCam = self.makeCamera(RenderTarget,sort=-100)


		FSQ = loader.loadModel("plane.egg")
		FSQ.setShader(Shader.load(Shader.SL_GLSL, vertex="blav.glsl", fragment="blaf.glsl"))

		global Screen
		Screen = loader.loadModel("plane.egg")
		Screen.setShader(Shader.load(Shader.SL_GLSL, vertex="blablav.glsl", fragment="blablaf.glsl"))

This worked perfectly fine until I added the multipass. The plane I found on the net,
I think rdb posted it and I changed it to center around (0,0,0). It’s added at the bottom.

My issue is that for some reason not only texture coordinates are wrong. Under certain setups,
when I moved the camera, the two triangles were split up.

The issue, which didn’t happen before, can be seen here:

This is simply …

gl_FragColor = vec4(texcoord0.xy,1,1);

… which should actually look like …

This is the plane.egg:

    <CoordinateSystem> { Z-Up }

    <VertexPool> vpool {
      <Vertex> 0 {
        -0.5 0 -0.5
        <UV> {
          0 0
          <Tangent> { 1 0 0 }
          <Binormal> { 0 0 1 }
        <Normal> { 0 -1 0 }
        <RGBA> { 1 1 1 1 }
      <Vertex> 1 {
        -0.5 0 0.5
        <UV> {
          0 1
          <Tangent> { 1 0 0 }
          <Binormal> { 0 0 1 }
        <Normal> { 0 -1 0 }
        <RGBA> { 1 1 1 1 }
      <Vertex> 2 {
        0.5 0 0.5
        <UV> {
          1 0
          <Tangent> { 1 0 0 }
          <Binormal> { 0 0 1 }
        <Normal> { 0 -1 0 }
        <RGBA> { 1 1 1 1 }
      <Vertex> 3 {
        0.5 0 -0.5
        <UV> {
          1 1
          <Tangent> { 1 0 0 }
          <Binormal> { 0 0 1 }
        <Normal> { 0 -1 0 }
        <RGBA> { 1 1 1 1 }
    <Polygon> {
      <VertexRef> { 0 3 2 <Ref> { vpool } }
    <Polygon> {
      <VertexRef> { 2 1 0 <Ref> { vpool } }

I don’t know what’s going on.

What’s wrong?

Could you post a full runnable example of this issue?

I’ve uploaded it here:

So I switched my plane with a card.

	FirstPassBuffer = CardMaker("FirstPass")
		FirstPass = NodePath(FirstPassBuffer.generate())
		FirstPass.setShader(Shader.load(Shader.SL_GLSL, vertex="test_v.glsl", fragment="test_f.glsl"))

Same issue, different look.

I don’t get this. There’s of course a huge chance I’m doing something wrong,
but I’ve minimized the issue so much that I’m doubting it’s my fault. For once.

Plane or Card … both are wrong.


I must admit I have not seen an issue like this before. It does appear that OpenGL does not interpolate the texture coordinates entirely as expected, and this discrepancy seems to be amplified exponentially by the fact that you’re sampling these texcoords using the same texcoords.

The workaround seems to be to determine the texture coordinates based on the vertex positions, like so:

texcoord0 = p3d_Vertex.xz + vec2(0.5, 0.5);

The other part of the problem might be the positioning of the card in front of the camera. You have a perspective camera and put the card exactly 1 unit in front of it. That’s not guaranteed to completely fill up the camera’s field of view with the card. You are better off using an orthographic lens.

Or, even better, you can write your shader to completely ignore the camera transformation, directly writing out the correct screen coordinates:

gl_Position = vec4(p3d_Vertex.xz * 2, 0, 1);

The multiplication by two is necessary because the card has vertex positions from -1 to 1.

Well, it’s a weird issue. It’s weirder with plane.egg than with the card,
though I have to test if that’s actually the case.

In some “reparenting” situations, when I moved the camera away from the plane.egg,
the plane actually split up into two triangles.

What if it’s not actually related to the texture coordinates,
but for some reason the triangles are not being recognized as connected?

Btw, I do scale the plane to fullscreen. I am doing this for convinience,
because it allows me a cheap and easy way to get normals from camera through worldspace
without having to pass any additional values to the shader. This way I just use the and a plane.

Still raymarching.

I’ll see what I can do to reproduce the splitting issue. I don’t exactly recall how that happened.
And I’m not sure how your advise will help me, but I’ll try and find out!

Rdb, the change of texture coordinates seemed to do the trick!

At least, so far, it looks like it!

Still, the issue is kind of weird. Other people don’t seemed to have had this. Hm…

Well … uhm … with the .xy+0.5 trick I can’t render at half resolution anymore.
Instead, it seems to render one quarter of the full resolution, not at half resolution.

I might be missing something, I lost track …

NEVERMIND, I forgot that I have to change width/height in the shaders manually. :slight_smile:

Okay, this doesn’t really work as expected.
First it looked like it, but now I’m fiddling around with new issues.

Okay NOW I finally got it to work! Had to change a few things,
probably was completely unrelated to the workaround!

Anyhow, what’s the real problem behind this and how can it be fixed ?