GeomVertexFormat v3t2 not generating texcoords

I’ve copied and pasted this exact code in multiple places and had it work, and yet in one instance Panda3D keeps generating geoms with no texture coordinates.

PT(GeomVertexData) pData = new GeomVertexData(sName + " vData", GeomVertexFormat::get_v3t2(), Geom::UH_static);
if (!pData) throw std::runtime_error("Unable to instantiate GeomVertexData");
GeomVertexWriter vertex, texture;

vertex = GeomVertexWriter(pData, InternalName::get_vertex());
texture = GeomVertexWriter(pData, InternalName::get_texcoord());

vertex.add_data3f(-1, -1, 0);
vertex.add_data3f(1, -1, 0);
vertex.add_data3f(-1, 1, 0);
vertex.add_data3f(1, 1, 0);

texture.add_data2f(0, 1);
texture.add_data2f(0, 0);
texture.add_data2f(1, 1);
texture.add_data2f(1, 0);

PT(GeomTristrips) pPrimitives = new GeomTristrips(GeomEnums::UH_static);
if (!pPrimitives) throw std::runtime_error("Unable to instantiate GeomTristrips");
pPrimitives->add_vertices(0, 1, 2, 3);
pPrimitives->close_primitive();

PT(Geom) pGeom = new Geom(pData);
if (!pGeom) throw std::runtime_error("Unable to instantiate Geom");
pGeom->add_primitive(pPrimitives);

PT(GeomNode) pNode = new GeomNode(sName + " node");
if (!pNode) throw std::runtime_error("Unable to instantiate GeomNode");
pNode->add_geom(pGeom);

m_inputCard = pWindow->get_render().attach_new_node(pNode);

I’m using nVIDIA’s Nsight Graphics tool for debugging, and using this code I only see the vertex position, but no texture coordinates. I’m not quite sure what I’m doing wrong; is there any sort of automatic handling where Panda would drop those texture coordinates for some reason even though they’re present in the GeomVertexFormat?

As a sanity check, I also put together a shader that mapped the texture coordinates to output colors, hoping to see a nice red horizontal gradient and a green vertical gradient, but instead I just got a black screen.

Has anyone run into something like this before? I’m hoping that someone can give me some insight into how Panda generates the OpenGL vertex description from the GeomVertexFormat, and why it’s just generating a position from this.

Thanks!

That looks like it should work. The only reason I can think why they wouldn’t show up is that your shader doesn’t use them. What does your vertex shader look like?

Does the texture show up without any shader applied?

Previously I was just using a texelFetch with gl_FragCoord, but I’m switching to something more resolution-independent and am passing in actual texcoords. When I use texelFetch, the texture shows up fine, but when I use texture with the texcoord I just see whatever’s at (0, 0) stretched over the entire screen. Here’s the actual vertex shader and the test fragment shader that I’m using.

Vertex shader:

#version 140

uniform mat4 p3d_ModelViewMatrix;

// Vertex inputs
in vec4 p3d_Vertex;
in vec2 p3d_MultiTexCoord0;

out vec2 texcoord;

void main() {
  gl_Position = vec4(p3d_Vertex.xy, 0.0, 1.0);
  texcoord = p3d_MultiTexCoord0;
}

Test fragment shader:

#version 130

precision lowp float;

uniform sampler2D p3d_Texture0;

in vec2 texcoord; 
out vec4 color;

void main() {
  color = texture(p3d_Texture0, texcoord, 0);
}

And here’s what I see in Nsight, just the position and no texcoords:

Thanks for your help!

Out of curiosity, what happens if you call it texcoord instead of p3d_MultiTexCoord0? (It does mean you need to rename the other varying already called texcoord).

So I renamed p3d_MultiTexCoord0 to texcoord, and changed the original texcoord to pix_texcoord. When I run that, I get the exact same results.

That’s very strange. And you’re using a recent version of Panda3D?

Is there some way for you to reproduce this in a simple Python script I can run for myself?

You could also try outputting the texture coordinates into the red and green channels of the output to really make sure that it’s the texture coordinates, and not the texture (somehow).

Yes, I first ran into this while using 1.10.6, and updated to 1.10.7 in the course of troubleshooting.

The really odd thing is that in most situations, this code works properly. To give you some background, I’ve written a deferred renderer, and this function here is just generating a textured fullscreen quad - that’s why the vertex shader doesn’t do much other than pass coordinates through. I don’t see it generate texture coordinates in any of my pixel shaders - be it the HBAO shader, a depth-aware blur, or anything like that. On the other hand, if I use this exact same geometry generation code with something passed to the first-stage shader where I generate all of the buffers for the deferred renderer, it works fine.

I may be able to duplicate this in Python - I’m not particularly proficient in the language so it may take me a bit.

This makes me wonder if there’s something about the graphics buffer or the camera that could be overriding my GeomVertexFormat and cutting off the texture coordinates. To your knowledge, is there any sort of state under the hood that could be overriding the GeomVertexFormat when creating the OpenGL vertex description?

No, not to my knowledge. I suspect that either this is some quite obscure state tracking bug in our OpenGL renderer or this is some mix-up of nodes in your application.

I have it working now. I tried creating a test build outside of my game to see if I could replicate the problem in a simpler application, and to my annoyance, the test application didn’t duplicate the problem. I copied and pasted the code from the test application back into my game, and suddenly my game started working. The only difference is that the code in the test application didn’t reference the sName string, instead it has hard-coded names for its nodes. I wonder if I was overflowing some buffer somewhere under the hood with a name that was too long.

So I can’t say that I ever came up with a satisfying conclusion… these situations always make me worry that the problem will rear its ugly head again when I least expect it. Thank you very much for your help.