setRenderModePerspective not working for GeomPoints?

Hi everyone,

I would like to use GeomPoints to display point clouds. Though, it does not look that great yet when zooming out with the camera. The point clouds are a bit overloaded if the GeomPoints are not scaling according to their distance to the camera. Here I read that setRenderModePerspective should make them be rendered smaller/bigger when the camera is closer to/farther from the GeomPoint. But it does not work for me. Am I doing something wrong?

vdata = GeomVertexData('name', GeomVertexFormat.get_v3c4(), Geom.UHDynamic)
vertex = GeomVertexWriter(vdata, 'vertex')
color = GeomVertexWriter(vdata, 'color')
vertex.addData3(x,y,z)
color.addData4(0,1,0,1)
prim = GeomPoints(Geom.UHDynamic)
prim.addVertex(0)
geom = Geom(vdata)
geom.addPrimitive(prim)
node = GeomNode('point')
node.addGeom(geom)
nodepath= NodePath(PandaNode("nodepath"))
nodepath.attachNewNode(node)
nodepath.setRenderModeThickness(4,100)
nodepath.setRenderModePerspective(True,100)
nodepath.setLightOff()
nodepath.reparentTo(self.render)

The code works for me. However, perhaps you have a shader assigned? When testing your code I noticed the auto-shader doesn’t currently handle perspective points correctly.

I filed a bug report for that:

1 Like

That’s it! Had the auto shader active. It works for me when I just don’t assign the shader for the geom points. Thank you for looking into this and recognizing the problem!

The issue has been fixed and the fix will be part of Panda3D 1.10.14.

You can also set hardware-point-sprites 0 in Config.prc to work around this issue, if you need to use the shader generator, but there will be a performance penalty.

1 Like

I’m bumping up this topic because I’m having the same problem.
First of all, from what I understand so far, if I want to use shaders (I actually use my own shaders, not an autoshader), then unfortunately I have to forget about perspective. Well, unless I turn on hardware-point-sprites 0.
But now… First of all enabling hardware-point-sprites 0 which I do like this:

loadPrcFileData("", "hardware-point-sprites 0")

Before:

base = ShowBase()

causes me to crash of Panda3D with:
Process finished with exit code 139 (interrupted by signal 11: SIGSEGV)
I wanted to check what the “performance penalty” would be, but I can’t even check that.
Secondly, asking a bit broader: even if we managed to get it somehow (via hardware-point-sprites 0 or by installing Panda3D 1.10.14 - by the way, when can we expect this version?), is there any chance to make the rendered vertices (geoms) have a different shape than just squares? For example, I found a YouTube channel where a guy plays with pointclouds (but in Unity) and renders them, in close-ups, as disks:

If you’re making your own shaders, can you not implement the effect of perspective in those shaders?

At a guess, I might suggest nevertheless generating quads (i.e. squares), and then rendering circles on them (surrounded by transparency).

1 Like

Which version of Panda are you using, precisely? 1.10.13?

You can use perspective points in a shader, but you need to calculate the point size yourself in the shader and assign it to gl_PointSize, and set a flag on the ShaderAttrib indicating that the shader specifies the point size. I can help you with the math if you need. This is more efficient than using software-emulated point sprites.

To get round points, enable the AntialiasAttrib.M_point mode. Or calculate the squared distance to the center in the shader from gl_PointCoord and discard fragments that exceed 1.

The crash is a bug. If you have a reproducible test case, please file a bug report.

2 Likes

Thank you for all suggestions!

I’m still a newbie to shaders though, and I just didn’t know exactly how perspective in Panda3D point clouds worked, nor did I know there was such a thing as gl_PointSize. I’ve now managed to use this inside a vertex shader and it actually works. Of course I have to adjust the point size and probably tie it to some perspective equations. If I could ask for a ready-made snippet, I would be happy to - it would save me a lot of time, deriving mathematical formulas and testing.

As for non-square shapes, I couldn’t get anything done using AntialiasAttrib.M_point (and yes, I enabled multiple sampling in the config, which only resulted in a significant decrease in performance). However, I liked the idea of just rendering the shape inside a fragment shader and relating it to the center of gl_PointCoord. Not only did I manage to implement this, but I also see many more possibilities here, such as blurring points that are beyond depth of field (although I still have performance concerns when point clouds start to count millions of points).

As for the bug - (un)fortunately, when I tried to recreate it, it stopped appearing. Weird…

1 Like

The point size in perspective mode is calculated by taking some constant factor (dependent of the desired thickness and the size of a pixel on screen) and dividing it by the length of the vertex position translated to view space.

Thank you but… Do you mean something like this?

	vec4 view = p3d_ViewMatrix * p3d_Vertex;
	gl_PointSize = 50 / length(view);

The effect is rather weird…

That’s a 4D length you’re doing. You’ll want length(view.xyz).

1 Like

Thank you. I’ve checked, but something seems to be still wrong and I have doubts.
Are you ABSOLUTELY SURE this should be p3d_ViewMatrix? Doesn’t p3d_ViewMatrix mean coordinates relative to the model coordinate system? Because for me the effect is such that the points in the middle of the model are thick, and on its outskirts - tiny.
On the other hand it is quite good-looking when I calculate the length after multiplying p3d_Vertex by p3d_ModelViewProjectionMatrix, p3d_ModelViewMatrix, or p3d_ModelMatrix.
And one more thing: I noticed an uninteresting effect at further points, which are already of the order of single pixels - because then you can see a very clear border between (for example) points of size 1 and 2. And I came up with the idea to simply do:

gl_PointSize = size / gl_Position.z;

Does it make sense?

Yes, it should be p3d_ModelViewMatrix. Sorry for overlooking your mistake in your previous post.

1 Like

Thank you! Works OK now!

Unfortunately, I’m back on topic. It turned out that on (at least) one computer, there is a problem with gl_PointCoord. I tested it on Apple M1 (driver version: 4.1 Metal - 83.1) and some other high-end Windows PCs and everything is OK. On the other hand, on one laptop (also Windows, new, very good, but not gaming, vendor: ATI Technologies Inc, renderer: AMD Radeon Graphics, driver version: 4.6.0 Compatibility Profile Context 22.40.01.54.230214) in the gl_PointCoord shader always reports 0. What could it be?

Do you have point sprites enabled on the object? I think you need an appropriate TexGenAttrib to enable this.

2 Likes

Many thanks! Indeed, it was beneficial! For future reference ;-), in case someone needs it - right after loading the model (which we’ll name model), use the following Python code:

model.setTexGen(TextureStage.getDefault(), TexGenAttrib.MPointSprite)
2 Likes