I would like to use GeomPoints to display point clouds. Though, it does not look that great yet when zooming out with the camera. The point clouds are a bit overloaded if the GeomPoints are not scaling according to their distance to the camera. Here I read that setRenderModePerspective should make them be rendered smaller/bigger when the camera is closer to/farther from the GeomPoint. But it does not work for me. Am I doing something wrong?
I’m bumping up this topic because I’m having the same problem.
First of all, from what I understand so far, if I want to use shaders (I actually use my own shaders, not an autoshader), then unfortunately I have to forget about perspective. Well, unless I turn on hardware-point-sprites 0.
But now… First of all enabling hardware-point-sprites 0 which I do like this:
loadPrcFileData("", "hardware-point-sprites 0")
base = ShowBase()
causes me to crash of Panda3D with: Process finished with exit code 139 (interrupted by signal 11: SIGSEGV)
I wanted to check what the “performance penalty” would be, but I can’t even check that.
Secondly, asking a bit broader: even if we managed to get it somehow (via hardware-point-sprites 0 or by installing Panda3D 1.10.14 - by the way, when can we expect this version?), is there any chance to make the rendered vertices (geoms) have a different shape than just squares? For example, I found a YouTube channel where a guy plays with pointclouds (but in Unity) and renders them, in close-ups, as disks:
Which version of Panda are you using, precisely? 1.10.13?
You can use perspective points in a shader, but you need to calculate the point size yourself in the shader and assign it to gl_PointSize, and set a flag on the ShaderAttrib indicating that the shader specifies the point size. I can help you with the math if you need. This is more efficient than using software-emulated point sprites.
To get round points, enable the AntialiasAttrib.M_point mode. Or calculate the squared distance to the center in the shader from gl_PointCoord and discard fragments that exceed 1.
The crash is a bug. If you have a reproducible test case, please file a bug report.
I’m still a newbie to shaders though, and I just didn’t know exactly how perspective in Panda3D point clouds worked, nor did I know there was such a thing as gl_PointSize. I’ve now managed to use this inside a vertex shader and it actually works. Of course I have to adjust the point size and probably tie it to some perspective equations. If I could ask for a ready-made snippet, I would be happy to - it would save me a lot of time, deriving mathematical formulas and testing.
As for non-square shapes, I couldn’t get anything done using AntialiasAttrib.M_point (and yes, I enabled multiple sampling in the config, which only resulted in a significant decrease in performance). However, I liked the idea of just rendering the shape inside a fragment shader and relating it to the center of gl_PointCoord. Not only did I manage to implement this, but I also see many more possibilities here, such as blurring points that are beyond depth of field (although I still have performance concerns when point clouds start to count millions of points).
As for the bug - (un)fortunately, when I tried to recreate it, it stopped appearing. Weird…
The point size in perspective mode is calculated by taking some constant factor (dependent of the desired thickness and the size of a pixel on screen) and dividing it by the length of the vertex position translated to view space.
Thank you. I’ve checked, but something seems to be still wrong and I have doubts.
Are you ABSOLUTELY SURE this should be p3d_ViewMatrix? Doesn’t p3d_ViewMatrix mean coordinates relative to the model coordinate system? Because for me the effect is such that the points in the middle of the model are thick, and on its outskirts - tiny.
On the other hand it is quite good-looking when I calculate the length after multiplying p3d_Vertex by p3d_ModelViewProjectionMatrix, p3d_ModelViewMatrix, or p3d_ModelMatrix.
And one more thing: I noticed an uninteresting effect at further points, which are already of the order of single pixels - because then you can see a very clear border between (for example) points of size 1 and 2. And I came up with the idea to simply do:
Unfortunately, I’m back on topic. It turned out that on (at least) one computer, there is a problem with gl_PointCoord. I tested it on Apple M1 (driver version: 4.1 Metal - 83.1) and some other high-end Windows PCs and everything is OK. On the other hand, on one laptop (also Windows, new, very good, but not gaming, vendor: ATI Technologies Inc, renderer: AMD Radeon Graphics, driver version: 4.6.0 Compatibility Profile Context 22.40.01.54.230214) in the gl_PointCoord shader always reports 0. What could it be?