When I use MSAA + opengl, my graphics also returns ‘16’. But actually, the sample quality is 2x (not 16x.) You can check this against the DX MSAA which works correctly on my GTX 260 (ie., gives a controllable range between 2x-16x.) Unfortunately, the DX pipeline also automatically frame caps my video to 60fps no what settings I use for sync-video, vsync…etc. I’m not sure if the DX pipeline has other problems with MSAA as well (ie., slow.)
As for opengl, I suspeect our drivers only allows for a 2x multisample and either Panda or the driver is reporting the wrong maximum. Opengl drivers are generally (in my opinion), pretty crappy compared to their DX counterparts.
I’m sure I’m getting x16 QQ. You can easily tell the difference between x2 and x16. The “gagged edge” look has been totally destroyed…you don’t get that much smoothing with x2. Plus, higher sampling will cause you to reduce the resolution of your app.
Even when I run commercial games on my PC with x2 sampling, there is always some “jagged” look to the graphics…unless I go really high resolution.
I would say Panda has some error code somewhere…but then again, a human(s) made Panda and we all know those humans aren’t perfect.
Would love to see Mr. PHD fix this later…can’t remeber his name.
What resolution are you in? Smaller Resolutions with x16 would look like x2. I run a 1280x768 res with a forced x16qq and my edges are liquid smooth. The only way for me to see what a true x2 looks like is to allow my graphic card to take control and manually set x2.
Of course when I do that, I can jump to higher resolutions. Even x2 would look much better at 1600x1024 res.
I can only conclude Panda3D’s settings for sampling is based not on the most modern graphic cards (drivers), since I’m using a modern day Nvidia. Everything on the store shelves now pretty much will be Nvidia or ATI, new Gen.