Can Pure GL Be Used Here?

Hi,

I’ve noticed a particular behavior in Pand3D when it comes to antialiasing. It doesn’t matter how many samples you request, Panda will return with the Max that your Graphic card can allow.

Example: If I ask for samples x 2… I’ll always get the full x 16, since my Graphic Card’s max is 16.

This is a major performance waster.

I was wondering… Is there a way to import Open GL and write your own GL code to obtain the correct number of samples?

If anyone has done this, please share the code. I’m on a windows OS.

I really would like to obtain control over the samples asked for. I hope Panda3D’s Developers will fix this issue in the future.

When I use MSAA + opengl, my graphics also returns ‘16’. But actually, the sample quality is 2x (not 16x.) You can check this against the DX MSAA which works correctly on my GTX 260 (ie., gives a controllable range between 2x-16x.) Unfortunately, the DX pipeline also automatically frame caps my video to 60fps no what settings I use for sync-video, vsync…etc. I’m not sure if the DX pipeline has other problems with MSAA as well (ie., slow.)

As for opengl, I suspeect our drivers only allows for a 2x multisample and either Panda or the driver is reporting the wrong maximum. Opengl drivers are generally (in my opinion), pretty crappy compared to their DX counterparts.

I’m sure I’m getting x16 QQ. You can easily tell the difference between x2 and x16. The “gagged edge” look has been totally destroyed…you don’t get that much smoothing with x2. Plus, higher sampling will cause you to reduce the resolution of your app.

Even when I run commercial games on my PC with x2 sampling, there is always some “jagged” look to the graphics…unless I go really high resolution.

I would say Panda has some error code somewhere…but then again, a human(s) made Panda and we all know those humans aren’t perfect.

Would love to see Mr. PHD fix this later…can’t remeber his name.

What kind of video card do you have? I have a Gtx 260 running on XP 32-bit and my ‘16x’ gl MSAA definately looks like 2x dx MSAA. I’ve uploaded a few pictures to image shack.

dx 2x
img651.imageshack.us/i/dxmsaa2.png/

dx 16x
img135.imageshack.us/i/dxmsaa16.png/

gl 16x
img809.imageshack.us/i/openglmsaa16.png/

Imageshack rescaled my image a little bit, but you can still see along the backtail of the body, the 2x dx + 16x gl is still somewhat pixelated while the 16x dx is perfectly smooth.

Does enabling dx MSAA automatically vsyncs the display for you? (It does for me!)

What resolution are you in? Smaller Resolutions with x16 would look like x2. I run a 1280x768 res with a forced x16qq and my edges are liquid smooth. The only way for me to see what a true x2 looks like is to allow my graphic card to take control and manually set x2.

Of course when I do that, I can jump to higher resolutions. Even x2 would look much better at 1600x1024 res.

I can only conclude Panda3D’s settings for sampling is based not on the most modern graphic cards (drivers), since I’m using a modern day Nvidia. Everything on the store shelves now pretty much will be Nvidia or ATI, new Gen.

Panda may need a re-write of codes now.