sync-video question.

I just started playing around with Panda3d a couple of days ago, and my general impression is that it’s fantastic. The only problem I have is that I get second long stalls on every panda program i run, if I don’t set

sync-video 0

With sync-video off, I get no stalls and a frame rate thats usually over 600 FPS. I have a decently powerfull Dell Inspirion 1420 with a Nvidia GeForce 8 Series graphics card running an admittedly outdated Ubuntu Gutsy Gibbon.

I’ve been wondering why sync-video isn’t working for me. With sync-video on, I tried setting my panda3d program to the “realtime” round-robin scheduler with the highest priority possible, and I niced the process down to the highest priority possible, There was no change. So, I doubt that it’s the scheduler.

Then I started looking at the engine code to see what the sync-video call actually did, so that I could see if I could find any Nvidia driver information about it. As far, as I can tell, sync-video only calls

glXSwapIntervalSGI(1)

which according to the documentation I looked at doesn’t actually try to sync your application with the screen refresh. It simply caps the application to update no more often than every 1 screen refresh.

It looks like the GLX calls to actually sync the video to the refresh rate are
glXGetVideoSyncSGI() and glXWaitVideoSyncSGI(), which I couldn’t find called anywhere. Granted, I haven’t dug too hard.

So my question is, does panda3d actually try to sync to the video refreshes when it’s using openGL, or is it just capped at them? Because it seems to me that glXWaitVideoSyncSGI() might be just the thing that’s needed to make sure my panda programs wake up, and don’t keep stalling for a second or more.

Of course, if this is only happening to me, and I’m using an older kernel, and nvidia driver, there’s not much point in changing things.

So, do other linux users need to run with sync-video off to avoid stutters and stalls?

All I know is that SwapInterval commonly indicates the interval at which the front-buffer and back-buffer need to be swapped. This is so that Panda can render into the backbuffer while the front-buffer is shown, and swap them when it finished rendering.

Running a Dell XPS M1530 with an 8-series GeForce as well, Ubuntu Karmic Koala. I’m guessing it’s a driver issue. Which drivers are you using? I recommend upgrading to the official nvidia 185 drivers or so. (Not sure how that’s managed in Gutsy - you might need to install the tool “envy” to do the trick there.)

I’m currently using the latest supported nvidia-glx-new package for Gutsy, which is the 100.14.19 version. It suppose I should bite the bullet, back-up, and upgrade to a release that people are still making new packages for.

Great to hear that sync-video works fine with linux and nvidia cards for other people. It did seem kinda wrong to max out an entire CPU, simply to render 10 times faster than my screen refreshes.

Wow. Version 100 is really old. I really recommend upgrading.
Just to point out, maxing out your CPU is nothing weird, that’s common behavior. Panda just takes full advantage of the available CPU power, as everything that’s not used is wasted. :slight_smile:

If you don’t want it to take 100% CPU, you could just add a very tiny unnoticable sleep to your game (for example, by setting the “client-sleep” configuration value to, say, 0.00005 seconds).