Struggling to understand performance issue / framerate drop

'lo again.

I’ve been building a scene and just recently noticed a performance issue I’m having little success figuring out.

It started when I was happily testing the scene, when I realised the framerate dipped from 60 to 40 when looking at the floor. You read that right: Looking directly at the complex scene (a forest) ran at ~60 FPS, but looking at the floor was ~40. I did some more experimenting and realised the same thing happens if I in any way turn the camera away from the geometry (e.g. if I put the camera outside of and looking towards a small test forest; everything is fine until I turn the camera away, whereupon the framerate drops, despite drawing nothing but a black screen).

According to pstats, the “draw” time pretty much doubles (e.g. from ~17 to ~33 ms) when looking away from the geometry. Nothing else (app/cull/etc) seems to change noticeably, as far as I can tell.

I’ve used oobeCull and it does appear that the objects are (ostensibly) being culled (i.e. they display as a red wireframe when outside of the view frustum).

I’ve currently got a test scene set up where I’ve got nothing in front of the camera and a few hundred objects behind the camera, but the framerate is 30 FPS and the draw time is higher.

In all test scenarios, the low FPS is also constant, whether or not the player/camera is moving around.

render.analyze shows 512 geoms in total. The “Geom” graph in pstats also says 512 at all times. If that’s the (average) number of geoms being drawn at any given time - and not just how many are in the scene graph - I guess that would suggest all 512 geoms are being drawn, despite not being in view of the camera and supposedly being culled?

If you need any more information, feel free to ask. If I’m missing something obvious, feel free to slap me.

For what it’s worth: I’m running Windows. I know there have been Windows-specific bugs that have slipped through the cracks in the past.

Cheers.

Edit: Just to eliminate some confusion and rule out the high geom count, I also have a test scene with ~170 geoms which has the same problem i.e. if I place ~100 objects in front of the camera, I get 60 FPS and can move around without issue, but if I then open my level editor, move those same objects behind the camera and re-run the test, I get 40-50 FPS and noticeable stutter when moving.

Edit #2: I just noticed the “primitive batches” increases from ~20k to ~80k when the geometry is out of view. Could that be the culprit? What might be causing this? (It seems to be the “triangle strips” - if that’s relevant).

Some more info that might help:

I did some more testing and it’s starting to look like the problem is somehow linked to the tristrips being automatically created when Panda loads the model.

So in my case, when egg-mesh is set to 1 (by default), Panda creates thousands of tristrips (each with 2 triangles) for the tree. In pstats, “primitive batches” is 2 when the tree is in view and spikes up to 1481 when I move the camera to look away from the tree. If I set egg-mesh to 0 (and re-save the model to bypass the cache) I can see from render.analyze that there’s now 0 tristrips - and in pstats I can see “primitive batches” is 2 regardless of where the tree/camera is.

So I guess the remaining questions are:

  1. What causes this spike to happen?

  2. If there’s 2 - 1481+ primitive batches in pstats, regardless of whether the tree is in view or not, does that mean it’s still being drawn / sent to the graphics card, even though it’s supposed to have been culled?

Thanks again.

Unfortunately, I do not know what is causing your performance issues, but I do agree that it sounds like bad culling and/or bad mesh optimizations. However, I would like to offer a couple of tips/insights for doing profiling.

First off, you’re running into issues with vertical sync (vsync). With vsync enabled, your frame time will always be a multiple of your monitor’s vertical refresh rate since the engine will wait to present the rendered image to the monitor to keep in sync with the monitor. This avoid tearing, but can make your profiling confusing. For example, if you go from a frame time of 12ms to 14ms, you will see your frame time jump from ~16.6ms to ~33ms when using vsync on a 60Hz monitor. So, a 2ms difference balloons to about ~17ms.

Secondly, you should be using frame time for profiling instead of FPS since frame time is linear and FPS is not. In other words, a 2ms difference to frame time is always the same regardless of the starting point (e.g., 16ms to 14ms versus 8ms to 6ms). However, the same cannot be said for a difference of 2 FPS. A drop of 2 FPS from 42 FPS (~23.8ms) to 40 FPS (25ms) is not the same as a 2 FPS drop from 12 FPS (83.3ms) to 10 FPS (100ms). In the first case, the 2 FPS drop was not as bad since it was “only” a 1.2ms, but losing 2 FPS when you started at 12 FPS is a much more problematic ~17ms increase to frame time.

You say you are seeing red outlines when the objects aren’t in view; does this mean you have fake-view-frustum-cull enabled? If I’m not mistaken, with this setting off, you should see the objects just disappear in oobeCull mode, rather than turn into a red outline. Could it be that you have this enabled, and that the objects that are out of view are the red outlines that Panda is generating, and that Panda uses an inefficient method for converting tristrips in particular to line segments?

Well holy shit. Turns out I did have that enabled and I don’t even remember doing it.

Thanks all for the input.