prepare_scene() not working.

I have a billboard with a small 64X64 texture applied to it. It isn’t visible when I first enter the game. When I first rotate the view around to look at the billboard the game chugs once, after that I can rotate the view around just fine without any problems. Here is the code to make my billboard.

NodePath coin = window->load_model(SuperGlobals::framework.get_models(),Filename("plane"));
Texture* coin_tex = TexturePool::load_texture(Filename("coin1.png"));
PT(TextureStage) tsc = new TextureStage("tsc");
coin.set_texture(tsc,coin_tex);

coin.reparent_to(render);
coin.set_billboard_point_world();
coin.set_transparency(TransparencyAttrib::M_alpha);
coin.set_light_off(dlnp);

I tried to fix the problem by trying the following methods, one at time

coin.prepare_scene(window->get_graphics_window()->get_gsg());
		coin.prepare_scene(window->get_graphics_output()->get_gsg());

render.prepare_scene(window->get_graphics_output()->get_gsg());

I also tried sticking the first two methods in a task but it didn’t seem to fix the problem.
[/code]

It’s hard to imagine a 64x64 texture would cause a visible chug. It’s possible, but I think it’s likely to be something else that’s making the chug.

By chance are you using the auto-shader with something like render.set_shader_auto()? That would require a new shader to be compiled the first time a particular render state is encountered, which would indeed cause a visible chug.

Unfortunately, prepare_scene() doesn’t yet handle pre-compiling shaders. I agree it should, but it doesn’t.

David

No I don’t think I’m using an auto shader, I don’t think my computer even supports shaders at all.
I thought that the lag might just happen for anything with a texture so I replaced the coin with the panda model

NodePath pandaActor = window->load_model(SuperGlobals::framework.get_models(), "panda-model");
   // pandaActor.set_scale(0.025);
	//pandaActor.set_y(1000);
   // pandaActor.reparent_to(window->get_render());

the panda model created no lag when I turned to face it, so I thought it might have to do with transparency so I loaded up the environment model instead.

NodePath pandaActor = window->load_model(SuperGlobals::framework.get_models(), "panda-model");

Then there was lag when I turned to face the environment, tried fixing it with:

render.prepare_scene(SuperGlobals::framework.get_window(0)->get_graphics_window()->get_gsg());

But it didn’t seem to make a difference, so then I switched back to using my coin and tried turning off the transparency by commenting out

//coin.set_transparency(TransparencyAttrib::M_alpha);

But it still lagged so I removed the texture by commenting out the following three lines:

Texture* coin_tex = TexturePool::load_texture(Filename("coin1.png"));
PT(TextureStage) tsc = new TextureStage("tsc");
coin.set_texture(tsc,coin_tex);

Now it doesn’t lag but of course my coin is not textured. So I think the causes of the lag are transparency and dynamically loaded textures.
Is there a better way to load textures? Also I was wondering if there was a better way to call prepare_scene and if I need to add anything extra to make it work.

I can do some more experiments on using prepare_scene but I need to know the proper way to call it, and the proper way to get the gsg for the parameter.

Should I do it like this?
coin.prepare_scene(window->get_graphics_window()->get_gsg());

Or Like this? coin.prepare_scene(window->get_graphics_output()->get_gsg());

Where window is a pointer to my current window?

Either of those is correct. The only difference between get_graphics_window() and get_graphics_output() is that the former only returns a value if you are rendering onscreen, while the latter returns a value if you are rendering onscreen or offscreen.

David

Alright, thank you, but I still can’t get the function to have any noticeable effect. I am using Panda 1.7.2, would it make any difference to use the devel build?

I created a source code sample to isolate and test the problem.

#include "pandaFramework.h"
#include "pandaSystem.h"

#include "genericAsyncTask.h"
#include "asyncTaskManager.h"

PandaFramework framework;
PT(AsyncTaskManager) taskMgr = AsyncTaskManager::get_global_ptr(); 
PT(ClockObject) globalClock = ClockObject::get_global_clock();
NodePath camera;

bool bTurnRight = false;
bool bTurnLeft = false;


AsyncTask::DoneStatus spinCameraTask(GenericAsyncTask* task, void* data) {
  double time = globalClock->get_real_time();

  //we don't actually need this, just use trackball, I just didn't want empty task.
  if (bTurnRight)
  camera.set_hpr(time * 10.0, 0, 0);
  if (bTurnLeft)
  camera.set_hpr(time * 10.0, 0, 0);

  return AsyncTask::DS_cont;
}

int main(int argc, char *argv[]) {

framework.open_framework(argc, argv);
framework.set_window_title("Hello World!");
// Open it!
WindowFramework *window = framework.open_window();

camera = window->get_camera_group();

// Load the environment model. This we will placed outside the view to test prepare_scene()
NodePath environ = window->load_model(framework.get_models(), "models/environment");
environ.reparent_to(window->get_render());
environ.set_scale(0.025, 0.025, 0.025);
environ.set_pos(-100, 0, 0);

// Enable keyboard detection
window->enable_keyboard();
// Enable default camera movement
window->setup_trackball();

if (window != (WindowFramework *)NULL) {
  nout << "Opened the window successfully!\n";
 
  // Load our panda, just to give us something to see when we s
    NodePath pandaActor = window->load_model(framework.get_models(), "panda-model");
    pandaActor.set_scale(0.005);
    pandaActor.reparent_to(window->get_render());
 
    // Load the walk animation
    window->load_model(pandaActor, "panda-walk4");
    window->loop_animations(0);

	
  // Add our task.
  taskMgr->add(new GenericAsyncTask("Spins the camera",
    &spinCameraTask, (void*) NULL));
 
//here we test prepare_scene
  environ.prepare_scene(window->get_graphics_window()->get_gsg());

  framework.main_loop();
} else {
  nout << "Could not load the window!\n";
}

framework.close_framework();
  return (0);
}

If this works on your computer I’m guessing it has something to do with my integrated graphics.
Although after I have looked at the model once there is no lag for turning away and then looking at it again.

thanks in advance

I will investigate, thanks! I’m juggling several things at once right now, so please forgive me if it takes me a few days to get to it.

David

While I am waiting I am finally getting around to doing my taxes for this year :smiley:

My apologies for the long delay. I’ve just investigated, and discovered that while prepare_scene() was in fact correctly preloading the Texture objects, it was failing to preload the vertex buffers, which is no doubt what was causing the chug in your case.

I’ve just committed the necessary changes that should fix prepare_scene() to preload vertex buffers as well as textures. (It still won’t pre-compute shaders, that’s going to be a more involved fix that will have to come a little bit later.)

Please try it again with the latest buildbot (once the builtbot server has built with these latest fixes) and let me know if it works better for you now. :slight_smile:

David

hey David

Sorry, I just found out the chug was all my fault. I’m a noob at visual studio (studied my first C++ in CodeBlocks with minGW). So when I ran the program I always just hit the green arrow to “start debugging”. I didn’t realize there was also a “start without debugging” option :/. Now that I am doing that I am having trouble testing prepare_scene properly because I can’t seem to get any chugs whatsoever. When I run it with “start debugging” I still get lag, even when I use prepare_scene.
By the way this is the first time I’ve used the devel build but it seems to work good so far.