[Solved] Can't use more than 8 texture units in directx

As I said in the topic, I have a shader where I use more than 8 textures. It works fine in OpenGL, but in DirectX I can’t retrieve any texunit after the 8th one, no errors, I just get black values.

I have tried with and without the “arbvp1 arbfp1” profiles, with and without basic-shaders-only, it still won’t work.

My graphic card has 16 texture units, but it shouldn’t matter anyway because its shader model 4.0.

Am I missing something?

EDIT: for the record, I also tried vs_3_0 ps_3_0, hlslv hlslf to no avail.

dont use directX?

treeform, that’s neither constructive nor original, I’m always getting the same answer whenever I ask in IRC about a directx problem. Funnily, when I ask about a OpenGL problem I don’t get “then don’t use OpenGL”, and when I ask about a Panda problem in general, I don’t get “then don’t use Panda”. (The same thing happens if the problem somehow involves Windows.) In any case, I guess I’ll answer it once so that next time it happens I can refer to this post.

One of the reasons I use an engine like panda rather than learning an API is because that way I have access to two APIs, not using one of them kinda defeats the point.

It’s silly to stop using something just because you encounter a problem, do you do this frequently in your daily life? I doubt it, because if you did so, then you’d have a lot of problems getting anything done. What I’m saying is, if there’s a simple fix to this problem that involves a day or two of work then it’s worth it for me to fix it at the gain of keeping compatibility with two APIs.

Why would I want DirectX to start with? I’ll give you three reasons, sorted in descending order of significance.

Very important: Fall-back mode for troubleshooting. Say there’s some bug in my game, in Panda, in the graphic drivers or in the SO related to one of the APIs, and most of the times unexpected bugs fall in this category. In this case it’s very useful for the user to be able to choose an API, because most of the times simply doing so will allow them to avoid that particular bug, so it eases support enormously. I can tell them “I’ll look into it, in the meantime, start the game in DirectX/OpenGL”.

Important: Xbox. If it isn’t much of a problem I’d like to be able to keep DX support, because that would make it easier to port my stuff to the xbox if I’m in a position to do so in the future.

Less important: DirectX works better than OpenGL in Windows, specially in Vista and 7. Better startup time (my game starts up in 0.1 seconds with DirectX), marginally better framerate in some situations, and it works much better with crappy drivers. But this is minor…

If what you are suggesting is that DirectX doesn’t work well with Panda so I shouldn’t use it, well… Then maybe we should stop advertising DirectX support as a feature on the Panda homepage, don’t you think?

For the record, reasons (1) and (3) are both considered Very Important to Disney as well, which is why Panda includes DirectX support in the first place. We would probably even rank (3) ahead of (1) in importance, because it’s very important for us to minimize the need for that initial support call in the first place: at the razor-thin profit margin at which all online game companies operate, even one support call is likely to blow away the profitability for that customer. So, we need the game to work perfectly out of the box, even for a user who doesn’t know what a graphics driver is, and that means we need it to run well under DirectX.

Of course, Disney doesn’t currently offer any titles that put such heavy demands on the graphics card, since there are still so very many Intel 845’s in the world, so we’ve never had a reason to explore shader limits in either API. So, sorry, but I don’t know why you would be experiencing this limit. I wonder if it’s related to the texture coordinate set? DX8 required an explicit texture coordinate pair to appear in the vertex buffer for each texture stage used, even if the texture stage shares the same texture coordinates used by another texture stage. There was necessarily a limit on the number of texture coordinate pairs that could be passed in this way, and I suspect that limit was set arbitrarily at 8 (which was a lot back in the DX8 days).

I don’t think DX9 has the same limitation, since they relaxed the requirements considerably to support more shader features, but I suspect that the DXGeomMunger9 code is still just a basic copy of the DXGeomMunger8 code, and wasn’t necessarily updated to match DX9’s more flexible requirements.

David

Well thanks for the pointers, I’ll take a look there. For the record, the reason I hit the limit is because of a terrain splatting shader, 4 diffuses, 4 normals + 3 alphamap channels fitted in 1 texture = 9 textures (If there’s a better way to pass this info to the shader I’d love to know). I can easily make it so that I never use that many passes on the same terrain chunk, of course. But I’d rather not have such an arbitrary limit in the terrain engine if it’s possible to avoid it.

Are you using the alpha channels of your diffuse and normal maps for anything? If not, you could pass your three alpha maps alongside those, and thus avoid the need for a ninth texture.

But, yeah, an arbitrary limit of eight textures rankles, especially if it’s not due to a hardware limit.

David

No, they are unused but I don’t want to constraint the alphamaps to the diffuse maps dimensions. Also, I’d like to reserve that channel for an extra effect.

I found this in dxgsg9base.h, btw:

#define D3D_MAXTEXTURESTAGES 8

Which sounded so promising… but increasing it didn’t have an effect. I’ll keep looking.

(Double posting sorry)

I found it, at dxGraphicsStateGuardian9.cxx

_max_texture_stages = d3d_caps.MaxSimultaneousTextures;

That sets it to 8 by default for me, if I replace that with:

_max_texture_stages = 16;

Then I get all the passes I need and my shader works. You should take a look at that, it seems to me that you are misinterpreting what MaxSimultaneousTextures stands for, but I don’t know what exactly I should replace it with. If you have any idea I’ll be glad to help.

Ah, I bet that’s just intended to be used for the fixed-function pipeline, and shouldn’t be capped at all in the shader case. An easy fix.

David

For archival purposes, David fixed this in CVS for 1.7.0.