Detecting unsuported shaders

How do you detect when a shader compiles correctly but is not supported by the graphics card? Obviously it spits out an error to the command line, but I would like to detect this scenario and switch to another technique when this happens, and yet can’t figure out how to do so.

The problem is, a shader might fail for one particular GSG while it works with the other. There’s no way at all to detect whether a shader loads until a shader context is created - and that happens the first time when a model with that shader appears into view.

Note that there is getErrorFlag() that will return False in many cases when something goes wrong.

Yeah, saw that error flag, but it explicitly states that it doesn’t catch the situation where a shader fails to work for the current graphics card.

Isn’t there some way of testing it? I note there is a prepareNow() method for Shader, but its unclear what happens if you try to prepare and it can’t - would it return None? (I would test, but I am not sure where I would get the inputs to that static method from. Well, there is a get default for the second one, but no idea for the first.)

There really has to be a way of doing this - I know it potentially can get complicated, but for typical computers all GSG’s should have the same capabilities, and you really do have to be able to elegantly scale back depending on the users hardware. In my particular case its a fullscreen filter that fails on my laptop - consequentially you can’t see anything, which really is game over.

Well, it can go wrong in the GLShaderContext constructor (or in GLShaderContext::bind), and at that point you don’t really return anything, the only thing I can do in there is print an error. Unless theres some kind of error flag on the GSG that I’m not aware of.

I recommend just specifying a profile in your shader and testing if that particular profile is supported by the GSG. if so, you have a pretty good chance that it will succeed.