So I have found out way before that one way of loading background images or GUI textures is to make texture cards with the egg-texture-cards command line tool.
I have also been told that if I want to use non power of two textures, I’ll need to tell panda that my hardware supports it or scale it to the next bigger power of two instead, with these commands in the Config.prc file respectively:
The visual quality difference of the image is noticeable after these changes.
BUT I have always noticed that my gui images loaded as a texture cards, compared to the image viewed in an image viewer, is still blurry, not very much and not very noticeable, but when you put them side by side you can easily spot the difference.
I make texture ‘cards’ with the ‘egg-texture-cards’ panda tool and load them as any 3d models. Like the Manual suggests. Texture cards are also used by some Panda classes like Direct GUI, you can pre set different textures for different button states, form example. So I really need to use it.
I’ve heard about
but i’ve also heard it wont work with hardwares not supporting non power of 2 textures.
I also know about
Which scales your texture to the closest bigger power of 2 rather than the closest smaller power of 2.
This does provide a better visual quality, but it is still a bit blurred.
I have been told that in egg-texture-cards you can set your pixel-prefect resolution with the -p parameter, like
This will make the 3d model pixel perfect at anything x 768 resolution (2 * 768).
The -p parameter to egg-texture cards has nothing to do with ensuring pixel-perfect rendering. It only tells egg-texture-cards to create a card of the appropriate size to render the texture in its original size onscreen. The texture image may still be scaled bigger or smaller if it is not a power of 2, according to your textures-power-2 setting.
Still, if you have set “textures-power-2 none”, and your window and your texture are both indeed 768 pixels tall, then “-p 384,384” should be the right parameter to make a pixel-perfect texture that fills the screen. Of course, if your goal is to exactly fill the screen, you could just create a -1 … 1 card as Bei suggests, and parent it to render2d instead of aspect2d.
You might also try setting the minfilter and magfilter to FTNearest, either via Python code or via the egg-texture-cards options “-minf nearest -magf nearest”. This disables any automatic filtering that might be applied in case your scale is still a little bit off.
OK, you caught me in a slight misstatement. Strictly speaking, placing the texture on a card that is precisely 1-to-1 with pixels on the screen ought to result in a pixel-perfect rendering, if you have not downscaled your texture due to a non-power-2 rescale.
But it is not 100% guaranteed to be so. And certainly if you allow your texture to be downscaled it won’t be true. If you allow your texture to be upscaled, it’s more likely to work, but still it might be a little bit fuzzy due to the upscaling. And there still might be some filtering applied, which makes it fuzzy, unless you disable the filters as I suggest.
Achieving pixel-perfect accuracy on 3-D hardware is not straightforward. The hardware is not built for that sort of thing. It can be done, but you have to go out of your way.
I updated my post (I do that often), don’t know how much you read.
Sorry, why then should I use ‘egg-texture-cards’?
I mean the easiest way to make buttons, or to make images with proper aspect is with it, but you say you can’t have true pixel-perfect rendering.
‘pixel perfect’ rendering of images is something I notice on any modern games I play. Actually I noticed this ‘blurriness’ by loading a game logo from another game’s main screen in panda and compared the 2.
I just dont understand why I cant have the same detail in Panda, with the suggested tools from the Manual.
Also what about if I have a resolution lower than 1024x768? Will Panda again load the cards blurry? That doesnt make any sense.
You can certainly have pixel-perfect rendering. It just isn’t straightforward. These other games you’re referring to no doubt had to deal with the same sorts of issues, and overcome them in similar ways.
Sorry, my mistake. I’d thought you were referring to a background texture. In the case that your texture is not intended to fill the entire screen, it can be whatever size you like.
There are lots of different things that can cause blurriness. It can be caused by an incorrect size card, by an automatic scale to a power-of-2 texture size, or by an inappropriate filter selection. You have to make sure all of these things are set correctly. We have already described the way you set each of them correctly. Are you still experiencing blurriness after setting each of these as we discussed?
“pixel-perfect” means that the pixels you paint exactly correspond to pixels onscreen. This means your card has to be the same size onscreen as your texture. If your button is rendered smaller than your texture, of course it will blur out the pixels, because you have to subsample the pixels. If you want it to be pixel-perfect even when you have made the window smaller, it means you will have to make your card bigger to compensate. One way is to have different cards, one for each window size. Another way is to forget about the -p parameter to egg-texture-cards, and parent your buttons to pixel2d instead of aspect2d. Then you will be dealing with pixel locations instead of abstract -1…1 units, but you will also have to deal with repositioning your gui elements for a smaller window.
Well from your post I concluded that you can’t do it without power of 2 textures because thats how 3d hardwares work. My bad.
You mean the -p parameter. Set it to your y resolution / 2. Do I still have this wrong?
When I modded other games, I remember that I dont remember any games which had power of 2 textures for images and GUIs. So probably they had something like that. But still seamed ‘pixel-perfect’
When you say panda scales them to next bigger power of 2 during rendering, do you mean scaling or changing the canvas size. is the latter possible? (i hope you get what I mean here)
This could be it. I havent been mentioned that untill today.
Sorry I think I posted it in quotation marks, what I ment by ‘pixel-perfect’ here (and in few other places) is, like scaling your images down in an image editor. Of course there is ‘subsampling’ generated between pixels in that case. i didnt mean that as ‘blurriness’.
Subsampling will cause blurriness. Depending on the nature of the filter, the blurriness may be more or less noticeable. Image editing programs generally use a pretty high-quality filter, so the blurriness is not as noticeable. 3-D hardware uses a relatively low-quality filter by default to do subsampling, so it will be more noticeable. You can reduce the blurriness artifacts by increasing the anisotropic degree.
When I’m wishing for a feature it means I dont know enough to do it myself, sadly.
Most of them you can disable the filter in options.
Yeah, it probably means if there is blurriness produced when scaling to the next bigger power of two, this is not my case. I might not even notice that blurriness.
Uh, what were the other two possibilities?
I mean minfilter, magfilter. Set them both to “nearest” to disable filtering altogether, that’s what I suggested above.
But you should also experiment with anisotropic filtering. This means setting minfilter and magfilter to “linear”, and setting a value like 4 or higher as the anisotropic filter value. I assure you that this is commonly done in commercial games, and there’s no reason for a game to provide an option to turn it off, so there’s no way for you to know that they’ve done it.