Well it kind of sucks that I could get our formats in Panda3d and the texture of the font format is the only reason I need to keep a cache folder. Might there be an alternative, perhaps lower level way to create bitmap fonts for Panda? Or maybe I could ask that feature to be added somehow? ( I have no idea how that could work with the egg format).
BTW, it could be an error in my code, but just to make sure, if I have
data = EggData()
and in the end when everything is set up
I should get an egg file? Because right now it’s empty.
Yes, as I explained before, it is the X coordinate of the point’s 3-D position that determines the width of the character. It is not the “thickness” of the point. Every point requires a 3-D position, which is specified with a vertex in the vertex pool.
Keep in mind that points are primarily used for, well, rendering points. It is only the egg font convention that makes a point be used instead for the purpose of determining the width of a character, and according to the egg font convention, it is the position of the point (and not any other property) that determines the width of the character.
So the problem is that you don’t want to write the texture out to disk at all? Well, there are alternatives, of course. There are many ways to achieve the same thing, it’s just a question of how deep you want to get into the low-level code. The egg library is a relatively high-level interface for constructing geometry, but it’s primarily used for loading files from disk, so it doesn’t have a mechanism built-in for textures that are already resident.
However, you can certainly achieve that. One easy thing to do is just create the fonts without any reference to textures at all, and then apply the texture afterwards, with root.setTexture(myTex), and then a root.flattenLight() to ensure it gets applied down to the individual quads. Another, different trick would be to go ahead and reference the texture in the egg file, with some arbitrary (but nonexistent) filename, and preload the texture in the TexturePool with tex.setFullpath(filename); TexturePool.addTexture(tex). This way when you load the egg file, it will see that the texture filename you named has already been “loaded” and won’t try to reload it. It doesn’t matter if it actually exists on disk or not.
Yes, you should get an egg file. Make sure you have called data.addChild() for each toplevel group you want to add to your egg file, otherwise it will be empty.
I’ll try the different approaches for loading textures when I get the rest working. Thanks.
PS. The pixel data of the glyphs is stored in the file after the character code, for each glyph. Right now I use PIL’s paste() method to copy that small glyph image to the bigger font image, giving the offset each time I think we might not need PIL anymore soon and I was wondering how would you suggest to do this in Panda. I can’t just append the data to a string. I found pnmimage.copySubImage(), but I’m not sure if it’s the best way of doing it.
Font height as in the y “size” of the characer quads, or the lenght between two lines of text using this font?
This might be pointless, but can you create fonts by just using the low level classes such as geomVertexData, etc? Because I’m generating a font file with around 9000 characters on each startup and the game already does some file scans on startup, so I’m afraid generating an egg string might be a bit slow for this. Right now it seems to take around 4 seconds, though the files are not true working font eggs.
BTW, I don’t mind using the and entries to make previewing the fonts easier. What are their corresponding Panda objects?
The distance between two lines of text. The y “size” is determined by the vertices of the quad for each character.
Certainly. You can create the equivalent structure using low-level GeomVertexData calls, building a node hierarchy that looks the same as the node hierarchy you would get from loadEggData(), and pass that root node to the StaticTextFont constructor.
But are you really creating a truly dynamic font every time, or is it always the same thing? You could simply create it once and then save it in an egg file that you load at runtime.
These correspond to eggGroup.setSwitchFlag(True) and eggGroup.setSwitchFps(2).
Well we are creating an improved replacement engine for an existing mod friendly game. There have been programs for creating fonts in that format for some time and people have been free to replace the default one with their own and have been doing that for some time. We don’t want to introduce new formats to the modders or force them to convert their existing formats to be able to use them in our replacement engine. That would be against our goals: we want to add new stuff without breaking old ones.
Anyway, I generated an egg font, but Panda failed to render most of the characters. The font file seems fine when previewing in Pview and character codes are set up correctly.
I’m uploading a test script and the generated egg font + png texture which it loads and tries to use for text. I hope you’ll find what is the problem, because everything in the egg seems correct. mediafire.com/?gbhd5argjwvi61b
BTW, the 100000 line egg file takes pretty long to load, I’ll probably switch to generating low level Panda objects directly…
Wow, that’s a big font. But the character codes you have used don’t appear to be Unicode character codes; I don’t know what codepage they belong to. What text are you supplying to Panda to render them? If you supply a string that includes the same character codes that appear in your font, it renders fine, like this:
from direct.directbase.DirectStart import *
from direct.gui.DirectGui import *
font = loader.loadFont('font.egg')
text = u'\u8140\u8141\u8142\u8143\u8144\u8145\u8146\n\u819b\u819c\u819d\u819e\n\u8277\u8278\u8279\n\ue94f\ue950\ue951\ue952'
OnscreenText(text = text, font = font, scale = 0.15)
But if you’re supplying a normal, Unicode-type string like “hello world”, then it’s true that none of the character numbers in that string appear in your font. Perhaps you meant to map whatever character numbers you receive into Unicode?
I don’t suppose the font format for this game is something that Freetype can read natively, is it? It would be awfully convenient if it were, since then you could just feed it to DynamicTextFont and call it done, and you wouldn’t have to deal with all of this preloading of so very many characters at once.
Incidentally, you appear to have placed the origin (0,0) of each character in the center; but the normal font convention is to place the origin at the lower-left corner of each character.
Incidentally, your original post showed a png file that had far fewer characters on it. Why are you generating a new png file with so many characters?
Does your source font consist of a number of smaller png files? You can just reference those png files directly in your new font; there’s no reason that all the characters on a font need to come from a single png file, and no need to synthesize a new png file if already have working png files.
That was just an example. i didn’t want to post a huge png file.
The game uses Shift-Jis (JASCII) strings. I convert it to unicode so that panda can recognise them. I mentioned that in my second post in this thread. I might have misunderstood that Shift-Jis character codes are the same in unicode and the latter simply has more.
Oh, that explains it, though very few character codes seemed to match…
Anyway, while coding this won’t be hard as you’ll just need to have a dicionary of charcode pairs, I don’t personally know japanese, so I feel it will be daunting to actually generate a dictionary like this…
I’m having some difficulties trying to port the code to generate a StaticTextFont directly.
When generating the glyph polygons, should I make 2 triangles by feeding 3 vertices to geomTriangles and calling closePrimitive() two times, then add all the geomTriangles to a single geom with addPrimitive()? How do you specify the character codes for the GeomTriangles? Or is that string for Geoms and you need to have a different geom for each GeomTriangles?
How to go with the offset vertices? How do I “assign” that GeomPoint to the Geom/GeomTriangle?