Hi All,
I’m trying to implement the jump flooding algorithm explained here:
(There are several shadertoys included in the article but they’re not written in GLSL)
using render to texture with panda3d and Python as part of a GPU port of a larger project, but I haven’t been able to piece together how to make basic render to texture work yet.
Currently, I’m importing:
import panda3d.core as pc
from direct.showbase.ShowBase import ShowBase
Setting up a frame buffer to render to:
texW = 1024
texH = 512
base = ShowBase()
# Request 8 RGB bits, 8 alpha bits, and a depth buffer.
fb_prop = pc.FrameBufferProperties()
fb_prop.setRgbColor(True)
fb_prop.setRgbaBits(8, 8, 8, 8)
fb_prop.setDepthBits(16)
# Create a WindowProperties object set to size.
win_prop = pc.WindowProperties(size=(texW, texH))
# Don't open a window - force it to be an offscreen buffer.
flags = pc.GraphicsPipe.BF_refuse_window
# Create a GraphicsBuffer to render to, we'll get the textures out of this
tempTex = pc.Texture()
buffer = base.graphicsEngine.makeOutput(base.pipe, "Buffer", -100, fb_prop, win_prop, flags, base.win.getGsg(), base.win)
buffer.addRenderTexture(tex=tempTex, mode=pc.GraphicsOutput.RTMBindOrCopy, bitplane=pc.GraphicsOutput.RTPColor)
Creating a scene graph, card, and camera:
# Create a scene graph, a camera, and a card to render to
cm = pc.CardMaker('card')
canvas = pc.NodePath("Scene")
card = canvas.attachNewNode(cm.generate())
cam2D = base.make_camera(buffer)
lens = pc.OrthographicLens()
lens.setFilmSize(2, 2)
lens.setNearFar(-1000, 1000)
cam2D.node().setLens(lens)
cam2D.reparentTo(canvas)
Loading and attaching the shaders:
voronoiShader = pc.Shader.load(pc.Shader.SL_GLSL, vertex="quad.vert", fragment="jumpflood.frag")
card.set_shader(voronoiShader)
My vertex shader looks like:
#version 150
// Vertex inputs
in vec4 p3d_Vertex;
in vec2 p3d_MultiTexCoord0;
// Output to fragment shader
out vec2 texcoord;
void main() {
gl_Position = p3d_Vertex;
texcoord = p3d_MultiTexCoord0;
}
My fragment shader looks like:
#version 430
// Input from vertex shader
in vec2 texcoord;
// Output to the buffer
out vec4 p3d_FragColor;
void main()
{
p3d_FragColor.rgba = vec4(1.0);
}
I’m rendering to the buffer:
base.graphicsEngine.renderFrame()
tex = buffer.getScreenshot()
And I’m writing the texture to the disk:
tex.write("jumpflood.png")
This doesn’t throw any errors, but the following message is printed to the console:
Known pipe types:
wglGraphicsPipe
(all display modules loaded.)
A black window appears and promptly vanishes (it’s not 1024x512, it looks more like 800x600), and jumpflood.png is output as a 1024x512 completely transparent image. I suspect that I’ve made quite a few mistakes, but I’ve been trawling through the documentation for a few days now with no luck in fixing them thus far, so I thought I’d ask here for help.
I also wasn’t sure how alpha was handled in the fragment shader so I tried:
p3d_FragColor.rgba = vec4(1.0, 1.0, 1.0, 0.0);
as well, but the output was still transparent.
I am also particularly confused by the fact that a window pops up every time I run the program even though my buffer uses the flag:
pc.GraphicsPipe.BF_refuse_window
Apologies if this post is a bit long, but I thought my mistake(s) could be anywhere in my code, so I wanted to cover all my bases by providing as much as possible up front.