Collaborative Sci-Fantasy Tech Demo for an Official Panda3D Showcase

Welcome 3D artists and Panda3D developers! Let’s build a 10 minute interactive real time demo together which shows off how cool Panda3D really is. Our goal is to have fun and build something truly cool to get more artists and developers using and developing Panda3D.

First Major Thematic Update
We’re moving from “hostile aliens” to non-sentient robots (as we determine) for the shooter segments.

Outline:
Section 1: Multiple Perspective Intro with Procedural Generation of a Starship
Section 2: Flying the Starship to a Space Station
Section 3: First Person Shooter (or “Sneaker”) Trek through a Space Station
Section 4: Portal into Space, Blow Up the Space Station (or don’t), and Fly Away

Section 1 Details:
Original Inspiration from @Epihaius :

So, we have a 1. Large Futuristic Hangar with a Large Futuristic Starship . Our camera perspective could be in a similar vein to Signal Ops Signal Ops - A game that will multiply your perspective. in that we have multiple perspectives that we can switch between. It need not be exactly like Signal Ops, maybe we just press Arrow Right or Arrow Left to switch between “security camera views” as the starship is being constructed by robots.

Section 1 Initial Requirements:

  • Very high quality starship 3D model
  • Somewhat sophisticated procedural generation code
  • Very high quality hangar 3D models

Section 2 Details:
We will probably want some obvious entry of the player character into the finished starship. I would not suggest going too deep on the character development here, as it could quickly get out of scope. I suggest the player character model comes in from a door somewhere inside the hangar, and is directed in 3rd person perspective up a ramp into the starship cockpit. After sitting down, the camera is in first-person perspective, and the player flies the starship out of the hangar and into space.

Section 2 Initial Requirements:

  • Fully rendered starship interior
  • Outer space environment (probably just a skybox)
  • Mothership / port hangar to launch from

Section 3 Details:
The player docks the starship in first-person perspective to the space station. The player egresses from the vehicle through a docking tunnel and enters the space station. The player is immediately greeted with hostile robots (good thing we brought our pilot’s sidearm). *Alternative playthrough: Sneak with slightly increased difficulty.

Section 3 Initial Requirements:

  • Very high quality space station exterior 3D model
  • Very high quality space station interior 3D models
  • Hostile robot models, let’s say 3 NPC variants
  • Very high quality pistol 3D model, maybe 1 or two high quality weapons found while working your way through the station

Section 3 Updates:

  • Our project goal on this section has been updated to include the following:

Section 4 Details:
The hostile forces become overwhelming. Luckily, the player discovers a way to portal out into space, if only they can find a spacesuit. The player discovers the spacesuit and opens a portal directly into space, taking a self-destruct remote detonator with them. After portaling into open space, the player presses the detonator and the space station explodes in a spectacular VFX fashion. *Alternative playthrough: Don’t detonate the station, and fly away the same. The starship, on autopilot, comes and picks up the player. The game is now in the win condition. Fade to black.

Section 4 Initial Requirements:

  • Very high quality space suit model
  • Very high quality detonator model
  • Portal programming magic (which has been demonstrated in Panda before)
  • Space station explosion VFX

Audio Requirements

  • Ambient soundtrack (space station ambiance for instance)
  • Sci-fi/fantasy gun sounds
  • Hostile robot noises
  • Starship noises, flight sounds
  • Music!

Further Details
This is probably the most ambitious project ever conceived of for the Panda3D development community. It will require full commitment of at least several people with great skill in different areas. “Programmer Art” will not be sufficient. We need professional quality 3D assets. Let’s discuss how this is achieved. I would like to commit $200 of my own money as an act of good faith once we have very well-defined asset goals in place for the right 3D artist/artists.

Progress so far:
Most recent updates:

atten

In just a few weeks of development, we have made several core contributions:
Our GitHub page for collaborators:

To contribute code and 3D models, it is recommended to Fork the specific P3D Space Tech Demo repository to your GitHub profile, and then submit Pull requests to us for review. We’re happy to receive contributions at this early stage.

Some early graphics:
The initial hangar:

Initial portal programming demo:
portal_gif_1

With a ramp:
portals_again

3 Likes

Awesome idea! I also agree to donate $ toward media, given that artwork would be so critical to make this work! I’m new to Panda but happy to help script (or C++, shaders, etc).

My guess is this will push Panda’s capabilities a bit, maybe necessitate to implement a feature or two to get this together, but I think this is the point here, come up with something that motivates the community to rally around a common goal, and help build something impressive. And if successful really add some nice showcase screenshots and videos to the site.

Side note, I remember Blender org teamed up with Crystal Space back in the day to build a demo similarly. Yo Frankie! – Apricot Open Game Project » Crystal Space

Maybe we can look at doing similar? (partnerships with OS communities?) Just ideas.

Looking forward to seeing if we can really get this off the ground!

Your enthusiasm is appreciated!

I’ll go ahead and start building some of these mechanics and getting the overall program structure in place. I’m hoping that the “10 minute scope” will aid significantly in reducing work overload. We’re not out to make an entire AAA game, just a compact 10-minute interactive slice of one. If the set design is good enough, a player may want to just hang out and look at the environment, extending the potential overall length of a playthrough. I would like to allow some wiggle room for this, not being too strict about scripted sequence lengths.

Additionally, and don’t laugh, but I’d like the player to be able to either engage the hostiles or sneak (with slightly more difficulty). Not everybody likes to play a violent character. I don’t think it would be too difficult to add a separate path through the space station section to avoid this. I think we could also script an ending which does not result in a spectacular explosion for the pacifist without a big work overhead.

For what it’s worth, of late I’ve been slowly picking away at a character model (intended to be one of two along similar lines) that I intend to offer for people to use.

Specifically, it’s a robot shaped after a humanoid panda.

(The other version is intended to be a knight wearing panda-like armour–although I don’t know whether I’ll get to that one.)

I also have it in mind to offer a “spacecraft” model–however, I think that the style that I have in mind for that might not match the degree of intricacy that you likely want for the project that you describe.

Please no zombies, I’m getting a little sick of those. :frowning:

Might I suggest that we consider something a little different in terms of weapons?

Perhaps high-sci-fi with curious designs: bifurcated guns that look like punch-daggers, or weapons with spinning, folding machinery at the front, etc.

Or science-fantasy with fusions of magic and machinery; crystals driven by brass machinery traced over with circuit-designs, for example.

Of course! I am quite open to other design ideas. I don’t even like zombies! Just trying to get the ball rolling. The most I’m likely to do in the next week is getting the architecture of the demo in place, and doing some physics modeling.

Fair enough!

Maybe once we have things functional, we might have a group discussion about what artistic direction we want.

(Although I might make a few models as inspiration finds me in the meanwhile; if they’re not used here, they can just go into my own “sample models” repository, after all.)

[edit] Minor correction regarding the model that I mentioned making: inspired by the mention of science-fantasy above, I’ve decided to rework my “robot” model into something more science-fantasy-ish.

With regard to this idea for the player character to be a robotic Panda, I quite like it. I’m not sure exactly what you are proposing with this character design, but it would be cool to have a strong Panda reference in the demo. I’m open to the idea of just making the “aliens” robots, or alien robots maybe. I’m not too interested in programming gore, and robotic NPCs could be more fun to blow apart.

Honestly, it could be used as an NPC design, or an enemy design, or something else again, if that’s preferred. (Or not used at all, for that matter!)

Don’t get me wrong: I do like the idea of it being used for the player-character! I just don’t want it to seem that I’m pushing my own design on the project–especially as I haven’t even shown the thing yet! ^^;;

If I recall correctly, I think that my original thought was to have a “character” that devs could use as a stand-in during development, a bit like Unreal Engine’s test-crash-dummy/robot. And since this is Panda3D, I wanted to make something Panda-themed.

That’s very fair!

Indeed, come to think of it, gore might be a little counter to this being used as a general showcase: there are people who might be turned off by gore, or who might be a bit young for such things. Actual killing, too, might not be ideal. Perhaps preferable, then, to have any destruction be of non-sentient, non-gory foes.

I have edited the original post to reflect this, as I agree entirely.

Indeed, we can discuss aesthetics for as long as it takes to reach a good group decision. Though, if I didn’t say “hey these ideas are pretty cool!” we may not get very far. I would like to encourage these sorts of ideas, especially at the beginning of the project. The only thing I’d really try to enforce at this point is the overall architecture of the demo, the 10 minute scope, and the most general parts of the concept as a matter of practicality.

1 Like

That is, I think, all good to read! :slight_smile:

Thanks for creating this topic :slight_smile: !

As far as the procedural generation of the starship is concerned, I would imagine going about this as follows:

  • in your favorite modeling program:

    1. create the entire model as you normally would;
    2. divide the finished model into separate pieces, with each piece being a reasonably “simple” shape (flat piece, cylinder-like, etc.), to facilitate point 4.;
    3. create a linear color gradient with as much colors as possible (might have to be generated procedurally in Panda using PNMImage);
    4. somehow project the color gradient onto each piece of the model and “bake” it into the vertices (the lowest color value(s) should be near the seam of the model piece where it is attached to the “parent” piece);
    5. export each model piece individually
  • using a Panda Python script, process the exported model files:

    1. create a new model to assemble the different pieces into;
    2. iterate over the vertex data rows of each piece and build up a sequence of vertex indices in an order based on ascending color values;
    3. parse the GeomTriangles primitive of each piece and define the new order of its triangles based on the order of their vertex indices within the aforementioned sequence;
    4. fill the new GeomTriangles primitive with the triangles as defined in the order previously established;
    5. save the result as a .bam file, which will be the actual model to be loaded into the Panda tech demo
  • in the Panda tech demo:

    1. load the .bam file;
    2. replace the original GeomTriangles primitive with a new, empty primitive;
    3. progressively fill the new primitive with the triangles from the original one, in the original order;
    4. as each triangle is added, have the “builder bot” move and/or direct its “construction beam” toward the center of the triangle; a (perhaps pulsating) glow effect could be applied to (a temporary separate model of) the triangle to visualize the solidification of pure energy into matter.

The “ascending color values” would actually be “packed” integer values, computed in the script as follows:

r, g, b, a = vertex_color  # values in the [0, 255] range
color_value = r << 16 | g << 8 | b

So the color gradient could ideally consist of 16777215 colors, but this should probably be generated procedurally then.

As I don’t have much experience with Blender, I’m not sure if the idea of applying a color gradient to model vertices is feasible; what do you guys think? An alternative would be to manually apply vertex paint, but perhaps this would become too tedious?

1 Like

It’s very feasible, I do believe! (At least in Blender versions before 2.8; later versions might be similar, but I don’t have experience with them, and so fear that I’m not in a position to speak to them.) I’ve used vertex-colours applied in Blender on a variety of occasions, both manually applied and procedurally.

Specifically, I wonder whether a Blender-script might not be a good way to apply the colours. If there’s a way of defining what colour a given vertex should be (location on the ship, nearness to a seam, etc.), then it should be possible to write a script that automatically colours the vertices.

To increase the number of distinct colours available, the script might apply multiple layers of vertex-colours; a given point’s value might then be something like the sum of the layer-values.

Ah, that’s good to hear :slight_smile: !

Sounds promising! Sadly I’m not familiar with Blender scripting, so I’m afraid I won’t be of much help in that area. But I will certainly try to write the external Panda script that processes the exported model pieces, as well as the in-demo code to gradually build up the GeomTriangles primitive of the model!

1 Like

I might give it a shot.

What would be the criterion for determining the colour of a given vertex? Just distance from the seam? Maybe some sort of broad-strokes layering?

That could already be sufficient. Although this means that all vertices near a seam would have the same colour, the triangles they belong to could be added in a random fashion, which should not be too much of a problem if the seam is kept relatively small (the build process could always be temporarily paused until the builder bot gets to the new location in case the distance between two consecutively placed triangles is a bit large, I guess).

If that doesn’t require too much effort, it could determine the order in which to add triangles even further, indeed. It might be called for if a seam is quite large.

In any case, thanks in advance :slight_smile: !

1 Like

Okay, I’ve put together a short first-draft.

Here’s a gif of it in action, using a simple shader to emulate the described effect:
(Sorry about the mouse-cursor. ^^; )

Test-model:
constructionTest.egg (1.6 MB)

Test-program:

Python code:

from direct.showbase.ShowBase import ShowBase
from panda3d.core import Shader

class Game(ShowBase):
    def __init__(self):
        ShowBase.__init__(self)

        self.disableMouse()

        base.camera.setPos(5, -5, 7)
        base.camera.setHpr(45, -45, 0)

        self.model = loader.loadModel("constructionTest")
        self.model.reparentTo(render)
        self.model.setTransparency(True)

        shader = Shader.load(Shader.SL_GLSL,
                                "constructionTestVertex.glsl",
                                "constructionTestFragment.glsl")
        self.model.setShader(shader)

game = Game()
game.run()

constructionTestVertex.glsl:

#version 130

in vec4 p3d_Vertex;
in vec4 p3d_Color;

uniform mat4 p3d_ModelViewProjectionMatrix;

out vec4 vertexColour;

void main()
{
    gl_Position = p3d_ModelViewProjectionMatrix*p3d_Vertex;
    vertexColour = p3d_Color;
}

constructionTestFragment.glsl

#version 130

in vec4 vertexColour;

uniform float osg_FrameTime;

out vec4 color;

void main()
{
    float order = vertexColour.w;
    float value = vertexColour.x + vertexColour.y + vertexColour.z;

    float finalValue = (order*255 + value);

    float alpha = smoothstep(finalValue - 0.1, finalValue, mod(osg_FrameTime, 25));
    if (alpha < 0.1)
        {
            discard;
        }
    color.xyz = vec3(0, order*255/5, 0);
    color.w = alpha;
}

Should I post the Blender-script and/or the Blender-file?

(I presume that further work will be called for, for one thing.)

3 Likes

Wow, a shader-based solution, very cool!
There are some things in that GLSL code that I’m not familiar with, like smoothstep and osg_FrameTime, so it’s very interesting for me to see how one can make effects that change over time :slight_smile: !

If this shader method could be used for the final model as well, that could certainly save a lot of work.

Yeah, that would probably be helpful to those who want to study how the model was made and to further experiment with it – if you are willing to share those files, of course!

Thanks for your work :slight_smile: !

Thank you! :slight_smile:

If you’re interested:

The “smoothstep” function, I believe, is essentially the same as the “step” function–it produces 1 when the given value is over the threshold and 0 when it’s below it. However, instead of a single threshold, it employs two: when the given value is above the upper threshold, it produces 1; when the given value is below the lower threshold, it produces 0; and when the given value is between the thresholds, it produces a value interpolated between 0 and 1.

The “osg_FrameTime” uniform is one that I gather is provided by Panda–as shown on this page. In short, it gives the current running time, and is thus useful for animation purposes.

We’d want something a bit more complex than this, I daresay! Something with lighting and normal-mapping and other things besides, I presume!

That said, I mainly didn’t suggest such an approach earlier because my impression was that this element of the showcase is intended to show off procedural generation, which is not what this shader does. ^^;

Of course, if we want to take this shader-based approach instead, then I don’t gainsay it, and am happy to have my shader above used to that end!

(To be clear, I can’t guarantee that I’ll take part in the full shader, although I might.)

Sure! I’m not in the relevant OS right now, so perhaps look for that later.

Bear in mind that there’s some hack-work present, as it was intended only as a first draft.

Oh, and one more caveat: I use a modified version of YABEE that exports vertex-colour alpha-values sourced from the second set of Blender vertex colours, as I recall. This because Blender, alas, doesn’t support alpha in its vertex colours.

It’s not a major modification however, if I recall correctly. And in this demonstration, it only affects the order in which elements are made to appear; without it, they should all appear at once.

I am thrilled to see such promising work already being contributed here. Thanks @Epihaius for firstly, coming up with the idea, and now being engaged in the discussion about how we’ll achieve Section 1 technically. I’ll keep working on the architecture, mechanics, things like camera placements in the hangar, displayregion tricks, and physics modeling so that we can get a skeleton together for the demo.

Here’s the rough modeling of a 90,000 m^2 hangar with some shadow casting sunlight. I have designed a little heuristic that lets the player switch the camera view with Arrow Right. Also, added a little drop shadow to the Text Node.