Passing texture output from one CustomPipeline to the next?

I’d like to use the image/texture output of one pipeline as input to another.
Is it possible to get the output image/texture from a custom pipeline after its executed?

I’m not at all an OpenGL expert, but I’m a bit skeptical whether this would be easily possible. Each Game Object has one Pipeline it uses to render itself. Binding a Pipeline to the Renderer involves changing the OpenGL program to the appropriate Pipeline’s program. Since the image will then be rendered to the screen, I don’t see any obvious way to access the result without, say, taking a screenshot of the canvas after the frame is rendered.

The only solution I can think of with my limited OpenGL knowledge would be to render the sprite to a framebuffer, then pass that into the next Pipeline. I haven’t experimented with the rendering API enough to be able to provide a concrete example, but the Bitmap Mask Pipeline in Phaser actually does something quite similar. Maybe that could be a decent starting point?

Alternatively, if your Pipelines only bind different fragment shaders without doing anything else, you could look into combining the two shaders into one. Not sure if this is a good solution, but I think it should work.

Yeah, I’ve been thinking perhaps combining multiple shaders into one, then just passing in a flag to switch them on and off as needed. Its not as modular and flexible as I’d like, but it might work for most of what I want to do.

It is possible right now to apply two pipelines by setting one on the camera, and the other on each sprite/image that needs it. From my testing this does allow you to get two shader effects on a single object, so its possible in some sense. But it feels a little hacky, is limited to two. I’m also not sure if setting a pipeline for hundreds of individual sprites = bad performance?

customPipeline1 = game.renderer.addPipeline('Custom1', new CustomPipeline1(game)); 
customPipeline2 = game.renderer.addPipeline('Custom2', new CustomPipeline2(game));

var mySprite = this.add.sprite(10, 10, 'blah');

mySprite.setPipeline('Custom1'); //apply a shader to an individual sprite
this.cameras.main.setRenderToTexture('Custom2'); //apply a shader to a camera (and thus any sprite that it doesn't ignore)

I’ve been walking through the renderer code and pipeline stuff in phaser, but its not the easiest to understand - I’m not well versed in any of this lower level rendering stuff, I’ve just let Phaser handle that magic for me in the past.


For reference, linking in my post in the old forums on this subject: http://www.html5gamedevs.com/topic/37307-implementing-custom-shader-effects-extending-the-webglrenderer-existing-pipelines-where-did-effectlayer-go/?do=findComment&comment=238089

1 Like

That definitely is a clever idea - if it gets its job done, good job!

Looking at the rendering code, the sprite batch is flushed (i.e. the vertices built up inside of it are rendered, which - as far as I’m aware - is one of the slowest parts of the renderer) only when two consecutive sprites use different Pipelines, not when one of them uses a non-default Pipeline. If your hundreds of individual sprites use the same Pipeline instance and come directly after each other in the display list, only the first one will actually switch the Pipeline - we can see this from the first if statement inside the setPipeline method of the WebGL Renderer. If you, however, have other sprites which use the Texture Tint Pipeline (that’s the default Pipeline which almost everything uses), the sprite batch will have to constantly be flushed, which will be bad for performance. Of course, changing a texture may also require the batch to be flushed (some GPUs support multiple textures at once, but a cursory glance at the Texture Tint Pipeline makes me skeptical whether Phaser actually makes use of that - maybe someone can confirm this?), so Pipelines wouldn’t be your only concern if your sprites use different textures.

1 Like

Thanks for the detailed response Telinc1, I’ll do some more digging.