Hi, guys!
Hope you can help me.
I try to use render texture’s glTexture as second texture for camera’s pipeline. I want to do next:
Use one render texture + first pipeline for generate texture with pixel “offets”.
Draw it in other big render texture in concrete places.
Run 2nd pipeline on Camera, setup to it texture from second step and mix it using fragment shader(use shifts for getting pixels from initial sampler for camera)
So I use setTexture2D(bigRendTex.glTexture, 1), and wait it will be available on iChannel1, but on all iChannels I see one original(from camera) texture.
This is my codepen
https://codepen.io/pavel-shirobok/pen/PoYjKZw
Can anyone help me with understanding what I’m doing incorrectly?
Thanks!
1 Like
pyf
November 5, 2023, 4:22pm
2
Did you solved this problem?
opened 06:21PM - 03 Sep 18 UTC
closed 12:12PM - 02 Oct 19 UTC
## Version
* Phaser Version: 3.12 beta-4
* Operating system: macOS Sierra 10… .12.6
* Browser: Chrome 68
## Description
Hi, I have a developmental requirement to use a normal-mapping with sprite. Please let me explain that I know the [Light2D](https://www.codeandweb.com/texturepacker/tutorials/how-to-create-light-effects-in-phaser3), but I'm not need the light effect. In fact, I want to implement a water waving effect with the help of a ripple normal-mapping.
So, I refer to the way by [ForwardDiffuseLightPipeline.js](https://github.com/photonstorm/phaser/blob/1d4b2ed01a157c3cd61cbd3852c1c46c4cad1443/src/renderer/webgl/pipelines/ForwardDiffuseLightPipeline.js) and wrote a fragment-shader. My idea is set the normal-mapping to shader by [setTexture2D](https://photonstorm.github.io/phaser3-docs/Phaser.Renderer.WebGL.Pipelines.TextureTintPipeline.html#setTexture2D__anchor) method. But I found this can't achieve the effect I want.
This is the fragment-shader ( `test.frag` ):
```glsl
precision mediump float;
uniform sampler2D uMainSampler;
uniform sampler2D uNormSampler;
varying vec2 outTexCoord;
void main() {
gl_FragColor = texture2D(uNormSampler, outTexCoord);
}
```
This is the code in scene's `preload` method:
```js
this.load.image('sprite')
this.load.image('normal-mapping')
```
This is the code in scene's `create` method:
```js
testPipeline = new Phaser.Renderer.WebGL.Pipelines.TextureTintPipeline({
game: this.game
renderer: this.game.renderer
fragShader: this.cache.shader.get("test.frag")
})
this.renderer.addPipeline('test', testPipeline)
sprite = this.add.image(500, 400, 'sprite')
sprite.setPipeline('test')
glTexture = this.textures.get('normal-mapping').source[0].glTexture
this.renderer.getPipeline('test').setTexture2D(glTexture, 1)
```
The above is a simple test code and it can work without error. I'm a beginner at the Shader and GLSL, according to my understanding the display result of this fragment-shader should be to render the normal-mapping image. However the actual running result is to render the sprite image. It seem the uNormSampler is same as uMainSampler both the sprite image.
I am not sure if I am wrong. Please tell me where I did it wrong and how to set the sampler2D texture-data to fragment-shader ?
<bountysource-plugin>
---
Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/62952421-unexpected-result-when-use-settexture2d-method-to-set-sampler2d-texture-data-to-fragment-shader?utm_campaign=plugin&utm_content=tracker%2F283654&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F283654&utm_medium=issues&utm_source=github).
</bountysource-plugin>