Get pixel of RenderTexture

I am doing scratch game and trying to detect percent of scratched area. I am using renderTexture to make scratch effect:

userCanvas = this.add.renderTexture(0, 0, 800, 600);

this.input.on('pointermove', function (pointer) {
if (pointer.isDown) {
  userCanvas.draw('brush', pointer.x - 32, pointer.y - 32);

  var pixel = userCanvas.texture.getPixel(100, 100);
  document.getElementById('color').textContent =  "rgb(" + pixel.r + "," + pixel.g + "," + pixel.b + ")";
}}, this);

But there is not possible to get color of individual pixels. Its always return zero.
I tried to saveTexture with some key, extract it and then get color - but result is the same.
What I am doing wrong ?

Full example:

Texture has no getPIxel method. Use it on the canvas.

Please provide code sample.

var rgb = userCanvas.context.getImageData(100, 100, 1, 1);

It works only for game type = Phaser.CANVAS.
if I set type = Phaser.AUTO then it using webgl rendering and returns always zero.
How to get pixel in this case?

Example updated:

Thanks for answer. It can get one pixel inside callback, but its not enough to detect scratched area., pointer.y, function (pixel)
    g.fillRect(0, 0, 128, 128);


There is another method to get area, but it also returns data inside callback., pointer.y, 128, 128, function (image)     {
    if (textureManager.exists('area'))
    textureManager.addImage('area', image);

And again, my goal is way to detect scratched area of existing renderTexture object. Idea was to get all pixels and check color. I don’t understand how to do that with folowing methods.

You are a moving target :slight_smile:

It returns final image of game, not only renderTexture. E.g. if game has background image, it will contains.
So main question still open.


Tested. No, it returns global game image. I think renderer is global object.

I think you have to use CanvasTexture for this. Or else make a snapshot, copy the image to a canvas and then get pixels.

userCanvas.renderer.snapshotArea works fine for me.

Without ‘renderer’ it doesn’t, which is a bug I think…

I tested RenderTexture#snapshot and RenderTexture#snapshotArea, both worked correctly.

The renderer methods are global, those are different.

snapshot snapshotArea

Strange. I tested on the OP’s CodePen. The global method works, the RenderTexture one doesn’t.
If I look at the Phaser code, it shouldn’t make any difference? It just passes through, based on Canvas or WebGL…

In that pen, userCanvas.snapshotArea(…) does snapshot correctly, but I think userCanvas.context.getImageData(…) will always get a blank pixel in WEBGL mode.


It capture correctly, it was my mistake.

So full solution will be:

  1. for canvas render just use getImageData() and check pixels
  2. for webGl render
  • get snapshot area
  • save image to canvas
  • getImageData() and check pixels

Thanks for help. if somebody knows more easier solution, I will appreciate.

  var gl ='experimental-webgl');
  var pixels = new Uint8Array(1 * 1 * 4);
  gl.readPixels(100, 100, 1, 1, gl.RGBA, gl.UNSIGNED_BYTE, pixels);

I found in sources of snapshot method how webGl gets all pixels. But it always return zeros. What is wrong ?

Demo updated:

I think, your problem is described here:

You get zeroes as drawing buffer got cleared.