worldView.contains problem with sprite height

Hi,

I’m having an issue with cameras.main.worldView.contains and detecting if a sprite is in the ‘Viewport’ of the window. It seems to only detect the centre of the sprite not accounting for the full height of it.

Example here
JS File

if (this.cameras.main.worldView.contains(panel1.x, panel1.y)) {
      text.setText('In Viewport: True');
      console.log("In Viewport")
    } else {
      text.setText('In Viewport: False');
    }

I’m using scrollY to fade the sprite in at a certain point, but I want to detect if it’s in the window or not and then fade in instead. For whatever reason the height of the sprite is not taken into consideration only the centre point.

Any help appreciated.

Thanks!

Seems using worldCollide.contains with Y will only detect the ‘origin point’ of a sprite so if setOrigin is 0 it will detect it only at the top, .5 will be the middle etc. As soon as that small point is off screen it becomes ‘False’

So I ended up just writing a function to handle detection of the top, middle and bottom of the sprite.
I can also set an offset at the top and bottom if I want it to animate in quicker or slower etc.

Is there a better way than this? :man_shrugging:

detectViewport(sprite, offset1, offset2) {


  if (this.cameras.main.worldView.contains(sprite.body.x, sprite.body.y + offset1) ) { //TOP POINT + OFFSET IF NEED BE
    text.setText('In Viewport: True');
    this.tweens.add({
      targets: sprite,
      alpha: 1,
      ease: 'Power1',
      duration: 400,
      scale: 1
      

     });
   
  }
  else if (this.cameras.main.worldView.contains(sprite.x, sprite.y)) { //MID POINT
    text.setText('In Viewport: True');
   
}
  else if (this.cameras.main.worldView.contains(sprite.x, sprite.y + sprite.displayHeight / 2 + offset2)) { //BOTTOM POINT + OFFSET IF NEED BE
    text.setText('In Viewport: True');
    this.tweens.add({
      targets: sprite,
      alpha: 1,
      ease: 'Power1',
      duration: 400
      

  });

  } else {
      text.setText('In Viewport: False');
      graphics.clear();
      this.tweens.add({
        targets: sprite,
        alpha: 0,
        ease: 'Power1',
        duration: 400

    });

  }

  
 }

Then in update:

this.detectViewport(panel1, 300, -350)

There is

const visibleObjs = this.cameras.main.cull([gameObjects]);

(corrected)

Hi Samme,

Thanks for the info. Forgive my ignorance but I don’t understand how this detects if the sprite is in the viewport or not. Console logging this returns a huge array and no obvious ‘visible true/false’

The following just always returns true even if the sprite is not in the camera view:

visibleObjs = this.cameras.main.cull(panel1);

   if (visibleObjs) {
    console.log("true")
   } else {
    console.log("false")
   }

Also is it possible to set an offset to detect sprites a bit later / earlier?

Thanks again

cull() is a filter, so for one game object you would use it like

if (this.cameras.main.cull([sprite]).includes(sprite)) {/*…*/}

(My mistake above: argument must be an array.)

But if you want to adjust the view area then I would do something like

function update() {
  const view = Phaser.Geom.Rectangle.Clone(this.cameras.main.worldView);

  // Enlarge or shrink the view:
  Phaser.Geom.Rectangle.Inflate(view, w, h);

  const spriteBounds = sprite.getBounds();

  if (Phaser.Geom.Rectangle.Overlaps(view, spriteBounds)) {
    // …
  }
}