Setting a scroll factor on a tilemap layer make it move faster on narrow screens compared to wider screens. This seems like a bug or bad behavior. I’ll briefly summarize the process happening under Phaser’s hood (checked with 3.24 and 3.55.2).
Let’s use the diagram below and only consider the X axis for the sake of clarity:
- In green, the world boundaries based on the tilemap.
- In red, two devices with different form factors (20 px wide portrait vs 50 px wide landscape).
- In blue, the “scrollX” value of each device’s camera for a same midpoint (yellow, say 100px away from origin).
The scrollX is different on both devices due to their different screen widths:
- the visible area’s left border is 90px away from the world origin on the the narrow device.
- the visible area’s left border is 75px away from the world origin on the the wide device.
The world contains two layers:
- foreground: where the action happens, has the default scroll factor of 1.
- background: a parallax layer moving at 50% speed: setScrollFactor(0.5)
Phaser implements parallax the following way:
- Get the main camera’s scrollX (narrow: 90px, wide: 75px).
- Compute the parallax’s X offset by multiplying that scrollX value by the scroll factor (0.5 in this case).
- Render the background layer with that resulting offset:
- On the narrow device, the BG layer was offset by 45px when looking at a 100px midpoint.
- On the wide device, the BG layer was offset by 37.5px when looking at the same midpoint.
To me this is wrong. The parallax offset should be computed based on the camera’s midpoint, not the device-dependent scrollX as it is now.
What are your thoughts? Am I missing something?
NB. A practical reason why I call this a problem is that it makes it impossible to synchronize parallax layers in a predictable way. Let’s just see the example below:
- the hero is on the front layer (scroll factor: 1)
- the window is on the midground layer (scroll factor: 0.75)
- the night sky is on the far end layer (scroll factor: 0.25)
Using Phaser’s current approach, it’s impossible to ensure that whatever the device, the moon appears as the player moves in front of the large window.
The default approach should be that when the main camera’s midpoint targets the center of a non-parallax layer, then that’s where all parallax layer centers should be aligned.
Ideally, we should be able to define a “focal point” per parallax layer to have more control on this.