In the Phaser 3 examples, they use assets/atlas/megaset-0.png , as well as megaset-0.json . I’m wondering how these megasets work. I’m not understanding how they’re able to load separate images from the one png.
Thank you!
In the Phaser 3 examples, they use assets/atlas/megaset-0.png , as well as megaset-0.json . I’m wondering how these megasets work. I’m not understanding how they’re able to load separate images from the one png.
Thank you!
The json file those examples are using is specially created along with the big image file. The json file has a list of objects that have a x/y/width/height properties that identify a specific rectangle on the big image.
Each of those rectangle objects also has name associated with it. Phaser is smart enough to identify that name with a specific part of the image that the rectangle points at.
After Phaser has identified all those smaller “subimages”, it can easily draw each of those when you ask it too.
Does that make sense? It’s pretty much the exact concept of a “texture atlas”. If you google that term, you’ll find a bunch more explanations that are much better than mine
Thank you so much!