Further to my previous post on Geom.Mesh class, I’ve been messing around with the GameObject.Mesh object. In particular the class allows you to create simple 3D objects by simply giving each each face a colour - see below of rotating cube with different coloured faces.
The class also allows you to map a texture if you give it the right uv coordinates - see below of rotatable earth.
And driving this is WebGL in the background (if I understand it right) so it’s fast - surprisingly powerful for a 2D game engine.
However, I do have a few questions for my senpai programmers:
i) perspective projection matrix - documentation states rendering takes place using orthographic camera, and yet object is equipped with projectionMatrix, together with setPerpsective method. What are these for?
ii) viewMatrix - similar to the above, class is equipped with a viewMatrix, but changing this matrix seems to have no effect.
iii) uv pairs - if the “length” of the vertices array is different from the length of the uv’s array, an error is generated. Makes sense if vertices a defined in 2D but seems odd when vertices are defined in 3D - I filled the uv arrays with surplus zeroes to avoid the error - but is this the right way?
vi) flipping Y - is there a way to flip the Y-axis of the texture being mapped?
v) lighting - the mesh object can take normals - how is this used in the rendering? is there perhaps some lighting function built in somewhere?
The documentation explains this class is wasn’t designed for displaying 3D objects…but the variety of methods would indicate it is actually way more powerful that it’s letting on?!
Mesh vertices can be calculated with either an orthographic or a perspective projection matrix. This matrix is used when calculating the final transform matrix (along with model rotation, position, scale, and the view matrix.) Depending on what visual effect you wants depends on which projection you’d use. What the documentation is talking about is that, ultimately, Phaser will take all of your Faces and render a triangle for each one and that triangle is drawn with an orthographic projection, however, the position (i.e. projection) of that Face depends entirely on the Mesh’s projection matrix.
viewMatrixis just a matrix that the
viewPositionis set into (and inverted). It gets reset and updated if the
viewPositionis dirty each frame and is used as part of the final transform matrix calculation.
This is fixed in 3.60
Typically, you flip the UV coordinates.
It’s not used during rendering at all, but I retain it on the Vertex objects in case you want to access it (remember, you don’t have to use Phaser’s pipeline to render a Mesh, so you could pass it off to your own quite easily, that did take advantage of this)
Thank you so much for such comprehensive answers to my very basic questions. I have just started looking into custom pipelines and will post the results when I manage to get it working eventually (can’t seem to find any examples with custom vertex shaders or any that pass attributes—only frag shaders that push uniforms?)
Out of interest, found below vertex and fragment shaders apparently geared towards 3D, including shading? Are these used anywhere?
As an aside, to add to Rich’s response to question (vi), Phaser.Geom.Mesh.ParseObj does have a flipUV option that does exactly that - flipUV, but GenerateVerts does not.
For a custom pipeline like that, it’s easier to take an existing one and modify it. I detail the layout of the vertex buffers quite carefully in most of them.
Those Mesh shaders are not used, no. I’ll check if they aren’t in the final bundle, but if they are, then I’ll have to remove them from the repo. If they are not, I may leave them there in case anyone wants to use them!
Just checked and the Mesh shaders aren’t in the dist file, so I’m happy to leave them there.