|
TheExDeus
|
|
Reply #1 Posted on: December 27, 2014, 04:17:49 pm |
|
|
Joined: Apr 2008
Posts: 1860
|
This is cool. These two are actually very useful for things like deferred shader which I want to write for ENIGMA as an example. I also want to finish my island demo which uses a specific depth buffer (basically the distance from the water plane to the vertex, not from camera to vertex) for shore lines.
ENIGMA does provide you with normal matrix. "uniform mat3 normalMatrix;" in GLSL.
Also, you can see the two surfaces are upside down. I don't really have an idea what to do in this case. I guess we just can keep it like this and allow people to rotate the projection how they choose. In 2D they will be correct side up, because in recent changes I made the default projection in surface_set_target() to flip the projection. We cannot do that when we want surfaces to be used with 3D.
The example is very well written. Has comments for every line, divides everything in chunks so people can learn more easily. I'm also impressed that you managed to do this even though we haven't documented some of the stuff, like the fact that surface_create() has a third optional parameter enabling depth buffer for that FBO. In this case it's not that useful, but in a more complex scene you would probably have to set it to true or you would get visual artifacts. We could also make a way to assign buffers to surfaces like GL, so you could add a depth texture and use that instead, without making a specific shader for that. I'm not sure how to do it efficiently now though.
|
|
|
Logged
|
|
|
|
orange451
|
|
Reply #2 Posted on: December 27, 2014, 04:48:13 pm |
|
|
Joined: Mar 2013
Posts: 16
|
ENIGMA does provide you with normal matrix. "uniform mat3 normalMatrix;" in GLSL.
It does? Well I spent the past 10 minutes arguing with Josh for no reason then. Also, you can see the two surfaces are upside down. I don't really have an idea what to do in this case. I guess we just can keep it like this and allow people to rotate the projection how they choose. In 2D they will be correct side up, because in recent changes I made the default projection in surface_set_target() to flip the projection. We cannot do that when we want surfaces to be used with 3D.
Because OpenGL flips the y axis so the surfaces are flipped when drawn. An easy fix is to modify draw_surface(...). In my own OpenGL Java engine that I wrote, when I draw a surface like I do in this example, I just draw it with a quad model that has flipped y texture coords. I think that is the most practical fix, as it doesn't disrupt any math down the line, it is solely for viewing purposes. The example is very well written. Has comments for every line, divides everything in chunks so people can learn more easily. I'm also impressed that you managed to do this even though we haven't documented some of the stuff, like the fact that surface_create() has a third optional parameter enabling depth buffer for that FBO.
Thanks ^^ Robert helped me a bit in the IRC last night. We could also make a way to assign buffers to surfaces like GL, so you could add a depth texture and use that instead, without making a specific shader for that. I'm not sure how to do it efficiently now though.
To be completely honest I don't know if there would be a major performance gain in doing this. Regardless, I couldn't figure out how to do it in my own Java engine :c
|
|
« Last Edit: December 27, 2014, 05:04:22 pm by orange451 »
|
Logged
|
|
|
|
orange451
|
|
Reply #3 Posted on: December 27, 2014, 04:57:08 pm |
|
|
Joined: Mar 2013
Posts: 16
|
Okay I changed the normal shader:
// Vertex in vec3 in_Position; in vec3 in_Normal;
out vec3 v_Normal;
void main() { v_Normal = normalize(normalMatrix * in_Normal); gl_Position = modelViewProjectionMatrix * vec4( in_Position.xyz, 1.0); }
// Fragment in vec3 v_Normal;
out vec4 out_FragColor;
vec3 encodeNormal(vec3 normal) { normal = normalize( normal ); vec3 shifted = (normal+vec3(1.0, 1.0, 1.0))/2.0; return shifted; }
void main() { out_FragColor = vec4( encodeNormal(v_Normal), 1.0 ); } I noticed something odd. I think that the normal matrix isn't calculated properly. What should happen, is that every polygon face you stare at directly should be Blue. Faces that are straight up (floors) should be green, Faces facing right (walls on the left) should be red. However, this isn't the case.
I also tried this to calculate the normal matrix:
in vec3 in_Position; in vec3 in_Normal;
out vec3 v_Normal;
void main() { mat3 extractedViewMatrix = mat3(viewMatrix); mat3 normalMatrix = transpose(inverse(extractedViewMatrix)); v_Normal = normalize(normalMatrix * in_Normal); gl_Position = modelViewProjectionMatrix * vec4( in_Position.xyz, 1.0); } I believe that the view matrix isn't properly calculated. Either that, or in_Normal isn't correctly calculated when using d3d_draw_ commands.
[EDIT] I believe I have confirmed that in_Normal is incorrect when using the d3d_draw commands!
// Vertex in vec3 in_Position; in vec3 in_Normal;
out vec3 v_Normal;
void main() { v_Normal = normalize(in_Normal); gl_Position = modelViewProjectionMatrix * vec4( in_Position.xyz, 1.0); } // Fragment in vec3 v_Normal;
out vec4 out_FragColor;
void main() { out_FragColor = vec4( v_Normal, 1.0 ); } In my example, every face is black except for 1 wall. This should not be the case. Only two of the walls should be black (negative normal), two walls should have color (one green, one red), and the floor should be blue.
|
|
« Last Edit: December 27, 2014, 05:43:19 pm by orange451 »
|
Logged
|
|
|
|
TheExDeus
|
|
Reply #4 Posted on: December 27, 2014, 05:39:17 pm |
|
|
Joined: Apr 2008
Posts: 1860
|
I think that the normal matrix isn't calculated properly. ... I believe that the view matrix isn't properly calculated. If that would be the case, then model_view matrix would be wrong, as well as model_view_projection matrix. Essentially everything would be wrong then. I did have problems with normal matrix calculation previously though. Right now it's used in lighting engine, and it took me some time to make it look correctly. This is the part that calculates the matrices: void transformation_update(){ if (enigma::transform_needs_update == true){ //Recalculate matrices enigma::mv_matrix = enigma::view_matrix * enigma::model_matrix; enigma::mvp_matrix = enigma::projection_matrix * enigma::mv_matrix;
//normal_matrix = invert(transpose(mv_submatrix)), where mv_submatrix is modelview top-left 3x3 matrix enigma::Matrix4 tmpNorm = enigma::mv_matrix.Transpose().Inverse(); enigma::normal_matrix = enigma::Matrix3(tmpNorm(0,0),tmpNorm(0,1),tmpNorm(0,2), tmpNorm(1,0),tmpNorm(1,1),tmpNorm(1,2), tmpNorm(2,0),tmpNorm(2,1),tmpNorm(2,2)); enigma::transform_needs_update = false; } } It seems you try inverting and then transposing, while I think it should be the other way around. To be honest, I have seen both being used, so I never figured out which is correct. Like here it shows the way *you* did it: http://stackoverflow.com/questions/21079623/how-to-calculate-the-normal-matrixI will test some. edit: It seems that normals are in world space when using this code. Also, I think when you use d3d_draw_wall() to create the object, the normals are pointing in the same direction for opposing walls. That could be a bug for that function.
|
|
« Last Edit: December 27, 2014, 06:30:36 pm by TheExDeus »
|
Logged
|
|
|
|
orange451
|
|
Reply #5 Posted on: December 27, 2014, 05:45:33 pm |
|
|
Joined: Mar 2013
Posts: 16
|
I think that the normal matrix isn't calculated properly. ... I believe that the view matrix isn't properly calculated. If that would be the case, then model_view matrix would be wrong, as well as model_view_projection matrix. Essentially everything would be wrong then. I did have problems with normal matrix calculation previously though. Right now it's used in lighting engine, and it took me some time to make it look correctly. This is the part that calculates the matrices:
void transformation_update(){ if (enigma::transform_needs_update == true){ //Recalculate matrices enigma::mv_matrix = enigma::view_matrix * enigma::model_matrix; enigma::mvp_matrix = enigma::projection_matrix * enigma::mv_matrix;
//normal_matrix = invert(transpose(mv_submatrix)), where mv_submatrix is modelview top-left 3x3 matrix enigma::Matrix4 tmpNorm = enigma::mv_matrix.Transpose().Inverse(); enigma::normal_matrix = enigma::Matrix3(tmpNorm(0,0),tmpNorm(0,1),tmpNorm(0,2), tmpNorm(1,0),tmpNorm(1,1),tmpNorm(1,2), tmpNorm(2,0),tmpNorm(2,1),tmpNorm(2,2)); enigma::transform_needs_update = false; } } It seems you try inverting and then transposing, while I think it should be the other way around. To be honest, I have seen both being used, so I never figured out which is correct. Like here it shows the way I did it: http://stackoverflow.com/questions/21079623/how-to-calculate-the-normal-matrix
I will test some.
that external link says: transpose(invert(viewMatrix)); which is viewMatrix.invert().transpose(); you're doing viewMatrix.transpose().invert(); Also, you're supposed to extract it to a mat3 before applying the operations. Though, I am not sure if this has an effect. In Java I would do it: public static Matrix3f createNormalMatrix(Matrix4f matrix, Matrix3f normalMatrix){
// extract 4x4 matrix to 3x3 matrix extract(matrix, normalMatrix);
// Invert the matrix normalMatrix.invert();
// Transpose the matrix normalMatrix.transpose();
return normalMatrix; } -- Regardless, look at my edited post about in_Position.
|
|
« Last Edit: December 27, 2014, 06:13:02 pm by orange451 »
|
Logged
|
|
|
|
TheExDeus
|
|
Reply #6 Posted on: December 27, 2014, 06:26:38 pm |
|
|
Joined: Apr 2008
Posts: 1860
|
that external link says: Yes, you are right. I was referring to something else then. The problem was that I made the lighting shaders and then implemented normal matrix that worked for them. So I ditched the 3x3 matrix thing, because transpose and then inverse worked for me to get correct normals. If you do inverse and then transpose, then normals changed in relation to camera position. For that not to happen I had to get the 3x3 top-left of mv (basically to cut off translation). I made the changes now and it seems that none of my previous examples have broken. Also, yeah, I also think it's wrong normals for d3d_wall. For floors and cubes it seems correct. Check this: https://www.dropbox.com/s/r17cwch4962voi8/3D_Lights_test.egm?dl=0Press Insert to switch to per-pixel lights and Delete to switch to regular per-vertex. The cube and floor is okay, but the two side walls are totally wrong. I also seem to have trouble changing the orientation of the normals with changing the coordinates. It doesn't matter if the x,y is changed, the floor is still pointing downwards (so if I enable backface culling the floor is not visible from top). All of these things must be checked for. I don't think I will be the one fixing this though. Please test this branch: https://github.com/enigma-dev/enigma-dev/commits/GL3.3NormalMatrixI did the matrix change there. See if it helps. Normals for d3d_walls still need to be fixed. This should also be slightly faster, because it's a lot faster to calculate 3x3 inverse than 4x4 inverse.
|
|
« Last Edit: December 27, 2014, 06:29:36 pm by TheExDeus »
|
Logged
|
|
|
|
orange451
|
|
Reply #7 Posted on: December 27, 2014, 11:47:59 pm |
|
|
Joined: Mar 2013
Posts: 16
|
I really don't know if d3d_draw_floor() is correct. Backface culling is enabled, and I am drawing a floor that correctly shows its face. Since the face is pointing directly upward, its normal would be (0, 0, 1). So, the in_Normal in the shader should then also be (0, 0, 1). In my shader, I am outputting in_Normal as the outColor. It's black, so the normal must be negative. [EDIT] I turned off culling, and changed the floor drawing to: d3d_draw_floor( 0, 0, 0, 512, 512, 0, background_get_texture(bkg_0), 16, 16 ); Now, the floor is technically a ceiling, but since culling is disabled I can see it as a floor. In the shader, it is a solid blue (0, 0, 1). So, d3d_draw_floor is flipped! Perhaps in your lighting shader, you were negating the viewposition when it was already negated, which is why it appeared to work. [EDIT 2] I made a replica of the map using Model Creator. Here is what the normal buffer looks like: Here is what it looks like using d3d_draw functions:
|
|
« Last Edit: December 28, 2014, 12:35:29 am by orange451 »
|
Logged
|
|
|
|
|
Josh @ Dreamland
|
|
Reply #9 Posted on: December 28, 2014, 12:14:28 pm |
|
|
Prince of all Goldfish
Location: Pittsburgh, PA, USA Joined: Feb 2008
Posts: 2950
|
I see. I thought you were asking me to add a function to fetch "the normal matrix," which just isn't a common entity (it only exists in our default shader—there's no guarantee that other shaders calculate one). Anyway, thanks for the patch! If you like, you can file it through GitHub, here.
|
|
|
Logged
|
"That is the single most cryptic piece of code I have ever seen." -Master PobbleWobble "I disapprove of what you say, but I will defend to the death your right to say it." -Evelyn Beatrice Hall, Friends of Voltaire
|
|
|
TheExDeus
|
|
Reply #10 Posted on: December 28, 2014, 01:44:17 pm |
|
|
Joined: Apr 2008
Posts: 1860
|
Perhaps in your lighting shader, you were negating the viewposition when it was already negated, which is why it appeared to work. Both sides of the floor had the same lighting, which makes me belive it wouldn't matter if the normal is inverted. The falling angle would be the same. But after looking at the shader it seems that it shouldn't actually render anything on the other side, as the color is actually black (0) if the value is negative (float LdotN = max( dot(norm, L), 0.0 ); ). I don't get a blue color with that d3d_floor code either. In the lights demo the floor is black with that code too. Same with the wall code. I guess more investigation is required. I also flipped the surface drawing in GSsurface.cpp: We had numerous discussions about this and we didn't come to a real conclusion (as it seemed I was the only one interested on fixing it). Basically we had (and still have) three options: 1) Flip the coordinates in draw_surface functions like you did. It would still break drawing to surfaces (like when using views) and all _texture_ functions (as the texture coordinates will have to be inverted compared to any other texture in ENIGMA). Could work in shaders, because this is how GL likes it. 2) Flip the projection (via d3d_set_projection_ortho(0, surf->height, surf->width, -surf->height, 0); ) when binding the surface (surface_set_target()). This fixes 2D rendering, but any projection function will override this. So if you render anything in 3D, then you will have to flip it manually in your own projection function. This allows _texture_ functions to be used like with any other texture in 2D. Doesn't break surfaces usage in shaders, as everything is rendered how GL likes it in 3D. In 2D it could break shaders, but haven't tested it. This how it is done now. 3) texture_matrix. Basically flip everything in shader and keep everything flipped elsewhere. This would mean people writing shaders will have to keep this in mind and manually add this every time. I see. I thought you were asking me to add a function to fetch "the normal matrix," which just isn't a common entity (it only exists in our default shader—there's no guarantee that other shaders calculate one). It actually exists for all shaders. Like all matrices it's calculated on the CPU and then passed as a uniform if the shader uses it.
|
|
« Last Edit: December 28, 2014, 02:26:16 pm by TheExDeus »
|
Logged
|
|
|
|
Goombert
|
|
Reply #11 Posted on: December 28, 2014, 02:33:18 pm |
|
|
Location: Cappuccino, CA Joined: Jan 2013
Posts: 2993
|
We had numerous discussions about this and we didn't come to a real conclusion (as it seemed I was the only one interested on fixing it). I am very interested in this as well, I just hate the problem because it shouldn't exist. You can choose the order the pixels are read when you blit a framebuffer but not when you use it as a texture, in addition the default main framebuffer isn't set up like this. It's such an annoying bug, you should be able to flip the thing left to right or vertically without any performance cost. 1) gross 2) We could just keep track of whether a surface is bound and check that in the projection functions, but this method is still inefficient 3) texture_matrix is deprecated GL, we'd have to do it manually ourselves and know in the default shaders for GL3 whether a surface is bound to invert the texture matrix, but this is only a little more efficient ostensibly than doing the same for the projection matrix It would be nice to know how ANGLE handles this.
|
|
|
Logged
|
I think it was Leonardo da Vinci who once said something along the lines of "If you build the robots, they will make games." or something to that effect.
|
|
|
|
TheExDeus
|
|
Reply #13 Posted on: December 28, 2014, 04:57:24 pm |
|
|
Joined: Apr 2008
Posts: 1860
|
No it seems correct. Compared it to Model Creator and had same color. Weird. What shader and what normal matrix did you use? I tried both your code which creates the normal matrix in shader, as well as the modified my code which does it on the CPU. In both cases I cannot get the floor to be blue. This is what I get: I also tested the surface fix, that is why the surface draws correct side up. But the floor and walls are still wrong. The floor is in x/y plane, so the normal should be in z (blue). 1) gross I don't like it either, but as time goes on I keep coming back to this. Pros: 1) It will draw correctly. 2) It should work in shaders. 3) No math is involved, so it's the fastest option. Cons: 1) People will have to flip texture coordinates when used manually for rendering. This means it won't be consistent. If we check the two most popular use cases for surfaces then they are either drawn with draw_surface functions, which will work, and as full screen effects (i.e. shaders), which should also work. I will have to do additional tests later. There is sadly no magic fix for this. We will have to sacrifice something to make this work.
|
|
|
Logged
|
|
|
|
orange451
|
|
Reply #14 Posted on: December 28, 2014, 05:30:01 pm |
|
|
Joined: Mar 2013
Posts: 16
|
No it seems correct. Compared it to Model Creator and had same color. Weird. What shader and what normal matrix did you use? I tried both your code which creates the normal matrix in shader, as well as the modified my code which does it on the CPU. In both cases I cannot get the floor to be blue. This is what I get:
I also tested the surface fix, that is why the surface draws correct side up. But the floor and walls are still wrong. The floor is in x/y plane, so the normal should be in z (blue).
No your picture looks correct. When I was speaking about the floor being blue, I was speaking of outputting JUST the vertex normals, not combined with the normal matrix. When you combine it with the normal matrix then: -triangles that have a normal face in the direction of your camera are blue -triangles that face upwards are green Since you're multiplying the normalmatrix by the in_Normal, it should always be consistent no matter what angle your camera is facing; which it is in your picture If I output JUST the normals the scene looks like:
|
|
« Last Edit: December 28, 2014, 05:32:54 pm by orange451 »
|
Logged
|
|
|
|
|