This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 »
301
Works in Progress / Re: [OGL3] Shader Test
« on: December 28, 2014, 01:44:17 pm »Quote
Perhaps in your lighting shader, you were negating the viewposition when it was already negated, which is why it appeared to work.Both sides of the floor had the same lighting, which makes me belive it wouldn't matter if the normal is inverted. The falling angle would be the same. But after looking at the shader it seems that it shouldn't actually render anything on the other side, as the color is actually black (0) if the value is negative (float LdotN = max( dot(norm, L), 0.0 ); ).
I don't get a blue color with that d3d_floor code either. In the lights demo the floor is black with that code too. Same with the wall code. I guess more investigation is required.
Quote
I also flipped the surface drawing in GSsurface.cpp:We had numerous discussions about this and we didn't come to a real conclusion (as it seemed I was the only one interested on fixing it). Basically we had (and still have) three options:
1) Flip the coordinates in draw_surface functions like you did. It would still break drawing to surfaces (like when using views) and all _texture_ functions (as the texture coordinates will have to be inverted compared to any other texture in ENIGMA). Could work in shaders, because this is how GL likes it.
2) Flip the projection (via d3d_set_projection_ortho(0, surf->height, surf->width, -surf->height, 0); ) when binding the surface (surface_set_target()). This fixes 2D rendering, but any projection function will override this. So if you render anything in 3D, then you will have to flip it manually in your own projection function. This allows _texture_ functions to be used like with any other texture in 2D. Doesn't break surfaces usage in shaders, as everything is rendered how GL likes it in 3D. In 2D it could break shaders, but haven't tested it. This how it is done now.
3) texture_matrix. Basically flip everything in shader and keep everything flipped elsewhere. This would mean people writing shaders will have to keep this in mind and manually add this every time.
Quote
I see. I thought you were asking me to add a function to fetch "the normal matrix," which just isn't a common entity (it only exists in our default shader—there's no guarantee that other shaders calculate one).It actually exists for all shaders. Like all matrices it's calculated on the CPU and then passed as a uniform if the shader uses it.
302
Works in Progress / Re: [OGL3] Shader Test
« on: December 27, 2014, 06:26:38 pm »Quote
that external link says:Yes, you are right. I was referring to something else then. The problem was that I made the lighting shaders and then implemented normal matrix that worked for them. So I ditched the 3x3 matrix thing, because transpose and then inverse worked for me to get correct normals. If you do inverse and then transpose, then normals changed in relation to camera position. For that not to happen I had to get the 3x3 top-left of mv (basically to cut off translation). I made the changes now and it seems that none of my previous examples have broken.
Also, yeah, I also think it's wrong normals for d3d_wall. For floors and cubes it seems correct. Check this: https://www.dropbox.com/s/r17cwch4962voi8/3D_Lights_test.egm?dl=0
Press Insert to switch to per-pixel lights and Delete to switch to regular per-vertex. The cube and floor is okay, but the two side walls are totally wrong. I also seem to have trouble changing the orientation of the normals with changing the coordinates. It doesn't matter if the x,y is changed, the floor is still pointing downwards (so if I enable backface culling the floor is not visible from top). All of these things must be checked for. I don't think I will be the one fixing this though.
Please test this branch: https://github.com/enigma-dev/enigma-dev/commits/GL3.3NormalMatrix
I did the matrix change there. See if it helps. Normals for d3d_walls still need to be fixed.
This should also be slightly faster, because it's a lot faster to calculate 3x3 inverse than 4x4 inverse.
303
Works in Progress / Re: [OGL3] Shader Test
« on: December 27, 2014, 05:39:17 pm »Quote
I think that the normal matrix isn't calculated properly.If that would be the case, then model_view matrix would be wrong, as well as model_view_projection matrix. Essentially everything would be wrong then. I did have problems with normal matrix calculation previously though. Right now it's used in lighting engine, and it took me some time to make it look correctly. This is the part that calculates the matrices:
...
I believe that the view matrix isn't properly calculated.
Code: (edl) [Select]
void transformation_update(){
if (enigma::transform_needs_update == true){
//Recalculate matrices
enigma::mv_matrix = enigma::view_matrix * enigma::model_matrix;
enigma::mvp_matrix = enigma::projection_matrix * enigma::mv_matrix;
//normal_matrix = invert(transpose(mv_submatrix)), where mv_submatrix is modelview top-left 3x3 matrix
enigma::Matrix4 tmpNorm = enigma::mv_matrix.Transpose().Inverse();
enigma::normal_matrix = enigma::Matrix3(tmpNorm(0,0),tmpNorm(0,1),tmpNorm(0,2),
tmpNorm(1,0),tmpNorm(1,1),tmpNorm(1,2),
tmpNorm(2,0),tmpNorm(2,1),tmpNorm(2,2));
enigma::transform_needs_update = false;
}
}
It seems you try inverting and then transposing, while I think it should be the other way around. To be honest, I have seen both being used, so I never figured out which is correct. Like here it shows the way *you* did it: http://stackoverflow.com/questions/21079623/how-to-calculate-the-normal-matrixI will test some.
edit: It seems that normals are in world space when using this code. Also, I think when you use d3d_draw_wall() to create the object, the normals are pointing in the same direction for opposing walls. That could be a bug for that function.
304
Issues Help Desk / Re: Font character range behaviour
« on: December 27, 2014, 05:33:38 pm »
Also works for me.
305
General ENIGMA / Re: Who fixed arrays?
« on: December 27, 2014, 05:32:39 pm »
Thank you! This is so awesome! Does it also work with 2d arrays? This bug is the reason why I have often used ds_ instead of arrays, because often regular arrays were parsed wrong as well. Like when used in scripts.
Also, do these arrays (together with EDL arrays) extend to more than 2 dimensions? I actually don't like that GM limitation of only having 2 dimensions. EDL arrays will probably be stuck there, but if the parser bugs are fixed and these are used as regular C++ arrays, then at least they should allow n-dimensions.
I will test all of this later. Thank you!
Also, do these arrays (together with EDL arrays) extend to more than 2 dimensions? I actually don't like that GM limitation of only having 2 dimensions. EDL arrays will probably be stuck there, but if the parser bugs are fixed and these are used as regular C++ arrays, then at least they should allow n-dimensions.
I will test all of this later. Thank you!
306
Issues Help Desk / Re: Font character range behaviour
« on: December 27, 2014, 04:21:22 pm »
I think the easiest way to do this by adding a 0-255 range whenever you press "Save" on a font, which has no ranges added. So the user doesn't have to press no "Preview" or "+" or anything else. It would work like it does right now, and it would just be clearer when they reopen the font editor, they would actually see the range (which was the confusing point here for the user).
What does GM:S do when you press Save on a font with empty ranges?
What does GM:S do when you press Save on a font with empty ranges?
307
Works in Progress / Re: [OGL3] Shader Test
« on: December 27, 2014, 04:17:49 pm »
This is cool. These two are actually very useful for things like deferred shader which I want to write for ENIGMA as an example. I also want to finish my island demo which uses a specific depth buffer (basically the distance from the water plane to the vertex, not from camera to vertex) for shore lines.
ENIGMA does provide you with normal matrix. "uniform mat3 normalMatrix;" in GLSL.
Also, you can see the two surfaces are upside down. I don't really have an idea what to do in this case. I guess we just can keep it like this and allow people to rotate the projection how they choose. In 2D they will be correct side up, because in recent changes I made the default projection in surface_set_target() to flip the projection. We cannot do that when we want surfaces to be used with 3D.
The example is very well written. Has comments for every line, divides everything in chunks so people can learn more easily. I'm also impressed that you managed to do this even though we haven't documented some of the stuff, like the fact that surface_create() has a third optional parameter enabling depth buffer for that FBO. In this case it's not that useful, but in a more complex scene you would probably have to set it to true or you would get visual artifacts. We could also make a way to assign buffers to surfaces like GL, so you could add a depth texture and use that instead, without making a specific shader for that. I'm not sure how to do it efficiently now though.
ENIGMA does provide you with normal matrix. "uniform mat3 normalMatrix;" in GLSL.
Also, you can see the two surfaces are upside down. I don't really have an idea what to do in this case. I guess we just can keep it like this and allow people to rotate the projection how they choose. In 2D they will be correct side up, because in recent changes I made the default projection in surface_set_target() to flip the projection. We cannot do that when we want surfaces to be used with 3D.
The example is very well written. Has comments for every line, divides everything in chunks so people can learn more easily. I'm also impressed that you managed to do this even though we haven't documented some of the stuff, like the fact that surface_create() has a third optional parameter enabling depth buffer for that FBO. In this case it's not that useful, but in a more complex scene you would probably have to set it to true or you would get visual artifacts. We could also make a way to assign buffers to surfaces like GL, so you could add a depth texture and use that instead, without making a specific shader for that. I'm not sure how to do it efficiently now though.
308
Programming Help / Re: Reading xml with Enigma?
« on: December 27, 2014, 05:12:43 am »Quote
Please Speak EnglishBasically there is no way to read XML right now without doing it manually with string functions. There is a ready made C++ code that can do this, but someone needs to make as an extension to ENIGMA for people to be able to use it.
I'm sorry, but I'm not very experienced with Enigma development.
309
Issues Help Desk / Re: Font character range behaviour
« on: December 27, 2014, 05:10:53 am »
The box will be empty, but it will work. Like try adding a font, then press Save without adding a range, and draw text with it:
Code: (edl) [Select]
draw_set_font(font_0);
draw_text(10,10,"This is something!");
It will use the font. I guess the default behavior for LGM is that if no range is added, then it will add the default 0-255 one. It's not written anywhere and is not implied, so that is a problem. I think either LGM should say that or it should automatically add 0-255 range when you press "Save" on a font without any ranges.
310
Proposals / Re: Error reporting
« on: December 26, 2014, 09:57:04 am »
How would breakpoints work in Robert's idea? I get that we can return stacktrace from C++, but how would you set a breakpoint? In GDB I had two ideas - one is the GDB/MI interface and the other was using regular files. GDB supports stuff like gdbinit file or something, which basically lists all the commands you need when you run a gdb project. So LGM would write that file and run GDB with the game, which would load all the breakpoints from the file. Problem was still mapping the user source to parsed source, as the breakpoints need to be in the parsed source.
Anyway, I'll what Josh comes up with.
Anyway, I'll what Josh comes up with.
311
Issues Help Desk / Re: Font character range behaviour
« on: December 26, 2014, 09:48:07 am »
It should automatically do it anyway. Like when you load a font and close the dialog immediately, you will be able to use the font and draw 0-255 range (ASCII) of characters. So you don't have to necessarily use the + button. I too find it confusing though and when I need specific ranges I tend to feel that I opened the font dialog for the first time. I have to press every button and only then do I understand how it works (until the next time I use it). But I don't have any specific things to recommend.
312
Proposals / Re: Error reporting
« on: December 25, 2014, 08:31:15 pm »
You are sure that works on anything other than Linux? Most of the stuff like "addr2line" is linux only as far as I see. So your idea is to put all this error reporting inside ENIGMA's engine itself? So when it crashes, it returns the stacktrace even if no LGM is running? It would be awesome, but as I said, the example you posted seems to be Linux only.
My idea was basically the same as yours, but mine used gdb (that comes with mingw) we call from LGM. We still need a way to map users to source to parsed source though. Like the stacktrace with your method would still show stuff like "C:\ProgramData\ENIGMA\Preprocessor_Environment_Editable\IDE_EDIT_objectfunctionality.h:1410" instead of "Obj_guy:create event:20" or something like that. I also found that you can just call gdb from the crashing application itself (http://stackoverflow.com/questions/3151779/how-its-better-to-invoke-gdb-from-program-to-print-its-stacktrace) which is something between what I and you are talking about. But this requires the user to have GDB and he won't be able to run the debug version otherwise. In your idea he wouldn't need gdb and in my case you run it from LGM, so you always have it.
My idea was basically the same as yours, but mine used gdb (that comes with mingw) we call from LGM. We still need a way to map users to source to parsed source though. Like the stacktrace with your method would still show stuff like "C:\ProgramData\ENIGMA\Preprocessor_Environment_Editable\IDE_EDIT_objectfunctionality.h:1410" instead of "Obj_guy:create event:20" or something like that. I also found that you can just call gdb from the crashing application itself (http://stackoverflow.com/questions/3151779/how-its-better-to-invoke-gdb-from-program-to-print-its-stacktrace) which is something between what I and you are talking about. But this requires the user to have GDB and he won't be able to run the debug version otherwise. In your idea he wouldn't need gdb and in my case you run it from LGM, so you always have it.
313
Proposals / Error reporting
« on: December 25, 2014, 07:09:12 pm »
I would want to start a discussion on how to improve ENIGMA's error reporting. Last changes in this respect was by Robert, who added scope tracking, so errors would tell in which event an error occurred. What we need right now is to actually show the offending line number, because usually it's not enough to see the event (because events call scripts, which then can massively big). The bug fixing right now means using GBD with "break dialogs.cpp:56" which adds a breakpoint at "show_error". Then I can backtrace to see where error originated from, but even then the information is in _IDE_EDIT files. So my ideas are these:
1) Creating a separate debugger will probably be infeasible, so we will probably have to use GDB. This means we need to integrate it in LGM, to allow breakpoints to be set and called properly. This is done via the GDB interpreter mode, which allows it to be used via the GDB/MI interface (https://sourceware.org/gdb/onlinedocs/gdb/GDB_002fMI.html#GDB_002fMI). As far as I understand it's, like using the regular console, but the output is easier to parse. So you run GDB as a separate process and then communicate with it like a cmd program.
2) We need to map the _IDE_EDIT files to original source, so we can track where in the original source something goes wrong, not the parsed IDE_EDIT. This could be done via some macro's that uses parser information to generate something like GCC does with __LINE__, __FILE__ tags. This is actually needed for GDB as well (but in reverse), because we need to be able to set breakpoints in scripts, but GDB needs to change them to be in _IDE_EDIT.
Any ideas on how to do this? This seems purely an LGM side of stuff (only the mapping has to be done something in the parser), so I'm not sure how to do this.
I know Robert did try adding the graphical part to LGM (the model dialog at the bottom) that will be useful here.
edit: Also, the current debugging described here doesn't actually work either. The line locations shown by GBD actually differ from the line locations in files. I don't know why.
1) Creating a separate debugger will probably be infeasible, so we will probably have to use GDB. This means we need to integrate it in LGM, to allow breakpoints to be set and called properly. This is done via the GDB interpreter mode, which allows it to be used via the GDB/MI interface (https://sourceware.org/gdb/onlinedocs/gdb/GDB_002fMI.html#GDB_002fMI). As far as I understand it's, like using the regular console, but the output is easier to parse. So you run GDB as a separate process and then communicate with it like a cmd program.
2) We need to map the _IDE_EDIT files to original source, so we can track where in the original source something goes wrong, not the parsed IDE_EDIT. This could be done via some macro's that uses parser information to generate something like GCC does with __LINE__, __FILE__ tags. This is actually needed for GDB as well (but in reverse), because we need to be able to set breakpoints in scripts, but GDB needs to change them to be in _IDE_EDIT.
Any ideas on how to do this? This seems purely an LGM side of stuff (only the mapping has to be done something in the parser), so I'm not sure how to do this.
I know Robert did try adding the graphical part to LGM (the model dialog at the bottom) that will be useful here.
edit: Also, the current debugging described here doesn't actually work either. The line locations shown by GBD actually differ from the line locations in files. I don't know why.
314
Issues Help Desk / Re: Build options appear grayed out on Windows XP [SOLVED]
« on: December 25, 2014, 06:17:37 pm »
It seems the problem was just that make-dir was saved in EGM. That is not a project specific thing, but the environments. It's like saving the LGM theme in EGM. I think we just need to make it how it was before, but not save it inside EGM, but instead save it in LGM/ENIGMA specific config.
315
General ENIGMA / Re: Improving rooms editor
« on: December 25, 2014, 06:13:25 pm »
I would want a way to merge projects together, but in controlled way. Like I need to take 10 sprites from one project and add it to another. This is especially useful now with things like GUI's, for which I constantly need to import 20 sprites in every project. There is a "Package" button, but neither Export, nor Import works there. And it allows exporting and importing whole groups of resources, not individual ones. I remember having this feature in GM, but can't remember how it worked. I think it just took two projects and made them one.
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 »