TheExDeus
|
|
Posted on: June 20, 2014, 07:58:26 am |
|
|
Joined: Apr 2008
Posts: 1860
|
GL3 light fixes So I implemented lights in GL3 some time ago, but I didn't do it in Master branch because AMD fails. It worked fine and dandy on Nvidia, but on AMD it didn't render models with lights for some reason. After days of testing I still didn't figure out what the problem is, but I did found a workaround. I send color data to GPU even if they are not used. This reduces FPS slightly (goes from 1170 to 1150 on Nvidia), but now it at least works on AMD. Additional investigation should be done into that matter, but at least we can have lights now. So this is how it looks like on AMD with GL1 and GL3: You can see that they are not pixel identical, but very close and this is about the difference you might even expect from different card vendors when using FFP (GL1). But at least AMD draws everything now. Default shader override The default shader (per-vertex lighting which emulates FFP) is quite bad and basic. That is the idea for now, as we wanted both systems to be compatible. But if you want to make something that actually looks good, then you might have to use your own shaders. This is of course possible, but until now you couldn't override what is actually the "default shader". So your code looked like this: glsl_program_set(shr_my_cool_effect_shader); //Draw the cool effect glsl_program_set(shr_my_default_shader); //Draw normally with a custom default shader instead of this: //Create glsl_program_default_set(shr_my_default_shader);
//Draw glsl_program_set(shr_my_cool_effect_shader); //Draw the cool effect glsl_program_reset(); //Draw normally with a custom default shader It is not a big change, but the reason I point this out is because we now could theoretically allow extensions which override this shader automatically. Like imagine a "Per-pixel lighting" extension, you Enable it and it automatically sets the shader. The extension doesn't even need to have functions. Sadly though, extensions still can't modify or inject code into events. So right now it's not possible to do it without a function, but it might be in the future. Like I made an example: Here I switch between shaders with keyboard: if (keyboard_check_pressed(vk_insert)){ glsl_program_default_set(shr_per_pixel); glsl_program_reset(); }else if (keyboard_check_pressed(vk_delete)){ glsl_program_default_reset(); glsl_program_reset(); } The glsl_program_reset(); is required because the _default_ functions don't actually bind the program. They just set what the default one is. Faster speed I can't be sure about AMD as it was broken before, but my Nvidia implementation works a lot faster than before. The current master is actually very broken, I have about 250fps on it. After fixing lights I got it up to 1090FPS (which is what it was before and is with GL1), but I also did additional caching and now get about 1180FPS, so there is an improvement. Others should probably see it too. What I need from you Please test these changes, by downloading from this branch - https://github.com/enigma-dev/enigma-dev/commits/GL3-cleanupEspecially AMD users should test. Try with Project Mario as well as other examples and games. You can also test this: https://www.dropbox.com/s/r17cwch4962voi8/3D_Lights_test.egm - It's the lights demo where you can see the default FFP emulation and the sample per-pixel lighting. You can switch between them using Delete and Insert, look around with mouse and move with WASD. It would also be useful to know FPS in examples like Project Mario before (the Master branch) and after (this one). If everything is okay and nothing else breaks, then the branch could be merged.
|
|
« Last Edit: June 20, 2014, 08:06:10 am by TheExDeus »
|
Logged
|
|
|
|
Goombert
|
|
Reply #1 Posted on: June 20, 2014, 02:38:07 pm |
|
|
Location: Cappuccino, CA Joined: Jan 2013
Posts: 2993
|
Excellent work Harri lights are working good now in GL3 and it is working great altogether. I only notice a few differences: * OGL3 is marginally slower * OGL3 lights appear darker But we will work out these kinks, for now I would like you go ahead and merge your branch so that GL3 has working lights. I will work with it in the future to optimize it with you. These are wonderful developments however, it will be relatively little effort to port this system to GLES and get a proper Android port working.
|
|
« Last Edit: June 20, 2014, 02:41:46 pm by Robert B Colton »
|
Logged
|
I think it was Leonardo da Vinci who once said something along the lines of "If you build the robots, they will make games." or something to that effect.
|
|
|
time-killer-games
|
|
Reply #2 Posted on: June 20, 2014, 02:53:18 pm |
|
|
"Guest"
|
Excellent work Harri! This is a big help. I don't want to make any comparisons in saying this, but I can't thank you enough for your continual dedication to ENIGMA. The project is getting pretty close to stable on at least OGL 1.1&3
|
|
|
Logged
|
|
|
|
TheExDeus
|
|
Reply #3 Posted on: June 20, 2014, 05:14:50 pm |
|
|
Joined: Apr 2008
Posts: 1860
|
* OGL3 is marginally slower Yes, for me it's also 1200FPS in GL1 vs 1140FPS in GL3. But some things to consider: 1) I had to add that stupid hack for AMD to work, so now it's possible it sends useless data (I lost about 30FPS there on my Nvidia card). 2) I doubt it's that easy to get custom FFP emulation as optimized as the hardware and driver designers did for GL1. If you want GL3 to do the same as GL1, then it will probably be slower. The idea is that it should do more, like per-pixel lights, shadows, deffered shading and so on - things that GL1 cannot do. 3) There are of course more optimizations to do, but I don't think it would increase the speed that much. One optimization I have to finish is caching, as the only uniforms still not cached are matrices. That would potentially remove some useless GPU calls, but as uniforms are the fastest thing to do on the GPU, then I don't think it would impact that much. * OGL3 lights appear darker I used the same algorithms as defined by GL1 spec, but sadly hardware developers implemented it differently. Like GL1.1 looks different on Nvidia and AMD. The difference is negligible and unnoticeable when not comparing side by side. That is a good reason to write your own shaders, because now GL3 should look identical on both Nvidia and AMD. If it works for at least one other person, then I will merge it.
|
|
|
Logged
|
|
|
|
|
|
TheExDeus
|
|
Reply #6 Posted on: June 20, 2014, 07:55:25 pm |
|
|
Joined: Apr 2008
Posts: 1860
|
It looks fine on my machine (GeForce GTX 760, Linux, Open GL3). Unfortunately, I do not have a Radeon machine to test this on. That is good news. Your lighting demo is just a black screen for me. I tried pressing del/insert, moving mouse, and using WASD (in case camera was off) but it's always pure black.. That is bad news. 1) Try commenting out glsl_program_default_set() and glsl_program_reset() lines (first 6 lines in draw even of the single object). Then run in GL1.1. And then run in GL3 and see if you get the same lights. 2) Try downloading the egm again. I changed it so you start in the middle of the lights (previously starting position was the player turned away, thus you see only black screen initially when not using mouselook). 3) Try looking at console output of the program, it will output information about shaders at startup. The logo shown at the beginning of the game looks hideously ugly, but everything else looks fine. What is ugly exactly? Does it look the same as in GL1.1? That is what we are trying to replicate and so GL3 won't look any better by itself. Although I do plan to make a topic about this - maybe it's time to ditch GM and it's rendering and go for something better? It will break compatibility slightly (especially 3D games), but would open a lot of doors for improvements.
|
|
|
Logged
|
|
|
|
daz
|
|
Reply #7 Posted on: June 20, 2014, 08:17:55 pm |
|
|
Joined: Jul 2010
Posts: 167
|
2) Try downloading the egm again. I changed it so you start in the middle of the lights (previously starting position was the player turned away, thus you see only black screen initially when not using mouselook). This works for both vertex and pixel shaders! So I guess the camera was just badly lost somewhere in the world (most likely I was underneath the platform by default and didn't know). The plane is quite visible in this though, which might be the case for different video cards or maybe your screenshots just don't show the effect well: http://i.imgur.com/kBiGpqg.png3) Try looking at console output of the program, it will output information about shaders at startup. This is not the case, but I run LGM through Enigma.exe which closes itself after starting LGM. And LGM's window does not contain this output. What is ugly exactly? Does it look the same as in GL1.1? That is what we are trying to replicate and so GL3 won't look any better by itself. Although I do plan to make a topic about this - maybe it's time to ditch GM and it's rendering and go for something better? It will break compatibility slightly (especially 3D games), but would open a lot of doors for improvements. The lighting is awful. It's either a difference in video cards, or just difference between the OGL1 and OGL3 drivers. Mario OGL 1: http://i.imgur.com/DCDyEKY.pngMario OGL 3: http://i.imgur.com/XZJy0Yx.pngThis problem could just be solved by either using OGL 1 for GM compatibility, or using OGL 3 with pixel shader.
|
|
|
Logged
|
|
|
|
Goombert
|
|
Reply #8 Posted on: June 20, 2014, 09:09:21 pm |
|
|
Location: Cappuccino, CA Joined: Jan 2013
Posts: 2993
|
This is not the case, but I run LGM through Enigma.exe which closes itself after starting LGM. And LGM's window does not contain this output. Look in settings.ini you can stop that window from closing, you can also just find the info enigma-dev/output_log.txt Although I do plan to make a topic about this - maybe it's time to ditch GM and it's rendering and go for something better? It will break compatibility slightly (especially 3D games), but would open a lot of doors for improvements. That's what I keep trying to say Harri. At any rate, I am not for dumping GM compatibility altogether but having an ENIGMA spawn off project.
|
|
« Last Edit: June 20, 2014, 09:11:15 pm by Robert B Colton »
|
Logged
|
I think it was Leonardo da Vinci who once said something along the lines of "If you build the robots, they will make games." or something to that effect.
|
|
|
TheExDeus
|
|
Reply #9 Posted on: June 21, 2014, 08:24:26 am |
|
|
Joined: Apr 2008
Posts: 1860
|
This problem could just be solved by either using OGL 1 for GM compatibility, or using OGL 3 with pixel shader. But you have the second problem in GL3 with pixel shaders? Because GL3 always uses pixel shaders. But I am not sure why you have that awful looking lights though. On my AMD and Nvidia cards I don't see any real difference between GL1 and GL3. The lighting is awful. It's either a difference in video cards, or just difference between the OGL1 and OGL3 drivers. Drivers are the same for GL1 and GL3. And in GL3 there should be no difference between cards. That is the good thing about shaders - they should, in theory, look exactly the same on all hardware (when coded properly), because the dev is the one who codes all the math. It's like having "2+2" being different on different platforms. So it seems this could be a problem related to my code in GL3shaders.cpp. But it also seems it could be an issue with normals. In this case it's maybe because the normals are loaded incorrectly? But you use Win8/Win8.1 as me, and we should see the same. Maybe the model is changed or corrupt? That's what I keep trying to say Harri. At any rate, I am not for dumping GM compatibility altogether but having an ENIGMA spawn off project. I don't want to make two projects. I just think we should stop worry about GM compatibility and add our own functions. Most of the things will still be compatible though, but we will just have more functions which do different things. Like we could make default rendering to be deferred, for example, and that shouldn't really break code compatibility. It will just look different on screen, but it looks slightly different now anyway. We could really make a shader/material editor, for example. And even though GM would not be able to load them, we could save them in .egm and live with that. edit: Alright, I will merge this. Because it seems it at least works better than before, even if not perfect.
|
|
« Last Edit: June 21, 2014, 10:14:45 am by TheExDeus »
|
Logged
|
|
|
|
|
TheExDeus
|
|
Reply #11 Posted on: June 21, 2014, 05:22:11 pm |
|
|
Joined: Apr 2008
Posts: 1860
|
Unless you mean that ENIGMA does some default shaders behind the scenes (that I'm not aware of)? OGL3 ditched FFP (fixed function pipeline) - this means that if you want to render ANYTHING (like a single triangle) you need to write your own shader. In the last 3 months or so I have slowly made the change to a custom shader like that in OpenGL3 graphics system in ENIGMA. And that shader is what was the problem. I originally wrote it without support for lights, but now I implemented them. They are not exactly the same as GL1, but I'd say close enough. There was a problem though, that AMD didn't render them correctly. Now I did a dirty hack to fix it. Now it should render on both AMD and Nvidia and they should render the same. You can find the shader here: https://github.com/enigma-dev/enigma-dev/blob/e16c591c36a437c5fad73674f7f750856eb91bb6/ENIGMAsystem/SHELL/Graphics_Systems/OpenGL3/GL3shader.cpp in the lines 68 to 217. So it's a simple vertex lighting based shader that uses phong shading. So the problem is that you see messed up model at the title screen. And the rest worked correctly? Can you post screens of the Mario head and the game itself? Just to see if the shading is in right direction. I am mentioning normals, because they are basically the only thing that impacts the calculation of the color. Either that or the position of the lights. But both of them should be exactly the same. If it did use shaders I would rather expect them to look the same between even OpenGL versions. I would expect them to not look the same between OGL versions, because as the developer (us) writes the shader, then we might not be doing the same calculations GL1 implementation were doing. But I would expect GL3 to look the same between different hardware (to a degree), because we all hardware should do the same calculations (the ones we ask them to).
|
|
« Last Edit: June 21, 2014, 05:26:35 pm by TheExDeus »
|
Logged
|
|
|
|
Goombert
|
|
Reply #12 Posted on: June 21, 2014, 06:57:10 pm |
|
|
Location: Cappuccino, CA Joined: Jan 2013
Posts: 2993
|
daz, it's the same way with GLES and DX10/11 you have to write pretty much everything yourself. The only thing is it usually comes out faster. But we can work out the kinks later, at least now we can do embedded systems.
|
|
|
Logged
|
I think it was Leonardo da Vinci who once said something along the lines of "If you build the robots, they will make games." or something to that effect.
|
|
|
|
|
|