|
Goombert
|
|
Reply #1 Posted on: July 28, 2014, 07:04:44 pm |
|
|
Location: Cappuccino, CA Joined: Jan 2013
Posts: 2993
|
Some of them have been added in other places Harri, such as draw_set_alphatest_ref or whatever the hell it is called because Studio also has these functions. I assumed the EGM file was there for toggling various graphics settings in the future as one of Josh's unfinished plans. I side with removing them and changing important ones around to match the other graphics functions, they should not have unique names because of what Josh mentioned before, how easy it is to look up functions in GML because of prefix.
That said I would like to see that made more consistent and generalized between the systems, and what we don't need thrown out, it would help if Josh would elaborate.
|
|
|
Logged
|
I think it was Leonardo da Vinci who once said something along the lines of "If you build the robots, they will make games." or something to that effect.
|
|
|
Josh @ Dreamland
|
|
Reply #2 Posted on: July 28, 2014, 09:41:44 pm |
|
|
Prince of all Goldfish
Location: Pittsburgh, PA, USA Joined: Feb 2008
Posts: 2950
|
Oh my. EGMenable.h is from like 1904. Please delete it. It is good to offer functions similar to those; I like offering the user control over as much as possible. In GL3, that's done with shaders. In GL1, we should find a better way of offering these features. Perhaps offering these will be prettier if and when ENIGMA can mark functions as system-specific. I've been doing some thinking, and perhaps a good way to handle this is to offer shader generation methods. I'll explore the possibility a bit more, later on. Basically, a shader factory with the ability to set those parameters could just call glEnable in GL1, and actually generate appropriate shader code in GL3. Just a consideration. I should also point out that a more concise way of writing that is [snip=cpp](enable? glEnable : glDisable)(GL_WHATEVER);[/snip]. Just for what it's worth.
|
|
« Last Edit: July 28, 2014, 09:43:48 pm by Josh @ Dreamland »
|
Logged
|
"That is the single most cryptic piece of code I have ever seen." -Master PobbleWobble "I disapprove of what you say, but I will defend to the death your right to say it." -Evelyn Beatrice Hall, Friends of Voltaire
|
|
|
TheExDeus
|
|
Reply #3 Posted on: July 29, 2014, 02:59:11 am |
|
|
Joined: Apr 2008
Posts: 1860
|
Most of those functions don't work anyway in GL3, because we already use a shader. I can implement things like alpha testing in the shader, but I don't really want too, as it's rarely used. And if a person needs it, then he can write a simple shader on his own. But if I do add it in the shader, it's possible the speed will not really decrease, because all these options used uniform bool to toggle. That means the whole shader is executed with one code path, and that probably means the driver just generates several shaders for this.
And yes, there are two major ways to do complex rendering. First is mega-shader (one shader to rule them all - has all the options and features, and activates them using if (some_uniform)) - surprisingly this method isn't that slow, because uniforms don't change between renderings. This makes constant uniforms to be very fast. This is also a method used in a lot of games. Most of the Physically-based shaders are also just one mega-shader. The second method involves a shader factory - This means we generate a new shader for each possible option or permutation of the option. This means there can be A LOT shaders, and that they have to be changed all the time (which has some perf. penalty). So this might even be slower, as it's probably what drivers are doing anyway. But if the whole game uses only a simple shader (like a old school 2D platformer), then this would actually be faster. But that point I suggest the user to write a simple shader himself. I can provide examples on how to do them, as it's not that complicated.
I will delete some of the functions (the EGMenable.h), but still leave some in, because there are A LOT of deprecated functions still in ENIGMA GL3 (though normally not used). So this will probably take some time I don't want to waste now. But it is required if we want GLES to be even easier to implement.
Also, do we need _fog_ functions in GL3?
|
|
|
Logged
|
|
|
|
TheExDeus
|
|
Reply #4 Posted on: July 29, 2014, 11:41:30 am |
|
|
Joined: Apr 2008
Posts: 1860
|
So I went ahead and started running GL3 in debug context. This found quite a few bugs (like using wrong enums, functions) as well as deprecated stuff (which is what I actually wanted). After removing some deprecated stuff like glEnable(GL_TEXTURE_2D) and glEnable(GL_ALPHA_TEST) I found that the AMD bug was actually because of it. When alpha testing is enabled with shader Nvidia, seems to just disregard it mostly. On AMD it actually foobars the whole thing. So when I got rid of it and instead implemented it in the shader, it seems work fine now on both Nvidia and AMD and without that ugly hack I found earlier (where I always enabled and sent color even though the buffer didn't have any). But then I found that Robert (I am pretty sure it was him as I even vaguely remember the commit) added glAlphaFunc(GL_NOTEQUAL,0) to d3d_Begin(). Now it is required if you want alpha to work properly in 3D, but for that GM:S and ENIGMA has a special function - draw_set_alpha_test(). Alpha testing by default is false, because it's just faster that way. So in this regard I have two questions: 1) For Robert - Does GM:S rendered Project Mario's trees correctly without using draw_set_alpha_test()? I think it didn't. So Robert added that line in ENIGMA. 2) Do we enable that by default when using d3d_begin() or do we ask the user to enable it manually via draw_set_alpha_test()? I think we should ask the user to do it.
Right now Project Mario launches and runs fine with GL3.3 core profile enabled on my old AMD laptop. This means that it doesn't use ANY deprecated functions at any time. This means, in theory, that it could technically work in GLES from graphics standpoint.
Performance wise no difference. I still get 1120FPS in Project Mario on 660ti. When I set draw_set_alpha_test_ref_value(50) to get rid of the ugly tree alpha, I get up to 1160FPS, because there is less to render. Surprisingly with draw_set_alpha_test(false) I get the same 1120FPS, while I expected to get more.
|
|
|
Logged
|
|
|
|
Goombert
|
|
Reply #5 Posted on: July 29, 2014, 04:29:13 pm |
|
|
Location: Cappuccino, CA Joined: Jan 2013
Posts: 2993
|
Also, do we need _fog_ functions in GL3? Yes, Studio does not deprecate them so we clearly need them. http://docs.yoyogames.com/source/dadiospice/002_reference/drawing/drawing%203d/3d%20lighting%20and%20fog/d3d_set_fog.html1) For Robert - Does GM:S rendered Project Mario's trees correctly without using draw_set_alpha_test()? I think it didn't. So Robert added that line in ENIGMA. Alpha testing should be disabled by default, but this function did not exist for 8.1 and zwriteenable was used to control proper rendering of the trees. The function you are referring to is not used anywhere in Project Mario. http://docs.yoyogames.com/source/dadiospice/002_reference/drawing/color%20and%20blending/draw_set_alpha_test.htmlIt appears we are enabling it by default. https://github.com/enigma-dev/enigma-dev/blob/master/ENIGMAsystem/SHELL/Graphics_Systems/OpenGL1/OPENGLStd.cpp#L68https://github.com/enigma-dev/enigma-dev/blob/master/ENIGMAsystem/SHELL/Graphics_Systems/OpenGL1/GLstdraw.cpp#L81https://github.com/enigma-dev/enigma-dev/blob/master/ENIGMAsystem/SHELL/Graphics_Systems/Direct3D9/DX9draw.cpp#L74https://github.com/enigma-dev/enigma-dev/blob/master/ENIGMAsystem/SHELL/Graphics_Systems/Direct3D9/DX9screen.cpp#L407Turning it off may have adverse effects and I believe the manual is lying about it being disabled by default. If I recall correctly disabling it for Direct3D will cause some serious rendering issues, I believe I may have confused their implementation or really they did with zwriteenable. I honestly don't see how or why this would or should be disabled by default. Also, please remove the EGM files from all graphics systems and commit them if you are going to do so that way I can merge it with my current pull request under development.
|
|
« Last Edit: July 29, 2014, 04:47:17 pm by Robert B Colton »
|
Logged
|
I think it was Leonardo da Vinci who once said something along the lines of "If you build the robots, they will make games." or something to that effect.
|
|
|
TheExDeus
|
|
Reply #6 Posted on: July 29, 2014, 06:11:16 pm |
|
|
Joined: Apr 2008
Posts: 1860
|
Yes, Studio does not deprecate them so we clearly need them. That's not really an argument, but okay. It appears we are enabling it by default. I know. Now I have removed all those deprecated functions, so we don't enable it by default. Turning it off may have adverse effects and I believe the manual is lying about it being disabled by default. If I recall correctly disabling it for Direct3D will cause some serious rendering issues, I believe I may have confused their implementation or really they did with zwriteenable. I honestly don't see how or why this would or should be disabled by default. The only anomaly is the trees in Project Mario. Which is what I expected to see. You can easily enable it yourself, and the question is if we should do it automatically. My personal opinion is that we shouldn't. Yours is that we should. Anyone else has an opinion? Also, please remove the EGM files from all graphics systems and commit them if you are going to do so that way I can merge it with my current pull request under development. I am working in a separate branch ( https://github.com/enigma-dev/enigma-dev/commits/GL3.3Fixes), which I don't plan to merge at least a day or two. Maybe more.
|
|
|
Logged
|
|
|
|
|