ENIGMA Forums

Contributing to ENIGMA => Developing ENIGMA => Topic started by: TheExDeus on October 03, 2014, 05:21:22 PM

Title: Massive GL3.3 changes.... again
Post by: TheExDeus on October 03, 2014, 05:21:22 PM
Some might remember a merge I did mid-August. It involved massive GL3.3 changes. It stood as a merge request for a week for anyone to test. Nobody did. So I merged it and everything went up in flames. Now I will post a topic so people actually know about these changes, previously maybe only Robert was aware. I will also post how you would test it if using git.

These are some massive changes to the GL3.3 graphics system (it also touches other places). In short:
1) Better errors for GLSL together with more caching.
2) Surfaces now have optional depth buffers. This allows using them for rendering 3D scenes on them, which is basis of many graphics FX, like reflections, refractions and post-processing.
3) Added functions to use attributes and matrices, both of which are now cached.
4) Added proper GL3.3. debug context together with error function. This means when you run GL3.3 in debug mode, then it will error (segfault) whenever you use a deprecated function or wrong enum. Then it prints the function to console and shows an error window. This is very useful when we are trying to get rid of deprecated functions or when we have a hard to find bug (like wrong enum in a function argument). By doing this I removed many functions, fixed many others. In the end it fixed AMD problems we were having and I removed the "hack" that was used previously. That also means that normally ENIGMA users shouldn't see those errors (as they wont use GL directly), and so this could be an additional debug mode (graphics debug mode), so that we don't drop frames without reason (this GL debug mode really does drop FPS).
5) Fixed view_angle in GL1 and GL3.
6) Adds a global VAO, which is necessary for GL3.3 core. Making one VAO per mesh might be better, but I had some problems with rendering when I tried that. Worth investigating later.
7) Fixes GL1 models not updating. This is because GL lists are now used for additional batching, but they were not reset when model_clear() was called.
8) The GL1 GL list was never destroyed, thus creating a memory leak problem. Fixed that by destroying the list in mesh destructor. The list is also created once doing the mesh constructor.
9) Fixes surfaces, which were broken in the recent viewport code changes.
10) Started on the GPU profiler. It would basically keep track of vertices, triangles, texture swaps, draw calls and so on per frame. This is of course done in debug mode for now. Many changes to come in this regard, as well as a forum post to explain this in more detail.
11) Updated GLEW. This wasn't necessarily needed, but it's always good to have newer stuff. The reason I did it though is because I needed to get rid of glGetString(GL_EXTENSIONS) which was in glew.c. This function with this argument is deprecated, and so it always crashed at startup in debug context. The newest versions (1.10) still doesn't remove that function call, but I found many code snippets on the net that replace it.
12) The color choice is more consistent with GM in GL3. It's hard to explain, but basically the bound color (draw_set_color) will be the default one and it won't blend when using vertex_color. This is basically the fix for the purple floor in Minecraft example. In GL1 the floor was purple, in GL3 it was white. Now in GL3 it is also purple.
13) Fixed shadows in Project Mario (can't remember what did it though).
14) Added alpha test functions for GL3.3. This can also improve performance.
15) Added draw_sprite_padded() which is useful for drawing menus, buttons and other things like that. Will be instrumental in GUI extension I'm working on.
16) Added a basic ring buffer. If buffer type is STREAM (like the default render buffer), then it uses a ring buffer. It basically means that if you render stuff with the same stride (like 6 bytes for example), it will constantly use glSubData on different parts of the buffer and not cause GPU/CPU synchronization. This is useful for things like particle systems. For now it will only work when you render something in one batch with the same stride (like particles). In my test I draw 10k sprites - I get 315FPS with current master, and 370FPS with this change. But the gain will not be noticeable in more regular cases. Like minecraft or mario examples have zero gain because of this change. I think the short term the biggest gain can only be from texture atlas or texture arrays. Another thing would be to use GL4 features, like persistent memory mapping. Learn more here: http://gdcvault.com/play/1020791/ and about ring buffers here: https://developer.nvidia.com/sites/default/files/akamai/gamedev/files/gdc12/Efficient_Buffer_Management_McDonald.pdf.
17) C++11 is now enabled. This means from now on we will start using C++11 features, including unordered_map which is already used in shader system.
18) Some OpenAL changes so calling sound_play(-1) doesn't crash Linux.

There were many other changes as well, but I have forgotten most of it, as this was originally a mid-August merge.

I would like if some other people tested it. I have tried it on AMD laptop and NVIDIA PC. Will do some additional tests later.

Also there are performance improvements for GL3 stemming from these changes. Like project mario is now 1620FPS (vs 1430FPS in master). But there can be also a decrease in some cases, because the caching can actually take more time than calling the gl function. For example, uniforms are very optimized and are meant to be changed frequently (like 10million times a second) and so adding a caching layer can actually slow it down. That is still useful for debugging purposes, as we actually know what types uniforms are and what data they hold (so people can actually query back this data without touching GPU memory) and I'm still investigating if leaving cache in, but disabling cache checks is more useful and faster.

I recommend testing on:
1) Project Mario - http://enigma-dev.org/forums/index.php?topic=1161.0 (GL1 and GL3).
2) Minecraft example - http://enigma-dev.org/edc/games.php?game=65 (GL1 and GL3).
3) Simple shader example - https://www.dropbox.com/s/6fx3r0bg5puyo28/shader_example.egm (GL3).

I will fix up the water example and post a link as well.

This is how they should look after running:
(http://i.imgur.com/WOPB6xP.png)
(http://i.imgur.com/dzc6hZK.png)
(http://i.imgur.com/N8py1B5.png)


You can find the branch here: https://github.com/enigma-dev/enigma-dev/commits/GL3.3RealCleanUp
To test it you can do this via git:
1) Open console, and cd to enigma directory
2) Write "git checkout GL3.3RealCleanUp"
3) Then open LGM and test

Another way is to download this: https://github.com/enigma-dev/enigma-dev/archive/GL3.3RealCleanUp.zip
Then you must extract it. Copy LGM, plugin directory, ENIGMA.exe, as well as ENIGMAsystem/Additional to the extracted directory from your working version of ENIGMA.

Please test, give feedback and bug reports. I would want this merged as soon as possible. :)

Known bugs:
Text in Project Mario is messed up. Can't remember if this was fixed or not. It looks fine in Minecraft, so not sure what is going on. Maybe Robert knows.
Title: Re: Massive GL3.3 changes.... again
Post by: sorlok_reaves on October 03, 2014, 11:48:34 PM
Lots of great fixes! One minor bug: you need to add the following in GL3shader.cpp:

Code: [Select]
#include <cstring>
...because memcpy is actually defined there (for some reason). Otherwise, Project Mario can't compile in Linux.
Title: Re: Massive GL3.3 changes.... again
Post by: sorlok_reaves on October 03, 2014, 11:54:12 PM
Also, the Minecraft example looks like this on Linux (+Nvidia), with both OpenGL1 and OpenGL3:

(http://i.imgur.com/eLJ0orB.png)

Finally, the shader example will cause Lateral GM to segfault, shortly after:

Code: [Select]
Running make from `make'
Full command line: make Game WORKDIR="%PROGRAMDATA%/ENIGMA/" GMODE=Run GRAPHICS=OpenGL3 AUDIO=OpenAL COLLISION=Precise WIDGETS=None NETWORKING=None PLATFORM=xlib CXX=g++ CC=gcc COMPILEPATH="Linux/Linux" EXTENSIONS=" Universal_System/Extensions/Alarms Universal_System/Extensions/Timelines Universal_System/Extensions/Paths Universal_System/Extensions/MotionPlanning Universal_System/Extensions/DateTime Universal_System/Extensions/ParticleSystems Universal_System/Extensions/DataStructures" OUTPUTNAME="/tmp/egm7620500460771948401.tmp" eTCpath=""

Title: Re: Massive GL3.3 changes.... again
Post by: Goombert on October 04, 2014, 12:06:09 AM
Sorlok, as far as the shader output, it would probably be better to post enigma-dev/output_log.txt because I believe we throw the errors in that file.
Title: Re: Massive GL3.3 changes.... again
Post by: TheExDeus on October 04, 2014, 04:51:44 AM
Quote
...because memcpy is actually defined there (for some reason). Otherwise, Project Mario can't compile in Linux.
Alright. The fact that I cannot test on Linux and Mac is a good reason for this testing in the first place. Thank you!

Quote
Also, the Minecraft example looks like this on Linux (+Nvidia), with both OpenGL1 and OpenGL3:
That is weird. Can you run in Debug mode and see if it outputs anything to console?

Quote
Finally, the shader example will cause Lateral GM to segfault, shortly after:
Same, please run in Debug mode and see if anything errors out.

But project mario looks okay for Linux? If so, then it's a good sign. Means we can fix other problems as well. Otherwise it would just be a "driver issue" and we be out of luck.

edit: Also, I forgot to mention that one of the biggest reasons all of that was done, was to make the porting to GLES a lot easier. For that we needed to use GL Core context, which is why I had to redo several places, even if that could actually lead to a performance hit in short term. But on my hardware FPS in several examples is larger, and on others it's as before. Performance gain is basically because of reduced GL function calls, in Project Mario they went from 5215 per frame, to 3222. Almost 40% reduction. But sadly it doesn't mean much, as some calls are very fast, others are not. Optimizing for the slow ones is the way to go from now.
Title: Re: Massive GL3.3 changes.... again
Post by: Goombert on October 04, 2014, 08:53:17 AM
Quote from: Harri
That is weird. Can you run in Debug mode and see if it outputs anything to console?
I believe it might be because that game uses screen_refresh() to read the heightmap data from the background/sprite. Sorlok, are you testing on Compiz?

Quote from: Harri
Performance gain is basically because of reduced GL function calls, in Project Mario they went from 5215 per frame, to 3222. Almost 40% reduction. But sadly it doesn't mean much, as some calls are very fast, others are not. Optimizing for the slow ones is the way to go from now.

I can tell you, I think some of the slow down is VBO's used for sprite/background batching. I believe vertex arrays are a lot faster for dynamic draw. But this is in regards to regular consumer hardware, I don't have a graphics card as good as yours Harri, so in some cases (like sprite batching) software vertex processing is just optimized a lot more I guess.
Title: Re: Massive GL3.3 changes.... again
Post by: TheExDeus on October 04, 2014, 09:12:26 AM
I doubt your card does any software vertex processing either. It is all on the GPU. But there are many things we still need to do, like batching textures. Then we could batch many models together like you do with gllists. That is actually how it's often done these days - You create a 3D texture and bind all the textures to it. Then put all the stuff you want to draw in one VBO, then draw many times with switching the texture index. This means no texture switching or VBO switching is actually taking place. But it's hard to do all that automatically in the way GM and ENIGMA allows drawing things.

Also, I made xlib changes. I haven't tested them, so I'm sure they broke it, but I would want sorlok to test and see. Basically we actually didn't even create a GL3 context. So it wouldn't also give you any error information. I doubt it will fix any of the problems you are seeing (maybe the segfault in shader example), but still a good debug feature. Please test again and help fix any syntax errors I might have made.

Can you move around in the Minecraft example? Like are only the visible blocks solid? If so, then the problem really is in the generation. The generation happens in world_generator, it draws a sprite, and then uses draw_getpixel. It actually doesn't do a screen_refresh() as far as I can see. So now sure how it works. I would just use a surface instead.
Title: Re: Massive GL3.3 changes.... again
Post by: sorlok_reaves on October 04, 2014, 11:03:02 AM
First, some basic errors:

In Bridges/xlib-OpenGL3/graphics_bridge.cpp, you need:

Code: [Select]
#include "libEGMstd.h"
...because you use toString().

Also, on this line:
Code: [Select]
*fbc = glXChooseFBConfig(enigma::x11::disp, DefaultScreen(enigma::x11::disp), visual_attribs, &fbcount);
...you are storing a double-pointer in a single-pointer.

Similarly, here:

Code: [Select]
*vi = glXGetVisualFromFBConfig( enigma::x11::disp, fbc[0] );
...you store a single pointer in a value type. I tried changing these lines to:

Code: [Select]
*fbc = *glXChooseFBConfig(enigma::x11::disp, DefaultScreen(enigma::x11::disp), visual_attribs, &fbcount);
*vi = *glXGetVisualFromFBConfig( enigma::x11::disp, fbc[0] );

...but I got a segfault on the first line. I'll look into this more, but perhaps you changed how the *FBConfig() functions work? (GL1 doesn't use them, and Windows-GL3 does something different.)
Title: Re: Massive GL3.3 changes.... again
Post by: TheExDeus on October 04, 2014, 11:13:52 AM
Yes, Windows version is quite different, because GLX is different. I used this as an example: https://www.opengl.org/wiki/Tutorial:_OpenGL_3.0_Context_Creation_%28GLX%29 . Maybe you can use that to figure out in more detail. Sadly, even if I fix it, I cannot test it. I know Robert can test linux now as well, so he should maybe try it.
Title: Re: Massive GL3.3 changes.... again
Post by: sorlok_reaves on October 04, 2014, 12:35:06 PM
I've done a little digging. Turns out this line:

Code: [Select]
fbc = glXChooseFBConfig(enigma::x11::disp, DefaultScreen(enigma::x11::disp), visual_attribs, &fbcount);
...will segfault unless glewInit() is called. However, glewInit() will fail if it is not in exactly the right place (causing later failures). The important thing here is that even if glewInit() fails, the glXChooseFBConfig() will at least not segfault.

I'll try to track down the "right" place to put the glewInit() call, but maybe Robert has a better idea? I'm not much of an OpenGL person.
Title: Re: Massive GL3.3 changes.... again
Post by: TheExDeus on October 04, 2014, 12:54:26 PM
Your type errors are weird though. You shouldn't dereference the value gotten from the functions, so you shouldn't do this:
Code: (C) [Select]
*fbc = *glXChooseFBConfig(enigma::x11::disp, DefaultScreen(enigma::x11::disp), visual_attribs, &fbcount);
*vi = *glXGetVisualFromFBConfig( enigma::x11::disp, fbc[0] );
Also, we call glewInit() twice actually in GL3 on windows. Once in the bridge, and the other type in graphicssystem_initialize(). I didn't add the glewInit to the xlib bridge though

Also, can you test those examples on master? Mario should work, but without water. Minecraft should work, but mining wouldn't. Shaders will render only the glass box.
Title: Re: Massive GL3.3 changes.... again
Post by: sorlok_reaves on October 04, 2014, 01:18:28 PM
Sure, I'll have a look. Also, you are right that the first line should look like this:

Code: [Select]
fbc = glXChooseFBConfig(enigma::x11::disp, DefaultScreen(enigma::x11::disp), visual_attribs, &fbcount);
Title: Re: Massive GL3.3 changes.... again
Post by: TheExDeus on October 04, 2014, 01:57:09 PM
Also test the changes I made in branch. I thought I said that, but then I read my post and noticed that I didn't. :)
Title: Re: Massive GL3.3 changes.... again
Post by: sorlok_reaves on October 04, 2014, 03:15:33 PM
Running your latest changes allows it to compile, but it still crashes at:

Code: [Select]
fbc = glXChooseFBConfig(enigma::x11::disp, DefaultScreen(enigma::x11::disp), visual_attribs, &fbcount);

Running on master compiles and runs, and then the game is unplayable (just a single blue screen and the cursor, but it's not clear if I'm actually moving, since everything is blue).

Project Mario builds on master.

The shader example still crashes on master, but I figured out why. By default (because this is an EGM file!) the make directory is set to:
Code: [Select]
%PROGRAMDATA%/ENIGMA/
I really, really, really think we should not honor the user-specified make directory and just pick one (internally) for each platform. I've run into this same bug like 3 times.
Title: Re: Massive GL3.3 changes.... again
Post by: TheExDeus on October 04, 2014, 03:33:50 PM
Turns out there is glxewInit as well. I'm sure we don't never call it. Again, it works only when context is created though. So the chicken and the egg problem again. We need to create a simple context, call glxewInit() and then create the GL3.3 context.

edit: Try now. That is why actually glXChooseFBConfig() segfaulted. glX functions are loaded by glxewInit, instead of glewInit, so the pointer was junk. Calling a junk pointer to a function ends in a segfault. So if you get the current one to compile, then it should not crash at glXChooseFBConfig.

edit2: The fact that you don't see anything on master in the minecraft example at least means the current fixes are better. It shows blue window in both GL1 and GL3? It seems we are more hopeless on Linux than we thought. I guess you should try Iji then.
Title: Re: Massive GL3.3 changes.... again
Post by: sorlok_reaves on October 04, 2014, 05:41:45 PM
Now I'm getting a compile error:
Code: [Select]
Bridges/xlib-OpenGL3/graphics_bridge.cpp:132:21: error: ‘glxewInit’ was not declared in this scope
    err = glxewInit();

What's really weird about this is that you include glxew.h. So I tracked down within that file:
Code: [Select]
#ifdef GLEW_MX
#define glxewInit() glxewContextInit(glxewGetContext())
#endif

GLEW_MX is an optional macro, and defining it makes a lot of other things more annoying (like glewGetContext()). Not entirely sure what the right response here is.
Title: Re: Massive GL3.3 changes.... again
Post by: TheExDeus on October 04, 2014, 06:24:11 PM
Yeah, I was mistaken. Turns out it's only used for multiple contexts or something. It also actually calls the same thing glewInit() calls. So I removed it - try now. It might work, because I left the temporary context in, which was still needed for glewInit().
Title: Re: Massive GL3.3 changes.... again
Post by: sorlok_reaves on October 04, 2014, 07:21:11 PM
Unfortunately, it still crashes at the same line:
Code: [Select]
fbc = glXChooseFBConfig(enigma::x11::disp, DefaultScreen(enigma::x11::disp), visual_attribs, &fbcount);
Fun time; I just figured out that glXChooseFBConfig is NULL on my machine. Like, the function pointer returned by glxew is null. This shouldn't be happening (modern graphics card, etc.), but I'll dig into it a bit more.

EDIT: Ok, so it seems the function pointers are only set to non-null if glxewInit() is called. But that can only work if GLEW_MX is defined... which requires us to track the current render context.

My question is... did glXChooseFBConfig() work before, or was it added just to this commit?
Title: Re: Massive GL3.3 changes.... again
Post by: TheExDeus on October 05, 2014, 07:24:34 AM
Actually it's defined in line 303, where there is no GLEW_MX required. glxewInit calls glxewContextInit() which is in glew.c line 13227. So the whole thing seems weird to me.

Quote
My question is... did glXChooseFBConfig() work before, or was it added just to this commit?
It was added. Previously only glXChooseVisual was called. For proper GL3 context it seems we need glXChooseFBConfig as well. I'm honestly out of ideas. You can try using GLEW_MX, as glxewGetContext() shouldn't cause problems. GLEW homepage says that GLEW_MX is not actually included in default release. So you might need to make a custom glew build to enable that (http://glew.sourceforge.net/advanced.html). But it does look that we have everything we need already. It would be 100x easier for me to fix this if I had Linux. I guess I will have to either dual-boot or do a vm thing.

Try compiling this simple example: https://www.opengl.org/wiki/Tutorial:_OpenGL_3.0_Context_Creation_%28GLX%29 . It doesn't use glew, and uses old school pointer stuff. But if nothing else, then you can at least get the pointer for that function alone. And use glew for the rest.

Also, we might look into glbinding, https://github.com/hpicgs/glbinding . It the newest and most recently worked on binding framework. It basically replaces glew and many others. It uses C++11 features and is quite interesting. But changing this will probably be a little painful, and we haven't decided if we go towards supporting C++11 compilers only, so I guess we should hold off of that for a while.
Title: Re: Massive GL3.3 changes.... again
Post by: sorlok_reaves on October 05, 2014, 10:51:28 AM
Quote from: TheExDeus
Actually it's defined in line 303, where there is no GLEW_MX required. glxewInit calls glxewContextInit() which is in glew.c line 13227. So the whole thing seems weird to me.

Which file are we talking about? I'm referring to glxew.h, where glxewInit() is defined on line 1566, but only if GLEW_MX is defined. glXChooseFBConfig() will always be defined, but it might point to nothing if glxewInit() is never called.


Quote from: TheExDeus
Previously only glXChooseVisual was called. For proper GL3 context it seems we need glXChooseFBConfig as well. I'm honestly out of ideas. You can try using GLEW_MX, as glxewGetContext() shouldn't cause problems. GLEW homepage says that GLEW_MX is not actually included in default release. So you might need to make a custom glew build to enable that (http://glew.sourceforge.net/advanced.html). But it does look that we have everything we need already. It would be 100x easier for me to fix this if I had Linux. I guess I will have to either dual-boot or do a vm thing.

You might want to look at putting Ubuntu in a VM; I've tried enabling GLEW_MX, and got a bit stuck since it requires you to manually handle the render contexts yourself.


Quote from: TheExDeus
Try compiling this simple example: https://www.opengl.org/wiki/Tutorial:_OpenGL_3.0_Context_Creation_%28GLX%29 . It doesn't use glew, and uses old school pointer stuff. But if nothing else, then you can at least get the pointer for that function alone. And use glew for the rest.

The example works.


Quote from: TheExDeus
Also, we might look into glbinding, https://github.com/hpicgs/glbinding . It the newest and most recently worked on binding framework. It basically replaces glew and many others. It uses C++11 features and is quite interesting. But changing this will probably be a little painful, and we haven't decided if we go towards supporting C++11 compilers only, so I guess we should hold off of that for a while.

From my point of view, C++11 support would be great! I think the main concern is double-checking that MinGW on Windows has all the C++11 features we'll need. The other devs will probably have different opinions.

Actually, if this is just in the graphics Bridge, couldn't we selectively enable C++11 on Linux when GL3 is used?
Title: Re: Massive GL3.3 changes.... again
Post by: TheExDeus on October 06, 2014, 04:58:42 AM
Quote
The example works.
Then maybe you can try that method. Get the glXChooseFBConfig pointer and use that.

Quote
From my point of view, C++11 support would be great! I think the main concern is double-checking that MinGW on Windows has all the C++11 features we'll need. The other devs will probably have different opinions.
New MinGW (that is last few years) supports C++11 just fine. We also pack our own MinGW in ENIGMA installation, so we can control what version they are using. But supporting C++11 is a lot bigger decision, for which I would want Josh's input.

Quote
Running on master compiles and runs, and then the game is unplayable (just a single blue screen and the cursor, but it's not clear if I'm actually moving, since everything is blue).
But when you were running minecraft on the branch (when you also saw ground and some blocks), were you able to hit those blocks then? Because if we cannot do this context fix properly, then I might as well revert so Linux at least partly works. Because it might have been that it rendered fine (your scaling and text was still messed up, but still), and it was the world generation that broke.

You also said that you fixed the shader example segfault, by disabling the build path. Did render correctly then? Can you post a picture?

Because it seems that Linux might have worked fine for the most part. Even if you don't have a proper GL3.3 context and debugging. This can be done later.
Title: Re: Massive GL3.3 changes.... again
Post by: sorlok_reaves on October 06, 2014, 08:24:27 AM
Yes, placing blocks an destroying them in Minecraft works on the branch. I think it's correct to say that the world generation broke.

The Shader example crashes at the same place as Minecraft (the glXChooseFBConfig() thing).

I'll try to integrate the sample code's context stuff and see if that helps.
Title: Re: Massive GL3.3 changes.... again
Post by: TheExDeus on October 06, 2014, 08:48:19 AM
It's possible then that everything was actually already working except some draw_getpixel() somewhere. So if you are unsuccessful, then I will just revert.
Title: Re: Massive GL3.3 changes.... again
Post by: sorlok_reaves on October 06, 2014, 09:39:17 AM
If I force the pointer (using these changes (https://github.com/sorlok/enigma-dev/commit/32c4d550fc91d5c3902e457c3d9000e3bcea0d47)), then the program runs past initialization and crashes immediately on:

Code: [Select]
Program received signal SIGSEGV, Segmentation fault.
0x0000000000000000 in ?? ()
(gdb) bt
#0  0x0000000000000000 in ?? ()
#1  0x0000000000483ad8 in enigma::Shader::Shader (this=0x919620, type=0)
    at Graphics_Systems/OpenGL3/GLSLshader.h:40
#2  0x000000000047cd3d in enigma_user::glsl_shader_create (type=0)
    at Graphics_Systems/OpenGL3/GL3shader.cpp:490
#3  0x00000000004990de in enigma::graphicssystem_initialize ()
    at Graphics_Systems/OpenGL3/OPENGL3Std.cpp:101
#4  0x000000000051d598 in enigma::initialize_everything ()
    at Universal_System/loading.cpp:59
#5  0x0000000000455f1c in main (argc=1, argv=0x7fffffffe0b8)
    at Platforms/xlib/XLIBmain.cpp:306

Does this look like we're making progress, or did I just cause all sorts of nondeterminism by casting function pointers around? After all, the point of Glew is to handle this kind of stuff automatically.
Title: Re: Massive GL3.3 changes.... again
Post by: TheExDeus on October 06, 2014, 10:45:40 AM
You should try debug mode. It crashed on glCreateShader(), which means you either don't support shaders, glCreateShader is also NULL, or the core context just says we using something deprecated. You can also try uncommenting "//GLX_CONTEXT_PROFILE_MASK_ARB, GLX_CONTEXT_COMPATIBILITY_PROFILE_BIT_ARB," in xlib bridge. That will turn on the compatibility context, so it's more probable that it will run. Nvidia actually discourages disabling this compatibility context (because when later even GL3.3 is deprecated, your already compiled programs will not run without it), and I actually only did it to make the code compatible with GLES. After all these changes, I will probably turn that flag back on.
Title: Re: Massive GL3.3 changes.... again
Post by: sorlok_reaves on October 06, 2014, 10:14:12 PM
glCreateShader() was also NULL. Enabling the COMPATIBILITY_PROFILE does not help.

I think this is almost certainly because I'm fiddling with raw pointers. I think the correct approach is to use glxew.h with GLEW_MX, or maybe that new approach you mentioned.

Anyway, I'm always free to help debug this; I'd like very much to see GLES integration.
Title: Re: Massive GL3.3 changes.... again
Post by: sorlok_reaves on October 06, 2014, 10:58:28 PM
Ok, I tried GLEW_MX, and glXChooseFBConfig is still a null pointer. There is clearly something weird going on there.

Edit: Just for kicks, I tried updating the glew.c/glew.h packaged with ENIGMA. Still no dice.
Title: Re: Massive GL3.3 changes.... again
Post by: TheExDeus on October 09, 2014, 12:10:23 PM
Another thing we didn't try (or at least I think we didn't) is adding glXMakeCurrent after creating the temporary context, but before the glew init's. It's actually required (also on Windows), so it was my bad that I forgot to add it. Try the last commit.
Title: Re: Massive GL3.3 changes.... again
Post by: sorlok_reaves on October 09, 2014, 07:51:00 PM
I'll give it a try and get back to you (stuck in a meeting).
Title: Re: Massive GL3.3 changes.... again
Post by: sorlok_reaves on October 10, 2014, 08:19:02 PM
Got an error compiling:

Code: [Select]
In file included from Bridges/xlib-OpenGL3/../General/glxew.h:102:0,
                 from Bridges/xlib-OpenGL3/graphics_bridge.cpp:21:
Bridges/xlib-OpenGL3/graphics_bridge.cpp: In function ‘void enigma::EnableDrawing()’:
Bridges/xlib-OpenGL3/../General/../../Graphics_Systems/General/glew.h:18011:51: error: ‘glewGetContext’ was not declared in this scope
 #define glewInit() glewContextInit(glewGetContext())
                                                   ^
Bridges/xlib-OpenGL3/graphics_bridge.cpp:128:18: note: in expansion of macro ‘glewInit’
     GLenum err = glewInit();

Turns out that when you define GLEW_MX, you need to define your own glewGetContext() function (since there can be multiple contexts now).
Title: Re: Massive GL3.3 changes.... again
Post by: TheExDeus on November 04, 2014, 02:41:40 PM
Alright. I just reverted the xlib changes. Now please try again. I have installed Ubuntu on a virtual machine, but sadly, that doesn't allow me to use new graphics functionality, as the gpu is virtual as well. So I ended up without GL2.1 support max. But I could compile it without errors and run GL1.1 until surfaces were used. Then it breaks, as the virtual driver doesn't support framebuffers.

So on Linux try all the examples, but ignore minecraft bug, as there is a problem in draw_getpixel, not drawing itself. I will try to get ubuntu somehow going (I guess in dual boot) and try testing again.

Can everyone please test this, hopefully, final version? It works for me on Nvidia and AMD.
Title: Re: Massive GL3.3 changes.... again
Post by: sorlok_reaves on November 09, 2014, 10:56:21 AM
Testing results on Linux Mint:

Minecraft: Shows consistent behavior on GL1 and GL3. The world is still empty, but block placement and movement works.

Shaders test: Crashes ENIGMA. I'm looking into this.

Project Mario: Shows the opening title screen, but hangs, spinning forever, on a black screen when trying to load the world. I'm looking into this too.


Edit: Shaders test works, but with a lot of tearing. Note that this tearing is not present in screenshots:
http://i.imgur.com/PGxJqyQ.png
Title: Re: Massive GL3.3 changes.... again
Post by: TheExDeus on November 09, 2014, 01:55:55 PM
Compile in debug mode and try "gdb compiled_game.tmp" and see where the segfault happens. For me it happened in "surface_reset_target()" because the driver I was using inside the VMWare didn't support much. Another thing was loading and using resources. Project Mario tries to play some audio resources it doesn't load, so if there is no "== -1" checks, then it crashes at that screen. I did add those checks to audio system however. Also, when running mario you must do it from the directory where all the mario files are.

Is the glass cube in shader example also visible? I'm not sure what would cause the tearing. Maybe flickering on the glass surface, but not tearing.
Title: Re: Massive GL3.3 changes.... again
Post by: sorlok_reaves on November 09, 2014, 04:29:25 PM
Ah, thanks for the hint. Turns out I was using an old PM file. It works fine now. (The window is "squished", but this was fixed on master, so you'll get it with the merge.)

What do you mean by the "glass cube"? Can you provide a screenshot of how it should look?
Title: Re: Massive GL3.3 changes.... again
Post by: TheExDeus on November 09, 2014, 04:44:54 PM
Like the screenshot in the original topic:
(http://i.imgur.com/N8py1B5.png)
Title: Re: Massive GL3.3 changes.... again
Post by: sorlok_reaves on November 09, 2014, 08:34:01 PM
Oh, I certainly don't see it. Is there a way to change the zoom level? The default view starts out very zoomed-in.
Title: Re: Massive GL3.3 changes.... again
Post by: Goombert on November 09, 2014, 11:30:21 PM
I would like to add Harri that I am planing on wrapping up LateralGM 1.8.7 soon after some additional fixes and integration of other new features.  I would like to have the OpenGL fixes in so that I can also fix the Direct3D surfaces. But we should decide on whether we want to add a draw_set_viewport function that does not scale the viewport and a screen_set_viewport that does scale the viewport to screen scaling coordinates or perhaps just add a booelean scaleToWindow to the existing function. But anyway I would like the GL3 fixes to be finalized so it can also be included in the new Portable ZIP alongside sorlok's array length functions and the new LGM.
Title: Re: Massive GL3.3 changes.... again
Post by: TheExDeus on November 10, 2014, 11:54:01 AM
Quote
Oh, I certainly don't see it. Is there a way to change the zoom level? The default view starts out very zoomed-in.
Zoomed in? You should be able to move with WASD and look around with mouse. The FOV doesn't seem to be really wrong in your screenshot, but it's hard to tell.

Quote
I would like to add Harri that I am planing on wrapping up LateralGM 1.8.7 soon after some additional fixes and integration of other new features.  I would like to have the OpenGL fixes in so that I can also fix the Direct3D surfaces. But we should decide on whether we want to add a draw_set_viewport function that does not scale the viewport and a screen_set_viewport that does scale the viewport to screen scaling coordinates or perhaps just add a booelean scaleToWindow to the existing function. But anyway I would like the GL3 fixes to be finalized so it can also be included in the new Portable ZIP alongside sorlok's array length functions and the new LGM.
I'm not sure either. I think either way is fine if by default it works like it should. You can then allow users to call those functions as well, but not sure what use they really would have. In my mind when you bind a surface, it should have the surface as the viewport, so you can draw on it like you would on screen. And draw like on an arbitrary size monitor. If a person wants a different viewport or projection, then he should set that AFTER binding the surface. Just like GM.
Title: Re: Massive GL3.3 changes.... again
Post by: Goombert on November 10, 2014, 03:34:38 PM
Quote from: TheExDeus
If a person wants a different viewport or projection, then he should set that AFTER binding the surface. Just like GM.
Actually, that's just it, in GM you can't set the viewport on a surface because the only way to set the view in GM is the built in variables which do not affect surfaces. This is exactly why I added screen_set_viewport to ENIGMA.

So we either need to have screen_set_viewport(boolean scale); view scale controlling whether it takes the scaling options into account to scale or maintain aspect ratio, if it is false then it just passes the viewport the user provides which would be the case with surfaces. Or we could break it into two functions screen_set_viewport which scales the provided viewport to the screen, and draw_set_viewport which just sets a generic viewport without scaling it for use with surfaces. It's up to you but this is how I wish you would fix it. Additionally as I suggested on GitHub, try to use a negative viewport to flip the rendering upside down on the surface, if we can do that to flip the rendering on surfaces, then I suggest we also add the function surface_set_viewport
https://github.com/enigma-dev/enigma-dev/issues/861#issuecomment-62361596
Title: Re: Massive GL3.3 changes.... again
Post by: TheExDeus on November 17, 2014, 07:37:45 PM
I don't think it can be fixed with glViewport and scissor functions. Here it requires the matrix to be changed and there is a problem. I can change the d3d_set_projection_ortho() in surface_set_target from "d3d_set_projection_ortho(0, 0, surf->width, surf->height, 0);" to "d3d_set_projection_ortho(0, surf->height, surf->width, -surf->height, 0);" and it will work fine. But then whenever the user tries to do anything with the matrices that require an identity (like d3d_set_projection_scale()), then he will have to manually flip it again. So this will fix many cases, but not all. Now sure what to do. I think this is basically the modification I wanted previously, but we used gl matrices previously, so we didn't want this solution (glScale(1,-1,1) looked ugly).

edit: Good news is that the shadow example here (http://enigma-dev.org/edc/games.php?game=62) actually works faster now. Previously in GL1 it was 2285FPS. Now it's about 2350FPS in GL1.1 and 2810FPS in GL3.3 (with all the lights placed in the same positions). So we have improved.
Title: Re: Massive GL3.3 changes.... again
Post by: Goombert on November 24, 2014, 03:24:02 PM
You're right, I tried flipping the views, doesn't seem to work. Another possible solution is to just transform the texture coordinate matrix when a surface texture is bound, this would make surfaces work as textures in 3D as well as 2D.
Title: Re: Massive GL3.3 changes.... again
Post by: TheExDeus on November 24, 2014, 05:28:43 PM
In GL3 there is no such thing as texture matrix right now. And as it is seldom used, then I wouldn't really want to add it just for this.