So SDL doesn't create a core profile since #1685 at least on Windows. It's actually a bug in SDL because there is no way to create a core context in SDL without specifying the version number.
We can see that Win32 simply requests a core profile with the default version. This correctly gives us a core profile of the latest version (core 4.5 for me).
We can see I made SDL attempt to do the same, but it does not work (compatibility 4.5 is returned for me).
We cannot tell SDL the version, like it wants, for several reasons.
- You can't request GL_VERSION_MAJOR without a context.
- GL_VERSION_MAJOR wasn't added until GL 3.0.
- We can't ask glew the version number without initializing glew (requires a context).
Basically, I think this bug just needs reported to the SDL developers. I also think the SDL bug came about as a result of the SDL developers having misinterpreted the Khronos documentation.
If the core profile is requested, then the context returned cannot
implement functionality defined only by the compatibility profile.
https://www.khronos.org/registry/OpenGL/extensions/ARB/WGL_ARB_create_context.txtJust to reiterate, I only want to pass it the correct version number in GL3+.
Right, which is not really possible because we can't get that version number easily. Now, I can tell you one work around we could do but it sucks. You just create a dummy SDL window and GL context, ask it the version number of the compatibility context, destroy it, use the version number to create the real context with a core profile. This is somewhat acceptable and a lot of people do this for creating core profiles. It just annoys me, because that shouldn't be necessary to simply create a "latest version core" context and it wouldn't be if SDL did not have the above documented bug.
You keep avoiding this so strongly that I can only assume some poor behavior results when you pass
GL_MINOR_VERSIONto the utility, but you haven't told me what it is. You just keep bringing up irrelevant problems, like "GL_MAJOR_VERSION isn't declared before GL3" (I don't care because we support GL3+ and GL1) and that we want to use 1.1 in GL1.
Now you want to tell it 4.5 (by default) instead of GL_MAJOR_VERSION, even though you've told me already that specifying the default has caused problems.
At best, we're telling it a number higher than what our headers support (GL_MAJOR_VERSION), then trimming it down to what the card supports. The card already knows what it supports. That's why the context has a compatibility mode. It's going to have the same compatibility mode regardless of what higher value you supply.
So the only question is whether
GL_MAJOR_VERSIONwill ever be LESS than what we can actually support, and if so, if we actually can get any benefit out of using a higher version (and I suspect not).
I can't figure out why you're fighting using those macros so hard, so I effectively give up.
Please sign in to post comments, or you can view this issue on GitHub.
There is no macro,
GLenumand it's passed to
glGetIntegervwhich isn't loaded until you have a context.
data returns one value, the major version number of the OpenGL API supported by the current context.
In my MSYS2
GL/glew.h:#define GL_MAJOR_VERSION 0x821B #define GL_MINOR_VERSION 0x821C
Also, 4.5 is not default, the default in WGL and what SDL should be is major version 1 and minor version 0.