Pages: 1
  Print  
Author Topic: OpenGL chosen by system (?)  (Read 2227 times)
Offline (Unknown gender) fervi
Posted on: July 18, 2013, 02:08:06 PM
Member
Joined: Feb 2013
Posts: 78

View Profile Email
Hello!

Bad English :P

If somebody use Enigma, then they can read about 2 versions of OGL - OGL1 for old computers , and OGL3 for new computers. In Description of OGL1 we can read about not compatibile with new computers and vice versa. If is true - maybe somebody can wrote autodetection and run Gaming?

My algoritm
Check OGL Version
If OGL>2
{
run_ogl3
}
else
{
run_ogl1
}

Fervi
Logged
Offline (Male) Josh @ Dreamland
Reply #1 Posted on: July 18, 2013, 02:14:28 PM

Prince of all Goldfish
Developer
Location: Pittsburgh, PA, USA
Joined: Feb 2008
Posts: 2959

View Profile Email
This has already been added to the new parser, in the form of preprocessors.

Code: (EDL) [Select]
{{if ENIGMA_Graphics == "OpenGL1"}}
  opengl1_do_stuff();
{{else}}
  opengl3_do_stuff();
  opengl3_special_stuff();
{{endif}}
Logged
"That is the single most cryptic piece of code I have ever seen." -Master PobbleWobble
"I disapprove of what you say, but I will defend to the death your right to say it." -Evelyn Beatrice Hall, Friends of Voltaire
Offline (Unknown gender) TheExDeus
Reply #2 Posted on: July 19, 2013, 01:13:37 AM

Developer
Joined: Apr 2008
Posts: 1872

View Profile
I don't think it's vice versa. I haven't seen a feature from GL1 which doesn't work on newer cards. That's the thing with software design and standards. It almost always have to be backwards compatible (or as much as possible). So while most of GL1 functions are obsolete, they are and will be supported. Maybe it's just in the driver level though (as it emulates fixed pipeline stuff with newer functions like shaders).

Josh, but isn't that just at the compile time? I think Fervi wants that we compile with both GL1 and GL3 and then switch at runtime. I don't believe we need to do that. Any PC (even with Intel cards as sidegame has made us aware) support GL3 in sufficient capacity. There really should be a point where compatibility is just too expensive in terms of time and effort required. I think our GL3 don't work on maybe 1% of all cards (which fervi and poly I guess are owning).
Logged
Offline (Male) Josh @ Dreamland
Reply #3 Posted on: July 19, 2013, 08:02:06 AM

Prince of all Goldfish
Developer
Location: Pittsburgh, PA, USA
Joined: Feb 2008
Posts: 2959

View Profile Email
You'd be surprised. The graphics pipeline has changed; GL1 was designed around the FFP. DirectX changes so much as to be impossible to salvage old code, from version to version. OpenGL has remained "compatible", but a lot of what you do is emulated software-side. For example, pr_linelist, pr_linestrip, pr_trianglelist, and pr_trianglestrip are all that's left, hardware side. And the strips are on their way out. It's getting to where your only choice is between lines and triangles.

Anyway, reading over his request again, I think you're right. I don't have any intention of supporting that, presently, as it removes the possibility of link-time optimization. Specifically, inlining graphics calls. Compared to the actual work done processor side, this is small, but it still aggravates me. It also bloats the game from having two of the same thing in it, although the amount of duplication has been improving, lately. In the event I get around to actually making a game in ENIGMA, my inclination will be instead to release two versions. This is assuming that neither (1) I determine that the game doesn't need mind-blowing graphics, and GL1 is fine, or (2) I determine that if your shit doesn't support GL3, it has no business attempting to run my game.
Logged
"That is the single most cryptic piece of code I have ever seen." -Master PobbleWobble
"I disapprove of what you say, but I will defend to the death your right to say it." -Evelyn Beatrice Hall, Friends of Voltaire
Offline (Unknown gender) TheExDeus
Reply #4 Posted on: July 19, 2013, 08:17:45 AM

Developer
Joined: Apr 2008
Posts: 1872

View Profile
Quote
DirectX changes so much as to be impossible to salvage old code, from version to version.
Code does change and so the needed sdk's and libraries to compile it. My thoughts was that already compiled game will probably work fine though. That is why HL1 from 1998 with D3D using DirectX6 or something still runs on modern hardware even without patches.

Quote
OpenGL has remained "compatible", but a lot of what you do is emulated software-side.
But that means it will still run. So a GL1 game still runs on a newer PC. Maybe many functions will be software side, but that is true for GL3 anyway. Like we don't have any transformation functions any more. We need to take care of matrices ourselves. And while I believe at some point the transformation functions were hardware based, now they are purely software. So none of the functionality that went from hardware to software (GL1 to GL3) is not actually straightforwardly replaceable with something purely in hardware. Now everything is done in shaders. The even simple things like rotations or translations. If you want to move a box in a 3D you have to write a vertex shader that does it and if you want to see the box then you need to write a pixel shader that does it. Nothing comes included.
Logged
Offline (Male) Josh @ Dreamland
Reply #5 Posted on: July 19, 2013, 08:19:23 AM

Prince of all Goldfish
Developer
Location: Pittsburgh, PA, USA
Joined: Feb 2008
Posts: 2959

View Profile Email
No, I promise that eventually, companies will cease to bundle compatibility drivers with their shit. I don't know when that will be, but when it happens, GL1 and DX9 applications will cease to function.

Regardless, offering the option to build for both is the best we can (or at very least should) do.
Logged
"That is the single most cryptic piece of code I have ever seen." -Master PobbleWobble
"I disapprove of what you say, but I will defend to the death your right to say it." -Evelyn Beatrice Hall, Friends of Voltaire
Offline (Unknown gender) TheExDeus
Reply #6 Posted on: July 19, 2013, 09:56:40 AM

Developer
Joined: Apr 2008
Posts: 1872

View Profile
When that happens then of course it will not be practical to support it. Because it really isn't practical to support GL1 even now. For desktop PC's you can buy a GL3 card for 100$ and a used one for 50$ or less. There really is no reason to not get one. On laptops sadly changing GPU is usually a lot harder or even impossible. But then again laptop (especially a crappy one) is in no way meant for gaming. But if it is at most 3-4 years old then it should support GL3 just fine even with crappy intel cards. You can probably buy a 3 year old laptop for 100$ as well. So if a person like poly wants to maintain GL1 for himself then it's ok. But ENIGMA as a project shouldn't really focus on that kind of compatibility.
Logged
Offline (Male) Goombert
Reply #7 Posted on: July 20, 2013, 08:52:55 AM

Developer
Location: Cappuccino, CA
Joined: Jan 2013
Posts: 3110

View Profile
This will be possible when I do the OGRE powered graphics system. OGRE provides such an excellent level of abstraction, kind of its purpose >: This is also one thing I love about ENIGMA and is one of its greatest features.
Logged
I think it was Leonardo da Vinci who once said something along the lines of "If you build the robots, they will make games." or something to that effect.

Pages: 1
  Print