|
|
|
lonewolff
|
|
Reply #18 Posted on: October 13, 2014, 08:58:24 pm |
|
|
"Guest"
|
Also if a company was focusing its release on console or windows only and not other platforms, then why would they use OGL ?
If that was the only target platform, then they wouldn't.
|
|
|
Logged
|
|
|
|
|
Goombert
|
|
Reply #20 Posted on: October 13, 2014, 10:13:57 pm |
|
|
Location: Cappuccino, CA Joined: Jan 2013
Posts: 2993
|
You're all partially correct in my opinion as someone who has written code for all of the graphics API's and their various versions. Direct3D9 is honestly the best of the Direct3D versions, and is far superior to the GL1 and GL2 specs. Direct3D has software vertex processing and the ability to emulate shader model on cards that do not support new features, which has always been a huge positive for developers, in OpenGL it's really not feasible to do without duplicating tons of code. GL3 starts going in a new direction and provides more options, you can have both sampler objects and per-texture sampling information, the choice is yours. Direct3D10 also started going in the lower level direction, but in the wrong way, DX10/11 require a substantial amount of code to get a basic rendering context. This is obviously expected, but the code for Direct3D10/11 is so much more verbose than it is with OpenGL3/4. OpenGL "Next Generation" is in the works as well. They're planning on deprecating all old OpenGL versions and starting anew in favor of an API more like Mantle.
I think it's overhyped. They aleady finished, it's called GLES. As to DX12/OGL, I'm not 100% familiar with all the technical aspects, but can you do everything DX12 can, in OGL or are there feature specific things only supported in DX ? DirectX12 will really not offer anything that the other API's do not. Direct3D9 used to have image loading and could read various formats for you without extensions, but it's going lower level now so it doesn't have anything super special to it anymore. There are really only a few things that Direct3D10/11 have that will soon be superseded by the other API's, software emulation (which will be obsolete and no longer needed) and the illusion of threading (which Mantle has already addressed)
|
|
« Last Edit: October 13, 2014, 10:17:18 pm by Robert B Colton »
|
Logged
|
I think it was Leonardo da Vinci who once said something along the lines of "If you build the robots, they will make games." or something to that effect.
|
|
|
|
Goombert
|
|
Reply #22 Posted on: October 13, 2014, 10:20:59 pm |
|
|
Location: Cappuccino, CA Joined: Jan 2013
Posts: 2993
|
I don't know if I did or not, I am just being objective. Look at the code to load a texture in Direct3D11 and then realize why ENIGMA does not have a Direct3D11 backend. http://msdn.microsoft.com/en-us/library/windows/desktop/ff476904%28v=vs.85%29.aspx
|
|
|
Logged
|
I think it was Leonardo da Vinci who once said something along the lines of "If you build the robots, they will make games." or something to that effect.
|
|
|
lonewolff
|
|
Reply #23 Posted on: October 13, 2014, 10:37:57 pm |
|
|
"Guest"
|
I don't know if I did or not, I am just being objective. Look at the code to load a texture in Direct3D11 and then realize why ENIGMA does not have a Direct3D11 backend. http://msdn.microsoft.com/en-us/library/windows/desktop/ff476904%28v=vs.85%29.aspx
It is nowhere near as bad as the example in that link. I have a basic DX11 engine that I created about 6 months ago, wasn't that bad to load a texture to a quad. [edit] Just dug up some old code. It turns out it is actually a single liner if(FAILED(D3DX11CreateShaderResourceViewFromFile(d3dDevice,"media\\room_title\\spr_title_0.png",NULL,NULL,&pTexture,NULL)))
Jeez, I didn't realize how far I actually got with writing my own DX11 engine. I was upto the stage of writing pixel perfect collision detection before I lost interest. LOL - this is a classic, I had full gamepad support also. Man, the things you can create in a week or two when you set your mind to it
|
|
« Last Edit: October 13, 2014, 11:50:26 pm by lonewolff »
|
Logged
|
|
|
|
TheExDeus
|
|
Reply #24 Posted on: October 14, 2014, 04:24:07 am |
|
|
Joined: Apr 2008
Posts: 1860
|
As to DX12/OGL, I'm not 100% familiar with all the technical aspects, but can you do everything DX12 can, in OGL or are there feature specific things only supported in DX ? I know nothing what DX can do that GL could not. They really are basically the same. From the technical aspect they get even more similar all the time, like they specifically modified GL4 so it would be easier to port from DX. So can you make Crysis3 in GL? Yes. Assassins Creed 4? Yes. BF4? Yes. You can create exactly the same thing. It has never really been about features, but programming preference. DX API used to be very high-level compared to GL, as it had a lot of utility features built-in (like loading a texture, a 3D model, rendering vectorized text etc.), while in GL you had to write all of those yourself. It might have changed in DX12, as they are targeting lower-level now. However, there is a HUGE reason to use DirectX on Windows- performance, support, and tooling. Performance can be the same, or even faster in GL. Like when valve ported Source engine to GL, it ran faster than on DX. Performance depends a lot more on the person writing the graphics code, than the api used. "Support" usually is a stack overflow thread, in which you can ask for both GL and DX. I don't have any problems with available GL doc's either. Both the official Kronos one, or the docs.gl one. And tools are also getting a lot better for GL. Things like apitrace, CodeXL or Nsight are all awesome and helps a great deal. I know that DX tools allow a few things more, but not much. DirectX drivers have way more work put into them to make them fast, fix bugs, and make them consistent with other drivers. That might be true, but GL isn't far behind in this respect. Nvidia and AMD have pulled a lot of resources in the past few years in making drivers more consistent for both different GPU's, and different platforms (like Linux, which now has drivers that actually runs GL4 just fine). It also gets new features more quickly and doesn't have a sluggish committee that can't seem to decide what it wants. GL committee was a problem in GL2 times, now both DX and GL usually comes with new versions at about the same time and with the same features. GL also allows extensions, which might be harder to use, but they allow stuff before even DX allows it. Like one of the new features in DX12 is programmable blend modes, which both Nvidia and Apple has added as extensions to GLES a long time ago. Like on Nvidia tegra, you have programmable blend modes since 2006. They aleady finished, it's called GLES. No, he means the GL "NG" (next generation), it's not GLES. GLES is for embedded systems only (hence the ES), it's not actually meant for desktops. But they have basically merged together in GL4.
|
|
« Last Edit: October 14, 2014, 04:25:57 am by TheExDeus »
|
Logged
|
|
|
|
Rusky
|
|
Reply #25 Posted on: October 14, 2014, 08:23:17 am |
|
|
Joined: Feb 2008
Posts: 954
|
Yes, OpenGL is better now than it has been historically. But it's lost out a lot because of that: http://programmers.stackexchange.com/a/88055As has been said, DirectX is still to this day a lot more consistent than OpenGL, if not with features on your particular card, then on stuff just working everywhere. How many threads do we have here just to debug the GL backends on various cards? DirectX is also already better at multithreading, which admittedly isn't super important for ENIGMA. Also note that OpenGL performance was better when Valve ported source because they worked with Nvidia to fix OpenGL bugs and optimize the driver. So like I said- support. If your game has a problem, DirectX likely solves it, but OpenGL requires you to be Valve and convince the hardware developers to fix it. Plus, Valve found DirectX's problem in the process. What do you think is the likelihood of Microsoft fixing that once vs every GPU vendor fixing various problems with OpenGL drivers as they try to get them ready for something like SteamOS? Edit bonus - the non-deprecated D3D texture loading code: ComPtr<ID3D11ShaderResourceView> pShaderResourceView;
int width, height, bpp; unsigned char *data = /* load pixel data */;
D3D11_TEXTURE2D_DESC td; ZeroMemory(&td, sizeof(td)); td.Width = width; td.Height = height; td.MipLevels = 1; td.ArraySize = 1; td.Format = DXGI_FORMAT_R8G8B8A8_UNORM; td.SampleDesc.Count = 1; td.SampleDesc.Quality = 0; td.BindFlags = D3D11_BIND_SHADER_RESOURCE;
D3D11_SUBRESOURCE_DATA sd; sd.pSysMem = data; sd.SysMemPitch = (width * 32 + 7) / 8; sd.SysMemSlicePitch = sd.SysMemPitch * height;
ComPtr<ID3D11Texture2D> pTexture; dev->CreateTexture2D(&td, &sd, &pTexture);
dev->CreateShaderResourceView(pTexture.Get(), NULL, &pShaderResourceView);
|
|
« Last Edit: October 14, 2014, 08:36:05 am by Rusky »
|
Logged
|
|
|
|
Darkstar2
|
|
Reply #26 Posted on: October 14, 2014, 12:06:45 pm |
|
|
Joined: Jan 2014
Posts: 1238
|
I read some rumours a long time ago that DX12 might be the actual last version of DirectX as we know it before a completely new architecture, what are your opinions on this ? I mean how far can we go ? Are we going to end up with 32Gb, 64GB, 128GB, 1TB textures ? Are we destined for more DX's ? DX13, DX14, DX15, etc? Some say DX12 is what DX11 should have been, so assuming DX12 is the raving success that is claimed to be, this could very well be a version that will "stick" for many years.
There comes a time when your GPU will have just as much RAM as your system and more. Games are getting BIGGER it's ridiculous, and getting far shorter game play. It's to wonder if one day air cooling will be obsolete and liquid cooling a minimum requirement. Those new cards to come out, their setups look like huge cooling towers LOL!
|
|
|
Logged
|
|
|
|
TheExDeus
|
|
Reply #27 Posted on: October 14, 2014, 01:14:43 pm |
|
|
Joined: Apr 2008
Posts: 1860
|
Yes, OpenGL is better now than it has been historically. But it's lost out a lot because of that: http://programmers.stackexchange.com/a/88055 Good read. Thanks. How many threads do we have here just to debug the GL backends on various cards? We had the same with DX. It's just that we don't actually use DX, so there is less threads. Also note that OpenGL performance was better when Valve ported source because they worked with Nvidia to fix OpenGL bugs and optimize the driver. So like I said- support. If your game has a problem, DirectX likely solves it, but OpenGL requires you to be Valve and convince the hardware developers to fix it. This is also true for DX. If you have some D3D bug, then I can assure you Microsoft will not go out of their way to fix it for you or ENIGMA. They will fix it for Valve, Blizzard or Ubisoft though. Every large game company has a guy from a GPU manufacturer working in it, that helps with optimization, not only game code, but driver code. I read some rumours a long time ago that DX12 might be the actual last version of DirectX as we know it before a completely new architecture, what are your opinions on this ? I mean how far can we go ? Are we going to end up with 32Gb, 64GB, 128GB, 1TB textures ? Are we destined for more DX's ? DX13, DX14, DX15, etc? Some say DX12 is what DX11 should have been, so assuming DX12 is the raving success that is claimed to be, this could very well be a version that will "stick" for many years. That is possible, because just as John Carmack said (in QuakeCon 2013) - we already can render everything that an artist can imagine. It's only speed and memory that is the limit, not the features.
|
|
|
Logged
|
|
|
|
Darkstar2
|
|
Reply #28 Posted on: October 14, 2014, 01:36:41 pm |
|
|
Joined: Jan 2014
Posts: 1238
|
So I guess future iterations of graphic cards will be about performance enhancements through optimisations, better chipsets, better memory bandwidth, etc, basically we can't expect any major "feature"release per se, I mean that makes sense, probably with existing even last generation cards you have more than enough to render anything but it comes down to memory, GPU speed, etc. What do you think about the fact one day air cooling might not be enough anymore...... Have we reached the limit ? Now they are cramming jet engines into their cards, I guess that scenario is believable that one day high-end gaming card will come with liquid cooling support as a standard and requirement. or they will have to do some miracles and make those MFs significantly smaller and energy efficient. So I guess it's safe to say that Windows 11 won't have any new DX version
|
|
|
Logged
|
|
|
|
TheExDeus
|
|
Reply #29 Posted on: October 14, 2014, 04:21:21 pm |
|
|
Joined: Apr 2008
Posts: 1860
|
So I guess future iterations of graphic cards will be about performance enhancements through optimisations, better chipsets, better memory bandwidth, etc, basically we can't expect any major "feature"release per se, I mean that makes sense, probably with existing even last generation cards you have more than enough to render anything but it comes down to memory, GPU speed, etc. To reiterate - the only thing people needed was the GPU to programmable. That is it. That is the only "feature" that is required. Like CPU's. They don't have any new "features", they just keep getting faster. Same with GPU's. Like if you gave a brush to an artist, would he be able to draw less than with photoshop? Probably not - he will just draw slower. So does it require DX12 to render a 100% realistic human head? No, even DX9 can do it. Software render's can do it, so even GM can do it. What do you think about the fact one day air cooling might not be enough anymore...... Have we reached the limit ? Now they are cramming jet engines into their cards, I guess that scenario is believable that one day high-end gaming card will come with liquid cooling support as a standard and requirement. or they will have to do some miracles and make those MFs significantly smaller and energy efficient. Newer GPU's get more and more power efficient. So it's possible that the current cooling will hold for some time. It's possible that in the future you wouldn't even need a fan. The smaller the chip process gets, the smaller the power consumption. The smaller power consumption means smaller heating.
|
|
« Last Edit: October 14, 2014, 04:23:21 pm by TheExDeus »
|
Logged
|
|
|
|
|