sorlok_reaves
|
|
Posted on: January 15, 2014, 07:10:34 pm |
|
|
Joined: Dec 2013
Posts: 260
|
Hey all, I've found a really nasty bug that corrupts any multi-sub-image texture when it's swapped out and reclaimed. Observe: The first pic shows what happens on master; the second pic shows the correct behavior. The patch is here: https://github.com/sorlok/enigma-dev/compare/texturecache_fixAnd my test game: https://drive.google.com/file/d/0B1P7NepPcOslbDlZanZNLUg2WDQ/edit?usp=sharingEdit: Second test game, same glitch, no sub-images: https://drive.google.com/file/d/0B1P7NepPcOslQVNBYjE1SlNKV0k/edit?usp=sharingHowever, I really need someone to test this on other backends. In particular: - DirectX9 -- I cannot, for the life of me, manage to get Enigma to work properly from SVN on Windows.
- OpenGL3 -- Everything compiles and runs, but I get no textures whatsoever (even on master).
(I've confirmed that my patch works fine on OpenGL1.) Once I get confirmed tests for these two platforms, I'll issue a pull request. I would really appreciate your help in this; Platform-specific bugs are a real pain, and this one affects everyone. Thanks~
|
|
« Last Edit: January 17, 2014, 01:57:39 am by sorlok_reaves »
|
Logged
|
|
|
|
Goombert
|
|
Reply #1 Posted on: January 16, 2014, 05:45:35 pm |
|
|
Location: Cappuccino, CA Joined: Jan 2013
Posts: 2993
|
Please see my comments on the GitHub issue tracker. https://github.com/enigma-dev/enigma-dev/issues/615DirectX9 -- I cannot, for the life of me, manage to get Enigma to work properly from SVN on Windows. To work with DirectX as a developer you need to install the DX SDK, however, there is only one problem with that, actually several. Microsoft's code is only compatible with MSVC so you need specific headers for DX that are made for MinGW which is in our ENIGMA redistributable which you can only obtain otherwise with MinGW versions that include DirectX from the WINE team. OpenGL3 -- Everything compiles and runs, but I get no textures whatsoever (even on master). Interesting, are you on Linux? Linux is already known to not be drawing with OpenGL 3 for myself and one other user as a result of a change I made in my model class most likely a result of IEEE float specification. Can you also please test some other very basic games to see if they draw for you with OGL3? These are the two things I need to know in order to get OGL3 working for everyone again.
|
|
|
Logged
|
I think it was Leonardo da Vinci who once said something along the lines of "If you build the robots, they will make games." or something to that effect.
|
|
|
sorlok_reaves
|
|
Reply #2 Posted on: January 16, 2014, 05:58:27 pm |
|
|
Joined: Dec 2013
Posts: 260
|
Please see my comments on the GitHub issue tracker. https://github.com/enigma-dev/enigma-dev/issues/615 Thanks, I've added a reply. To work with DirectX as a developer you need to install the DX SDK, however, there is only one problem with that, actually several. Microsoft's code is only compatible with MSVC so you need specific headers for DX that are made for MinGW which is in our ENIGMA redistributable which you can only obtain otherwise with MinGW versions that include DirectX from the WINE team. Ah, that explains a lot! Now that I know where to look, this shouldn't be too difficult. Thanks for clearing that up. Interesting, are you on Linux? Linux is already known to not be drawing with OpenGL 3 for myself and one other user as a result of a change I made in my model class most likely a result of IEEE float specification. Can you also please test some other very basic games to see if they draw for you with OGL3? These are the two things I need to know in order to get OGL3 working for everyone again. Yep, running under Linux. Several other "test" programs I wrote don't work with GL3, but I haven't tried to narrow down the problem (since I thought it was only affecting me). Since it's a known bug in ENIGMA, I'll track down more info on the GL3 bug, and I'll try some more well-known games.
|
|
|
Logged
|
|
|
|
Goombert
|
|
Reply #3 Posted on: January 16, 2014, 06:51:30 pm |
|
|
Location: Cappuccino, CA Joined: Jan 2013
Posts: 2993
|
I am a tad busy at the moment, but... Yep, running under Linux. Several other "test" programs I wrote don't work with GL3, but I haven't tried to narrow down the problem (since I thought it was only affecting me). Since it's a known bug in ENIGMA, I'll track down more info on the GL3 bug, and I'll try some more well-known games. Try to do a simple draw_text call. I believe the issue lies in my changes to this file... https://github.com/enigma-dev/enigma-dev/blob/master/ENIGMAsystem/SHELL/Graphics_Systems/OpenGL3/GL3model.cppAs a result of me switching to use a union because IEEE does not guarantee a float will always be 4 bytes. Thus the purpose of the union is to allow storing color as 4 bytes. If you could fix that bug you would be my hero, since I can only test from Windows right now and was unable to fix it in my Ubuntu VM as was Greg unable to fix it on his actual distro.
|
|
|
Logged
|
I think it was Leonardo da Vinci who once said something along the lines of "If you build the robots, they will make games." or something to that effect.
|
|
|
|
|
sorlok_reaves
|
|
Reply #6 Posted on: January 17, 2014, 01:32:15 am |
|
|
Joined: Dec 2013
Posts: 260
|
Heh, either or actually First, your hunch is a very good one; doing this fixes everything: typedef gs_scalar VertexElement; Isn't what you're trying to accomplish the whole point of GLfloat? From the specification: "A four-byte precision IEEE 754-1985 floating point variable." ...and it's supported on all versions of Open GL (Linux just typedefs it to float). For Windows+DirectX, just use gs_scalar (or whatever) as you know the platform. Any concerns with just doing this and calling it a day? typedef GLfloat VertexElement;
|
|
|
Logged
|
|
|
|
|
Goombert
|
|
Reply #8 Posted on: January 17, 2014, 11:20:51 am |
|
|
Location: Cappuccino, CA Joined: Jan 2013
Posts: 2993
|
Any concerns with just doing this and calling it a day? We can't do that because, a color element always has to be 4 bytes. You can't pack an RGBA color into a float because of the IEEE specification. All that needs done I think is the sizeof(gs_scalar) calls changed, but I am not exactly sure what to change them to.
|
|
|
Logged
|
I think it was Leonardo da Vinci who once said something along the lines of "If you build the robots, they will make games." or something to that effect.
|
|
|
sorlok_reaves
|
|
Reply #9 Posted on: January 17, 2014, 12:19:36 pm |
|
|
Joined: Dec 2013
Posts: 260
|
Any concerns with just doing this and calling it a day? We can't do that because, a color element always has to be 4 bytes. You can't pack an RGBA color into a float because of the IEEE specification. All that needs done I think is the sizeof(gs_scalar) calls changed, but I am not exactly sure what to change them to.
But a GLFloat is always guaranteed to be at least 4 bytes. I think the whole point of the GL types is to ensure you have enough bytes to store things like RGBA.
|
|
« Last Edit: January 17, 2014, 12:24:39 pm by sorlok_reaves »
|
Logged
|
|
|
|
Goombert
|
|
Reply #10 Posted on: January 17, 2014, 02:36:16 pm |
|
|
Location: Cappuccino, CA Joined: Jan 2013
Posts: 2993
|
I just tested that and the color components do not come out correctly on the cubes demo with GLfloat. Try copying and pasting this code into GL3ModelStruct.cpp and see if it works. http://pastebin.com/8BrQwT8zIt works for me, if it works for you than that is the fix we need.
|
|
« Last Edit: January 17, 2014, 02:56:58 pm by Robert B Colton »
|
Logged
|
I think it was Leonardo da Vinci who once said something along the lines of "If you build the robots, they will make games." or something to that effect.
|
|
|
|
Goombert
|
|
Reply #12 Posted on: January 17, 2014, 03:52:25 pm |
|
|
Location: Cappuccino, CA Joined: Jan 2013
Posts: 2993
|
If this earlier paste does not work. http://pastebin.com/53CMPb6fPlease try this paste in the same file. http://pastebin.com/3sWsAYbKI need to know whether each of them works or not. This is the single biggest thing stopping me from continuing my work on OpenGL3. The next step is to replace the FFP and do custom matrix stuff. Then it should be mainly a copy and paste effort into GLES, and we could probably get Android working again. This is what I've been trying to do this whole time.
|
|
« Last Edit: January 17, 2014, 07:09:30 pm by Robert B Colton »
|
Logged
|
I think it was Leonardo da Vinci who once said something along the lines of "If you build the robots, they will make games." or something to that effect.
|
|
|
sorlok_reaves
|
|
Reply #13 Posted on: January 17, 2014, 07:22:47 pm |
|
|
Joined: Dec 2013
Posts: 260
|
If this earlier paste does not work. http://pastebin.com/53CMPb6f Please try this paste in the same file. http://pastebin.com/3sWsAYbK
I need to know whether each of them works or not. Neither file works. I've tried drawing text, sprites, and primitives, and all that shows is the background color. Is your cube demo available somewhere? I might have more luck debugging the incorrect colors of GLFloat than the "can't see anything" that I'm getting now.
|
|
|
Logged
|
|
|
|
Goombert
|
|
Reply #14 Posted on: January 17, 2014, 07:29:33 pm |
|
|
Location: Cappuccino, CA Joined: Jan 2013
Posts: 2993
|
That's not actually my cubes demo lol it's the GM Studio one that I use to show off how much faster ENIGMA renders it. https://www.dropbox.com/s/ro00kob8723vlsb/CUBES.zipAlso please try this file. http://pastebin.com/AeDrDsVFI doubt you'll be able to pack an RGBA 4 byte color into GLfloat I already tried extensively, but be my guest :\
|
|
|
Logged
|
I think it was Leonardo da Vinci who once said something along the lines of "If you build the robots, they will make games." or something to that effect.
|
|
|
|