This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 »
631
Issues Help Desk / Re: Constants don't work?
« on: June 21, 2014, 01:54:18 pm »
When will you finish it and merge it? :|
632
Issues Help Desk / Re: Constants don't work?
« on: June 21, 2014, 08:59:30 am »Quote
It should work. Don't ask me why, i didn't make the parser !The code in "Definitions" is not parsed. It needs to be valid C++. In valid C++ you need to have that semicolon at the end (it doesn't matter if it's on one line or several). So it's a typo in the wiki and someone should fix it.
633
General ENIGMA / Re: GL3 lights fixes
« on: June 21, 2014, 08:24:26 am »Quote
This problem could just be solved by either using OGL 1 for GM compatibility, or using OGL 3 with pixel shader.But you have the second problem in GL3 with pixel shaders? Because GL3 always uses pixel shaders. But I am not sure why you have that awful looking lights though. On my AMD and Nvidia cards I don't see any real difference between GL1 and GL3.
Quote
The lighting is awful. It's either a difference in video cards, or just difference between the OGL1 and OGL3 drivers.Drivers are the same for GL1 and GL3. And in GL3 there should be no difference between cards. That is the good thing about shaders - they should, in theory, look exactly the same on all hardware (when coded properly), because the dev is the one who codes all the math. It's like having "2+2" being different on different platforms. So it seems this could be a problem related to my code in GL3shaders.cpp. But it also seems it could be an issue with normals. In this case it's maybe because the normals are loaded incorrectly? But you use Win8/Win8.1 as me, and we should see the same. Maybe the model is changed or corrupt?
Quote
That's what I keep trying to say Harri. At any rate, I am not for dumping GM compatibility altogether but having an ENIGMA spawn off project.I don't want to make two projects. I just think we should stop worry about GM compatibility and add our own functions. Most of the things will still be compatible though, but we will just have more functions which do different things. Like we could make default rendering to be deferred, for example, and that shouldn't really break code compatibility. It will just look different on screen, but it looks slightly different now anyway. We could really make a shader/material editor, for example. And even though GM would not be able to load them, we could save them in .egm and live with that.
edit: Alright, I will merge this. Because it seems it at least works better than before, even if not perfect.
634
General ENIGMA / Re: GL3 lights fixes
« on: June 20, 2014, 07:55:25 pm »Quote
It looks fine on my machine (GeForce GTX 760, Linux, Open GL3). Unfortunately, I do not have a Radeon machine to test this on.That is good news.
Quote
Your lighting demo is just a black screen for me. I tried pressing del/insert, moving mouse, and using WASD (in case camera was off) but it's always pure black..That is bad news.
1) Try commenting out glsl_program_default_set() and glsl_program_reset() lines (first 6 lines in draw even of the single object). Then run in GL1.1. And then run in GL3 and see if you get the same lights.
2) Try downloading the egm again. I changed it so you start in the middle of the lights (previously starting position was the player turned away, thus you see only black screen initially when not using mouselook).
3) Try looking at console output of the program, it will output information about shaders at startup.
Quote
The logo shown at the beginning of the game looks hideously ugly, but everything else looks fine.What is ugly exactly? Does it look the same as in GL1.1? That is what we are trying to replicate and so GL3 won't look any better by itself. Although I do plan to make a topic about this - maybe it's time to ditch GM and it's rendering and go for something better? It will break compatibility slightly (especially 3D games), but would open a lot of doors for improvements.
635
General ENIGMA / Re: GL3 lights fixes
« on: June 20, 2014, 05:14:50 pm »Quote
* OGL3 is marginally slowerYes, for me it's also 1200FPS in GL1 vs 1140FPS in GL3. But some things to consider:
1) I had to add that stupid hack for AMD to work, so now it's possible it sends useless data (I lost about 30FPS there on my Nvidia card).
2) I doubt it's that easy to get custom FFP emulation as optimized as the hardware and driver designers did for GL1. If you want GL3 to do the same as GL1, then it will probably be slower. The idea is that it should do more, like per-pixel lights, shadows, deffered shading and so on - things that GL1 cannot do.
3) There are of course more optimizations to do, but I don't think it would increase the speed that much. One optimization I have to finish is caching, as the only uniforms still not cached are matrices. That would potentially remove some useless GPU calls, but as uniforms are the fastest thing to do on the GPU, then I don't think it would impact that much.
Quote
* OGL3 lights appear darkerI used the same algorithms as defined by GL1 spec, but sadly hardware developers implemented it differently. Like GL1.1 looks different on Nvidia and AMD. The difference is negligible and unnoticeable when not comparing side by side. That is a good reason to write your own shaders, because now GL3 should look identical on both Nvidia and AMD.
If it works for at least one other person, then I will merge it.
636
General ENIGMA / GL3 lights fixes
« on: June 20, 2014, 07:58:26 am »GL3 light fixes
So I implemented lights in GL3 some time ago, but I didn't do it in Master branch because AMD fails. It worked fine and dandy on Nvidia, but on AMD it didn't render models with lights for some reason. After days of testing I still didn't figure out what the problem is, but I did found a workaround. I send color data to GPU even if they are not used. This reduces FPS slightly (goes from 1170 to 1150 on Nvidia), but now it at least works on AMD. Additional investigation should be done into that matter, but at least we can have lights now.So this is how it looks like on AMD with GL1 and GL3:
You can see that they are not pixel identical, but very close and this is about the difference you might even expect from different card vendors when using FFP (GL1). But at least AMD draws everything now.
Default shader override
The default shader (per-vertex lighting which emulates FFP) is quite bad and basic. That is the idea for now, as we wanted both systems to be compatible. But if you want to make something that actually looks good, then you might have to use your own shaders. This is of course possible, but until now you couldn't override what is actually the "default shader". So your code looked like this:Code: (EDL) [Select]
glsl_program_set(shr_my_cool_effect_shader);
//Draw the cool effect
glsl_program_set(shr_my_default_shader);
//Draw normally with a custom default shader
instead of this:Code: (EDL) [Select]
//Create
glsl_program_default_set(shr_my_default_shader);
//Draw
glsl_program_set(shr_my_cool_effect_shader);
//Draw the cool effect
glsl_program_reset();
//Draw normally with a custom default shader
It is not a big change, but the reason I point this out is because we now could theoretically allow extensions which override this shader automatically. Like imagine a "Per-pixel lighting" extension, you Enable it and it automatically sets the shader. The extension doesn't even need to have functions. Sadly though, extensions still can't modify or inject code into events. So right now it's not possible to do it without a function, but it might be in the future. Like I made an example:Here I switch between shaders with keyboard:
Code: [Select]
if (keyboard_check_pressed(vk_insert)){
glsl_program_default_set(shr_per_pixel);
glsl_program_reset();
}else if (keyboard_check_pressed(vk_delete)){
glsl_program_default_reset();
glsl_program_reset();
}
The glsl_program_reset(); is required because the _default_ functions don't actually bind the program. They just set what the default one is.Faster speed
I can't be sure about AMD as it was broken before, but my Nvidia implementation works a lot faster than before. The current master is actually very broken, I have about 250fps on it. After fixing lights I got it up to 1090FPS (which is what it was before and is with GL1), but I also did additional caching and now get about 1180FPS, so there is an improvement. Others should probably see it too.What I need from you
Please test these changes, by downloading from this branch - https://github.com/enigma-dev/enigma-dev/commits/GL3-cleanupEspecially AMD users should test. Try with Project Mario as well as other examples and games. You can also test this: https://www.dropbox.com/s/r17cwch4962voi8/3D_Lights_test.egm - It's the lights demo where you can see the default FFP emulation and the sample per-pixel lighting. You can switch between them using Delete and Insert, look around with mouse and move with WASD.
It would also be useful to know FPS in examples like Project Mario before (the Master branch) and after (this one).
If everything is okay and nothing else breaks, then the branch could be merged.
637
Programming Help / Re: Project Extentions
« on: June 20, 2014, 03:26:45 am »Quote
Actually it does convert images to png, whether you use jpg, bmp, gif, etc. that's what I noticed when I opened an EGM using winrar.True. So yes, it changes that format, but it changes it to lossless PNG, so my point is still valid. You won't get any generational degradation.
638
Programming Help / Re: Project Extentions
« on: June 19, 2014, 05:26:33 pm »Quote
When you say compressed does it compress everything ? Meaning if I use uncompressed resources in my games will saving to EGM cause recompression each time ? that sucks major balls. Or is it just using non lossy compression or what ? I know that EGM files can be opened with 7z, WInrar, etc.It compresses the files, not the format. So no, it's not lossy and doesn't degrade your images or sounds. It's a regular .zip.
I know it does compress images but it is my understanding it uses PNG. So does this mean each time I save it recompresses a generation on image, sound ,etc ?
639
Programming Help / Re: Project Extentions
« on: June 19, 2014, 12:18:55 pm »
I use gmk (GM version 8) and egm. Now I mostly use EGM as it has stabilized a little.
640
General ENIGMA / Re: Timelines won't update correctly after saving.
« on: June 17, 2014, 01:54:32 pm »
I remember the cool ones which allowed you to customize it, like put blocks of rubber with letters and symbols on the bottom. Then there are the ones which allow you to select dates and numbers on the top, but they are less customizable. Haven't really seen a stamp in a few years though.
641
General ENIGMA / Re: Timelines won't update correctly after saving.
« on: June 17, 2014, 01:38:44 pm »
"Rubber" in "Rubber stamp" refers to the STAMP, i.e., the bottom that makes the image (http://en.wikipedia.org/wiki/File:MODOInkStamp.jpg). http://en.wikipedia.org/wiki/Rubber_stamp
642
Issues Help Desk / Re: Josh! Relieve my penis!!
« on: June 17, 2014, 09:16:28 am »Quote
I shouldn't have to change the display resolution to make the game scale right.You don't have to change the resolution. You have to set the port correctly.
643
General ENIGMA / Re: Timelines won't update correctly after saving.
« on: June 17, 2014, 09:02:02 am »
Rly guy? It's a freaking rubber stamp (http://en.wikipedia.org/wiki/Rubber_stamp).
644
Works in Progress / Re: Project Mario
« on: June 16, 2014, 01:16:34 pm »
Yeah, I am fixing GL3 right now. I fixed lights in this branch (https://github.com/enigma-dev/enigma-dev/commits/GL3-cleanup), but AMD keeps crashing on it. So I need to fix that.
645
Issues Help Desk / Re: Josh! Relieve my penis!!
« on: June 15, 2014, 12:40:15 pm »
Why doesn't it return function not found error and instead a syntax error?
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 »