This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 »
616
Tips, Tutorials, Examples / Re: Multitextured Terrain Example
« on: July 04, 2014, 03:32:33 pm »Quote
I wonder exactly how Studio handles backwards compatibility of shaders.I don't think they have such a thing. They added shaders only in GM:S, when they added GLES. So all their shaders are just like ours in GL3. They never allowed GL1.1 shaders. So we do support all their shaders (when properly written) and I have tried some of them.
Added shading to terrain and per-pixel lights:
617
Programming Help / Re: OGL Texture interpolation and transparent textures
« on: July 04, 2014, 10:20:41 am »
Josh, that function was added in GM8.1.139. I don't remember having it in GM6. I also think there was a function added to actually change the z-buffer operator.
edit: draw_set_alpha()? If you create models by adding your own vertices (draw_vertex(...)), then just use the one with color.
edit: draw_set_alpha()? If you create models by adding your own vertices (draw_vertex(...)), then just use the one with color.
618
Tips, Tutorials, Examples / Re: Multitextured Terrain Example
« on: July 04, 2014, 09:21:55 am »
Note: This is a reply to a message Robert sent me, but I post it here as it should be clarified for everyone.
TL;DR - It's very unlikely that your GL1 shader code will be compatible with GL3. The only way to do it would be either "#define gl_vertex vec4(in_Position,1.0)" or try changing it with "glSetAttributePointer" - both of these methods should be impossible, as it's a built-in variable which is unchangeable. It might even be better to make a rudimentary shader convertor, as most of the conversion can be done via "Find and Replace".
Here is actually a water example I made using this example. The water has real-time reflections and is transformed in vertex shader to have waves (very basic though). I plan to add per-pixel lights (the moving ball), water refraction, sun and some other stuff. Maybe animated trees or other props.
The FPS is quite junk on my old ATI (yes, it's actually an ATI) laptop though, but that is to be expected. Especially when the water consists of 260k vertices which are transformed every frame. But on my Nvidia it runs with no problem.
Quote
I wanted to let you know that this terrain demo I made a while back works in OGL1 but still does not draw anything in OGL3 for some reason, it uses custom shaders because the terrain is multitextured. All it draws is the text.The reason it doesn't render anything in GL3 is because GLSL is not compatible between GL1 and GL3. And it never will be. So if you make something that uses shaders, then you must choose - either use GL1 or GL3 graphics system, as it won't work for both. Another way to do it is to write two versions for each shader and load during runtime. The specific reason why this example doesn't render in GL3 is because the vertex shader looks like this:
Code: [Select]
varying float height;
void main() {
gl_TexCoord[0] = gl_MultiTexCoord0;
// Set the position of the current vertex
gl_Position = ftransform();
height = gl_Vertex.z;
}
The deprecated stuff is this - "varying" no longer is used, "gl_TexCoord[0]" and "gl_MultiTexCoord0" is no longer defined, "ftransform();" and "gl_Vertex" is no longer used. So basically 90% of that shader is deprecated. The reason why we can't make it backwards compatible, is because built-in attributes like "gl_Vertex" require a special function (glVertexPointer) C++ side to work, while in GL3 position is just a general attribute. So we cannot use GL3 attribute function to set gl_Vertex. So the correct shader is something like this:Code: [Select]
out float v_height;
in vec3 in_Position;
in vec2 in_TextureCoord;
out vec2 v_TextureCoord;
void main() {
v_TextureCoord = in_TextureCoord;
// Set the position of the current vertex
gl_Position = modelViewProjectionMatrix * vec4(in_Position,1.0);
v_height = in_Position.z;
}
Attributes like "in_Position" and "in_TextureCoord", as well as matrices like "modelViewProjectionMatrix", are predefined by ENIGMA, so it is a lot easier to write shader code (+it's compatible with GM).TL;DR - It's very unlikely that your GL1 shader code will be compatible with GL3. The only way to do it would be either "#define gl_vertex vec4(in_Position,1.0)" or try changing it with "glSetAttributePointer" - both of these methods should be impossible, as it's a built-in variable which is unchangeable. It might even be better to make a rudimentary shader convertor, as most of the conversion can be done via "Find and Replace".
Here is actually a water example I made using this example. The water has real-time reflections and is transformed in vertex shader to have waves (very basic though). I plan to add per-pixel lights (the moving ball), water refraction, sun and some other stuff. Maybe animated trees or other props.
The FPS is quite junk on my old ATI (yes, it's actually an ATI) laptop though, but that is to be expected. Especially when the water consists of 260k vertices which are transformed every frame. But on my Nvidia it runs with no problem.
619
Programming Help / Re: using gamemaker tutorials / docs for enigma?
« on: July 03, 2014, 01:43:55 pm »
I'd say in 99% of cases GM tutorials will work with ENIGMA. For example, I looked over the Platform tutorial you posted and I don't see why it wouldn't work in ENIGMA. Seemed compatible, but note, that the site has syntax rendering problems, for example, this:
The only big difference I know should still be an issue, is that boolean FALSE in GM is anything less than or equal zero (<=0), while in ENIGMA it's only equals zero (==0) because we use C++ standard. This is an issue in GM collision functions. GM has functions, like collision_point() which returns ID of an instance that collides with a point, but it returns a special value "noone" when no instance collided. "noone" is actually a value of -4, so in GM this is valid:
Code: (gml) [Select]
if (vsp < 10) vsp += grav;
Should be this:Code: (gml) [Select]
if (vsp < 10) vsp += grav;
The only big difference I know should still be an issue, is that boolean FALSE in GM is anything less than or equal zero (<=0), while in ENIGMA it's only equals zero (==0) because we use C++ standard. This is an issue in GM collision functions. GM has functions, like collision_point() which returns ID of an instance that collides with a point, but it returns a special value "noone" when no instance collided. "noone" is actually a value of -4, so in GM this is valid:
Code: (gml) [Select]
if (collision_point(...)){ //GM evaluates to TRUE when any instance is collided and false otherwise, but ENIGMA would always return TRUE
}
While ENIGMA requires this:Code: (gml) [Select]
if (collision_point(...) != noone){ //Evaluates to TRUE when any instance is collided and false otherwise
}
620
Proposals / Re: ENIGMA + LGM = 1 Tracker
« on: June 29, 2014, 04:57:45 pm »
The reason they have different trackers is because they are actually different projects. LGM was before ENIGMA existed and ENIGMA was standalone. With the new addition of CLI, ENIGMA is stand-alone once again. So tying them together is not correct in the software sense. Way back LGM and ENIGMA didn't even really share developers. LGM was made by Ism with little contribution to ENIGMA (as far as I know) and ENIGMA was made by Josh with little contribution to LGM. It's just now that both have a common developer - Robert. So I don't think we should tie them together.
Maybe just make a page that shows both trackers together or at least links to them. Now we have "tracker" in top row, which doesn't mention on what it is used for. And as people keep thinking LGM as being part of ENIGMA, then the problem is getting worse. We should explicitly say that.
Maybe just make a page that shows both trackers together or at least links to them. Now we have "tracker" in top row, which doesn't mention on what it is used for. And as people keep thinking LGM as being part of ENIGMA, then the problem is getting worse. We should explicitly say that.
621
Programming Help / Re: How can I use large arrays more than 30 MB?
« on: June 29, 2014, 07:24:57 am »
Josh, is that valid EDL now? Previously most of that stuff didn't work. I'll check in master.
Darkstar2, as Josh mentioned the problem is you probably end up using more than 2GB of ram. Look at task manager. Normally you would be able to do it like this:
edit: Jup, Josh it doesn't work for the same reason my posted code doesn't work - "a[ i ] = 65" parses as "a(i) = 65", and so arrays like these don't work. That is why we are waiting for your parser sooooo patiently.
Darkstar2, as Josh mentioned the problem is you probably end up using more than 2GB of ram. Look at task manager. Normally you would be able to do it like this:
Code: (edl) [Select]
size = 40000000;
local variant a[40000000];
for (int i = 0; i < size; ++i)
{
a[i] = 65;
}
But sadly the parser doesn't support this yet.edit: Jup, Josh it doesn't work for the same reason my posted code doesn't work - "a[ i ] = 65" parses as "a(i) = 65", and so arrays like these don't work. That is why we are waiting for your parser sooooo patiently.
622
Issues Help Desk / Re: Font resource not displaying in game after lgm/enigma restart.
« on: June 29, 2014, 07:15:56 am »
If you use <GM:S formats then you won't be able to save fonts, because now they use ranges for unicode support. I actually don't know if there is any backwards compatibility for that anymore. So Robert, if I load GM6 and saves as GMK I will loose fonts?
623
Issues Help Desk / Re: script threading does not work
« on: June 29, 2014, 07:11:35 am »Quote
In contrast with what Harri just said, the new compiler does support atomic types however.How is that "in contrast" when I didn't even mention atomic types? I didn't mention anything tied to compiler. I mentioned that in my test sprite_add indeed doesn't seem to work correctly in threads.
edit: Josh, what "legal reason" you are talking about? Something to do with google? Like if you work on ENIGMA your 1 day off, then they own it?
624
Off-Topic / Re: True Valhalla
« on: June 28, 2014, 07:16:21 am »
I have actually seen his "book" a few years back. Or at least few pages of it (of course I didn't buy it... the language he used made me think he is very young and inexperienced), and I also remember that it was totally awful. It isn't worth 5$, let alone 20$. It didn't have ANY useful information in it (that at least couldn't be gotten in the internet for free) and it was very short and badly written. So I believe anyone who bought the book probably regretted it right away. Only a 12 year old would think that the thing is a good deal.
But I really don't care about the guy. There are a lot like him on the internet, trying people to pay large money for total junk. His "engine" also seems like something you make in 200 lines of code, yet he sells it for 60$. By some extent, he really could be called a scammer.
But I really don't care about the guy. There are a lot like him on the internet, trying people to pay large money for total junk. His "engine" also seems like something you make in 200 lines of code, yet he sells it for 60$. By some extent, he really could be called a scammer.
Quote
PS He must be stoned on some pretty hard liquor if he really think he can convince everyone that he really makes that kind of money he claims to get in the mail regularly. I'm telling you this guy's on drugs. A fucking pothead. XDHis numbers aren't really unreal. They are actually quite low considering he has been doing it for 2 years now. Earning 2k$ to 20k$ a month isn't that hard online. There are many who makes crappy 5 day HTML5 games and get that amount of money. If 5k people buy it for 1$, you already have 3k$ (redacted some things companies usually take away), but that is a small part of community taking into account million upon millions of users who use Android or iOs. So even crappy stuff often sells, but it does require a little promotion.
625
Developing ENIGMA / Re: Command Line Interface
« on: June 28, 2014, 06:56:37 am »Quote
I don't like the idea of discontinuing the IDE and building everything from a command-line.
Quote
#1) I did read your post, what caught my attention is you mentioning that one could create an entire project without an IDE, which is was hard for me to digest.
Quote
So TKG making an entire game entirely from non IDE, I would be shocked. I'd love to know how he will do this, given his games are big is he going to place all those resources by mentally calculating their X Y coordinates ? How can you build a game entirely from CLI, you are not very convincing in selling this, since you are asking people's opinion.......I love how Darkstar missed the point entirely - when he was notified of this, he was still missing the point for several posts more. I have noticed that the level of misunderstanding on this forum is unbelievable massive. It's like if we didn't know english and used google translate to talk to each other, even when many here have english as the first language.
Quote
1) LGM does not have a guaranteed format, it lets you save and load from all formats.But should we support all formats in CLI? I think we should only support compiling from egm. If a person wants to use the CLI, he should convert to egm.
Quote
2) This would require that you create the project directory when you want to make a new game, like in Studio, you couldn't get away with adding resources and saving later because the IDE has to have it on disk.Couldn't the IDE just save to temp folder whenever you press "Compile"? And save to the project file when pressing Save.
Quote
3) GMK is damn near impossible to optimize as a result of it being one giant ass file, you can't easily skip through parts of it and write a single resource.I think we shouldn't be supporting it.
Quote
4) You probably would not be able to build without saving your changes. I believe Studio has this same restriction.But what if you checked if the project is modified from what is loaded or not, and then save to temp folder with the changes if it is modified, or don't save it and just compile the project file if there are no changes? Easier way is to write out to temp folder whenever you press "Compile", but then it might be slightly slower (depending on how fast egm can be written).
Quote
1) Treat formats like GMK as an import/export where you can import a GMK but it will automatically be converted to GMX or EGM whichever format you choose.Exactly, use EGM internally.
626
Issues Help Desk / Re: script threading does not work
« on: June 27, 2014, 05:07:56 am »
Everything should work in threads. If something doesn't because of things like race conditions, then you will get wrong results (or crashes), but it would still work asynchronously. So your problem is somewhere different probably. I will test.
edit: I just tested and I also have some problems with sprite loading. The data it loads is pure white (1,1,1,1) for every pixel when loading from a thread, but loads fine otherwise. It could be a problem with data locality, as it might not be allocated correctly outside the thread. Should be investigated. But it seems to be a problem with sprites function and it works with _bin function (as it returns data directly instead of in the background like resource functions do).
Darkstar2, remember that you cannot use local variables in threaded scripts. For example:
edit: I just tested and I also have some problems with sprite loading. The data it loads is pure white (1,1,1,1) for every pixel when loading from a thread, but loads fine otherwise. It could be a problem with data locality, as it might not be allocated correctly outside the thread. Should be investigated. But it seems to be a problem with sprites function and it works with _bin function (as it returns data directly instead of in the background like resource functions do).
Darkstar2, remember that you cannot use local variables in threaded scripts. For example:
Code: (edl) [Select]
//Create of object
a = 0;
//Script
a = file_text_read_real(...); //Object.a is still 0, because this a is local for the thread
//Correct script
obj_someObject.a = file_text_read_real(...) //Now you specify the variable correctly
So you must either specify the object, make the variable global or pass the object as an argument (you can do "argument0.a" and call it by "script_thread(script,self.id)".
627
Issues Help Desk / Re: So much for variable type declarations
« on: June 27, 2014, 05:04:24 am »
Yeah, you can use "var" just fine. It's just that it is slower, because it's Jack Of All Trades kind of type.
Also, the error is because "string" is a function in GM and in turn ENIGMA - 'draw_text(10,10,"Lives = "+string(lives)")'. So it probably never will work, because I don't know if you can use function name as a type name. Especially when using a type as a function, like int(5.4)==5, is already defined in C++ spec. So it's possible you will never be able to use "string" as a type name. Or we will have to rename the function, because in most languages it is called something like "toString" instead of "string". But that will break GM compatibility once more.
Also, the error is because "string" is a function in GM and in turn ENIGMA - 'draw_text(10,10,"Lives = "+string(lives)")'. So it probably never will work, because I don't know if you can use function name as a type name. Especially when using a type as a function, like int(5.4)==5, is already defined in C++ spec. So it's possible you will never be able to use "string" as a type name. Or we will have to rename the function, because in most languages it is called something like "toString" instead of "string". But that will break GM compatibility once more.
628
General ENIGMA / Re: GL3 lights fixes
« on: June 24, 2014, 05:33:03 pm »
Some things might be broken. I haven't tested much of it, because I just test an example I need to fix. In this case I worked on it until Project Mario worked. Then I made a simple light test for a custom shader (per-pixel lights) which also worked the same in GL3 and GL1. So until I get more broken examples, I won't be able to do much.
So first try GL1. If it works in GL1, then GL3 needs to be fixed. If it's broken in GL1, then you probably have messed something up in the code.
So first try GL1. If it works in GL1, then GL3 needs to be fixed. If it's broken in GL1, then you probably have messed something up in the code.
629
General ENIGMA / Re: Compatibility flags.
« on: June 24, 2014, 10:54:48 am »
So to fix a game imported in ENIGMA it would require you to add one line of code? Adding a special compatibility for that doesn't sound like a good idea. We already are going into deeper and deeper sh*t because of GM compatibility. Right now we only try to target GM:S even that should be taken with a grain of salt.
630
General ENIGMA / Re: GL3 lights fixes
« on: June 21, 2014, 05:22:11 pm »Quote
Unless you mean that ENIGMA does some default shaders behind the scenes (that I'm not aware of)?OGL3 ditched FFP (fixed function pipeline) - this means that if you want to render ANYTHING (like a single triangle) you need to write your own shader. In the last 3 months or so I have slowly made the change to a custom shader like that in OpenGL3 graphics system in ENIGMA. And that shader is what was the problem. I originally wrote it without support for lights, but now I implemented them. They are not exactly the same as GL1, but I'd say close enough. There was a problem though, that AMD didn't render them correctly. Now I did a dirty hack to fix it. Now it should render on both AMD and Nvidia and they should render the same. You can find the shader here: https://github.com/enigma-dev/enigma-dev/blob/e16c591c36a437c5fad73674f7f750856eb91bb6/ENIGMAsystem/SHELL/Graphics_Systems/OpenGL3/GL3shader.cpp in the lines 68 to 217. So it's a simple vertex lighting based shader that uses phong shading. So the problem is that you see messed up model at the title screen. And the rest worked correctly? Can you post screens of the Mario head and the game itself? Just to see if the shading is in right direction. I am mentioning normals, because they are basically the only thing that impacts the calculation of the color. Either that or the position of the lights. But both of them should be exactly the same.
Quote
If it did use shaders I would rather expect them to look the same between even OpenGL versions.I would expect them to not look the same between OGL versions, because as the developer (us) writes the shader, then we might not be doing the same calculations GL1 implementation were doing. But I would expect GL3 to look the same between different hardware (to a degree), because we all hardware should do the same calculations (the ones we ask them to).
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 »