ENIGMA Development Environment
Website is in read-only mode due to a recent attack.

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - Josh @ Dreamland

406
Issues Help Desk / Re: Windows 7 x64 and ENIGMA?
« on: March 27, 2014, 02:47:45 PM »
Is the file marked as read-only? It's conceivable (though doubtful) that Windows is somehow treating the executable flag as a read-only flag, but it seems the more obvious cause is that the file was marked read-only by GCC.

Executable files become read-only on Windows when they are in use. Was LGM running when you tried to compile the DLL, or something? I'm not sure what would be running arbitrary executables written to TEMP on his machine, so that probably isn't the case on his, but if you could find a way to reproduce that problem on your machine, we might have better luck.

407
Issues Help Desk / Re: Windows 7 x64 and ENIGMA?
« on: March 27, 2014, 09:46:03 AM »
I'm going to assume that by /temp you mean %TEMP%, or whatever. That should be in the user folder on Windows; if they don't have r/w permission to it, something is terribly wrong with their installation (aside from the obvious problem, being that it runs Windows). Both of these folders should be configurable, though; especially in the case of Linux, wherein users might want the objects path under ~/.cache/enigma or even /tmp instead of ~/.enigma.

Or did you actually mean /temp? This signifies, on new Windows distributions and ALL POSIX-compliant operating systems including BSD (Mac) and Linux, the root of the filesystem. So probably C:\ or D:\ on most Windows installations. For future reference to any interested party, ~ signifies the home directory of the current user, so for me, it translates directly to /home/josh/, and in general, ~you/ translates to /home/you/. The more you know ♪

Anyway, the ability to customize those directories is a must. If there isn't already a way, don't bother; I'll add one.

408
Programming Help / Re: libGME
« on: March 27, 2014, 09:36:39 AM »
You need to install libGME on your computer. If you're on Ubuntu, this is as simple as sudo apt-get install libgme-dev. If you are on Windows, you will need to find a distribution of it and add it to your MinGW installation manually.

Aside from that, the extension should work fine. We could add a function to ENIGMA (and so to the IDE) to give a list of functions, globals, and locals provided by a given extension. In fact, the compiler already does this internally; I can modularize that segment of it so that the IDE will have access to it in the future.

409
Works in Progress / Re: Attack of the Naked Blockheads 3D
« on: March 26, 2014, 08:27:54 PM »
The board only shrinks images; it doesn't generate thumbnails. :P

410
Programming Help / Re: Access instance variables stored in ds_list
« on: March 26, 2014, 09:31:28 AM »
Possibly a segfault—does it happen in debug mode? I'll test this out later to see what happens on my copy.

411
Issues Help Desk / Re: Windows 7 x64 and ENIGMA?
« on: March 26, 2014, 09:22:15 AM »
Why does ENIGMA still need admin rights on Windows? The entire reason I put up with cheeseboy's changes in the merge was because it fixed that problem on Linux, which has a much stricter permissions hierarchy than Windows ever did. It writes all files to %APPDATA% now, and its only requirement is to have MinGW installed, which happens on C:\, to which everyone has access. If that isn't the case on newer Windows distributions, it should work fine from your user folder, too. What about ENIGMA still requires elevated permissions?

412
Issues Help Desk / Re: Enigma to C++
« on: March 26, 2014, 09:03:10 AM »
We need an FAQ page, because someone literally asks this question monthly.

Anyway, yes, your code is translated to C++. You can find the C++ source of your game by looking in your ~/.enigma/ folder. At the moment, it's under ~/.enigma/Preprocessor_Environment_Editable/.

For Windows users: Your game source is written to %APPDATA% under the same name.

Presently, we discourage this, as your translated source is not pretty and has dependencies on various aspects of the ENIGMA engine.

413
...

Harri, I am talking about the DirectX sampler vs the OpenGL sampler.

And yes, I meant naming texture coordinates. And the point of the fix is to make sure that the user *never* notices that GL's coordinate system is flipped. WE would be modifying the shader, which, as I've said eleven times, is probably what the graphics driver is doing for GL programs (as opposed to DirectX programs!). Users can name texture coordinates they send to the GPU anything we like. So we have to modify them only once the sampler is invoked. The only alternative is to track every vector that goes to the sampler and somehow modify it at its first assignment. Since users can create vec2s in their code, that's not practical.

414
Harri, you keep showing me that image, and it shows me two different sampler behaviors. Yes, the coordinate (0.5, 0.25) gives the same pixel in both systems, so long as the image is loaded upside-down. And I believe you're saying GL is taking care of that for us. That's good and well, until FBOs happen. Then it becomes SURPASSINGLY APPARENT that the two samplers are behaving differently. One of them expects the texture to be upside-down, and in the case of the textures we have given it the data to load, it is correct. Then the user starts rendering to a framebuffer, and suddenly, it's wrong, and the fact that the samplers ARE behaving differently begins to show. You have to recognize this.

And you can name your texture anything you like. There is no limit to how many texture coordinates you can have; you can sample any number. We can't know them all.

415
Quote from: Harri
You mean projection code?
No, I mean the code in screen_redraw was literally doing checks to see if we're inside a framebuffer. Doing a check at any point to tell if we're in a framebuffer is disgusting.

If you want my honest opinion, we ought to draw *everything* to a framebuffer and only at paint time render the buffer to the screen. That will make draw_line scale with the window like it's supposed to and make all our checks for "LOL ARE WE IN A FRAMEBUFER?" true. Meaning we can remove that nonsense from the projection code entirely.

Related: I was wanting to move screen_redraw into universal. You may have noticed that I separated individual pieces of that function out into their own methods. This was for clarity, yes, but also because only those methods contain any intimacy with the rest of the graphics layer. The rest can be moved to universal, provided those functions are added to namespace ENIGMA in the mandatory header.

Quote from: Harri
And why wouldn't it check that? How exactly is that slow/bad? Because the truth is that FBO and main framebuffer are two different beasts in this regard.
In general, it is bad for a system to have logical dependencies on other systems. I know that we have isolated the GL graphics system as its own module, and so it seems that extreme intimacy inside that module is acceptable, but to a prospective contributor, this will appear extremely tacky, and if at any point a person forgets that a check for that nonsense needs done, we have an easter egg of a problem. And I don't mean to pull the slippery slope card, but from my experience, these hacks tend to compound. Again, screen_redraw was pretty bad.

Quote from: Harri
That is probably why GM:S now renders everything on a surface by default and only then draws the surface to screen.
As mentioned, proper primitive scaling is another reason.

Quote from: Harri
We maybe would need to consider the same, but compatibility with 10 years old PC's would be a problem.
Ten-year-old PCs will use GL1 binaries, which are designed for people who don't take gaming seriously. Their cards either always multiply by a matrix or just have shitty GL support. Essentially, the problem doesn't apply either way, for better or for worse.

Quote from: Harri
The custom matrix classes and functions were made just for GL, because >GL3.0 no longer has FFP and any matrices whatsoever.
So you are trying to maintain forward compatibility. So that GL1 games can run on modern machines and old machines, while GL3 games only work on modern machines. I suppose that's acceptable, and probably beneficial as (1) it gives us more control over multiplication order (which was a problem for compatibility in the past) and (2) it gives users access to those matrices for math, in addition to a possible (3) that when lots of matrix math is done, communication with the graphics card can be reduced substantially.

Quote from: Harri
The only difference between them is that one is transpose of the other. And DX uses it's own matrices with its own functions, so they don't interfere with one another.
Giving the users access to those matrices is to our advantage. It lets them query their own vector transformations to determine physical points on screen. That is an insanely powerful feature that was completely missing from GML. The matrices being a transpose of one another will just further confuse the user. "WHY IS THIS SYSTEM GIVING ME THE MATRIX ROTATED?" Bug report inbound.

Quote from: Harri
Driver DOESN'T do any math. The samplers in driver level ARE IDENTICAL. At least they should be, because the math works the same either way. Look my previous post with the sampler picture. So of course while I cannot be certain they work the same, I see no reason why they wouldn't.
If there is no math being done at the driver level, then it is being done at the software level. We are seeing different behaviors; the GL and DX samplers are "upside-down." So if it's not being done at the driver level, that's fantastic! That means we can probably tell it we would like the DirectX sampler instead of the GL sampler, and then our problems are solved.

But outside of fairytale land, I believe it is up to the driver to adapt the software as required so that GL applications have an inverted sampler. How this is done would be a black box to us. It's possible the hardware could offer two samplers, but I am doubtful. It's possible the hardware has a "take 1-y" line. But I am doubtful. I'd guess that on old hardware, there was always a matrix multiply for the sampler, which is why it's part of the GL API. On new hardware, I believe that nine times in ten, the hardware sampler has only one function, and the driver augments GLSL shader scripts to take 1-y. If you can prove me wrong, that would be great. Especially if you do so by finding a way to make the sampler behave properly.

Quote from: Harri
I no longer believe this is a solution, because then we need to flip textures given by LGM. This means it will break DX unless we make an exception for it. So we end up having custom LGM code just for each graphics system.
LGM should be giving texture data to us in the format most immediate to itself, which is probably right-side up. If GL or DX has to flip this texture to load it correctly, so be it. But what you might be missing is that it is not an option to only flip the projection: as soon as you start texturing in a surface, you'll find those textures are upside-down in the end. Using the GL1 API, we can invert the y-values for both the projection and the sampler matrices, thus mimicking DirectX's behavior. I see no reason to not do this.

If it's of any consolation, I am planning on this new compiler supporting scripting. So graphics systems will, ideally, be able to supply code to invert texture data at compile time, if need be. The new JDI I am currently hooking up has very nice AST evaluation support, so I believe EDL scripting in the compiler is going to be an option.

Quote from: Harri
It would be a lot faster to do this in vertex shader (then the inversion is only done once per vertex and interpolated when passed to pixel shader).
Blazingly so. Too bad we don't know which values to invert.

Quote from: Harri
We need to flip textures in memory and that means LGM needs to write those textures flipped.
That never followed, period. LGM conveys the data. What we do with it does not concern LGM in the slightest, regardless of the availability of compiler scripts.

416
One of my biggest problems with this "let's just flip all the projections" shit is that the code had creeped its way into the view code. The screen_redraw code had its fingers in every system. It wasn't just checking room variables, it was checking horrifying shit like "are we in an FBO?" or "is the moon currently waning or waxing?", and contained a copy of the entire function for each of these cases. It was then that I deliberately broke surfaces by removing the over-involved logic, in an attempt to get the original author of that segment to find a better way of dealing with the problem. Little did I know that this is problem is a very artificial yet extremely well-defined difference in the two APIs which is probably impossible to change directly, which would be preferable to working around. I'd much rather find a way to use Direct3D's sampler instead of OpenGL's, so that no additional math has to be done, but unfortunately, it seems that this is impossible without improper intimacy with individual graphics drivers.

As such, it is my belief that our best bet is to maintain this probably-leaky abstraction in the projection functions (asking for an orthographic projection gives you two fucking different matrices when GL is the system vs DirectX, which will probably confuse some poor bastard to tears later on), and to do these for the sampler:

In GL1, use glScalef(1,-1);- at the beginning of the game, as I had pointed out long ago and you had suggested earlier.

In GL3, create a macro for the sampler call; ie, #define texture(sampler, coord) texture((sampler), invY(coord)), where invY takes vec2 in and computes y = 1-y in it, returning the result. We then pray that the optimizer gets it, because that almost certainly does more math to undo math already added by the driver.

417
Rendering is easily fixed with a projection, which makes sense; it isn't as though it screws with projection matrices you set yourself.

But I'm so ashamed right now I'm just going to stop talking.

Code: (C++) [Select]
    enigma::Matrix4 orhto;
    orhto.InitOtrhoProjTransform(x-0.5,x + width,y-0.5,y + height,32000,-32000);

That is the single saddest thing I've ever seen.

For future reference, O-R-T-H-O. And 32000 isn't a very special magic number.

I can't even find something to look at and say, "yes! everything about this is right!"

418
Damn it. That still isn't what I'm trying to convey. I know that the two systems use a different point as (0,0). But the hardware doesn't. The hardware has some representation of (0,0) that is independent of GL or DirectX. And yet, both APIs' methods are supported, by the driver. How is it doing this? How does a GL application say, "I am GL! Treat (0,0) as bottom-left!"? How does a DirectX application say, "I'm Direct3D! Treat (0,0) as top-left!"? And more importantly, can we do the same? Can we say, "I'm ENIGMA! (0,0) is top-left!"? Can we lie and pretend to be DirectX?

I don't know who owns the code that forms the difference between these two. That's what I am trying to figure out. The physical sampler probably isn't doing this translation live. It's probably done by the shader compiler or by the driver, and unfortunately, I'm leaning toward the latter. Either way, we need to figure it out before we go hacking in a fix. I'm curious as to how Angle deals with this.

419
General ENIGMA / Re: Please vote for ENIGMA's new license
« on: March 20, 2014, 11:31:38 PM »
Most people with GPL games simply accept donations. An alternative is to have a purchase page, and offer a free download for users "who have already paid." Of course, they're not obligated to be honest, legally, but it prevents pirates from selling your game for you. You can also do some GPL trolling of your own to ensure that any third-party who is distributing your game is following the license and crediting you in the distribution.

An evil ENIGMA contributor suing you is an unlikely scenario given the overhead of filing a court case, the unlikelihood of being represented by the FSF's lawyers given that the act would be contrary to the spirit of this project, and the statistical unlikelihood of financial payout. But if you are making heaps of money, that starts to go away, and the odds of someone evil wanting their cut is greatly increased. So users would fear their ENIGMA games growing big while they are violating the GPL.

Regarding FlightGear: That's what allowing you to relicense your game is for. But I would argue that in that case, those who paid SHOULD be upset—from what I'm hearing, they are successfully selling it only because their brand is more popular, and it isn't their work to be sold. In your case, however, you are the original author: people shouldn't feel cheated buying your game from you. I got Cave Story for free on the PC. Nicalis teamed up with Pixel and re-released it for the PC and Wii. I bought a copy, not because I really cared about the special features, but because it was an easy way to show my support. I bought Portal 2 for $50 knowing full well I could get it for $20 or less the next month. I'd do it again. I had the option of pirating all of those games illegally, of course, but even if I could have legally obtained them for free, I would have paid for them. People have a lot of success with pay-what-you-want digital property.

But anyway, I am not advocating that all of our users switch to GPL. The GPL is just my personal philosophy. I would encourage it where applicable, but getting it to where users can select there own license is still a priority. I am just trying to make sure that any further proprietary game development environments are not my fault.

420
General ENIGMA / Re: Please vote for ENIGMA's new license
« on: March 20, 2014, 10:31:52 PM »
No one can sue you except the authors of the code you used. So, only members of the ENIGMA team. But notably, any member of the ENIGMA team who can prove you are using his or her code. So the odds of someone becoming evil are increased by some 600%, and more so by the fact that some of us are only just becoming financially independent. It's a risk, for users.

But yes, you're at risk of litigation whether you use ENIGMA right now or use GM:S. If you're playing by the rules of the GPL, there's nothing that we could successfully sue you for. But that means payments for your games would be on the honor system, and you wouldn't have special and exclusive rights to them, and that's definitely something for us to be concerned with. Petty theft of simple but fun games is more common than you'd think.