Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Topics - forthevin

Pages: 1
1
Hi everyone. I am happy to say that I have created an extension and library for doing automated integration and regression testing. The extension has been committed and is called IntegrationTesting, and the library has been pushed to a branch called integration_testing_develop: https://github.com/enigma-dev/enigma-dev/tree/integration_testing_develop. The test system has been designed such that tests can be run and output results by developers as well as automatically.

In regards to running it automatically, I think it would be a good idea if we also looked into using a Continuous Integration (CI) system. A CI system does a number of different things, but is basically about continuously building a project, running its tests, generating test reports, etc. automatically. For instance, you could make a CI system that builds a given project every time a commit is made to the master branch of the main repository, run all tests with the newest version, generate test reports, and report if there have been any regressions since the previous commit. This makes it really easy to track down when exactly a bug covered by a test was introduced.
There exists a number of existing CI systems. One open-source CI system which seems to be widely used is Jenkins, which is used by Apache, Mozilla and others. I have added an output format for the generated testreport accepted by Jenkins' default JUnit-plugin for parsing test results. It's a bit hacky, since the testing system I have implemented is not based on JUnit, but I believe it should work well enough. If we need another test report output, I can write one for that as well.

I have uploaded the test report and an output version generated from the current tests such that you can see a bit of what it looks like:
Example of the general test report: http://pastebin.com/2756rkG5
Example of a HTML output version: https://www.dropbox.com/s/ycqp8ck13na1e4g/testreport.html
The HTML-version is somewhat simple, but should give a good overview over things.

As for the testing system itself, I decided going with integration testing instead of unit testing. Unit testing is basically testing each small part independently, while integration testing is about testing all or parts of the whole integrated system at once (such as running a game). Given that it is easy to create and run games, and that most of the APIs do not change a lot over time, I decided to make a test the same as a game. In effect, this means that each added test is another game that we test for compiles, runs and obeys some properties (for instance, for a graphical shapes test, if you draw a filled red rectangle, then the color in the center of the rectangle is going to be red).
Since integration tests relies first and foremost on the interfaces and APIs and not on the specific implementation, and that most of our interfaces and APIs do not change a lot over time, this means that the majority of the tests we write now will not need any changes or maintenance in the future. It also means that most tests will be valid for different implementations of the same interface, meaning that if a test is written for the graphics system OpenGL1, it will also be valid for OpenGL3 and DirectX.

The current extension and library should be ready for use, though I would like to get your general approval and comments of the system first. This includes the current feature set, the dependencies, what features ought to be available, etc.

In regards to features, the best links to get an impression of them is the following:
https://github.com/enigma-dev/enigma-dev/blob/integration_testing_develop/integration_tests/README.md
https://github.com/enigma-dev/enigma-dev/blob/integration_testing_develop/integration_tests/format/test_file_format
https://github.com/enigma-dev/enigma-dev/blob/integration_testing_develop/ENIGMAsystem/SHELL/Universal_System/Extensions/IntegrationTesting/integration_testing.h

In regards to dependencies, the testing system currently depends on Python 3.3. The main reason for choosing Python is partially that we already use Python (install.py), and partially that it is a cross-platform and expressive scripting language. The reason for using version 3.3 is a couple of features, including a timeout functionality when running commands outside Python, such as when running a test. I am a bit in doubt whether it is a good idea to rely on Python 3.3 given that it is relatively new, but if relying on Python 3.3 is not a good idea, it shouldn't be too bothersome to rewrite the libraries to be compatible with an older version of Python.

In regards to the near future assuming no major changes to the testing system, I will look at adding desired features to the testing system, fixing bugs in the testing system, writing tests and fixing issues. I will also look at spending time on writing and updating documentation, including on the wiki.

A final note is that running the system now (using [snip]python3 integration_tests/run_and_record_tests.py[/snip] on Ubuntu 13.04) will not give very interesting results, since the plugin needs to be built and uploaded to DropBox (the test system requires the plugin to have some more features, which I have committed to the master branch).

2
Off-Topic / Interview with Bjarne Stroustrup on C++11 and C++14
« on: June 05, 2013, 09:39:03 am »
I read an interview with Bjarne Stroustrup the other day: http://www.informit.com/articles/article.aspx?p=2080042. In the interview, they talk about a number of different topics, including C++11 and C++14. If you are interested in the new features of C++11 and some of the up-coming features of C++14, I think you will find the interview interesting.

3
Due to various issues in regards to changes to the master branch of enigma-dev/enigma-dev, I believe it would be useful to have some guidelines on how changes to the master branch should be handled.

The following guidelines are not complete, but should be a good starting point. If the guidelines at some point are accepted, I will post them on the wiki.

Guidelines for changes to the master branch.

The idea behind these guidelines is to help developers and contributors, old as new, making good decisions in regards to the master branch of the repository. It is based off common thought and experience between the developers, and is not meant to be fast and hard rules, but informing and guiding developers. It should be noted that "developers" and "contributors" are used interchangeably in these guidelines.

Change notes:

2013-06-06: Added information on how to use git and GitHub to handle changes.
2013-06-04: Creation of this document.

Purpose of these guidelines:

  • Create common perspective amongst developers on the purpose of the master branch and how to deal with changes to it.
  • Help make developers aware of some of the possible issues that changes can create.
  • Inform and guide developers in regards to how to deal with changes.

The purpose of the master branch of enigma-dev/enigma-dev:

The master branch is the generally used branch by both developers and users. Having a common branch enables quick fixing and development, and means users will not have to wait long before getting the newest additions or fixes. Given that ENIGMA is still not fully mature or feature complete, it makes sense to have a semi-stable main branch, which will rarely break games relying on core and established functionality, while still allowing quick development and experimentation on unstable and non-core parts. However, if the master branch breaks core or established functionality all the time, the master branch becomes very unstable for users, and developers will waste time on finding and fixing issues that used to be non-issues. For this reason, it is important to keep the master branch at least semi-stable, rarely breaking core or established functionality.

Possible issues with changes:

  • Buggy changes that cause regressions, either silently or loudly, resulting in runtime crashes, very poor performance, compile-time errors, etc. These are the most common types of issues, and can often be prevented by testing it with the relevant configurations.
  • Changes that handles an issue in one subsystem, but does not handle the same issue in other subsystems of its kind. These changes can be bothersome, because it may cause confusion about whether an issue has really been handled, and it takes time to determine where the issue is handled. Frequently, a lot of time can be saved later by ensuring that the issue is handled in all relevant subsystems, or at least creating an issue on the tracker that describes where the issue has been handled and not handled.
  • Changes that depends on specific tools and technologies, which can take considerable amounts of work to fix, especially if time pass and more and more of the code depends on it. Some of these changes are necessary and acceptable, for instance when making platform-specific changes in an isolated platform subsystem. Main ways to avoid or handle such issues is to follow standards when available, isolate technology dependencies in specific systems (OpenGL{1,3}, DirectX, OpenAL, SFML Audio, etc.) and using them through generic interfaces, and simply coding things in a way that is fully or mostly independent of the technology or tool used (using common subset of available features amongst main Make build tools). Another way is also to put off using bleeding edge features until support is mature and widely available.
  • Architecture and interface changes affect how the different systems interact. Bad interfaces decreases modularity and increases coupling and dependencies amongst systems, and generally make it more difficult to debug, test and develop systems. It can be difficult to determine or predict whether an interface will be good or bad, so thinking about and discussing important systems, as well as developing for change, is generally advisable. Isolating dependent code between systems in "bridges" is one way to decrease coupling in certain cases.

Dealing with changes:

When you are in doubt about a change, and whether it should be committed to the master branch, there are several options for dealing with that:

  • The first is to test the change out for relevant scenarios. This is often quite effective, and it is therefore recommended that you always do at least light testing of your change, but you should take care to test with all the relevant configurations. For instance, if you change the interface for a subsystem like Collisions, it is a good idea to test that each collision subsystem still compiles and works with the new interface. That said, testing does not cover everything, and it can sometimes be difficult to test things, but testing for simple compilation and running is generally useful and easy. The main exception is testing of different platforms; in these cases, it is a very good idea to get others to test the changes.
  • Taking time to look at the changes, investigating them, looking through them, etc., can be laborious but quite effective. It can also help to improve the changes themselves. This should be always be done to some degree when working on established and core features, to avoid easily avoidable issues.
  • Experiment! Make the change in a fork or a branch other than master, and experiment and work with it. If the system is not yet established and known to people not to be ready for use yet, you can also do experimentation directly in master. Just be careful not to do experimentation outside of the specific system.
  • Implement the change in a fork or branch, and get others to look at it. This takes the time of others, but may well save time down the road. This is very appropriate for changes that could cause considerable issues if faulty, especially if you have already tested, experimented and/or investigated the changes thoroughly.
  • Low-severity changes are much less risky to commit than high-severity changes.

When determining whether a change should be committed to the main branch, it is useful to determine what issues it has. However, it can be difficult to determine all the issues of a given change. To handle this, there are some general guidelines for estimating the potential severity of possible issues of a change.

  • If a change is fully isolated to a subsystem, getting the change wrong will only affect that subsystem and other systems that depend on that subsystem (directly or indirectly). The more systems that depend upon a given subsystem, the more things it can break.
  • Systems and features that are used by most users and developers directly or indirectly, such as most of the stuff in Universal_System, the graphics system OpenGL1, or the Precise collision system, will affect many. Conversely, systems that are only occasionally used (like the audio system, the None collision system, the datetime extension, or the particles extension), and which aren't depended upon by any other systems, will not affect nearly as many. As long as they still compile or are not by default on, breaking them does is not as severe. That said, breaking such system and letting them stay broken can cause problems down the line, and since they are used less, issues will generally be found and reported relatively later.
  • Unstable parts that aren't used and aren't meant to be used, will not affect users or developers other than the ones working on it.
  • Issues that prevent compilation or running games with ENIGMA at all are quite severe. It is generally easy to test for these issues, since if it compiles and runs, then it shouldn't be a problem. The main issue is if the changes has platform-specific parts. Testing for several different platforms can be bothersome or impossible, depending on hardware. Use a fork or a branch to share and have an easier time with testing or in order to get others to test on the platforms you do not have access to that it really does not break things.
  • Platform-specific issues can be bothersome to find, debug and fix. Therefore, a platform-specific issue is generally more severe than a non-platform-specific change when all else is equal.
  • Changes that deal with interfaces and architecture can have long-term issues, and they can be difficult to test. Thinking about them and discussing them with others is generally recommended to help ensure that the changes are good, as well as seeking to make a flexible design that allows correcting some mistakes in the interface. A good thumb of rule is that the less dependencies and the less coupled the changes make the interface, the less that can go wrong.
  • If there are parts of the potential issues that you are not confident you know sufficiently about, determining the severity can be difficult. Learning more about those parts and related topics can be recommended, as well as asking others for advise on the topics as well as the changes in question.
  • One other factor to consider is the severity of the alternatives. If the change is definitely better than the current status, it may be better to commit and describe the uncertainties regarding the changes, such that they can be looked at later by yourself and others.

Using git to handle changes:

git offers several features which can be used to handle changes. One of these features are branches. See http://gitref.org/branching/ for information about what a branch is. In regards to changes, the master branch is the main branch used by users and developers alike. By creating and using a separate branch from the master branch, you can do any changes you want without it bothering anyone else, and let others look at it, possibly fix any issues, and then merge it into master.

For examples on how to use branches, see post 1 and post 2.

Using GitHub to handle changes:

Another feature is the fork of GitHub, which you can find more information about here: https://help.github.com/articles/fork-a-repo. Basically, a fork creates a whole new repository based on the new one, where you can do any changes you want, and once you would like others to look at it or get it merged into the main repository, you can make a pull request where you describe the changes and other relevant details. If more work is needed, you can make more commits, and it will be reflected in the pull request.

4
General ENIGMA / Testing and logging fps on different systems
« on: May 25, 2013, 08:14:15 am »
As you may know, there are big issues with the fps on several different systems. In order for me to figure out the issues and fix them, it would be very helpful if I could get some data on the fps on different systems. I have already fixed the issues on the systems I have direct access to myself.

What I would like you to do is to download the following test, run it with ENIGMA, and copy the results from the resulting file "fps_loggin_data.txt", along with information about your OS, the version/distribution of it, and what CPU your computer has/how many cores it got. If you are using a VM, you should instead skip this test, since VMs tend not to be reliable in regards to precise timing.

The link to the test: https://www.dropbox.com/s/9p7kszhy2sngns1/fps_logging.gm81

5
Tips, Tutorials, Examples / Modern OpenGL tutorial series
« on: May 21, 2013, 12:03:47 pm »
I found a new tutorial series about how to use modern OpenGL. There are current 3 articles in the series:

Getting started: http://solarianprogrammer.com/2013/05/10/opengl-101-windows-osx-linux-getting-started/

Drawing primitives: http://solarianprogrammer.com/2013/05/13/opengl-101-drawing-primitives/

Textures: http://solarianprogrammer.com/2013/05/17/opengl-101-textures/

The articles explain things bit by bit, and the source code for each program can be found on GitHub.

6
Works in Progress / Rules
« on: February 21, 2013, 01:53:35 pm »
This forum is for discussion of game projects you are working on and share your games for everybody. Useful feedback includes bug reports, suggestions, praise and criticism of the game, with the purpose of helping make the game in question better. Please stay on topic here, we have other boards for posting bug reports including a tracker that is integrated into the website and an off topic forum.

Works in progress that are posted here should be generally playable and usable, somewhat bug-free, and be realistic in their scope. They do not have to be filled with content, but there should be some gameplay available, such that useful feedback can be given on the game. You should also read the universal board guidelines and rules as they are in effect site wide.

General Guidelines and Rules

* DO NOT post spam or post virus-infected downloads, or your account may be terminated.
* Prank games and programs are only allowed if very clearly labeled, with instructions for disabling the program.
* Keep feedback to constructive criticism: do not bash other members' games for unrelated personal matters.
* Try to limit the number of images you have in your topic; nobody likes taking 6 hours to load a page. There is no set limit; if moderators feel you have too many, they will inform you.
* Try to stay on topic and not hijack a members topic bragging about how many kills you got on Call of Duty last night. It is disrespectful to them, and we have an off topic forum for such discussion.

Posting Criteria


* Please include screenshots and images in your post. Other users may like to see these before they download. Consider DropBox, ImageShack, Tinypic, PostImage, ImageUpload, etc.
* Information regarding the download size of your game is also helpful for other members.
* Try to be concise without cutting out important details in your description. It is also nice to include information regarding the genre, and other specifics.


Rules by forthevin and Robert B. Colton.

7
Proposals / Games category
« on: February 20, 2013, 06:53:18 am »
I would like to propose that a new forum category is created, called "Games", with two forums "Completed Games" and "Works in Progress". The purpose of the "Completed Games" forum would be to show and discuss newly created games, and the purpose of the "Work in Progress" forum would be to show off works in progress and discuss them as they develop.

I believe these forums would be very useful to have for users, and that they supplement the Games section on the website. Furthermore, ENIGMA has been mature enough to make complex games in for a long while, so I think the time is right for adding these forums.

8
General ENIGMA / The particle systems extension is complete
« on: February 09, 2013, 02:27:53 pm »
I am pleased to announce that the particle systems extension is complete. Turn it on under Enigma->Settings->Extensions->ParticleSystems in order to use it.

All particle and effect functions and actions have been implemented and tested. It includes attractors, destroyers, deflectors and changers.

The performance of the extension is near the level of GameMaker. It is able to update and draw 10.000 particles at 30 fps on my hardware with a quad-core processor and an Intel integrated graphics card. My latest profiling indicated that updating and drawing take about the same amount of time, with drawing taking slightly more, so optimizing either updating or drawing makes sense. As a side-note, the initial profiling and optimization made the drawing about 10x faster, since it turns out glPushAttrib can be pretty expensive inside a loop. Further profiling and optimization improved the performance slightly. The updating is fully single-threaded, though there are some parts of the updating which could be parallelized.

The particle systems extension is generally decoupled from the used graphics system. Most of the extension is fully separate from the graphics system used. The graphics-dependent parts are handled by requiring the used graphics system to implement a function to draw a vector of particle instances. The only coupling remaining in the particle systems extension is a few includes of OpenGL/GScolors.h, which only uses the non-drawing functions of the header.

9
Proposals / Proposals for implementing ASTOperator for EDL_AST
« on: January 20, 2013, 10:53:21 am »
I have looked at the To Do for "Create ASTOperator class for EDL_AST" for the parser at Trello, and have come up with some ways that it may be implemented. I don't see a straight-forward way to do it, since the operator class and the AST nodes in JDI are tightly coupled - the operate method refers directly to ASTOperator/ConstASTOperator, and the ASTOperator/ConstASTOperator have methods specific to each node. The most straight-forward way I can think of would be to make a new class, maybe EDL_ASTOperator, and let it inherit from ASTOperator. The problem with this is that the AST nodes in JDI can only call ASTOperator, not EDL_ASTOperator. I haven't found any solution which is generally good.

I have therefore come up with a couple of different suggestions on how the problem may be tackled:

1: Simply put more methods into ASTOperator and call it a day. The advantage of this method is that it is very easy and should work without any problems in the short-term. The disadvantage is that it makes JDI dependent on EDL, and it seems like JDI is meant to be an independent project separate from EDL.

2: Introduce constant values for each node type, and let operators on the nodes simply cast to the given node type based on the constant values. The advantage is that it would decouple the nodes and the operators, and shouldn't take much time to implement. The disadvantage is that it would make more boilerplate code necessary in the operators, and that it would throw type-safety away.

3: Use a template in the AST for which operator type to use. The advantage is that it would decouple the nodes and the operators like 2, and it would keep type-safety. The disadvantage is that it seems non-trivial, it would take considerably more time than the other options, and both the non-operator parts and the general use of ASTs would be affected.

10
General ENIGMA / Gamasutra article on GameMaker's recent DRM problems
« on: December 06, 2012, 01:45:48 pm »
I found an article on gamasutra.com regarding GameMaker's recent DRM problems, where non-pirated versions of Studio permanently overwrote images and sprites with a skull and crossbones image. There are also some interesting comments, including people who buy Studio and then use a pirated version because they find the pirated version more convenient:

http://www.gamasutra.com/view/news/182558/GameMaker_DRM_goes_berserk_defaces_dev_work.php#.UMDl49GYNok

11
Proposals / Automatic updating for particle systems
« on: December 06, 2012, 01:35:34 pm »
I have implemented a very basic version of particle systems, which can be seen in this fork: https://github.com/forthevin/enigma-dev. Recently, I have worked with implementing automatic updating of particle systems. In GameMaker, particle systems can be either updated automatically (the default setting), or updated by hand. I have determined that GameMaker updates the particle systems between the end-step events and the draw events.

The issue is now that updates and changes in GameMaker generally happens as part of instances and their events, and not outside them. This is reflected in ENIGMA's handling of events, since function calls (as far as I can tell) cannot be put directly in between events. Since particle systems are independent of instances and their events, they are an exception to the rule. Therefore, updating particle systems between the end-step event and the draw event seems non-trivial, since only events can be updated between events.

I have found two solutions to the problem. The first is to create a hacky event, that does not go through instances, but just calls a single function. It could look like this in event.res:

Code: [Select]
particlesystemsupdate: 100000
Name: Particle Systems Update
Mode: None
Default: ;
Instead: enigma::update_particlesystems();

The above code would be placed in between the end-step and draw events. This code solves the issue. The only thing of notice is that it also generates a superfluous virtual function in the object superclass. This solution is currently implemented in the fork and works.

The second solution I have found would be to extend the format of event.res, such that the format allows the above solution but without generating a superfluous virtual function in the object superclass.

My suggestion is now that the first solution is used as it is, while the second solution is postponed until further notice. Since the superfluous virtual function is never called, it shouldn't cause much (if any) overhead.

12
Proposals / Collision shape options
« on: October 06, 2012, 10:32:31 am »
Currently, the structure for sprites does not contain information for which kind of shape is wanted for a given sprite, such as bounding box, precise, ellipse, diamond or polygon mesh.

I propose a simple system, where a single integer field, for example "collisionshape", is used as an enum, and where each value indicates a given desired shape, for example:

0: bounding box

1: precise

2: ellipse

3: diamond

4: polygon mesh

Different collision systems can interpret the wanted shape differently. The BBox collision system can for example always use bounding boxes independent of the wanted shape, and a polygon collision system could model bounding boxes, ellipses and diamonds as polygons internally and ignore the precise shape. I think this is fine, because a collision system cannot handle a shape it does not support anyway, and it is very easy to change between collision systems. Basically, game developers should pick shapes that fit the collision system they use for a given game.

13
General ENIGMA / Liberated Pixel Cup beta-entry
« on: July 18, 2012, 07:12:32 pm »
Hi everyone,

I have finished the first beta of my Liberated Pixel Cup entry. It is an action-shooter based on classic fantasy. It currently features one chapter, including 3 combat sections.

For those who are curious, I have prepared some screenshots, binaries for 32-bit and 64-bit Debian Stable and 64-bit Windows 7, and instructions for compiling on Debian Stable and Windows 7.

Screenshots can be found here: http://imgur.com/a/FNMmN#0. They showcase the menu and some of the spells and enemies in action.

The binaries and source can be found here: http://www.sendspace.com/filegroup/aOcpWEWHw12Vgs7avF8VPrsXDybIwZNEr27sgNJAoI4. The Debian binaries require some libraries to be installed, but should work on both Debian Stable as well as Ubuntu 12.04. The source can currently be compiled on Debian Stable and Windows 7.

EDIT: Instructions on running and compiling can now be found in the source. Added link to version 47.
EDIT2: See update post below.

14
General ENIGMA / Enigma and the Liberated Pixel Cup competition
« on: June 26, 2012, 10:16:29 am »
Hi everyone,

I have a couple of questions regarding Enigma and the Liberated Pixel Cup competition that is currently running. For those that don't know, the LPC is a competition for free-as-in-freedom games, with a month for creating free art (which is nearly done, as of this writing), and a month for programming free games using that art (which is going to begin in a couple of days). You can find more info at http://lpc.opengameart.org/. I would like to use Enigma in this competition, and in that regard, there are a couple of details I am uncertain about and would like your opinion on.

The first is whether Enigma can be used in the competition. Any entries in the competition is required to be free software, and as far as I can see, Enigma fits the bill, although I would like to check with you before I enter the contest.

The second is the best way to employ Enigma and games created with it. I haven't had much luck getting Enigma running under Ubuntu 12.04 (something about a parsing error, I think), but Debian Stable seems to work just fine. Downloading Enigma and running the game source file on it seems a bit bothersome, but should be fine for the judges, and other users can simply download a binary if they can't be bothered to download Enigma and set it up. Is there any other options, or any advice regarding that?

Thanks for your time.

Pages: 1