Yves Posted October 18, 2013 Report Share Posted October 18, 2013 IntroductionThis article describes different aspects of the scripting integration in our game engine. It starts explaining the basics like the difference between a scripting language and a native programming language which should be comprehensible for everyone with a basic knowledge of computers.In the last paragraph about the ongoing Spidermonkey upgrade, the article goes into more detail and is targeted more at programmers and scripters who are already a bit more familiar with the topic.What is scripting?Basically scripting uses another layer between the operating system (Windows, Mac OS X, Linux) and the code being executed.This layer is a program that reads the script code and executes the commands and is called "interpreter", "script engine" or "script runtime". You don't have to compile scripts before they can be executed by an interpreter.ScriptsNative codeNative code on the other hand needs to be compiled before it can be executed. Compiling generates binary machine code that is specific to an operating system and even specific to a certain type of processor. For example you can't run the machine code generated for Linux or Mac OS X on a Windows system, but script code is the same on all three platforms and can run on all of them if there's an interpreter program available. An internet browser like Firefox contains a Javascript interpreter for example. You can display websites containing Javascript on all three operating systems, because Firefox contains the interpreter for those scripts.If you copy the .exe file of Firefox from Windows to Linux you won't be able to run it because it's binary machine code created specifically for Windows systems.Why scripting?With scripting languages you generally have less control over the low level parts of the program like memory access.First that might sound bad, but it also avoids a lot of potential errors the programmer could make. Bugs in code that accesses the memory directly usually can cause crashes or buffer overflows which are very severe problems and often also security issues. Interpreted scripts have an additional level of security because the scripts can only do what the interpreter allows them. Usually if scripts contain errors, an error message is printed to the screen and something in the application might not work as expected. It will be much less likely that an attacker can compromise the user's system if he finds an error in a script compared to finding an error in native code.From the security point of view, it would be very bad if not completely insane to download and execute untrusted native code. In addition to that it would also be quite difficult because the operating system and processor architecture need to match. Some scripting languages on the other hand are designed for that. Javascript can be used for websites and every time you access an unknown website containing Javascript, you execute untrusted script code in your browser.So let's sum it up.Advantages of scriptingMore or less "untrusted" code can be executed with little security concerns. In our case this can be used for automatic downloads of random map scripts, user defined mission scripts, user defined AIs or even GUI mods.Less low-level functionality makes scripts less vulnerable to security issues.Scripts generally allow faster iteration in the development process.Scripts are easier for modders because they only need a text editor instead of a lot of development toolsIn our case we attracted some developers with Javascript knowledge who probably wouldn't have joined us if we only used C++. This is probably different for other projects and not a general advantage of scripting though.Disadvantages of scriptingThe additional layer added by the interpreter and other characteristics of scripts make them run slower than native code.Maintaining a script interface and including a script interpreter in an application adds additional complexity and takes time.Where do we use scripting?Basically the idea is to use scripting wherever it's possible and makes sense performance wise.It doesn't really matter if it takes a few nanoseconds more until the game decides which piece of music to put next into the playlist. It can even do that before the previous piece of music stops playing, so it won't make a difference at all.On the other hand, it makes a difference for the player if pathfinding of hundreds of units on a map with a million of terrain tiles (little squares of terrain) runs in native code or in a slower script. If you order a unit to move somewhere and it takes half a second until it reacts, that's bad.The difficulty here is to find the right balance and to design sensible interfaces between native code an scripts.GUI (Graphical User Interface)Basically the whole GUI can be modified by changing XML files and Javascript scripts that are linked to actions in the GUI.The GUI system itself is written in C++ code, but XML files and scripts define how graphics are arranged and what happens when you click on a button. The Javascript code behind buttons might call native C++ function that are exposed to scripts though.Random map scriptsRandom map scripts are written in Javascript which makes them easy to share because they are meant to be shared and will probably even be downloaded automatically from other players in the future.Random map scripts aren't very time critical. The player shouldn't wait five minutes until a game starts, but it doesn't matter if it takes a few seconds longer. There was an idea recently that changed our view about that a bit though. It would be nice to generate previews of random maps before starting the game. This changes the situation a bit because the player might want to generate multiple maps before he's happy with the result. Now there are ongoing discussions about providing some commonly used functionality through native functions exposed to scripts.AI (Artifical intelligence)The AI is where using scripts is currently most controversial. The AI does a lot of calculations and works with a lot of data which runs too slow with the current scripts. Some people want to switch it to native C++ code completely, but that alone won't solve the problem completely. The better approach is probably switching the most performance critical AI functions to C++ and thinking about how they could be optimized at the same time. The less performance critical code can stay in the scripts.Of course this description of the solution is simplified and there are a lot of other aspects to consider which makes it more difficult than it might sound first.Gameplay logicGameplay logic includes everything that defines how the game plays like that there are units and these units have proerties such as "health" or "walk speed". Other examples of gameplay logic are technology research, Battle dection (for playing battle music), the implementation of trade, or how building works (using more builders makes building faster, only some units are builders etc.).We use a system called the "simulation" for implementing gameplay logic and associated functionality. The simulation consists of components ("Health" or "Builder" for example). Components can be written either in C++ or in Javascript.In this area we are quite flexible to switch between a scripted or a native code implementation.Scripting integrationScripting languageI've already mentioned that we have chosen Javascript as our scripting language. There were a few other candidates like Lua and I can't really tell why we have favoured Javascript.It's a very feature rich and flexible scripting language that is broadly supported and known by many people. While flexibility is its strength it's probably also its weakness. This flexibility makes it a bit slower and more difficult to be optimized by the scripting engine. It has also some quirks that modders and programmers need to know about in order to avoid trouble.In practice it's hard to compare performance to other scrip languages because benchmarks usually differ a lot from the code that is actually used in real world applications.Scripting engineThere are multiple different scripting engines available for Javascript. These are all designed for maximum performance and there's an ongoing competition between companies like Apple, Google, Microsoft and Mozilla about who has the fastest web browser and the fastest Javascript engine.These engines go way beyond running Javascript in a script interpreter as described earlier in this article. They gather information about the script code and optimize it while it's running. They even create platform specific assembly code, which is quite similar to what a C++ compiler generates. This is a very complex task and all these companies put a lot of man-years and money into their scripting engines.Those engines we could use are mainly Google's V8 or Mozilla's Spidermonkey. Other engines are not compatible with our licenses, only work on a specific platform or aren't available as standalone libraries.We have chosen Spidermonkey years back and it would be a lot of work to change to another library today if we wanted that for some reason. Here the issue with benchmarks is basically the same. There are benchmarks available but we had to learn the hard way that those benchmarks don't really reflect the performance in our game and we have to conclude that there's no easy way of telling if V8 would be faster or slower than Spidermonkey in our case. More on that later.Spidermonkey upgradeJavascript is based on the ECMA standard, but the API (Application programming interface) isn't standardized accross different scripting engines and not even from one version of a scripting engine to the other. This means that changing to another script engine or updating to newer versions is a very time consuming task.We are currently using Spidermonkey v1.8.5 which is the version used in Firefox 4. We have seen benchmarks showing major improvements since Firefox 4 and red a lot of technical articles describing new features solely designed to improve this performance even further. There was the addition of type inference in Firefox 9, the new JIT compiler called Ion Monkey was added in Firefox 18 and a new baseline compiler was added in Firefox 23.We somehow expected that these benchmarking results would be a bit exagerated and that the results would be less spectacular in real world applications like 0 A.D.. Still I was very disappointed after spending dozens of hours adapting to the new version just to notice that the performance in our case is even below the level of version 1.8.5. I've spent even more time and got help from a Mozilla developer but even though we were able to fix some performance related issues in Spidermonkey, we haven't yet managed to get back to the performance before the upgrade. It might be possible that the big changes made recently in Spidermonkey still need some more time until they are polished enough to improve the performance for normal applications and scripts which are not as well tested by the Mozilla developers as the benchmarks. On the other hand we have measured quite a lot of different code parts which more or less all showed the same result as v1.8.5, so I don't expect a huge speedup anyway.The following graph shows the duration of simulation turns in milliseconds. A simulation turn happens all 200ms by default and updates all the simulation components and does AI calculations. Basically it shows the performance as the game progresses going from the left side of the graph to the right. Lower values are better.Check the detailed description below the graph for details!Y-Axis: Duration in millisecondsX-Axis: Turn number divided by 20This was measured using a non-visual replay which only runs the simulation code and executes AI calculations. It's described in our wiki if you want to do such measurements yourself. Rendering overhead or network performance is not involved because no visuals are displayed and everything runs locally.The game here was a 2vs2 AI game with Aegis bots on the map Oasis 4. You can use this command to watch the AIs fight in the setup that was used for the measurements (but the real measurements were done using the non-visual replay)../pyrogenesis -quickstart -autostart="Oasis 04" -autostart-ai=1:aegis -autostart-ai=2:aegis -autostart-ai=3:aegis -autostart-ai=4:aegisNot everything on this graph runs in JS. There's the pathfinding and Simulation components that are written in C++, but it runs on the same version of our code and the only difference between the two graphs is the Spidermonkey version.Measuring is a difficult topic and also this measurement has a factor that makes it not 100% accurate. The new Spidermonkey changes the iteration order of some types of loops under some circumstances which causes the AI to behave slightly different than with v1.8.5. For example if it looks for an idle worker, it might pick another one and send him/her to gather wood. There seems to be no easy way to change this back to the old behavior for the measurements without affecting the performance negatively.You notice four things when looking at that graph,The longer the game runs, the slower it gets. This is mainly due to more units being produced which means it takes longer to filter the units by specific criteria, more units move and need to find paths to somewhere, more units are involved in range calculations etc. It's definitely too slow. Later the simulation update takes close to or even more than 200ms which means it will run in each frame. If something that takes more than 200ms runs in each frame, you won't get more than 5 FPS (1000[ms] / 200[ms] = 5 [FPS]). As I already said, the new Spidermonkey is slower. Somewhere after 6000 turns (300 * 20) the performance drastically reduces with the new Spidermonkey. I still have to figure out why this happens. Until now I've looked more at the peaks that happen between turn 0 and 6000. One positive exception regarding performance is the random map loading performance which is now much faster with the new Spidermonkey for those configurations I tested!The exact command used for this measurement was (and I added some code to print the duration):./pyrogenesis -autostart=alpine_lakes -autostart-random=123 -autostart-size=256 Conclusion regarding performanceThe conclusion for 0 A.D. is, that we have to redesign our current AI interface a bit. Long running performance intensive tasks need to run in native C++ code because only this way we have the fine grained control we need for optimization and only this way we get the performance we need. The challenge we face will be how the commonly used performance intensive tasks can be separated and moved to a C++ API without loosing the flexibility of scripting in our AI.Even though the performance aspect of the upgrade was a failure, it allowed us to see how far Javascript can get performance-wise and we don't have the expection anymore that it could suddenly run twice as fast with a new version of the script engine. We hope that the competition for better bench-marketing results between the major browser manufacturers will stop better sooner than later. For us a stable API would be much better than some new features aimed for performance improvements.Why the update is still required:We know that Javascript performance most likely won't improve a lot and can design our code accordingly.The Javascript library needs to stay up to date for security reason. This is not a real issue at the moment, but will become more important once we release the first stable version of 0 A.D..Linux distributions don't like to bundle thirth-party libraries with applications and they won't provide Spidermonkey v1.8.5 packages forever.It's important to know about several multi-threading related aspects that changed since v1.8.5 in order to design multi-threading sensibly for the parts that use scripting.The new version fixes bugs, and adds new features.A cleanup of our scripting related code was necessary. We used way to many different approaches which makes adapting to API changes harder, results in duplicated code and makes bugs more likely to occur. These issues were mostly grown historically.The new Spidermonkey is more strict about Javascript syntax and prints more warnings or errors that point developers or modders to potentially buggy code. We have already discovered some issues due to the new warnings.If we need any features or bug fixes in Spidermonkey, we need to have an updated version because Mozilla won't do any changes on v1.8.5.The next stepsThere's a work in progress patch for integrating the current Spidermonkey 27 development version into 0 A.D.This patch still needs a lot of work before it's ready and I'm going to commit independent parts of it step by step as I already did with some patches.I'm not yet sure which version we will exactly use. We should aim for an "ESR" release which is supported longer and for which a standalone library will be packaged by most Linux distributions. The problem ist that the next ESR version will be released somewhere around July 2014 which is too late. On the other hand it would be a waste to revert the API adaptions from ESR24 to v27 that are already included in the current patch.Links / ResourcesTrac ticket for the Spidermonkey upgrade with a list of related tickets: #1886Forum thread: Spidermonkey upgradeBugzilla meta-bug for performance fixes in Spidermonkey: Bug 897962 4 Quote Link to comment Share on other sites More sharing options...
infyquest Posted October 19, 2013 Report Share Posted October 19, 2013 Is Google V8 compatible with 0ad architecture? Quote Link to comment Share on other sites More sharing options...
Yves Posted October 19, 2013 Author Report Share Posted October 19, 2013 Is Google V8 compatible with 0ad architecture?We are using some Spidermonkey specific JS code and the whole API is different, so it would be more work to migrate to V8 than to upgrade Spidermonkey.Also as far as I know the V8 API isn't stable either, so that part of the problem would stay the same.I think nobody would spend more than 100 hours for a switch to V8 without knowing if there would be any benefit at all. V8 might be better in a lot of benchmarks but we have seen that benchmarks aren't directly related to the performance we get in the game. I wouldn't risk a guess if it performed better or worse for us. Quote Link to comment Share on other sites More sharing options...
infyquest Posted October 20, 2013 Report Share Posted October 20, 2013 so webkit cant be considered due to licensing and v8 re-engineering takes a lot a of time and effort Quote Link to comment Share on other sites More sharing options...
Yves Posted October 20, 2013 Author Report Share Posted October 20, 2013 so webkit cant be considered due to licensing and v8 re-engineering takes a lot a of time and effort I think Webkit/SquirrelFish should be compatible with our license, but It would also take a lot of time and effort to migrate.I don't know if its API is more stable. The most important resons for switching to another engine would be the quality of documentation (which is very bad for Spidermonkey) and the stability of the API.Anyway I think it currently isn't an option because of the sheer amount of work required for the migration. Quote Link to comment Share on other sites More sharing options...
abral Posted October 20, 2013 Report Share Posted October 20, 2013 (edited) Have you considered writing some of the AI modules in C++ and translating them in asm.js using emscripten? This should give you a good speedup without creating another API. Edited October 20, 2013 by abral Quote Link to comment Share on other sites More sharing options...
infyquest Posted October 21, 2013 Report Share Posted October 21, 2013 I think the moving some part of the AI code to C++ is in consideration.might have to wait for beta 16 or 17 for it Quote Link to comment Share on other sites More sharing options...
Yves Posted October 24, 2013 Author Report Share Posted October 24, 2013 I'm mostly offline during this and the next week, so there might be some delay in answering questions.I have thought about the ASM.JS part, but I came to the conclusion that it doesn't make a lot of sense in our case.First, there are practical reasons.ASM.JS was designed to overcome some of the performance limitations of Javascript like that more or less everything is a type-less object with an unknown number of properties that can theoretically change anytime. ASM.JS is basically valid Javascript with more information added to make it easier to understand and to optimize for compilers. At the same time it also makes it harder to read and understand for programmers, it adds an additional step that is similar to compiling of native code and it requires more tools and more knowledge to write it.The only advantage that remains compared to native code is the additional layer of security, but if the whole Javascript Engine gets more complex, that also involves security risks and more code which can contain bugs.In addition to that I have become very careful about benchmarks and I don't know how much ASM.JS would improve the performance in our case.In addition to these practical reasons, it's also conceptually strange to use ASM.JS in our case.ASM.JS was only designed because it would be nearly impossible to introduce a new scripting language which is better designed for performance in a short time and get all browser to support it. That's why they had to use something that's already there and already supported. If a browser doesn't support ASM.JS, it can still execute the JS code, just not as fast as it could with ASM.JS support.We aren't tied to web-standards, so we could just switch to another scripting language (theoretically, it would be a lot of work!) or write the performance-intensive parts in C++. Quote Link to comment Share on other sites More sharing options...
Jubalbarca Posted October 29, 2013 Report Share Posted October 29, 2013 The question (which has been hounding the AI guys since I was doing it!) is what you give up in flexibility to allow better performance.If I recall, the main issues tend to be things like distance calculations and movement stuff (in particular things like resources, where one was continually cycling through all the workers on the map, then for each of those calculating distances to every tree on the map to work out which was closest), but I don't know how integrated those are now anyway, I'm pretty out of the loop.Out of interest, as a general ballpark idea, if one literally hardcoded the entire AI in C++ would all the above times become negligible/near instantaneous? I don't really know how big the JS penalty is and how much it's just that the AI is making too many calculations - because the alternative to pushing more into C++ is to work out a way to do less to start with. Quote Link to comment Share on other sites More sharing options...
Yves Posted November 1, 2013 Author Report Share Posted November 1, 2013 Out of interest, as a general ballpark idea, if one literally hardcoded the entire AI in C++ would all the above times become negligible/near instantaneous? I don't really know how big the JS penalty is and how much it's just that the AI is making too many calculations - because the alternative to pushing more into C++ is to work out a way to do less to start with.Removing unnecessary calculations has always first priority of course. I don't think it would be "instantaneous" by simply moving code to C++. Optimizing algorithms and data structures is probably easier in C++ than in JS, though. JS performance depends very much on your code, how good some Spidermonkey optimizations apply there and how good the JIT compiler understands it.Unfortunately that's impossible to predict without deep knowledge of Spidermonkey internals and even if you have that knowledge, it's very hard. You see that if you read some articles from JS engine hackers who explain why a piece of code is fast or slow.Another disadvantage of optimizing JS compared to optimizing C++ is that your performance gain is very "fragile". A little change in the script engine's internals could suddenly make your code half as fast.There's not just a constant overhead for Scripting. The bigger performance problem is caused by the untyped and dynamic nature of Javascript. Quote Link to comment Share on other sites More sharing options...
FeXoR Posted November 11, 2013 Report Share Posted November 11, 2013 (edited) Concerning the RMS API change (includes discussions about this): http://trac.wildfiregames.com/ticket/1834I don't like the idea to make RMS scripts that are not time critical with a fixed map preview (as is) time critical just for the map preview.I don't object to add an generated map preview optionally (with the ability to pick the seed) but I'd keep the static map preview by default.If players really want to handpick the seed that would be possible (or with just the ability of picking a seed by choosing it in Atlas).Making RMS time critical would reduce the possible complexity of RMS (or if changed to C++ the simplicity of RMS modding).Both would result in less variety which would be sad IMO. Edited November 11, 2013 by FeXoR Quote Link to comment Share on other sites More sharing options...
wraitii Posted November 11, 2013 Report Share Posted November 11, 2013 AI overhead mostly comes from inefficient data storage as JS sucks at that. Filtering, pruning, iterating: all fairly advanced stuffs that could be done well with (a lot of, perhaps) work in C++ basically can't be done in JS beyond some basic points. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.