Jump to content

[Discussion] Spidermonkey upgrade


Recommended Posts

I prefer updating to ESR24 first and leper favoured that approach too. Any other opinions?

Another option is to stay with 1.8.5 :P Has the rationale for upgrading changed since this post: http://www.wildfiregames.com/forum/index.php?showtopic=17289&p=269566 ? At least it seems we can rule out performance improvement generally, not sure about GC improvements or AI threading. Is random map generation still faster?
Link to comment
Share on other sites

Another option is to stay with 1.8.5 :P

In my opinion it's completely unacceptable to have such a big part of the game as the scripting engine frozen and tied to one specific version.

Any kind of issue we face will require more or less ugly workarounds or forking of SpiderMonkey 1.8.5. Developing a Javascript Engine is not what we want and forking will be a painful, pointless and stuck approach. We won't be able to use new features of the Javascript language, we won't get bugs fixes for the cloning problem, the OOS problems or other problems we currently solve with workarounds or face in the future.

Especially now that we are so much closer to the upgrade (my WIP patch is now only about 4000 lines long, compared to more than 24 000 lines a few months ago), stopping the upgrade would be complete nonsense.

I think AI threading would work without the upgrade. GC could be improved in the future but we will only really know that when/if it happens. ;)

I just found this link. There's a short descriptions of what GGC (Generational Garbage Collection) should improve.

  • Like 1
Link to comment
Share on other sites

Upgrading to v24 or v26 (or upcoming v27) are both welcome. Just some more Pros and Cons with going with v26, since the choice seems to be v24:

Pros:

  • no performance regressions;
  • it's a newer version and mozilla developers could be more involved in helping and improving it;
  • more time to progressively test and find bugs, updating to v27, v28, ... up until v31 is finally released;
  • can be easily patched with test patches if needed;
  • when v31 will be used, comparison will be done against a version as fast as v1.8.5, if using v24 now and upgrading later to v31 it will be faster than v24, but it will be difficult to say if it's faster than v1.8.5.

Cons:

  • Linux distro should compile against embedded version. It would better to avoid it, but it's not really a problem, there are other games doing the same, e.g. supertuxkart, see here;
  • it adds more code to svn, there are some other things that could be removed anyway, e.g. bundled enet.

Anyway the long term will be v31 with the option of using the system lib.

In the meantime... thanks for all the months of work done so far! :)

  • Like 3
Link to comment
Share on other sites

Upgrading to v24 or v26 (or upcoming v27) are both welcome. Just some more Pros and Cons with going with v26, since the choice seems to be v24:

You have some good points. I plan to work on an integration branch after V24 is committed and update this branch to the current development version.

This should take care of these points and also helps staying informed about changes in the API:

  • it's a newer version and mozilla developers could be more involved in helping and improving it;
  • more time to progressively test and find bugs, updating to v27, v28, ... up until v31 is finally released;
  • can be easily patched with test patches if needed;

The following issues remain, but I think that's acceptable.

We have seen in the profiling above that most of the performance issues are fixed with newer versions.

  • no performance regressions;
  • when v31 will be used, comparison will be done against a version as fast as v1.8.5, if using v24 now and upgrading later to v31 it will be faster than v24, but it will be difficult to say if it's faster than v1.8.5.
Link to comment
Share on other sites

I think it's your decision, nobody wants you to feel like all this work was for nothing. (that's perhaps how the spidermonkey devs feel now seeing those results... but you already differentiated about that some posts above)

Is it a lot work to maintain it any longer? Then I'm for the upgrade.

If it makes the next upgrade easier and the next upgrade is planned, then put it in (upgrade) if you like.

If you think, we should throw it all away because an upgrade has no benefit so we'd better stick with the old version and you are not angry, then I'm for that. But imagine what will happen, once the 31/32 Version has real performance improvements. Then everyone will ask you again for hammering out another upgrade .. so I don't know what is best.

Edited by Hephaestion
Link to comment
Share on other sites

In my opinion it's completely unacceptable to have such a big part of the game as the scripting engine frozen and tied to one specific version.

Any kind of issue we face will require more or less ugly workarounds or forking of SpiderMonkey 1.8.5. Developing a Javascript Engine is not what we want and forking will be a painful, pointless and stuck approach. We won't be able to use new features of the Javascript language, we won't get bugs fixes for the cloning problem, the OOS problems or other problems we currently solve with workarounds or face in the future.

I think that all depends on how well the "frozen" version works, and relative to how well the upgrade works. We could very well encounter new, worse bugs than we currently have to deal with. That's a risk with any library but most of the others we use are far more stable and don't have major rewrites of the internal workings and API from one version to the next, and they're more general purpose instead of having just one major "user" in mind. We've been testing 1.8.5 for years, but what are the serious issues we need to patch to resolve? There is the weird JIT OOS issue recently discovered, but we have a workaround; I'm sure the upgrade will introduce similar weird JIT OOS issues that take years to discover/work around.

But I was thinking most of the issues with our scripting engine boiled down to how we're using it, rather than the engine itself, i.e. we shouldn't have code that requires hypothetical SM JIT/GC performance improvements to be feasible - that code should simply go away :D

Especially now that we are so much closer to the upgrade (my WIP patch is now only about 4000 lines long, compared to more than 24 000 lines a few months ago), stopping the upgrade would be complete nonsense.

That's a sunk cost and shouldn't have any impact on the decision moving forward, on deciding whether it's the best option or not. Doing nothing more should always be an option under consideration.

GC could be improved in the future but we will only really know that when/if it happens. ;)

Maybe it won't, everyone was certain that a newer version of SM would bring performance improvements, but it didn't... so I'll remain skeptical for now about the GC changing for the better :)
Link to comment
Share on other sites

I feel like this is getting metaphysical and the arguments could be said of anything old and new with very little rephrase.

Regarding the JIT issue, we mostly discovered that it was a JIT issue, so it would likely be much easier to debug such issues this time (others of new kinds could, it is true, arise).

What we have already discovered is that the new Spidermonkey is stricter about lax syntax and things like that, which are sources of weird, unpredictable, and hard to debug issues. We also know that it supports multi-threading to a greater extent.

And regarding performance, there are several stuffs beyond just compilation magic that could help us, such as new objects (Maps and Sets) which possibly could speed the AI a lot. On 1.8.5, I do not believe the AI will ever be efficient enough.

GC is quite likely to get faster, particularly if it can be fragmented (which ought to be able with the newer spidermonkeys).

If anything, this should be a separate branch on Git. Somewhere you can just try it easily at some point, to see how some changes react. If we're going to get economic, the opportunity cost of not trying the upgrade is just too high.

Link to comment
Share on other sites

That's a sunk cost and shouldn't have any impact on the decision moving forward, on deciding whether it's the best option or not. Doing nothing more should always be an option under consideration.

Being closer to the upgrade means there's less work left to do and the risk of unacceptable problems showing up is reduced.

I think that all depends on how well the "frozen" version works, and relative to how well the upgrade works. We could very well encounter new, worse bugs than we currently have to deal with. That's a risk with any library but most of the others we use are far more stable and don't have major rewrites of the internal workings and API from one version to the next, and they're more general purpose instead of having just one major "user" in mind. We've been testing 1.8.5 for years, but what are the serious issues we need to patch to resolve? There is the weird JIT OOS issue recently discovered, but we have a workaround; I'm sure the upgrade will introduce similar weird JIT OOS issues that take years to discover/work around.

But I was thinking most of the issues with our scripting engine boiled down to how we're using it, rather than the engine itself, i.e. we shouldn't have code that requires hypothetical SM JIT/GC performance improvements to be feasible - that code should simply go away :D

Although staying on version 1.8.5 is probably not too bad at the moment, it will become a big show stopper if we later figure out there's no way around upgrading.

I think it's quite likely that we will be forced to upgrade sooner or later.

If we base fundamental design decisions on v1.8.5, the work required for upgrading will increase the longer we wait.

If we upgrade now, there's still the chance that SpiderMonkey will change in a way that requires big architecture changes. That's bad, but keeping track of what's happening there and maybe having a chance to influence it is still much better than continuing using v1.8.5 without having any idea of what's changing in future versions. That could help us avoiding some pitfalls.

Unfortunately it's not as easy as quickly reading some changelogs with each SpiderMonkey release. I'm quite sure we wouldn't know about the changed relation between compartments and global objects if I didn't work on the upgrade.

I've tried raising the SpiderMonkey developers' awareness about library embedders. It won't stop them making big changes if they think it's necessary for Firefox, but I hope they will try to make it as easy as possible for us to prepare for it. I really see some change there. For example they have started to provide supported standalone SpiderMonkey versions on a regular schedule.

Also they expect the API to become more stable soon. I'm not completely convinced about this but I still thought it's the most acceptable solution to stick with SpiderMonkey and try to keep our version as close to the development version as possible (by upgrading SVN to ESR releases and keeping a dev-branch and syncing it about once every week).

Maybe it won't, everyone was certain that a newer version of SM would bring performance improvements, but it didn't... so I'll remain skeptical for now about the GC changing for the better :)

I partially agree here. I don't believe in promises of theoretical performance improvements anymore. On the other hand I expect the GC performance to be much more universal and less dependent on the scripts than JIT compiler performance.

I think we currently have enough bigger problems than GC performance to work on. On the other hand we won't be able to completely avoid GC hiccups in the main thread with v1.8.5.

Wraitii has some good points too IMO. For me the main argument is really avoiding to be stuck with an old and "frozen" version without preparing our engine for future changes.

  • Like 1
Link to comment
Share on other sites

At the moment I'm trying to figure out if what we see on this graph is normal or not.

I've reduced the number of GCs a bit because each GC call causes a more noticeable delay than with v1.8.5 (there are many flags to tweak GC though and I haven't tested them much yet). This graph compares JS memory usage of v1.8.5 and v26 during a 2vs2 AI game. V24 looks more or less like V26, so I didn't add that graph here.

The used memory gets much higher because I've reduced the amount of GCs. More interesting than the peaks are the lowest parts of the graphs (after a garbage collection).

I'd say that would be the amount of memory which is actually in use without counting memory that could be freed by a GC.

It looks like V26 uses about twice as much memory as v1.8.5.

It could be explained by the additional information stored by type inference and the JIT compiler. I don't know if such a big difference is normal for that.

post-7202-0-37598500-1390072672.png

EDIT:

I've dumped the JS heap on turn 1500 and ran a few grep commands to compare where the additional data in memory is coming from.

I' don't know yet what it means...

post-7202-0-59013500-1390086943.png

post-7202-0-37598500-1390072672_thumb.pn

post-7202-0-59013500-1390086943_thumb.pn

  • Like 1
Link to comment
Share on other sites

Yves, IMO, the game gets really irritating when there are lag pikes. So more frequent, but shorter GCs are more welcome than a few time-taking GCs. Even if the total time for GC is longer if you distribute it. But that aside, the memory usage after the spidermonkey upgrade is indeed strange. Maybe they changed pointers to 64 bit or something?

Link to comment
Share on other sites

If anything, this should be a separate branch on Git. Somewhere you can just try it easily at some point, to see how some changes react.

I was thinking so the other day and it would be my preference, but until we've migrated to git, we can't really get everyone on board with that. Especially people who rely on the Windows autobuild, they're stuck on SVN.

we should run the GC every turn for testing purposes.

GC is something we really need to think more about, running it every turn may not be acceptable depending on what's going on, but it should be run regularly enough that it's not too slow. It seems likely that we need to be directly in control of when GC occurs since SM's metrics won't really take into account the impact on our engine.
Link to comment
Share on other sites

Just to know how are lag killing progress going ? I'm trying to drag my entire promo to 0AD, and that'll be nice if I could tell them about that, which we all know is the main problem here. If everything went fine this thread would'nt exist.

I assume this question is related to the SpiderMonkey upgrade.

The current status is that the upgrade will make the performance a little bit worse for v24.

Newer versions (I've tested v26) bring the performance closer to what we had with v1.8.5. and maybe beyond that at some point.

Some new features of SpiderMonkey 24 (like Javascript Maps for example) could probably help to improve performance.

The memory usage problem can be solved by correctly configuring the garbage collection. With the incremental GC it should also be possible to split up the time required for garbage collection more evenly and avoid big lag spikes. I have spent the whole day yesterday but I haven't yet completely figured out how GC needs to be configured.

There are tons of flags to set and different functions to call and the documentation lacks.

Link to comment
Share on other sites

I assume this question is related to the SpiderMonkey upgrade.

The current status is that the upgrade will make the performance a little bit worse for v24.

Newer versions (I've tested v26) bring the performance closer to what we had with v1.8.5. and maybe beyond that at some point.

Some new features of SpiderMonkey 24 (like Javascript Maps for example) could probably help to improve performance.

The memory usage problem can be solved by correctly configuring the garbage collection. With the incremental GC it should also be possible to split up the time required for garbage collection more evenly and avoid big lag spikes. I have spent the whole day yesterday but I haven't yet completely figured out how GC needs to be configured.

There are tons of flags to set and different functions to call and the documentation lacks.

Good Job =) Hold on =)

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...