Jump to content

Ykkrosh

WFG Retired
  • Content Count

    4.928
  • Joined

  • Last visited

  • Days Won

    6

Ykkrosh last won the day on October 31 2013

Ykkrosh had the most liked content!

Community Reputation

42 Excellent

2 Followers

About Ykkrosh

  • Rank
    Primus Pilus

Contact Methods

  • Website URL
    http://

Profile Information

  • Gender
    Male

Recent Profile Visitors

2.142 profile views
  1. Ah, looks like the ALPHA_PLAYER is the problem - that still needs to be 8-bit alpha. I think there are approximately no cases where we can use 1-bit alpha formats (since even if the original texture has only 1-bit alpha, its mipmaps will need higher alpha resolution). I think we currently use no-alpha DXT1 (4bpp) for any textures whose alpha channel is entirely 0xFF, and DXT5 (8bpp) for everything else.
  2. Mmm... Having read the ASTC spec, I'm not sure there's much chance of a real-time encoder with decent quality - there's a huge number of variables that the encoder can pick, and if it tries to pick the optimal value for each variable then it's probably going to take a lot of time, and if it doesn't then it's just wasting bits. ETC2 might be more appropriate for that case, since it's much simpler and will presumably take much less encoding time to get close to optimal output. Might be interesting to do a test of quality vs encoding speed though, to see if that's true.
  3. The Android Extension Pack requires ASTC. The AEP is not mandatory, but GLES 3+ isn't mandatory either - what matters is what the GPU vendors choose to support. (I think even GLES 2.0 wasn't mandatory until KitKat, but pretty much everyone supported it long before that.) ASTC does seem like it will be widely supported in the future - Mali-T622, PowerVR Series6XT, Adreno 420, Tegra K1. Support is not great right now, but any decent Android port of the game is going to be a long-term project, and if we're going to spend effort on new texture compression formats I think it may be better to spend
  4. That looks like a non-GPL-compatible (and non-OSI) licence, e.g. you're only allowed to distribute source in "unmodified form" and only in "software owned by Licensee", so it's not usable. That just calls into etcpack, so it has the same licensing problems. Licence looks okay, but it sounds like quality may be a problem (from the blog post: "As for the resulting image quality, my tool was never intended for production usage").
  5. Hmm, ETC2 sounds like it might be good - same bitrate as S3TC (4bpp for RGB, 8bpp for RGBA) and apparently better quality, and a reasonable level of device support. (Integrating it with the engine might be slightly non-trivial though - we use NVTT to do S3TC compression but also to do stuff like mipmap filtering and to generate the DDS files, so we'd need to find some suitably-licensed good-quality good-performance ETC2 encoder and find some other way to do the mipmaps and create DDS etc.)
  6. Oh, sure, but then it's just a problem of the game being fundamentally not designed to run on that kind of hardware, which is a totally different problem to a memory leak bug (though probably a harder one to fix) It's not available on any devices I've used (Adreno 320/330, Mali-400MP, VideoCore) - as far as I'm aware, only NVIDIA supports it on mobile. But I think the game will use it automatically whenever it's available. (public.zip always contains S3TC-compressed textures, and the game will detect whether the drivers support S3TC, and if not then it will just decompress the textures in s
  7. 1GB isn't that much, depending on the map and game settings - if I run on desktop Linux with nos3tc=true (to mirror the lack of working texture compression on Android), on "Greek Acropolis (4)" with default settings, I see resident memory usage ~1.3GB. With S3TC enabled it's ~0.6GB. I think that indicates there's a huge volume of textures, and we don't (yet) have any options to decrease the texture quality or variety, so you just need a load of RAM. (Texture compression is kind of critical for Android, but ETC1 is rubbish and doesn't support alpha channels, and nothing else is universally avai
  8. 12.04 is supported by Canonical for 5 years, i.e. until 2017. Over the past 6 months we had roughly the same number of players on 12.04, 13.10, and 14.04 (~3000 on each, who enabled the feedback feature). That time period overlaps the release of 14.04, so there would probably be different results if we waited another few months and checked the latest data again, but it suggests there's likely still a significant number of people on 12.04 today. So we shouldn't drop support for 12.04 without significant benefits (and I don't think there are any significant benefits here, since 12.04 / GCC 4.6 s
  9. About 10% of our Windows players over the past 6 months were on WinXP (and that's about 2/3 as many players as for all versions of OS X), so I think it's important to continue supporting running the game on XP for now. I'm not too concerned about dropping it as a development platform though.
  10. To integrate properly and straightforwardly with distro package managers, we should only use the default compilers in the distros we care about, which I think currently means GCC 4.6 (for Ubuntu 12.04). On Windows, VS2010 should be sufficient for compatibility with SpiderMonkey (since I think that's what Mozilla still uses), but many of the more interesting C++11 features are only available in VS2012 or VS2013. One problem there is that VS2012 won't run on WinXP, so any developers currently using WinXP (I've got data that suggests there's still a few) will have to upgrade to Win7 or above. (Pl
  11. The alpha 16 package is apparently in wheezy-backports - can you install it from there (possibly with these instructions)? That should make sure you get all the dependencies too.
  12. There might be ways to use a GPU to implement certain kinds of pathfinding efficiently, but that article is not one - it looks like about the worst possible way you could use a GPU . If the maximum path length is N (with maybe N=1000 on our maps), it has to execute N iterations serially, and GPUs are very bad at doing things serially; and each iteration is a separate OpenGL draw call, and draw calls are very expensive. And it's only finding paths to a single destination at once, so you'd need to repeat the whole thing for every independent unit. It would probably be much quicker to just do the
  13. I believe I tried that when first implementing the algorithm. If I remember correctly, the problem was that sometimes a unit starts from slightly inside a shape or precisely on the edge, and the silhouette-point-detection thing will mean it doesn't see any of the points on that shape at all. That makes the unit pick a very bad path (typically walking to corner of the next nearest building and then back again), which isn't acceptable behaviour.
  14. The 4x4-navcell-per-terrain-tile thing and the clearance thing were part of the incomplete pathfinder redesign - #1756 has some versions of the original patch. None of it has been committed yet. (I think the redesigned long-range pathfinder mostly worked, but it needed integrating with the short-range pathfinder (which needed redesigning itself) and with various other systems, and that never got completed.)
  15. You probably need "./update-workspaces.sh ..." rather than just "update-workspaces.sh ..."
×
×
  • Create New...