Jump to content

Android port


Guest afeder
 Share

Recommended Posts

Anyone with some Android expertise / interest who wants to stay in the loop on this let me know. OF course I'll post here when I am successful. Defeat is not an option!

I'm definitely still interested in this. I'd like to give it another shot with the latest version of the code base. I don't have any OpenGL experience and if not anything else, I'll at least learn something.

Link to comment
Share on other sites

If you mean the issue with an SVN or release version of the game, it may be fixed indirectly by moving to SDL 2.0 on Windows. But I was actually referring to a bug specific to using official SDL instead of our emulation in Windows - which nobody does, so it shouldn't apply to you :)

Ah, I was referring to this thread, which definitely falls under the fore.

Link to comment
Share on other sites

  • 4 months later...

For all who are interested, Phillip (Ykkrosh) did a little fiddling on the current code and got things (sort of) working again on his Nexus 7:

android6.jpg

I've also done a little (unrelated) work to get things compiling on normal Linux-on-ARM, and as a result, we now have Ubuntu and Debian packages of 0AD for ARM.

  • Like 2
Link to comment
Share on other sites

This is only somewhat related to Android development, but just for laughs I tried to get 0 A.D. to run on a Raspberry Pi. It's really far too low-end of a platform to be practical for actually playing 0 A.D., but it's still fun to see the game running on unusual platforms, especially one so cheap and low power as a Pi. My Pi is a Model B with ARM CPU, 512 MB shared RAM, 16 GB SD card, connected to 1920x1080 monitor via HDMI, and running fully updated Raspbian OS. Pictures will come once (if) it's running stably.

In theory this is all possible, because the Pi is an ARM device and Josh, Philip and others have worked to get ARM support working nicely for 0 A.D. (originally for the Android port) and most of the game's dependencies are already available as ARM packages in Debian, which at least the Raspbian OS is based on. The Pi also has an integrated GPU with OpenGL ES 2.0 support, which is partially working in 0 A.D. SVN.

I fixed up some of our rendering code that was broken on GLES, available on my Github repo here, and updated our SDL2 support here. Note that SDL2 is required for supporting GLES and has vastly improved support for mobile devices. The changes in these branches will eventually make their way to SVN, but for now, they can be pulled for anyone wanting a somewhat working Android/GLES build (y)

My approach for getting the game to work on the Pi was to begin with a custom cross-compiling GCC toolchain, using distcc to distribute the bulk of the compiler processing burden onto an Ubuntu VM providing 4 cores, to avoid the Pi's incredibly slow CPU and disk I/O. crosstool-NG was used to build the actual cross-compiler toolchain, vaguely as described here. This proved to be quite a headache, not least of all getting my network configuration set up so the Pi could communicate with the Ubuntu distcc "slave".

Next I had to check that all dependencies of the game were available as ARM packages through Raspbian or if they could be built separately. Surprisingly, most of them were already available, though a few had different names than on Debian/Ubuntu. SDL2 was not available, so I had to download the source and compile without X support. That's important because GLX, which SDL typically uses on *nix, is not supported by the Pi's proprietary GLES drivers, but SDL2 defaults to using X when available. Instead, rendering has to use the EGL backend on the Pi, luckily SDL 2.0.1 has basic Raspberry Pi support that does all the Pi-specific initialization required. The only other dependency that was missing was NVTT, our bundled copy in SVN has patches to build on ARM.

Compiling software on the Pi is incredibly slow, with distcc and decently powerful slave devices, it becomes bearable, but it will still be slow during e.g. preprocessing and linking. If you can get a cross-compiling environment working on desktop Linux, that might be the way to go, and I might try that next. Once I found the right combination of options for SDL2 and premake, the process went smoothly, but there are a lot of problems to get that far. I would say it takes at least 45 minutes for a clean build of the game using distcc with my Ubuntu VM, compared to less than 3 minutes to build the game normally on the same VM. SDL2 clean build takes maybe 5-10 minutes additional time, the same for NVTT and FCollada (neither are really needed though as will be seen below).

After getting the game to build on the Pi, the next challenge was making it run. Once I figured how to build SDL2 with udev (required for keyboard and mouse) and EGL support, the next hurdle was that texture conversion was impossibly slow. I was staring at a mostly gray main menu screen with about 2 textures loaded for quite some time, before I realized it wasn't frozen but busy converting textures! It's also possible NVTT wasn't working 100% correctly as there were a lot of GL errors being logged. So I chose to create a public.zip archive like we use for releases, with all the textures, models, and XML precached, and copy it onto the Pi in place of the SVN mod data.

In the end, I got the game to load the main menu a few times (buggy as usual on low end hardware, maybe due to the massive scrolling background textures with transparency?) Using autostart, I even got a game to load once before textures were working, but it typically fails pretty hard during loading. There are either some software bugs, it runs out of memory, or the hardware/power supply is being overstressed. It's hard to troubleshoot because whatever is going wrong kills the network connection (so I can't SSH into it) and locks up input, so I can't close the game window. It could even be an assertion in our code, and it's unable to minimize the window and release input like it normally would (Pi support in SDL2 only allows creating a full screen window, not changing mode to e.g. windowed).

I'm satisfied that the game can be built for, and run on, a Raspberry Pi, but it's not yet to the point I can demonstrate it working reliably, if it ever will. Hopefully it will work again and I can get some quality photos and screenshots, because I think it's a pretty cool demonstration of 0 A.D's versatility and the capability of low power ARM devices like the Pi :) Sadly I don't have a smartphone with nice CPU and GPU, that would be a much better test platform, but the nice thing is I can develop, test and debug much of the same code on a $50 Pi, that would be reused by a $400 smartphone or tablet.

  • Like 1
Link to comment
Share on other sites

That's quite cool :)

It's also possible NVTT wasn't working 100% correctly as there were a lot of GL errors being logged.

NVTT doesn't touch OpenGL at all, so I think it's unlikely those errors are related. (I got GL errors running on Android too, I think because of invalid enums (but I didn't try to check exactly where).)

There are either some software bugs, it runs out of memory, or the hardware/power supply is being overstressed.

I think you can check if it's running out of graphics memory by doing something like "vcgencmd cache_flush; vcdbg reloc" which'll show all the GL textures and how much is free. Running out of graphics memory will probably make VideoCore go really really slow as it tries to shuffle things around in memory to free up some contiguous space, and then in theory it should fail and return an error but in practice it might just randomly corrupt memory since I don't think the RPi GL drivers have had much extensive testing. Either way, you probably want to avoid that situation.

It's hard to troubleshoot because whatever is going wrong kills the network connection (so I can't SSH into it) and locks up input, so I can't close the game window.

Maybe try using a serial console? That should work as long as the ARM hasn't totally locked up.

Sadly I don't have a smartphone with nice CPU and GPU, that would be a much better test platform, but the nice thing is I can develop, test and debug much of the same code on a $50 Pi, that would be reused by a $400 smartphone or tablet.

The RPi's GPU is roughly equivalent in power to the Galaxy S II that I started the Android port on, which had nearly bearable performance :). But I guess you may be running at 1920x1080 while I was running at 800x480 (does SDL let you change the fullscreen resolution? (I think the hardware ought to be able to do arbitrary scaling for free)), and the CPU is massively slower though :(
Link to comment
Share on other sites

NVTT doesn't touch OpenGL at all, so I think it's unlikely those errors are related. (I got GL errors running on Android too, I think because of invalid enums (but I didn't try to check exactly where).)

You're right about that, they were invalid enum (0x0500) errors and unrelated to the other problems.

I think you can check if it's running out of graphics memory by doing something like "vcgencmd cache_flush; vcdbg reloc" which'll show all the GL textures and how much is free. Running out of graphics memory will probably make VideoCore go really really slow as it tries to shuffle things around in memory to free up some contiguous space, and then in theory it should fail and return an error but in practice it might just randomly corrupt memory since I don't think the RPi GL drivers have had much extensive testing. Either way, you probably want to avoid that situation.

Thanks, I was wondering which commands could tell me this. A few times malloc failed somewhere in the engine and it crashed, I've also seen GL out of memory errors (0x0505) before a crash. Another thing I suspect is there are problems with disk access, I couldn't get the main menu to load at all until I switched from the public.zip to SVN data, then I moved them back and it worked with no other changes. I tested the zip separately and it seems to be OK.

Last night I got a bit farther by decreasing desktop resolution to 1024x768, the main menu background loaded and the game seemed more stable. The Pi platform code in SDL2 only creates a fullscreen window at full resolution, so I have to change the desktop resolution. I autostarted the blank default map and it loaded OK, the text rendering and LOS were correct. I have not yet been able to fully load a real game, but at least I can take some photos and screenshots now :) Performance varies, the menus are very slow to respond of course and game loading takes a long time, but the menu framerate hovers between 4-20 fps so it's usable.

Next I would like to experiment with smaller, lower quality textures, perhaps by hacking the texture loading code as you mentioned in IRC. If I have time, I might look into improving Pi support in SDL2, so I can use arbitrary resolutions, toggle fullscreen mode, etc.

Link to comment
Share on other sites

If I have time, I might look into improving Pi support in SDL2, so I can use arbitrary resolutions, toggle fullscreen mode, etc.

How would non-fullscreen mode work? As far as I can see, SDL doesn't talk to any kind of window manager at all, it just uses dispmanx to set up a new EGL surface as being drawn on top of everything else (X11, console, etc). You could easily change that surface to be not fullscreen, but it would have no window decoration (since there's no window) and it'd be permanently on top of everything else (you couldn't switch to another window).

I suppose maybe you could make a Wayland backend for SDL work on RPi, so that it does run in a proper window - do you mean something like that? Sounds non-trivial but would be nice :)

Scaling a low-res EGL surface to desktop resolution would hopefully work with just changing src_rect/dst_rect in SDL_rpivideo.c, but I guess you'd need to do an inverse mapping of mouse coordinates, so maybe that'd get a bit messier than I thought :(

Link to comment
Share on other sites

It would be best to resize all the textures to 64x64 to use less vram, and thus preserving RAM. (It would be nice to implement that in the game itself, eg. resizing the textures before loading the map, instead of resizing them manually)

Also I think we should simplify the menu's (without backgroundanimations, just a wallpaper.)

Would it be an idea to create a map, using custom, ultra light, textures and without water etc. so, just a very basic map?

Link to comment
Share on other sites

(It would be nice to implement that in the game itself, eg. resizing the textures before loading the map, instead of resizing them manually)

All non-GUI textures are using mipmaps, so "resizing" just involves ignoring some of the high-res mipmap levels and using the low-res ones. The game already does that for textures larger than the GL implementation supports - see get_mipmaps in source/lib/res/graphics/ogl_tex.cpp. There's also a OGL_TEX_HALF_RES flag that can halve the resolution again, though that's not a great API - might be better to add an ogl_tex_set_max_size(Handle, int), and then CTextureManager can be smart about what max size it picks (e.g. we might want separate controls for terrain texture resolution and unit texture resolution).

Also I think we should simplify the menu's (without backgroundanimations, just a wallpaper.)

Yeah, the multiple huge overlapping alpha-blended textures on the menu screen are fairly terrible for very low-end GPUs - an option for a single lower-res non-blended background texture might be nice.

Another problem on Android/RPi is the lack of texture compression - ETC1 is the only widely-supported format with OpenGL ES 2.0 (and it's the only one on RPi), while our game only supports S3TC, so we have to decompress everything from 4/8 bits per pixel to 24/32bpp, which obviously uses a load more VRAM. In theory we could add support for ETC1, but it doesn't support alpha channels so we'd have to split RGBA textures into compressed RGB + uncompressed(?) A, and update all the shaders to do two texture loads, which is probably a pain. GLES 3.0 requires ETC2/EAC, which I think should be much less painful, and ASTC may become widespread in the future, so those are probably the more interesting long-term targets.

Link to comment
Share on other sites

Photos of 0 A.D. running on Raspberry Pi :D

post-10080-0-54610100-1383877939_thumb.j post-10080-0-37721700-1383877944_thumb.j

You can see it's quite a simple setup: just the Pi, an LCD monitor (a TV works too), and a USB nano receiver for keyboard and mouse.

How would non-fullscreen mode work? As far as I can see, SDL doesn't talk to any kind of window manager at all, it just uses dispmanx to set up a new EGL surface as being drawn on top of everything else (X11, console, etc). You could easily change that surface to be not fullscreen, but it would have no window decoration (since there's no window) and it'd be permanently on top of everything else (you couldn't switch to another window).

I'd be happy with controlling the fullscreen resolution through standard SDL API :) Windowed mode isn't critical but it's kinda nice for testing and debugging.
  • Like 6
Link to comment
Share on other sites

darn god-like work, historic_bruno! Do you think the performance with another distribtuion could be better? E.g. I am using Arch Linux instead of Raspian. I think it has lower overhead and less daemons running in background by default, as well as newer kernel, libraries and so on...what do you think?

Edited by Almin
Link to comment
Share on other sites

darn god-like work, historic_bruno! Do you think the performance with another distribtuion could be better? E.g. I am using Arch Linux instead of Raspian. I think it has lower overhead and less daemons running in background by default, as well as newer kernel, libraries and so on...what do you think?

Every MB is precious when you only have 512 for both CPU and GPU :( The way our game is now, I wouldn't try to run it with less than 128 MB video memory either - ideally we would have better control over which and what size textures we're loading. So yeah, any fat that can be trimmed from the OS would help a little, I've thought about other possible distros. I don't think it will help performance much, but it should let the game run longer before it runs out of memory and dies :)
Link to comment
Share on other sites

I added a bit to the F11 profiler's renderer stats to show texture memory usage, in case that helps identify out-of-VRAM problems. (It's only an approximation since it doesn't count sky textures (which don't use CTextureManager (but probably should)), and doesn't count e.g. padding added by the GPU for alignment, but it does handle mipmaps and compressed textures so it shouldn't be too wrong.)

Link to comment
Share on other sites

I think that's entirely OT since it's a GLX extension, and Android and RPi use EGL instead of GLX :). But I added it anyway - it adds some fields to hwdetect like

 "GLX_RENDERER_VENDOR_ID_MESA": 32902, "GLX_RENDERER_DEVICE_ID_MESA": 10818, "GLX_RENDERER_VERSION_MESA[0]": 10, "GLX_RENDERER_VERSION_MESA[1]": 0, "GLX_RENDERER_VERSION_MESA[2]": 0, "GLX_RENDERER_ACCELERATED_MESA": 1, "GLX_RENDERER_VIDEO_MEMORY_MESA": 1536, "GLX_RENDERER_UNIFIED_MEMORY_ARCHITECTURE_MESA": 1, "GLX_RENDERER_PREFERRED_PROFILE_MESA": 1, "GLX_RENDERER_OPENGL_CORE_PROFILE_VERSION_MESA[0]": 0, "GLX_RENDERER_OPENGL_CORE_PROFILE_VERSION_MESA[1]": 0, "GLX_RENDERER_OPENGL_COMPATIBILITY_PROFILE_VERSION_MESA[0]": 2, "GLX_RENDERER_OPENGL_COMPATIBILITY_PROFILE_VERSION_MESA[1]": 1, "GLX_RENDERER_OPENGL_ES_PROFILE_VERSION_MESA[0]": 1, "GLX_RENDERER_OPENGL_ES_PROFILE_VERSION_MESA[1]": 1, "GLX_RENDERER_OPENGL_ES2_PROFILE_VERSION_MESA[0]": 2, "GLX_RENDERER_OPENGL_ES2_PROFILE_VERSION_MESA[1]": 0, "GLX_RENDERER_VENDOR_ID_MESA.string": "Intel Open Source Technology Center", "GLX_RENDERER_DEVICE_ID_MESA.string": "Mesa DRI Mobile Intel® GM45 Express Chipset ",
(It sounds like Mesa 10.0 is going to ship with that extension in a couple of weeks, so presumably they're unlikely to change it incompatibly, and if they do then we'll still have time to fix our code before our next alpha release.)
Link to comment
Share on other sites

Most of the information in the wiki is wrong and should be ignored :)

The only thing the engine technically requires is OpenGL ES 2.0, with drivers that don't have bugs we can't work around. It worked okay on my Qualcomm-chipset Nexus 7 so at least some version of their drivers is okay. The bigger problems are performance (usually terrible), and input (especially on small screens), and the unsuitability of the gameplay to that kind of device, but they're not device-specific problems.

Link to comment
Share on other sites

But I added it anyway - it adds some fields to hwdetect like

(It sounds like Mesa 10.0 is going to ship with that extension in a couple of weeks, so presumably they're unlikely to change it incompatibly, and if they do then we'll still have time to fix our code before our next alpha release.)

Nice, I'll have a look if the extension get updated before 10.0.

Link to comment
Share on other sites

  • 2 weeks later...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...