Jump to content

Collecting data from users


Recommended Posts

Yes

Unfortunately that's all.

Same for me. I created an update-script, the same that you have shown in the Unix-Build-Instructions on the wiki(http://trac.wildfiregames.com/wiki/BuildInstructions#Unix). It worked really fine for several weeks, but now it doesn't work any more. I've saved the output of my shell and uploaded it.

Unfortunately I'm using the german translation packages on my Ubuntu and there is one mistake that's explaned in the german language while building atlas, nearly at the end, after "UserReport.cpp". It says "Datei oder Verzeichnis nicht gefunden", that means: File or directory not found/doesn't exist. Two and three lines later it says "Fehler", which means "error" and then it says "Warte auf noch nicht beendete Prozesse...", which means "Waiting for still running processes/tasks".

I hope I could help in any way.

bash-output.zip

Edited by Almin
Link to comment
Share on other sites

  • Replies 51
  • Created
  • Last Reply

Top Posters In This Topic

The important part is:

In file included from ../../../source/ps/UserReport.cpp:23:
../../../source/lib/external_libraries/curl.h:50: fatal error: curl/curl.h: Datei oder Verzeichnis nicht gefunden

which means you need to install libcurl - I think the package is "libcurl-dev" on Ubuntu. (Also I need to update the build instructions to mention this...)

Link to comment
Share on other sites

The important part is:

In file included from ../../../source/ps/UserReport.cpp:23:
../../../source/lib/external_libraries/curl.h:50: fatal error: curl/curl.h: Datei oder Verzeichnis nicht gefunden

which means you need to install libcurl - I think the package is "libcurl-dev" on Ubuntu. (Also I need to update the build instructions to mention this...)

Great! The package "libcurl4-gnutls-dev" did it for me! It's working again! You're awesome! Thank you very much!

Edited by Almin
Link to comment
Share on other sites

I think this system is largely complete for now. It collects some more hardware details (like this - does anyone know of other values that would be useful to collect?) and profiler data (5 seconds and 60 seconds after launching a map, with the same data as pressing shift+F11 and looking in logs/profile.txt). The server needs more work to analyse the data nicely, but that doesn't need to be done right now (it could be after the next release if necessary).

Link to comment
Share on other sites

  • 2 weeks later...

The game reports profiling measurements now, so I thought it'd be useful to plot it and make sure it vaguely makes sense. The current result is like this - this is showing just the time spent in the "render" function (i.e. excluding gameplay logic and buffer-swapping and everything else), and each cross is a reported timing (from either 5 or 60 seconds after the start of a match, from any user with the same graphics device, from any map and any screen size and any graphics settings, etc - this is hopelessly imprecise and I should filter the data better once there's more of it). Red lines are medians, blue rectangles are upper/lower quartiles.

Ideally, everyone would be at least at 60fps, which is 16msec/frame (i.e. almost the tick mark just above '10^1'). Anything worse than about 20fps (50msec/frame) is probably no fun to play. The data's currently too random to deduce anything, but at least it seems to be putting older/integrated devices on the left and newer ones on the right so it's not entirely random, and hopefully it'll work once there's more data :)

Link to comment
Share on other sites

Yeah, this is SVN users and it'll include dirty data from people doing texture-conversion etc. That probably won't affect the numbers hugely (the conversion isn't counted as part of the 'render' time which is used in this chart) but it may make some difference. Currently I only exclude people running Debug versions, but I can restrict it later to e.g. people playing the alpha 4 release on some particular map, to get more meaningfully comparable numbers.

Link to comment
Share on other sites

Yeah, this is SVN users and it'll include dirty data from people doing texture-conversion etc. That probably won't affect the numbers hugely (the conversion isn't counted as part of the 'render' time which is used in this chart) but it may make some difference. Currently I only exclude people running Debug versions, but I can restrict it later to e.g. people playing the alpha 4 release on some particular map, to get more meaningfully comparable numbers.

Ah, sweet (y)

Link to comment
Share on other sites

The code is here (minus some small changes I've not committed yet), using Python and Django (which I currently think is quite nice). The data is collected and processed on my own server. I think making the raw data public would raise too many privacy concerns, but it's mostly anonymous and not really secret so I'd be happy to share it somehow with WFG members or other particularly interested people.

Link to comment
Share on other sites

  • 2 weeks later...

I'd prefer not to make the page too 0 A.D.-specific - most people reading it probably won't care about our game. But I'll try to set up a page on our wiki documenting what extensions we use (as required or optional), since currently that information is scattered throughout the code and it'd be nice to get a coherent view.

(Got 918 distinct device reports now, by the way, stretching back to GeForce2 MX. Not bad for a week's data collection :))

Link to comment
Share on other sites

Looks like Philip now has enough data to make an informed decision on the graphics paths (requiring fragment_shader or _program is apparently OK).

http://www.wildfiregames.com/forum/index.php?showtopic=14419&pid=217338&st=0entry217338

I've also noted and fixed some problems with the CPU and cache detection code based on "weird numbers" in these reports. All in all, a very useful tool! Big thanks to Philip for implementing it in such an awesome fashion :)

Link to comment
Share on other sites

  • 2 weeks later...

RAM graph.

(Fun facts: the lowest reported value is 223MB; the highest is 49150MB; the second highest is 16080MB.)

I would have expected much stronger steps in the graph. Looks like it's common for our measured value to slightly under-report the nominal RAM. On Linux the figure excludes 1.7% plus maybe the kernel image size (hence the step at ~2012MB before the step at ~2048MB). I guess the rest of the variation is some other figures that get subtracted (AGP aperture size maybe?), or people sticking random 256MB/512MB pieces into their machines, or something else, or some combination of those things.

Regardless of the reasons, this means trying to read off figures at exactly (e.g.) 1024MB isn't useful. I think the important figures are:

* 99.5% of users have nearly 512MB RAM.

* 95% of users have nearly 1GB RAM.

* 80% of users have nearly 2GB RAM.

* 50% of users have nearly 3GB RAM.

So I think we can safely assume 1GB as a minimum, but should worry if we assume much more than that.

Link to comment
Share on other sites

Interesting to look at, thanks for that! I'm surprised only 50% have 2 GB of memory (vs. > 80% of Macs on the Steam survey).

It's kind of silly that Linux is excluding paging overhead AND the kernel, but Windows also deducts non-paged pool.

I've been wanting to get my grubby hands on the SMBIOS tables anyway, which include exact memory module sizes.

There's a Windows 2003 SP1 and later API to retrieve them; amusingly enough, previous Windows versions can use Aken. We'd need something like

http://cpan.uwinnipeg.ca/htdocs/DMI-Decode/DMI/Decode.pm.html on Linux (get the address via /proc/efi/systab and mmap). Would you be interested in writing/adapting that?

Link to comment
Share on other sites

Interesting to look at, thanks for that! I'm surprised only 50% have 2 GB of memory (vs. > 80% of Macs on the Steam survey).

They probably round their numbers upwards - our data says ~80% of all users have nearly 2GB which probably means they nominally have exactly 2GB before deductions. (Also Macs are much higher-end than typical Linux machines (which make up most of our data).)

Would you be interested in writing/adapting that?

I don't think the data would be useful enough to be worth any non-zero effort. The game itself shouldn't care about RAM, so it's only relevant for drawing this kind of graph and figuring out whether we should set minimum requirements to around 1GB or 2GB etc, for which the current data is sufficient.

Link to comment
Share on other sites

  • 4 weeks later...

They probably round their numbers upwards - our data says ~80% of all users have nearly 2GB which probably means they nominally have exactly 2GB before deductions. (Also Macs are much higher-end than typical Linux machines (which make up most of our data).)

I don't think the data would be useful enough to be worth any non-zero effort. The game itself shouldn't care about RAM, so it's only relevant for drawing this kind of graph and figuring out whether we should set minimum requirements to around 1GB or 2GB etc, for which the current data is sufficient.

Is the opengl report automatically updated or it is run manually at times? I am asking because I can't see my updated graphics data (gallium RV530 should now have float texture).

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share


×
×
  • Create New...