Almin Posted February 18, 2011 Report Share Posted February 18, 2011 (edited) Yes Unfortunately that's all.Same for me. I created an update-script, the same that you have shown in the Unix-Build-Instructions on the wiki(http://trac.wildfiregames.com/wiki/BuildInstructions#Unix). It worked really fine for several weeks, but now it doesn't work any more. I've saved the output of my shell and uploaded it.Unfortunately I'm using the german translation packages on my Ubuntu and there is one mistake that's explaned in the german language while building atlas, nearly at the end, after "UserReport.cpp". It says "Datei oder Verzeichnis nicht gefunden", that means: File or directory not found/doesn't exist. Two and three lines later it says "Fehler", which means "error" and then it says "Warte auf noch nicht beendete Prozesse...", which means "Waiting for still running processes/tasks".I hope I could help in any way.bash-output.zip Edited February 18, 2011 by Almin Quote Link to comment Share on other sites More sharing options...
Ykkrosh Posted February 18, 2011 Author Report Share Posted February 18, 2011 The important part is:In file included from ../../../source/ps/UserReport.cpp:23:../../../source/lib/external_libraries/curl.h:50: fatal error: curl/curl.h: Datei oder Verzeichnis nicht gefundenwhich means you need to install libcurl - I think the package is "libcurl-dev" on Ubuntu. (Also I need to update the build instructions to mention this...) Quote Link to comment Share on other sites More sharing options...
Almin Posted February 18, 2011 Report Share Posted February 18, 2011 (edited) The important part is:In file included from ../../../source/ps/UserReport.cpp:23:../../../source/lib/external_libraries/curl.h:50: fatal error: curl/curl.h: Datei oder Verzeichnis nicht gefundenwhich means you need to install libcurl - I think the package is "libcurl-dev" on Ubuntu. (Also I need to update the build instructions to mention this...)Great! The package "libcurl4-gnutls-dev" did it for me! It's working again! You're awesome! Thank you very much! Edited February 18, 2011 by Almin Quote Link to comment Share on other sites More sharing options...
Ykkrosh Posted February 20, 2011 Author Report Share Posted February 20, 2011 I think this system is largely complete for now. It collects some more hardware details (like this - does anyone know of other values that would be useful to collect?) and profiler data (5 seconds and 60 seconds after launching a map, with the same data as pressing shift+F11 and looking in logs/profile.txt). The server needs more work to analyse the data nicely, but that doesn't need to be done right now (it could be after the next release if necessary). Quote Link to comment Share on other sites More sharing options...
liamdawe Posted February 20, 2011 Report Share Posted February 20, 2011 I just loaded up and told it to submit my info with a little message, hope someone finds it useful Quote Link to comment Share on other sites More sharing options...
Kimball Posted February 21, 2011 Report Share Posted February 21, 2011 I'm getting an absurd amount of errors from this. Just rebuilt the game a few minutes ago. Quote Link to comment Share on other sites More sharing options...
Ykkrosh Posted February 21, 2011 Author Report Share Posted February 21, 2011 That message means you're using an old version of the executable (from before SVN r8925) - maybe it had errors when you were compiling or something? Quote Link to comment Share on other sites More sharing options...
Ykkrosh Posted March 6, 2011 Author Report Share Posted March 6, 2011 The game reports profiling measurements now, so I thought it'd be useful to plot it and make sure it vaguely makes sense. The current result is like this - this is showing just the time spent in the "render" function (i.e. excluding gameplay logic and buffer-swapping and everything else), and each cross is a reported timing (from either 5 or 60 seconds after the start of a match, from any user with the same graphics device, from any map and any screen size and any graphics settings, etc - this is hopelessly imprecise and I should filter the data better once there's more of it). Red lines are medians, blue rectangles are upper/lower quartiles.Ideally, everyone would be at least at 60fps, which is 16msec/frame (i.e. almost the tick mark just above '10^1'). Anything worse than about 20fps (50msec/frame) is probably no fun to play. The data's currently too random to deduce anything, but at least it seems to be putting older/integrated devices on the left and newer ones on the right so it's not entirely random, and hopefully it'll work once there's more data Quote Link to comment Share on other sites More sharing options...
Pureon Posted March 6, 2011 Report Share Posted March 6, 2011 So those stats are from people who have downloaded the svn version? Nice. Some newer cards getting owned by older ones Quote Link to comment Share on other sites More sharing options...
feneur Posted March 6, 2011 Report Share Posted March 6, 2011 A thought, since this is SVN, and thus the textures need to get converted (or whatever exactly it is they do) if they haven't been accessed before, can that affect these times? Quote Link to comment Share on other sites More sharing options...
Ykkrosh Posted March 6, 2011 Author Report Share Posted March 6, 2011 Yeah, this is SVN users and it'll include dirty data from people doing texture-conversion etc. That probably won't affect the numbers hugely (the conversion isn't counted as part of the 'render' time which is used in this chart) but it may make some difference. Currently I only exclude people running Debug versions, but I can restrict it later to e.g. people playing the alpha 4 release on some particular map, to get more meaningfully comparable numbers. Quote Link to comment Share on other sites More sharing options...
feneur Posted March 6, 2011 Report Share Posted March 6, 2011 Yeah, this is SVN users and it'll include dirty data from people doing texture-conversion etc. That probably won't affect the numbers hugely (the conversion isn't counted as part of the 'render' time which is used in this chart) but it may make some difference. Currently I only exclude people running Debug versions, but I can restrict it later to e.g. people playing the alpha 4 release on some particular map, to get more meaningfully comparable numbers.Ah, sweet Quote Link to comment Share on other sites More sharing options...
bstempi Posted March 6, 2011 Report Share Posted March 6, 2011 Where is this data stored, and where is the web service located? I'm interested in taking a look at it and possibly contributing. Quote Link to comment Share on other sites More sharing options...
Ykkrosh Posted March 7, 2011 Author Report Share Posted March 7, 2011 The code is here (minus some small changes I've not committed yet), using Python and Django (which I currently think is quite nice). The data is collected and processed on my own server. I think making the raw data public would raise too many privacy concerns, but it's mostly anonymous and not really secret so I'd be happy to share it somehow with WFG members or other particularly interested people. Quote Link to comment Share on other sites More sharing options...
Ykkrosh Posted March 15, 2011 Author Report Share Posted March 15, 2011 Got a lot more data now, so it became very slow, so I made some changes to get the GL report page adequately fast again. (There's a batch process that extracts the relevant data into a few SQL tables, instead of having to parse a hundred megabytes of JSON every time someone loads a page.) Quote Link to comment Share on other sites More sharing options...
fabio Posted March 16, 2011 Report Share Posted March 16, 2011 Got a lot more data now...The Phoronix effect...:http://www.phoronix.com/scan.php?page=news_item&px=OTIwOACould you highlight/mark extensions currently used by 0 A.D. in the report? Quote Link to comment Share on other sites More sharing options...
Ykkrosh Posted March 18, 2011 Author Report Share Posted March 18, 2011 I'd prefer not to make the page too 0 A.D.-specific - most people reading it probably won't care about our game. But I'll try to set up a page on our wiki documenting what extensions we use (as required or optional), since currently that information is scattered throughout the code and it'd be nice to get a coherent view.(Got 918 distinct device reports now, by the way, stretching back to GeForce2 MX. Not bad for a week's data collection ) Quote Link to comment Share on other sites More sharing options...
Pureon Posted March 18, 2011 Report Share Posted March 18, 2011 Any unexpected/worrying results in the data collected so far?edit: Oops answered here Quote Link to comment Share on other sites More sharing options...
janwas Posted March 19, 2011 Report Share Posted March 19, 2011 Looks like Philip now has enough data to make an informed decision on the graphics paths (requiring fragment_shader or _program is apparently OK).http://www.wildfiregames.com/forum/index.php?showtopic=14419&pid=217338&st=0entry217338I've also noted and fixed some problems with the CPU and cache detection code based on "weird numbers" in these reports. All in all, a very useful tool! Big thanks to Philip for implementing it in such an awesome fashion Quote Link to comment Share on other sites More sharing options...
Ykkrosh Posted March 19, 2011 Author Report Share Posted March 19, 2011 (I've had to disable the CPU report for now, though, since parsing tens of megabytes of JSON on every page load doesn't make my server happy. I'll try to restore it some time soon.) Quote Link to comment Share on other sites More sharing options...
Ykkrosh Posted April 1, 2011 Author Report Share Posted April 1, 2011 RAM graph.(Fun facts: the lowest reported value is 223MB; the highest is 49150MB; the second highest is 16080MB.)I would have expected much stronger steps in the graph. Looks like it's common for our measured value to slightly under-report the nominal RAM. On Linux the figure excludes 1.7% plus maybe the kernel image size (hence the step at ~2012MB before the step at ~2048MB). I guess the rest of the variation is some other figures that get subtracted (AGP aperture size maybe?), or people sticking random 256MB/512MB pieces into their machines, or something else, or some combination of those things.Regardless of the reasons, this means trying to read off figures at exactly (e.g.) 1024MB isn't useful. I think the important figures are:* 99.5% of users have nearly 512MB RAM.* 95% of users have nearly 1GB RAM.* 80% of users have nearly 2GB RAM.* 50% of users have nearly 3GB RAM.So I think we can safely assume 1GB as a minimum, but should worry if we assume much more than that. Quote Link to comment Share on other sites More sharing options...
Pureon Posted April 1, 2011 Report Share Posted April 1, 2011 My Windows 98 computer had more than 223MB RAM Thanks for the summary Ykkrosh, very interesting! Quote Link to comment Share on other sites More sharing options...
janwas Posted April 2, 2011 Report Share Posted April 2, 2011 Interesting to look at, thanks for that! I'm surprised only 50% have 2 GB of memory (vs. > 80% of Macs on the Steam survey).It's kind of silly that Linux is excluding paging overhead AND the kernel, but Windows also deducts non-paged pool.I've been wanting to get my grubby hands on the SMBIOS tables anyway, which include exact memory module sizes.There's a Windows 2003 SP1 and later API to retrieve them; amusingly enough, previous Windows versions can use Aken. We'd need something likehttp://cpan.uwinnipeg.ca/htdocs/DMI-Decode/DMI/Decode.pm.html on Linux (get the address via /proc/efi/systab and mmap). Would you be interested in writing/adapting that? Quote Link to comment Share on other sites More sharing options...
Ykkrosh Posted April 2, 2011 Author Report Share Posted April 2, 2011 Interesting to look at, thanks for that! I'm surprised only 50% have 2 GB of memory (vs. > 80% of Macs on the Steam survey).They probably round their numbers upwards - our data says ~80% of all users have nearly 2GB which probably means they nominally have exactly 2GB before deductions. (Also Macs are much higher-end than typical Linux machines (which make up most of our data).)Would you be interested in writing/adapting that?I don't think the data would be useful enough to be worth any non-zero effort. The game itself shouldn't care about RAM, so it's only relevant for drawing this kind of graph and figuring out whether we should set minimum requirements to around 1GB or 2GB etc, for which the current data is sufficient. Quote Link to comment Share on other sites More sharing options...
fabio Posted April 28, 2011 Report Share Posted April 28, 2011 They probably round their numbers upwards - our data says ~80% of all users have nearly 2GB which probably means they nominally have exactly 2GB before deductions. (Also Macs are much higher-end than typical Linux machines (which make up most of our data).)I don't think the data would be useful enough to be worth any non-zero effort. The game itself shouldn't care about RAM, so it's only relevant for drawing this kind of graph and figuring out whether we should set minimum requirements to around 1GB or 2GB etc, for which the current data is sufficient.Is the opengl report automatically updated or it is run manually at times? I am asking because I can't see my updated graphics data (gallium RV530 should now have float texture). Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.