Jump to content

olsner

WFG Retired
  • Posts

    476
  • Joined

  • Last visited

Posts posted by olsner

  1. Attached is a patch that along with my recently committed header guards and misc fixes makes unity builds work as well as normal builds. The reason for not just committing it is that some stuff had to be hacked around a bit so I'd like to get these non-obvious fixes reviewed before simply committing them... Oh, and this is untested in Visual Studio, btw.

    "Documentation"

    The patch adds an option to the premake script (pass --unity to update-workspaces to enable), and when enabled it generates a source file per library/sub-project according to the existing structure - it simply takes the list of sources for each project and generates a foo_unity_unit.cpp file that #includes all the other files, then registers that single source file as the only source of the project as far as the generated makefile/VS project is concerned. This means that no changes in premake were required, only a few isolated changes in the lua scripts. (Which is nice when we plan to upgrade premake...)

    A couple of hacks were involved to make this work (beyond the actual code fixes):

    • LOG_CATEGORY is defined by many source files, so it is automatically undefed before including a source. I contemplated adding a #undef to each source file, but rejected that idea based on the number of source files involved :)
    • In the lib code there's a macro for generating "unique id:s" based on line numbers. This causes a bit of problems when many similar source files include about as many headers and then define a couple of error code associations just after those headers. Anyway, the generated unity unit redefines the macro to give different unique id:s before each file.
    • Errors.cpp is currently ill-suited for compilation along with any code that includes headers that declare error types. A better patch should update the generation of Errors.cpp to work well in unity-builds.
    • Oh, and actually only .cpp files are included in the unity unit. All other kinds of files (e.g. assembly files) get added to the source file list as usual.

    One problem is that the "engine" build seems to be a bit too big (gcc uses a lot of memory for it), so my parallel builds don't work as well on the unity build. I had expected to be able to run at least one gcc per core but I quickly get I/O-limited rather than cpu-limited. I'm guessing from linking the objects into .a's in parallel (which seems hard to teach make not to do - it doesn't categorize actions), or simply that the total memory use of the gcc instances cause swapping.

    I'm also getting mahaf link errors from acpi.cpp - but the code looks like it couldn't ever work on linux so I'm thinking it can't be caused by my changes but rather something that is actually an error in the source - especially since I also get the same error in unity builds and non-unity builds.

    As for build time results: with -j1 I get a build time of 1m10s, -j2 gives 37s and -j4 25s (4 core machine). On the same machine, a non-unity build with -j4 takes around 1m15s, so I guess that means a theoretical speedup of around 4x, although the actual speedup for me was only about 2x.

    unity.patch.txt

  2. SpiderMonkey on linux/unix/os x is now bundled with the sources (actually, the bundling has been done for some time, I just didn't actually activate it in the build system until now), so the procedure for building and installing spidermonkey has changed.

    The rationale is that getting the proper version compiled with the correct settings is messy for everyone who aren't lucky enough to have their distribution provide a package for it, so it's easier to bundle our own build script with the proper configuration, and that we'd prefer to use a specific javascript version everywhere to avoid accidentally producing incompatible scripts.

    When checking out a new tree (and of course for any existing trees), you'll now have to do this:


    cd trunk/libraries/spidermonkey/src
    ./build.sh

    to build and install spidermonkey within your tree.

    (http://trac.wildfiregames.com/wiki/BuildInstructions has also been updated for Linux and OS X)

    Actually I also made update-workspaces.sh compile/update the bundled external libraries on every run while writing this post, so those new instructions are already out of date ;) Hopefully the usefulness of not having to manually build the external libraries outweighs the extra time spent checking that the libraries are up-to-date and the extra spam from those build scripts.

  3. We need such a feature for various byte-oriented stuff like Zip archive headers, and supporting GCC's insane syntax and the widespread #pragma pack is definitely not desirable.

    That's why depending on struct layout is a bad idea :)

    Looking at it closer, I have OpenAL :

    - in Apple's /Developper/Library/... folder,

    - in 0ad's library folder,

    - in macport's /opt/local/... folder

    Given the error messages, you definitely seem to be getting the Apple installation of OpenAL. If the headers in /opt/local don't have the same problem, maybe you can change the compilation settings for OpenAL to use the macports openal instead of the "framework".

    To try that out change extern_libs.lua and just remove the line where we include the OpenAL framework (I think the line will look like osx_frameworks = {"OpenAL"}). If the macports-installed headers are in /opt/local/include/AL/alc.h rather than OpenAL/alc.h you will also have to change the openal.h file in our sources to include AL/* instead of OpenAL/*.

    If that fixes it, we may want to explicitly use the macports openal instead of apple's preinstalled one.

    • cached resources (data/cache and the mod archives): since these (only) depend on the resources themselves, it'd be nice if the installation automatically generated these from the resources installed - unless you actually modify something yourself you shouldn't have to generate them into the user-local directory. I think that when resources are modified we only have to store user-local cached versions of the actually changed resources rather than all of them?
    • screenshots and logs: should be moved into the user-local directory methinks. We should also have automatic (re-)creation of these directories (including copying the resources for the log html).
    • profiles: although these are kind of meant to be global (like every user on the computer creates their own profile), the right thing is probably to put the set of profiles entirely inside the user-directory.
    • user-local mods: while we don't write these - the directory to put them in will depend on all the other stuff. I think if we just mount ~/.0ad as data/ in the VFS, user-local mods and overrides for individual files just go in ~/.0ad/mods/ and that's it. Having this VFS is pretty sweet :)

    As janwas mentions in the trac ticket, we should also differentiate between user-local cache and user-local config/data. On unix we could just let the cache and config directories be the same (~/.0ad or ~/.pyrogenesis or whatnot) since there's AFAIK no "application data" distinction there.

    If we implement those and set up the VFS to disallow write access to the non-user-local directories, all other things we try to write should be pretty obvious since the VFS should then just fail :D

  4. The way things are set up now, non-windows builds always use the system-installed DevIL library and never the one in the tree (and like you have noticed, the version in the tree doesn't work on linux).

    If the code can't find the system-installed devil headers it will just fail to compile (which seems to be the original problem). To fix that, you might need to change the definition of devil in extern_libs.lua - if you have devil installed in a non-standard location for instance. (We currently add no include paths for devil, but if it's required and if devil comes with a config-program we could use that to set the proper cflags)

  5. Since the link options are added in an arbitrary order, all the library link options are already inside a --start-group/--end-group pair which is supposed to fix that...

    I suppose we could change the order in which LDFLAGS is generated by premake (in gnu_cpp.c), but that feels kind of hackish, and it still doesn't guarantee any specific internal ordering between libraries (only between the chunk of package.links-libraries and the explicit link flags in linkoptions).

  6. I had to run "sudo ln libboost_signals-mt.so libboost_signals.so" in /usr/lib.

    The scripts have been changed to always include the -mt libraries (since we couldn't find an example of someone having only the non-mt variants installed), so this shouldn't be required any longer.

    It is worth mentioning that I got some warning similar as mentioned in the thread [...]

    Basically, all these warnings require some developer to go through them one by one and fix them - not terribly difficult but someone needs to take the time to do it. This should probably be reported to the bug tracker and handled there.

  7. Been through the patch and merged all the stuff that corresponded to compile errors I could reproduce.

    (strike-through means I've committed fixes for it - but adding it to the list just to jot down that it was in the patch, the rest is todo)

    • premake config: collada and boost
    • fcollada: stringbuilder fixes
    • fcollada: pragmas (I didn't see any issues with this when compiling on my mac - was this only warnings?)
    • fcollada: use of snprintf (should be fixed, especially if long is more than 32 bits, but didn't cause any compile errors)
    • fcollada: ifdefs in filemanager (fixed by changes in the fcollada makefile - it shouldn't define Linux when on mac)
    • fcollada: makefile changes
    • openal: string type, should be fixed as trac issue #268
    • lib: timer type, debug_DumpStack, secure_crt functions
    • wxWidgets regexp stuff: I assume that passing the "advanced" argument enables some regexp extensions - but do these regexpes really mean the same thing if we use default rather than advanced regexpes? If so, we should change them to actually use the default-flag for everyone, else we should probably rewrite the regexpes to only use "default"-features.
    • wxUSE_DEBUG_REPORT: looks trivial enough, should be merged I guess
    • atlas/ScenarioEditor: Apple-ifdeffed _UINTxx defines. I didn't see any related compile errors on my mac, but if it's a problem with another version of something (os x, for instance) it looks like we would like to find a cleaner solution for the problem anyway.

    Since the patch was a lot of small changes, I might have missed something, but what's in SVN now actually compiles (and runs!) on my macbook :)

    (Well, the collada integration still doesn't compile for me, due to changed api:s in libxml2 and general confusion between the four separate sets of libxml2 headers that are reachable - but I think that's a slightly larger problem and I'll leave it to tomorrow and/or someone else to sort that out :D)

    In other news, valgrind is now optional and support must be explicitly enabled by passing --with-valgrind to update-workspaces.sh.

  8. About that assertion, the game seems to launch after just suppressing it. (But of course it should also be fixed)

    About the cxxtest error:


    In file included from ../../../libraries/cxxtest/include/cxxtest/StdValueTraits.h:10,
    from ../../../source/lib/self_test.h:189,
    from ../../../source/pch/test/precompiled.h:24,
    from ../../../source/pch/test/precompiled.cpp:25:
    ../../../libraries/cxxtest/include/cxxtest/ValueTraits.h:281: error: redefinition of ‘class CxxTest::ValueTraits<long unsigned int>’
    ../../../libraries/cxxtest/include/cxxtest/ValueTraits.h:266: error: previous definition of ‘class CxxTest::ValueTraits<long unsigned int>’

    I have a fix for this locally, but as usual I don't really know if it will break Windows :)


    Index: include/cxxtest/ValueTraits.h
    ===================================================================
    --- include/cxxtest/ValueTraits.h (revision 6954)
    +++ include/cxxtest/ValueTraits.h (working copy)
    @@ -276,10 +276,8 @@
    CXXTEST_COPY_TRAITS( const unsigned char, const unsigned long int );

    CXXTEST_COPY_CONST_TRAITS( signed int );
    - //CXXTEST_COPY_CONST_TRAITS( unsigned int );
    -#ifndef __APPLE__ // avoid redefinition errors on mac
    - CXXTEST_COPY_TRAITS( size_t, const unsigned int ); // avoid /Wp64 warnings in MSVC
    -#endif
    + CXXTEST_COPY_CONST_TRAITS( unsigned int );
    +
    CXXTEST_COPY_CONST_TRAITS( signed short int );
    CXXTEST_COPY_CONST_TRAITS( unsigned short int );
    CXXTEST_COPY_CONST_TRAITS( unsigned char );

  9. Apparently, HOSTTYPE is not exported by default (so premake sees the variable as undefined). I've added a hack in update-workspaces.sh that should forward the proper value to premake. I also added some code to send the proper elf64 format to nasm conditionally, so the assembly stuff should work out of the box on 64-bit linux now.

    About the fcollada compile errors, a work-around that worked for me (I'm also on 64-bit linux) was to just remove the #else of that #ifdef WIN32 - I seems visual c++ thinks 'int' is a different type than both int32 and int64. I went ahead and checked that change in, so let's just hope it didn't break 32-bit linux and mac os :)

  10. The beginning of the source is URL-encoded (unescape('%xx%xx%xx') and so forth) - probably to give the source of the custom unencryption function, with which the rest of the source is encrypted.

    So, it shouldn't be that hard to figure out how to get the real source. If you really wanted it - it's probably obfuscated and/or unreadable anyway (I don't think they've encrypted the comments in the code :))

  11. EXPRESSION = perl -ane'print join " ", grep /\.(o|cpp)$/, @F'
    CXX = echo -e \\tg++ `echo $^ | $(EXPRESSION)` && g++
    CC = echo -e \\tgcc `echo $^ | $(EXPRESSION)` && gcc

    I think your problem is the $ sign at the end of EXPRESSION: ...(cpp)$/...

    Make doesn't really do much with quotes and stuff, so it'll try to expand $/ as a make variable - which probably has some really strange value (or no value at all)

    Try replacing $ with $$ (where $$ is a special variable evaluating to $ :))

  12. I've used UMLPad before - really small very basic UML drawing program. Does what it does and not much more, basically ;-)

    Link: http://web.tiscali.it/ggbhome/umlpad/umlpad.htm

    EDIT: There's a program for linux called Dia, might be what you're referring to Klaas? Dia should exist in a win32 version if you like that better than UMLPad (but umlpad's uml drawing is IMO better and the download size/bloat of umlpad is much much smaller)

  13. Try mplayer - should play any- and everything :wine:

    But it isn't certain that it comes with a GUI, so you'd have to launch it from a terminal window

    mplayer dvd://<title number>

    should work (with trying a few title numbers to get the movie instead of some intro/extras)

    If you're lucky though, you'd get mplayer with a GUI and DVD menu support (but why bother when it comes with such a beautiful Command Line Interface :P)

  14. That's a pretty large question ;-) It can't really be answered up front without knowing exactly how your game's code looks like and everything that, but I could give you a few pointers nonetheless:

    Here's the forum FAQ for gamedev's network and multiplayer forums - contains some good links, might be a good place to get started:

    http://www.gamedev.net/community/forums/sh...asp?forum_id=15

    Another forum I've found greatly useful is the UNIX Socket FAQ - although it is a bit more advanced and is probably not very useful until you've actually made your first network code and tried out some different network stuff: (btw, even if it says UNIX, most of the info is also applicable to windows networking which is actually based on UNIX code [sometime in a past decade])

    http://www.developerweb.net/forum/

    Good luck! And I assure you: Networking is the most fun part of programming! (Some of them just haven't noticed yet :P)

  15. I think the easiest way is to simply use the system() function (I assume you're on unix since you want to run grep - I'm not sure if windows' version works the same)

    Excerpt from man system:

    int system(const char *command);

    DESCRIPTION

    system() executes a command specified in command by calling /bin/sh -c

    command, and returns after the command has been completed.

    If you need to do more advanced things, like having it run in the background or if you want to read your command's output through a pipe, you should have a look at fork and the exec group of functions

  16. I'm using nedit (and I'm the only one so far - whut!?).. nedit is IMO the best text/code editor ever for unix/X-windows - great syntax highlighting, normal keyboard shortcuts (i.e. not emacs or vi), and just plain nice and stable. If I have to do something in the console I'll use nano (which btw does have syntax highlighting in recent versions).

    When I started using unix(linux), basically the choice was between vi or emacs. Vi or Emacs I'd heard of - that unix users were split in two camps - vi-lovers and emacs-lovers (I can't understand either one of those camps!)

    Vi: "I could start it, but I couldn't figure out how to make it edit files" ;)

    emacs: "hmm... it did edit text, but how the heck does one save the file? or exit? Ctrl-C doesn't work! I'll just reboot and I'll get back to the login prompt"

    hmm... what other editors did this CD get me?

    pico: "oh, Ctrl-X is exit - nice! and it actually edits text when I tell it to - nice! that's a keeper!"

    On the rare occasions I'm coding stuff in windows I'll be using VS2k3.

    Emacs might be a nice operating system, but the editor sucks.
×
×
  • Create New...