Jump to content

ubuntu 9.10 build & WinXP build


Recommended Posts

it helped, the game starts without a problem but i get some disturbance in the textures

are there any options like -quickstart in linux? i would like to turn off the sound because it doesnt play in good quality, i could probably record it if anyone would need to hear it.

Edited by jd823592
Link to comment
Share on other sites

I see something strange with texture mipmaps on Linux when forcing S3TC - I assume you're getting the same problem. Haven't looked into what causes it - I guess we might just be lacking some mipmap levels and the drivers can't compress automatically, or something like that, so it can probably be fixed later.

-quickstart works on Linux too. The sound quality is a known problem with OpenAL and PulseAudio on Ubuntu.

Link to comment
Share on other sites

hm, double fail - the patch apparently didn't get attached (probably due to .patch extension not being whitelisted by the forum?) and didn't include a newly created file, so I can't commit *sigh*

Will try again tomorrow evening.

As soon as someone runs the game on a filesystem with ISO-8859-1 names (which is quite common) containing non-ASCII characters, it's going to break.

Oh no :/ Come to think of it, I believe we have the same problem on Windows (NTFS not using UTF8).

Is it feasible to determine the filesystem (and its encoding) on which our files are stored? (we'd have to do this for every mount point, but still)

i.e. handle pathnames internally as fs::path on Linux, fs::wpath on Windows

hm, sounds at least partially possible - however, we do need to do some string processing (not only appending blobs of characters, but also extracting the name of the mod / digging through pathnames to find out where to store the cached XMB). I have a bad feeling about this, at some point we'll need to understand the paths and their encodings.

ATM I am holding out hope for the fs-encoding-query option (we already do something similar in wfilesystem.cpp).

so it seems best for us to avoid encoding conversions as much as possible.

Right, at least this change moves us in that direction.

Link to comment
Share on other sites

NTFS not using UTF8
As far as I'm aware, that doesn't make sense. NTFS just deals with strings of arbitrary 16-bit values, and Win32's *W functions deal with strings of UCS-2/UTF-16 characters (depending on version of Windows), and there doesn't need to be any conversion in there, and there's no UTF-8 or other encodings involved. Am I misunderstanding how these things work, or what the problem is?
Is it feasible to determine the filesystem (and its encoding) on which our files are stored? (we'd have to do this for every mount point, but still)
Not on Linux - filesystems don't have encodings, they just handle paths as strings of bytes. Encoding is an application-level concept. Even if your system consistently uses UTF-8 everywhere, a rogue application could create a file whose name is not valid UTF-8. And even if filesystems did have consistently-used encodings that we could detect, any subdirectory could be symlinked to a different filesystem with different rules. These things aren't very likely in practice, but nor are they unimaginable (and filesystem encoding on Linux is a bit of a mess and easy to break), and I think we ought to implement something that's actually correct rather than something that's similarly complex but not quite right.

(Technically, some filesystem drivers (e.g. the NTFS ones) let you configure how characters on disk get encoded before they're returned via the byte string APIs, so encoding is sometimes also a filesystem concept. But that's just an implementation detail.)

(This article seems usefully informative about filesystems.)

(I'm not entirely sure how these concepts apply to OS X. HFS+ stores filenames as UTF-16 (in NFD) and probably converts to UTF-8 for the POSIX APIs; I've no idea what it does with non-ASCII on FAT filesystems. I guess the safest assumption is that if you get some bytes through the API and send them back out through a similar API, they'll be handled consistently but you can't rely on anything else.)

however, we do need to do some string processing (not only appending blobs of characters, but also extracting the name of the mod / digging through pathnames to find out where to store the cached XMB). I have a bad feeling about this, at some point we'll need to understand the paths and their encodings.
Yeah, that's a bit of a pain. We can probably require that none of our own data files use non-ASCII characters, but I'm not sure how much that simplifies things.
Link to comment
Share on other sites

NTFS just deals with strings of arbitrary 16-bit values, and Win32's *W functions deal with strings of UCS-2/UTF-16 characters (depending on version of Windows), and there doesn't need to be any conversion in there, and there's no UTF-8 or other encodings involved. Am I misunderstanding how these things work, or what the problem is?

Right, but here's what I'm thinking: any codepoints outside the BMP cannot be stored in NTFS, so how are Asian filenames represented? (unless they fit within the BMP, but the Han characters alone are 70k in number).

Not on Linux - filesystems don't have encodings, they just handle paths as strings of bytes. Encoding is an application-level concept. Even if your system consistently uses UTF-8 everywhere, a rogue application could create a file whose name is not valid UTF-8. And even if filesystems did have consistently-used encodings that we could detect, any subdirectory could be symlinked to a different filesystem with different rules.

OK, so much for that idea.

Yeah, that's a bit of a pain. We can probably require that none of our own data files use non-ASCII characters, but I'm not sure how much that simplifies things.

hm, that's a copout (I bet some people would be surprised that they can't specify savegame names with special characters).

However, even that doesn't yet solve the problem (see below).

what if you get some path from the system but then let user specify the filename.. then you would need the conversion anyway, am i wrong?

Correct: the user enters characters in a known encoding (wchar_t, i.e. basically just the Unicode codepoint); we can't just paste those bytes onto those received from the system. If the system returned latin-1 and we tack UTF8 onto that, we have screwed the pooch. Maybe Linux is successful at sticking its head in the sand in this regard, but we don't only receive and pass along pathnames unchanged. User-specified paths are one example; there are also cases where we have to take an existing data path, strip off some parts, and add some characters (specifying the cache directory). If the encoding of any of these path sources happens to be different, we are definitely screwed.

I have a definite sense of unease about distinct path vs wpath - not only would we need new string processing functions (more than just strcpy, see above), but moving towards less certainty (i.e. not knowing the encoding at all) seems a step in the wrong direction.

It looks like the only real source of trouble is Linux returning either latin-1 or UTF8. How about we add checks to wstring_from_utf8? If the bytes are actually Latin-1, we'd notice because it's hopefully (proof needed) invalid UTF8. In that case, we can assume it's latin-1 (or maybe even the current locale) and copy that to wchar_t without too much trouble.

Link to comment
Share on other sites

any codepoints outside the BMP cannot be stored in NTFS, so how are Asian filenames represented? (unless they fit within the BMP, but the Han characters alone are 70k in number).
(Asian characters are generally in the BMP, via Han unification; the other planes are used for historic and obscure scripts and characters.)

If I copy-and-paste a U+10000 character into a filename in Explorer on Vista, then it acts like a single character in Explorer (for rendering and cursor movement etc), but FindFirstFileW returns it as two wchar_ts (0xD800 0xDC00), and it matches the wildcard "??" (not "?").

If I call CreateFileW with a name with containing 0xD800 0xD800 (sic), it roundtrips fine through FindFirstFileW, and Explorer displays the name as containing two characters, and I can copy-and-paste them and they remain as 0xD800.

If I create 0xDC00 0xD800 then Explorer displays two characters, but the first character is impossible to select (the cursor skips straight over it and treats it as part of the previous character, unless it's the first character; if it's the first character and I paste a 0xD800 in front of it then the pair starts looking and acting like a single character).

So... NTFS really doesn't care. It just stores arbitrary 16-bit values. They get passed unchanged (and unvalidated) through the *W functions. Explorer's filename rendering handles properly-paired surrogate code units as if they were the appropriate non-BMP code point, and its filename editing UI seems to work by basically ignoring any high surrogate.

So on Windows/NTFS you can have filenames that contain unpaired surrogates and therefore cannot be losslessly converted to valid UTF-8. But if you use the normal UI for creating files, you'll end up with valid UTF-16 names (which always can be converted to UTF-8).

It looks like the only real source of trouble is Linux returning either latin-1 or UTF8.
It's not just those two choices, it can be any byte-based encoding. I expect some people use e.g. Shift-JIS in practice. (It can even be EBCDIC (which turns most of my filenames into question marks), though locale-gen warns "not ASCII compatible, locale not ISO C compliant". At least it can't be UTF-16/UTF-32.)
How about we add checks to wstring_from_utf8? If the bytes are actually Latin-1, we'd notice because it's hopefully (proof needed) invalid UTF8. In that case, we can assume it's latin-1 (or maybe even the current locale) and copy that to wchar_t without too much trouble.
That doesn't sound like it solves the roundtripping problem - if we see we're in the path /home/andré/... and decode it as UTF-8 or Latin-1 (depending on whether é is one byte or two), and then we want to write the file /home/andré/.config/..., how do we know how to encode the path again?

So I still think the only 'proper', robust, theoretically mostly correct approach is:

* On Windows, treat paths as strings of arbitrary 16-bit values, as much as possible. (Characters like "/" still have special meaning; but 0xD800 is just an arbitrary number, it's not a Unicode anything.)

* On Linux, treat paths as strings of arbitrary 8-bit values, as much as possible. (Characters like "/" still have special meaning; 0xC0 is just a number.)

* In both cases, conversion to a Unicode string (in any encoding) may be lossy, so don't do that unless lossiness is acceptable.

* When filenames have to be exposed to the user (e.g. in log file output, or in saved game names (if we don't just give them numerical filenames)):

- * On Windows, encode/decode as UCS-2. (Our user input can't contain non-BMP characters, because we're restricted to the BMP internally, so we don't need to bother properly encoding as UTF-16.)

- * On Linux, encode/decode based on the current locale environment settings (defaulting to UTF-8 if unspecified). (The settings might be wrong, but this is the best we can do.)

- * In both cases, encode/decode as little of the path as possible (e.g. encode the saved game name before concatenating it onto the opaque data directory pathname, rather than decoding the pathname first then concatenating then encoding).

* And use numerical filenames for saved games, so we don't have to worry about users entering Unicode or slashes or quotes etc.

(We don't have to do things correctly - it should work in 99% of cases if we just hard-code it as UTF-8 or whatever. But I think there are non-zero cases where it would break, because the user has a weird environment, and it's possible for us to make it work more reliably, and I don't like leaving intentional bugs.)

Link to comment
Share on other sites

Hello,

I can't compile on Ubuntu 9.10 (32 bits)

$ make
==== Building mocks_real ====
make[1]: `../../../binaries/system/libmocks_real_dbg.a' is up to date.
==== Building network ====
make[1]: `../../../binaries/system/libnetwork_dbg.a' is up to date.
==== Building engine ====
make[1]: `../../../binaries/system/libengine_dbg.a' is up to date.
==== Building graphics ====
make[1]: `../../../binaries/system/libgraphics_dbg.a' is up to date.
==== Building i18n ====
make[1]: `../../../binaries/system/libi18n_dbg.a' is up to date.
==== Building atlas ====
make[1]: `../../../binaries/system/libatlas_dbg.a' is up to date.
==== Building gui ====
make[1]: `../../../binaries/system/libgui_dbg.a' is up to date.
==== Building lowlevel ====
make[1]: `../../../binaries/system/liblowlevel_dbg.a' is up to date.
==== Building pyrogenesis ====
make[1]: `../../../binaries/system/pyrogenesis_dbg' is up to date.
==== Building mocks_test ====
make[1]: `../../../binaries/system/libmocks_test_dbg.a' is up to date.
==== Building AtlasObject ====
make[1]: `../../../binaries/system/libAtlasObject_dbg.a' is up to date.
==== Building AtlasScript ====
make[1]: `../../../binaries/system/libAtlasScript_dbg.a' is up to date.
==== Building wxJS ====
radiobtn.cpp
cc1plus: error: obj/wxJS_Debug/precompiled.h: No such file or directory
cc1plus: error: one or more PCH files were found, but they were invalid
cc1plus: error: use -Winvalid-pch for more information
In file included from /usr/include/wx-2.8/wx/dcgraph.h:17,
from /usr/include/wx-2.8/wx/dc.h:892,
from /usr/include/wx-2.8/wx/wx.h:48,
from /usr/include/wx-2.8/wx/wxprec.h:68,
from ../../../source/tools/atlas/wxJS/precompiled.h:23,
from ../../../source/tools/atlas/wxJS/gui/control/radiobtn.cpp:1:
/usr/include/wx-2.8/wx/geometry.h:91: warning: redundant redeclaration of 'wxPoint2DInt operator*(wxInt32, const wxPoint2DInt&)' in same scope
/usr/include/wx-2.8/wx/geometry.h:90: warning: previous declaration of 'wxPoint2DInt operator*(wxInt32, const wxPoint2DInt&)'
/usr/include/wx-2.8/wx/geometry.h:93: warning: redundant redeclaration of 'wxPoint2DInt operator*(const wxPoint2DInt&, wxInt32)' in same scope
/usr/include/wx-2.8/wx/geometry.h:92: warning: previous declaration of 'wxPoint2DInt operator*(const wxPoint2DInt&, wxInt32)'
/usr/include/wx-2.8/wx/geometry.h:96: warning: redundant redeclaration of 'wxPoint2DInt operator/(const wxPoint2DInt&, wxInt32)' in same scope
/usr/include/wx-2.8/wx/geometry.h:95: warning: previous declaration of 'wxPoint2DInt operator/(const wxPoint2DInt&, wxInt32)'
make[1]: *** [obj/wxJS_Debug/radiobtn.o] Error 1
make: *** [wxJS] Error 2

Link to comment
Share on other sites

Ok i compiled the source :

# apt-get source libxml2
cd libxml2-2.7.5.dfsg
libxml2-2.7.5.dfsg# ./configure --prefix=/usr --with-history --with-threads
libxml2-2.7.5.dfsg# make
libxml2-2.7.5.dfsg# make install

So now 0ad compiles.

binaries/system$ ./pyrogenesis_dbg
TIMER| InitVfs: 30.3825 ms
TIMER| InitScripting: 6.79384 ms
TIMER| CONFIG_Init: 19.8122 ms
TIMER| write_sys_info: 1.2212 ms
TIMER| ps_console: 10.0592 ms
TIMER| ps_lang_hotkeys: 33.2977 ms
TIMER| common/setup.xml: 3.77364 ms
TIMER| common/styles.xml: 25.2676 ms
TIMER| common/sprite1.xml: 141.386 ms
TIMER| common/init.xml: 102.074 ms
TIMER| pregame/mainmenu.xml: 427.651 ms
TIMER| common/global.xml: 3.00624 ms
TIMER| InitRenderer: 68.3667 ms
TIMER| SimulationInit: 26.9646 ms
TIMER| Init_miscgamesection: 134.731 ms
SND| alc_init: success, using ALSA Software
TIMER| common/setup.xml: 2.5718 ms
TIMER| common/styles.xml: 64.8447 ms
TIMER| common/sprite1.xml: 265.818 ms
TIMER| common/init.xml: 212.75 ms
TIMER| loading/loading.xml: 22.9264 ms
TIMER| common/global.xml: 1.76368 ms
TIMER| common/setup.xml: 2.36204 ms
TIMER| common/styles.xml: 32.4064 ms
TIMER| common/sprite1.xml: 189.239 ms
TIMER| common/init.xml: 149.481 ms
TIMER| session/session.xml: 225.339 ms
TIMER| session/manual.xml: 38.7122 ms
TIMER| common/global.xml: 27.1747 ms
GAME STARTED, ALL INIT COMPLETE

Thank you for your help :)

Link to comment
Share on other sites

this is what i get after clean&update&make


/usr/bin/ld: ../../../binaries/system/libwxJS.a(treeevt.o): relocation R_X86_64_PC32 against undefined symbol `wxKeyEvent::Clone() const@@WXU_2.8' can not be used when making a shared object; recompile with -fPIC
/usr/bin/ld: final link failed: Bad value
collect2: ld returned 1 exit status
make[1]: *** [../../../binaries/system/libAtlasUI.so] Error 1
make: *** [AtlasUI] Error 2

i was updating with --with-spidermonkey-tip

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...