LazyGreyMatter Posted April 7, 2021 Report Share Posted April 7, 2021 Based on the wiki: Getting Started with Reinforcement Learning, I have been unsuccessfully trying to execute the scenarios and train the model. The code keeps failing when trying to load the maps from zero_ad_rl. The maps zipped as "0ad_rl_maps.zip" give out either of the two errors: Unable to load textures - too old - (Temp fix by changing the scenario version from 6 to 7 (as is the current FILE_READ_VERSION in source code)). Unable to load map - check application logs (which basically throws .pmp File_OpenFailed error) Since I am new to the modding part of this, can anyone please direct me as to how to fix these issues, especially the pmp file not loading error? Quote Link to comment Share on other sites More sharing options...
Lion.Kanzen Posted April 7, 2021 Report Share Posted April 7, 2021 Let me help you. @Stan` Quote Link to comment Share on other sites More sharing options...
LazyGreyMatter Posted April 7, 2021 Author Report Share Posted April 7, 2021 Sure, any help is greatly appreciated! Quote Link to comment Share on other sites More sharing options...
Lion.Kanzen Posted April 7, 2021 Report Share Posted April 7, 2021 8 minutes ago, LazyGreyMatter said: Sure, any help is greatly appreciated! While you wait for an official response, I want you to see the following link, to see if it is not related to your topic. Quote Link to comment Share on other sites More sharing options...
LazyGreyMatter Posted April 7, 2021 Author Report Share Posted April 7, 2021 Appreciate you sending me the link! Unfortunately, that did not solve my problem... As you can see, even after running the script (without errors) to version 7, this is the error I am getting - < pmp File_OpenFailed > Quote Link to comment Share on other sites More sharing options...
Lion.Kanzen Posted April 7, 2021 Report Share Posted April 7, 2021 @Angen @Freagarach ------ By this time, our programmers must be sleeping. (Euro zone time) Quote Link to comment Share on other sites More sharing options...
Stan` Posted April 8, 2021 Report Share Posted April 8, 2021 0ad_rl_maps.zip Try this mod. ping @irishninja 1 1 Quote Link to comment Share on other sites More sharing options...
Langbart Posted April 8, 2021 Report Share Posted April 8, 2021 (edited) I followed the instructions from the article wiki/GettingStartedReinforcementLearning, but I run into an issue after this command picus@Picus zero_ad_rl-master % python3 -m zero_ad_rl.train --help Traceback (most recent call last): File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/runpy.py", line 193, in _run_module_as_main return _run_code(code, main_globals, None, File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/runpy.py", line 86, in _run_code exec(code, run_globals) File "/Users/picus/0ad/source/tools/rlclient/zero_ad_rl-master/zero_ad_rl/train.py", line 5, in <module> from ray.rllib.train import create_parser, run File "/Library/Python/3.8/site-packages/ray/__init__.py", line 63, in <module> import ray._raylet # noqa: E402 File "python/ray/_raylet.pyx", line 20, in init ray._raylet ImportError: dlopen(/Library/Python/3.8/site-packages/ray/thirdparty_files/setproctitle.cpython-38-darwin.so, 2): Symbol not found: _Py_GetArgcArgv Referenced from: /Library/Python/3.8/site-packages/ray/thirdparty_files/setproctitle.cpython-38-darwin.so Expected in: flat namespace in /Library/Python/3.8/site-packages/ray/thirdparty_files/setproctitle.cpython-38-darwin.so picus@Picus zero_ad_rl-master % Procedure macOS 10.15.7 (8/Apr/21) Spoiler Download & Install 0ad Download and install 0 AD from source (just follow wiki/BuildInstructions) Download & Install Python 3.8 Install Homebrew and Python@3.8 I had to install "python@3.8" because "python@3.9" did not work for me. Make sure you uninstall "python@3.9" completely, otherwise I had problems installing "ray". (See docs.ray.io/en/master/installation.html) Install 'zero_ad' Python Client navigate to this folder in the 0 A.D. source folder. ~/0ad/source/tools/rlclient/python run sudo -H python3 -m pip install . Confirmation message: Successfully installed zero-ad-0.0.1 run sudo -H python3 -m pip install -r requirements-dev.txt Confirmation message: Successfully installed iniconfig-1.1.1 pluggy-0.13.1 py-1.10.0 pytest-6.2.3 toml-0.10.2 running tests 0ad/binaries/system/pyrogenesis --rl-interface=127.0.0.1:6000 --autostart-nonvisual --mod=public python3 -m pytest Output picus@Picus python % python3 -m pytest ============================== test session starts =============================== platform darwin -- Python 3.8.2, pytest-6.2.3, py-1.10.0, pluggy-0.13.1 rootdir: /Users/picus/0ad/source/tools/rlclient/python collected 10 items tests/test_actions.py ...... [ 60%] tests/test_evaluate.py .... [100%] ============================== 10 passed in 22.50s =============================== We can check that our installation worked by running one of the example scripts. python3 ./simple-example.py Output picus@Picus samples % python3 ./simple-example.py female citizen's max health is 34.99985 {'id': 4753, 'template': 'units/spart/support_female_citizen', 'position': [612.9468231201172, 619.9051971435547], 'angle': 0.756927490234375, 'hitpoints': 34.99985, 'invulnerability': False, 'owner': 1, 'idle': False, 'stance': 'passive', 'unitAIState': 'INDIVIDUAL.REPAIR.APPROACHING', 'unitAIOrderData': [{'target': 4776, 'autocontinue': True, 'force': True}], 'resourceCarrying': [], 'garrisonHolderID': 0} Download & Install 'zero_ad_rl' Download "zero_ad_rl" from here: github.com/brollb/zero_ad_rl First install the “ray[rllib]” and then install "zero_ad_rl" via command sudo -H python3 -m pip install 'ray[rllib]' (pypi.org/project/ray/) sudo -H python3 -m pip install -e . Confirmation message: Successfully installed zero-ad-rl In the "zero_ad_rl" folder you will find a zip file named '0ad_rl_maps.zip', unzip it and change the name of the folder from "0ad_rl_maps" to "rl-scenarios" and add it to your 0ad mods folder (the mod.json file names this mod "rl-scenarios", but if the mod folder is not changed as well, the maps will not be displayed). if you open the game with the mod enabled you should see under Scenario some new maps e.g. "Cavalry Vs Spearmen" or "Cavalry Vs Slingers" Some maps use the old version 6 in their XML files, you have to update them from version 6 to version 7 with Phab:P232 otherwise it will fail But after I did that I still get 'entity' and 'CCacheLoader' errors for maps like "CavalaryVSInfantry", only "CavalryVsSlingers.xml" and "CavalryVsSpearmen.xml" worked for me. UPDATE: Stan fixed this problem, by updating the maps Training our agent pyrogenesis --rl-interface=127.0.0.1:6000 --autostart-nonvisual --mod=rl-scenarios --mod=public python3 -m zero_ad_rl.train --help error UPDATE: As described here (github.com/ray-project/ray/issues/10428), it does work with 3.8.5 but not with 3.8.9 (the current python@3.8 homebrew version) Edited April 8, 2021 by Langbart added python 3.8.5 remark Quote Link to comment Share on other sites More sharing options...
Stan` Posted April 8, 2021 Report Share Posted April 8, 2021 you're missing the argparse module I think Quote Link to comment Share on other sites More sharing options...
Langbart Posted April 8, 2021 Report Share Posted April 8, 2021 @LazyGreyMatter is your problem fixed? I got it to work on my computer. I briefly had a problem as you described, but the reason was that I forgot to include --mod=rl-scenarios while running the RL interface: pyrogenesis --rl-interface=127.0.0.1:6000 --autostart-nonvisual --mod=rl-scenarios --mod=public Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.