Jump to content

Itms

WFG Programming Team
  • Posts

    2.494
  • Joined

  • Last visited

  • Days Won

    117

Everything posted by Itms

  1. Fantastic, thanks for your interest. Indeed we need to get the SM upgrade done, and I have the experience to make it happen, but I will have to focus on the git migration first, so I'll probably be entirely occupied by that for the month of August. So, I can help you, but I cannot entirely take it off your hands for now - and it is a good opportunity to try your hands at it if you are interested. The plan is to upgrade to a more recent ESR, yes, sometimes it's better to avoid skipping an ESR (especially if the newest one is too new and has undetected issues with embedding), but sometimes there is no need to keep a previous ESR... I'll have to see when I look into this specific one. It sounds great to have your help and input on the use of submodules! But, considering the work needed, the imminence of the migration, and the unresolved discussion about using them or not, I would like to perform the migration with the source-libs SVN repo, and try and maybe decide using submodules in the near future, after the migration. You can certainly start working on that, but you would design an incremental improvement upon my current work, so expect having to rebase a lot. Yes: even with submodules we'd need a wrapper script somewhere. I believe my get/build-libraries script are here to stay, and they could abstract the use of svn-export or git submodules without breaking the user workflow if we switch to submodules. Thanks for your input! I saw gitea support being launched in the latest release and I didn't know if we could be interested in that. I will look into sha256 support, if we're going to late-switch to git, we may as well use modern features and be future-proof Thank you so much for the work I will try and get other developers to give their opinion, but now at least there is no technical blockage against either solution for translations.
  2. I forgot that one. And I am almost certain there is a technical term for what I'm trying to describe, but I don't know it: very frustrating. What I mean is that translation commits are not a single unit of work. When we commit graphical assets, we don't commit a bunch of new graphics, work-in-progress models, and a few shader patches all together. We try to keep commits atomic, a single commit is the work of one artist, or a small group of artists, working on a specific asset or collection of assets. Whereas a translation update is just a large dump of the current state of our translators work across the entire project. It's not like translators coordinated themselves to publish a consistent update for French, or a multilingual update specifically covering the tutorial. If it was like this, then those commits would be crafted by humans, and I'd keep them in the history; on the contrary, I generally oppose generated commits (even if the generation is based on human work). A rule of thumb could be: can one write a meaningful commit message? If you can't be more specific than "Updated PO and POT files", the commit is not very meaningful. You wouldn't imagine pushing a "Updated PNG files" commit. I'd like to explain myself further about the date-triggered commits. What makes the translation commits especially meaningless is that they would be different if the Jenkins job ran ten minutes later. They depend on the state of Transifex at the moment the script is run. I believe there is a technical term for that but I don't know it either, and I think it's a bad practice to have such contingent commits in large quantity in the history (as I said, those translation commits would take 80% of the size of the git repo if I let them in!). Obviously I'm proposing such a date-dependent system when I propose nightly builds, but that's a typical Continuous Delivery process where users receive a new version of the game for testing. Since we don't have a choice but to pull Transifex updates regularly, it makes sense to me to include them in that regular delivery. But I don't think automatic continuous delivery is supposed to happen the other way around into the source repository.
  3. I agree! But I am still a bit concerned about timing, as my current full-time availability is going to expire It's definitely about trade-offs and there is no ideal solution... maybe we could find one over time but I feel like I don't have that time. I would love to find some middle ground on the translations, I thought the separate repo could be the solution, but it doesn't look like it is. Sorry not clear again! I mean the code under i18n_helper, extractors, and also tests, it looks like this code is shared between the pot generation script and the other scripts, so I kept it duplicated, it would require more work to cleanly separate all that. It's probably not worth it. Hum, me too, to be fair. But I'll post on the staff forums to alert on my schedule and deadlines and hope to receive more input. Thank you so much for the kind words, and I understand I had a bit forgotten about the hurdles of written communication in such a project and I really appreciate your acknowledgement of my work
  4. @DunedanAh damnit our messages crossed. Written communication is so frustrating. I'm talking about the translations of non-0ad projects in that sentence. Most of the complexity I encountered when setting up the separate repo would also exist with translations in the main git repo (many Jenkins jobs, much hacking around git+Jenkins peculiarities, checkDiff.py not compatible with git, ...), so I'm not sure that is correct. I separated them without issue, the only problem is with our bundled modules, but I don't know enough about those Python scripts. Feel free to take a look. Yeah, it is a big downside.... And it adds to the complexity of the release process, as deciding which translations get in would be done much sooner in the cycle. No, you can do that all the time since you can patch the build scripts to download stuff from elsewhere. We do that all the time when working on SM upgrades. SpiderMonkey contains the tarball, and all source libs contain windows prebuilt binaries. That's a good point, but it doesn't work for web views. If it's only personal preference, well I'd go with my own personal preferences, since I'm the one working on this. But I am indeed trying to come up with the best experience for contributors, so I'm listening to your feedback. At this point I'm just running out of control with this translations issue: I've been dedicating far too much time and energy on that recently, while I dreamt of actually planning the migration And since I'm pondering about how to handle translations since a year and a half now, it's difficult for me to consider all the options fairly: I'm inclined to keep the system I designed. So I suppose I'll just give up on that issue and accept keeping the translations in the repo. But I have a condition, that you update the checkDiff.py script to work with git! I'm not going to spend extra days on this thing. Whoops I thought you had access, just fixed that. Well... I was very proud to have reduced the size of that repository through careful design, and now I'm just throwing that part to the trash. To me it really feels like debasing my work, but I'll have to hope I'm wrong. For translations I trust your judgment, but for libraries that's an entirely different problem and keeping them in the repo is not possible. The idea of submodules is (probably) good but it can be done in the future if it actually bothers contributors. Having been maintaining those libs for seven years now, I'm much more confident about that part.
  5. Thanks for giving a hand In other news, I have added SPIR-V generation (still running) to the nightly build, and, through a sprint, I also set up a separate translations repository, to see how it would work in practice. I'd have to document it on the dev environment page if we go with that, but basically: upon each commit to the main repo, pot files are generated and pushed to the translations repo for the future: ideally, after that pot update, we should push templates to Transifex instead of letting them autoupdate daily, the translation repo pulls updates from Transifex. in the future, we should run the check script and also lint po files, and follow translation issues in that repo. we should be able to eliminate issues such as #4250 This system still has some rough edges but here are my comments after developing that: the concerns are nicely separated now. all interaction with Transifex happens in the translations repo. that's a really good thing. this separate repo will be the perfect place to collaborate with translators and fix long-standing issues such as #4250 the base git repo only sends generated pot files using messages.json and updateTemplates.py. I deleted all the other scripts and all the .tx/config files the translation repo holds the majority of i18n python scripts, which do not need to know about the structure of the data/ dir anymore, that should make their maintenance simpler the checkDiff.py script does not work anymore with git, it was developed specifically for SVN! so the translations git repo gets all po/pot updates even if they are irrelevant. it is not a big deal for that auxiliary repo, but this is another dealbreaker against the idea of having translations in the main git repo. this new repo adds some complexity to the migration, as those scripts need to receive changes for that new infrastructure the nightly repo (and the git repo) can fetch the translations without using svn export... but git does not have that "export" feature so the new fetch script relies on creating a temporary shallow clone, that's a bit yucky the Jenkins setup is much more complex now, with two extra jobs (maybe three if we add a push job to Transifex) on top of the nightly-build job. It is impractical to commit and push to git using Jenkins, this is definitely not a standard use case, and my pipelines may break in the future if Jenkins features evolve. (this reinforces my prejudice against letting bots push to git) the Jenkins setup around the translation updates is not covered by Gitea features. it will be impractical to setup post-commit triggers for this. In a nutshell, I think that this alternative way of handling translations is on the pros side: separates issues, addresses some of your concerns, and gives us much more flexibility in our interactions with Transifex, with room for improvement; on the cons side: more complex to maintain for Jenkins, would require extra development work for you and other Python devs around the migration
  6. Proof submitted by PM, I merged the user's temporary account into their main account, with restored access.
  7. Hello, what is your main account? I can see and check if your IPs match. Or else you'll have to find another way to prove that you are behind the other account. If we have proof, we can merge your accounts without problem, but else we obviously won't do it
  8. @Dunedan Thinking about it, if you want to give a hand, you could try and take a look at creating that translations repo. You should have access to create repositories on Gitea under the 0ad/ organization. Right now I'm focused on shaders and other Jenkins improvements, I'm not going to look at translations until next week.
  9. If the exported translations are not committed, that feedback does not actually happen... Do you think it could create actual issues? Yes, the first one isn't possible. The second one I hadn't considered but maybe it would be nice. I am thinking this "translations" repository could even allow us to store the translations of official/semi-official mods. Those would not be pulled into the git and the nightly-build of 0ad, but they could be synced with Transifex, allowing our translator community to translate them. The modders would then pull translations into their mods when they release them, and push pot files to the repository whenever they wish. For the third one, I answer you further below. Well it wouldn't be possible to review changes to the libraries themselves, but changes to the 0 A.D. code would still be reviewed and even tested by CI. This is indeed problematic but it is the same situation as today. Take for instance D5002: the contributor cannot publish the SM upgrade itself inside that diff, the diff only consists of the changes in the game. Right now that situation is too complicated for our CI pipelines to handle it (arc patch doesn't work well with binaries) thus the build for those diffs always fail and make much noise. With git, it would be possible for a contributor to send big binaries in their PR for initial review of the whole. Then we could commit the update to source-libs and have a final round of CI on the final PR that would just update the SVN revision in libraries/ scripts, before committing. So that addresses your concern. windows-libs changes are currently unreviewable, as they are just me or Stan sending new binaries on SVN. At least with the new system we could send the new binaries, then create a PR that just updates the SVN revision in libraries/ scripts, so that CI can run on it. So it's still a bit unreviewable, but it will actually go through CI before commit. ah that's right! Well, it is arbitrary, or at least based on feelings (but I think it's important to consider those). One of my issues is: those commits would lack justification. The history of the repo would contain a lot of commits that do not address an issue. Those would just be commits happening because "translations are updated all Mondays and Fridays". Moreover, those commits have the big downside of highlighting low development activity. It feels very bad to see that autobuild commits more than the team in times of reduced activity. This is awful for morale. My proposal allows us to have daily updated translations, which is better for testing, but if we keep translations in the git repo, having daily updates would dwarf actual human contributions. Hence, if my current system really irks you, your second proposal above is the only one I would find acceptable. Basically I think that the issue with spir-v and with translations is the same: we have files that are needed for the end user (including nightly testers) but which cannot be committed to the git repo (for translations, replace "cannot be" by "I would rather have not"). I figured that if I generated shaders in the nightly and provided a way to export them into the git repo if developers needed them, I could just have the translations follow suit, and fix two problems with the same consistent solution. You can contribute PRs to the private dev-migration repo on Gitea. But I agree it's a difficult project to collaborate on Your thoughtful feedback is much appreciated though By "large" I was thinking about translations (and "very" with shaders on top) but you're right that going from 100MiB to 700MiB would probably not bother users too much (I, on the other hand, would be sad to miss that opportunity to reduce the size by such a factor!) and I hadn't indeed realized that increasing the frequency of translation commits would not increase size that much. Thanks again for your feedback I'll try and think about the idea of a cleaner, separate repo for the translations.
  10. By the way, since I also read about that on IRC: It is very likely that my urgent work after the migration will be a (partial or full) SpiderMonkey upgrade. Having handled several of those now, I am thrilled (that's not an exaggeration) about performing that future upgrade under the new git setup I designed, mostly because CI will finally be able to handle it. I think I'm a good judge of whether the setup is adapted to performing library upgrades. If I stumble on rough edges, that will be the opportunity to address them, and if I don't, it will be a good example in the repo history of how to perform such an upgrade in the new setup. To reiterate, maybe it would be even better with submodules, but I don't know enough about them to design a submodule-based setup myself. I'm not against using them in the future.
  11. Thanks for the detailed point of view, I'll try to address that from my perspective. (I read on IRC about feeling sorry about giving me more work, don't be! I was aware of that possibility since I worked on my own on this migration. Some parts of it are a bit opinionated: I tried to setup my own ideal setup. That said, I am not against anything in principle, but I'm not going to overwork myself: whenever I am convinced by a proposal, I'll perform it with pleasure, but if I'm not, I'll let someone else do it) My number one expectation is: ditch Phabricator (well, they ditched us first, as implodedok put it). Stan hinted at that above. We need to continue having a collaboration platform, and Phabricator was the only big one that provided support for SVN. But apart from that, I have nothing against SVN. It's still a very usable tool, with many upsides, and as long as we had a forge allowing us to collaborate, we could do without branches and PRs. But it's not the case anymore, so we have to switch to git. You (and others) dream of stopping using SVN entirely, but I also had to convince people who don't want to use git at all. I tried to find a middle ground here and tried to exploit the main advantages of both tools, and I'm quite happy with the result. I disagree with that part. You'd need to have subversion installed on your system (just like now), but you wouldn't have to interact with it anymore. You'd just run the libraries/build-source-libs.sh script and be done with it. No, I added a source/tools/i18n/get-nightly-translations.sh script to export the latest translations into your git clone. It was a valid concern though, bb was the one to bring it to my attention first. That is true, but the majority of devs will not have to interact with those repositories. The upgrade of libraries is fundamentally non-collaborative: me and Stan are the only ones who may upgrade Windows libs these days; you can add wraitii and s0600204 to the list for SpiderMonkey upgrades and that's all. Being one of the involved devs, I agree that the new setup is not a huge improvement over what we did on SVN, but it feels way more organized and streamlined to me. I never used git submodules, so basically: I wouldn't dare trying to setup that I have nothing against it in principle, but I had not thought of submodules before yesterday. If I'm not mistaken, this is a change that could be performed in the future without rewriting history, right? It could be an incremental improvement over the future setup, that could be performed by a knowledgeable contributor. Would that be acceptable to you? same as above, but I believe the directory structure of the repository would make that much harder it would create similar problems to the engine/mods split I think, I'm not sure. That, on the other hand, I am against. The history of po/pot files in the git repo, when I started my experiment, was HUGE. It was several times larger than the current size (LFS blobs excluded) that I got to. (I can't remember how much, but off the top of my head 600MiB) And even that is not satisfactory because you only get new translations twice a week. With the nightly-build system, testers obtain everything "as in a release" daily, which is a bit improvement! Of course, committing translations daily to the git repo would be possible, but the size would skyrocket. Generally speaking, I'm also a bit adverse to letting a bot commit to the source repo. It makes more sense to me to have a git repo managed entirely by humans, and then a repo generated by the CD system, which provides a ready-to-play state to testers. Let me link once again to the glorious diagram I'm really proud of in which it looks like concerns are cleanly separated. But the main thing that convinced me to strip the translations from the git repo is the SPIR-V shaders for Vulkan. Right now we are blocked with a Vulkan support that depends on a mod which cannot be committed because of its size, the generated nature of its contents, and the huge amount of time/computing needed to generate it (and I believe that's part of the reasons for that stalled A27, as including the mod would complicate the release process). Vulkan cannot be tested out of the box by users without some knowledge. I want to demonstrate next week (unfortunately I can't show that to support my claim yet) that this can be fixed by my nightly-build approach. The git repo does not have to contain those large and ever-changing shader files. Instead, the nightly build generation can provide those files daily, with the CD system taking care of generating them, and we'd have them ready for releases in advance. I only have to make sure that the generation can be efficient (we mustn't spend 5 hours generating them every night, but only when needed), and that will be my task for the next week. --- In a nutshell: 1. I don't think SVN is so bad that we should avoid using it entirely. 2. I am open to replacing my source-libs and windows-libs SVN repos with git submodules, but that can be done in the future, and I'd be happy if someone else would do it. 3. I believe that a nightly build for testers and for generating release bundles, generated from a lightweight git repo for the developers, is a superior approach than a 1:1 SVN mirror of a (very?) large git repo.
  12. Since that is for the lobby of the 0ad game, it would go to 0ad/lobby-infrastructure which would be consistent with GitHub.
  13. Having a git forge would be a good opportunity to start versioning some of our staff information, so I went ahead and made the change: that way, repositories associated with the development of the game go to 0ad, while we have the wfg organization for actual organizational matters. For instance, infrastructure documentation for the server as asked by Dunedan. SPI-related stuff should also go under wfg/ We should also really look into improving our websites, and thus we should have, when this project starts, a wfg/website repo for wildfiregames.com and a 0ad/website repo for play0ad.com. Does that sound logical?
  14. Yes that's one of the things I want to setup next week
  15. While we are breaking paths: I'm not sure wfg/0ad is a good repo path on Gitea. It's inconsistent with github and gitlab (where it's 0ad/0ad). I see that blender made the same decision and called the organization "blender" on their own gitea. So I'm pondering whether to rename the "wfg" organization to "0ad" (I'll probably keep "Wildfire Games" as the real name). Does that sound good? I don't like it much but it may objectively be better.
  16. Thanks for the feedback! It's in the staff forums. I am ready to update those threads as soon as changes happen. It would indeed be nice to store them in a repo, for instance a private repo on Gitea after the migration I do like originality, but this is a very good point. I'll change trunk to main. Indeed. I suppose they found a way to comply though, or else the service wouldn't work anymore. I'll disable Gravatar then. I'm not going to perform extra work for that. Users will have to fill their profile themselves on Gitea. Yes, that's nice to have! However I would like to avoid adding and maintaining too many devtools in the future If Gitea could become our one and only centralized platform for development that would be great.
  17. Thank you for your interest! I sent you a PM with login information and some pointers. Please go through the wiki and tickets, test and update stuff as you wish and let me know if you run into an issue.
  18. Hello folks! I have been hard at work on this, and things are starting to look usable. Here is a (probably incomplete) changelog: The CI works! It is still a bit raw and I deactivated it to save space on Jenkins, but I am now working on improving it. It is the last big chunk of work I have to perform, and then we will be ready to migrate. The nightly build now exists! Get it via SVN at https://svn.itms.ovh/nightly-build/trunk. It is generated from a month ago but I'll update it through Jenkins in the upcoming days. All the documentation was updated and improved during the JDLL event in May where I presented the project and received some feedback. I also improved the Privacy Policy based on feedback. Please head over to the wiki especially BuildAndDeploymentEnvironment. I need help! The glorious FAQ has been defaced by the conversion to Gitea, which uses Github-flavored Markdown. Is anyone interested in working on manually restoring the appearance of the FAQ, adapting it to its new home on Gitea? Please let me know if you wish to help. Basically it would be necessary to 1) cleanup the structure of the page, fix the tables, the raw HTML and hard links and 2) improve the appearance just like it was done on Trac, but using Markdown features instead of Trac features. The git repository now has a script for getting translations straight from the nightly build (the same will be done for Vulkan shaders) Links to changesets in the format [25001] on Trac are correctly converted to 04ec75ed7e on Gitea (and I fixed the commit text in that specific revision) Important Trac keywords (regression, pathfinding, design, ...) now have a Gitea label, but I can't easily automatically convert keywords to labels, so I will add the labels manually after the migration. Thanks in advance for your feedback!
  19. Not implemented yet, but close. Sent. I'm almost ready to start CI (hopefully) and I'll use the checkrefs script as my main test avenue. It will allow me to check that Git LFS interacts correctly with the CI. Once I have set things up, I will be very interested in having those changes in a pull request on gitea. This is automatically covered by the Gitea Jenkins plugin. Jenkins uses the Jenkinsfile from the upstream branch when the pull request author is not a recognized contributor. If it works as advertised, this will remove the need to have access to the Jenkins instance.
  20. Hello @plusmid! I know about Forgejo. It's a great project and you're right to promote it. Basically we want to use and support open-source software, but we'd also like to use widely-used, even if it's commercial, software in order to save energy in maintaining our tools. A lot of members of the community have been advocating for using Github, which everyone uses, and which is "too big to fail". Obviously we're reluctant to use Github/Microsoft closed software, but maybe we'll end up there anyway if it proves too difficult to self-host a forge. Gitlab is too heavy for us to self-host, Gitlab.com is another alternative possibility for the future. I think Gitea is a very good middle-ground, both open-source, self-hosted and Github-like. It has official maintained support from the Jenkins developers (we use that for CI/CD). We're very happy to know that there is a non-profit alternative in case the Gitea firm starts doing shady stuff, but for now, the only thing we want from upstream is stability and ease-of-use. Forgejo is simply not big enough yet, to put it bluntly.
  21. I regenerated the repo and associated data (that broke some links, I can't do much except rewiping the databases and users, which I'll try to avoid doing too frequently). I'm going to stop changing/fixing bugs in the PoC for now, thanks for all the reports and testing. I am now going to focus on the Unix build environment and the setting up of CI/CD. Changelog: git repository: - used short hashes in messages (I didn't have to choose the length, git automatically chose 10 chars) - wrapped the content of commit messages to 72 chars, ignoring the first line - grouped metadata lines in commit messages as much as possible (my regexp detects "[...] by:" and "Differential Revision:" but not for instance Trac tickets:, I can't cover and test all cases) - handle phab: links and [Pp]hab: links to non-commits - added a Bugs herder team Imported issues: - updated label display (typography/color), removed activity labels - disabled time tracking - do not publish a Patch change if the new value is empty wiki documentation: - added notices to install git LFS - adapted contents to label improvement Cosmetics: - added link to docs.wildfiregames.com in top menu Future branch: - merged Stan's update to .gitignore In gitea 1.22 (currently RC, will upgrade when it's released): - command-line instructions to merge will be adapted to our merge strategies - username will be correctly used instead of real name in commit feeds (including the RSS consumed by the IRC bot) Didn't do: - rename TracUser, I really think it's important to be explicit about the fact content comes from Trac - keep the Trac registration date of users... because Trac doesn't record that at all (only the last activity date) - I cannot reproduce @hyperion problem with "issues created by me"
  22. Why not! But that would have to link with the migration. They can be moved now or later, in both cases I'll preserve them. No need at all! Images can be retrieved from a remote URL, for instance on play0ad.com, or from the nightly-build SVN. It's okay, it's still kinda maintained, and as long as it doesn't break, we don't need to fix it. But there are indeed more popular choices for C++ test frameworks nowadays. That's also unrelated to the migration. My changes to the build environment will probably supersede those interesting diffs you linked.
  23. Oh I misunderstood your previous message, sorry! I thought you were suggesting doing things, but you were questioning/proposing improvements on what I did. Please go ahead and make a PR The bat files are used in the Windows installer I think, no need to make sh equivalents when we have a .desktop file. Converting to Markdown is a must, but it would indeed be great to deduplicate things that go to the wiki. Instead it would be awesome to have a readme.md which is actually appealing, with screenshots and a user-orientated presentation of the project. I have not worked in that direction but I'd love to see it done. I have already planned to remove svn-related binaries, but I have decided not to remove premake and cxxtestgen. Those are very lightweight binaries, and keeping them really simplifies the Windows build process in the design I am proposing.
  24. @Stan` Everything you listed in the latest message is either already covered or still WIP but planned in https://gitea.itms.ovh/Itms/0ad in what I called the "future" branch. Please test and propose alternative coloring if you have a strong opinion on the matter. That was not at all the aims I had in mind, I see how my wording was confusing. Please avoid deeming others' work "pointless" and being rude for no reason. I see that there is indeed overlap with other Gitea features. I wanted to make sure PRs without any reviewers would stand out. Instead of adding a "reviewer-needed" label, we could make a rule that reviewers assign themselves to PRs (which I hadn't realized is different from assigning one to an issue). That way, orphan PRs could be found with the "No assignees" filter. "under-review" is then already covered by the presence of assignees, and "work-in-progress" is covered by the "changes requested" display. This is a very good point, it would be great to be able to mark an issue as waiting on info without closing it. The "Due Date" feature would here be useful to keep track of when to close it. (still no found use for the Time Tracker which is a different feature) Well I could call that "closed". Order doesn't matter here since closed tickets should just have a "closed/resolved" and "theme" label. I disagree, there is an actual difference between "nice to have" (this would provide an actual benefit to the user or the devs, but low priority) and "if time permits" (this would be nice, but it either wouldn't change anything from the user pov, or the cost of doing it would counterbalance the small benefits). Then again, I just don't want to lose any information from Trac. Merging the labels can be done in the future if the team wishes.
  25. Will take a look Another thing worth mentioning: I plan to delete all Trac accounts without content as part of the migration (using the Trac spam filter). I did not perform that in my PoC because I haven't set up the spam filter (useless in a read-only instance). As a consequence, I will only create Gitea users for active Trac accounts in the actual migration.
×
×
  • Create New...