Jump to content

Itms

WFG Programming Team
  • Posts

    2.388
  • Joined

  • Last visited

  • Days Won

    108

Itms last won the day on July 19

Itms had the most liked content!

About Itms

Previous Fields

  • First Name
    Nicolas
  • Last Name
    Auvray

Profile Information

  • Gender
    Male
  • Location
    France

Recent Profile Visitors

7.947 profile views

Itms's Achievements

Primus Pilus

Primus Pilus (7/14)

2,6k

Reputation

9

Community Answers

  1. Since we won't be able to actually LFS-mirror to Github and Gitlab, it sounds like stopping mirroring altogether is the best solution. I would then just archive the historical repos without renaming them. Codeberg can allocate storage upon request, so I would still like to contact them after the migration, asking if we could have a mirror there, in order to show mutual support. Having a mirror on that Forgejo instance would also be a good way to keep in touch with the Forgejo community in case we want to replace Gitea or to stop self-hosting.
  2. Yes, I'm currently testing a mirror-push to Github, and seeing how it takes times, it's definitely uploading the LFS contents. Oh... GitHub and GitLab both have a low LFS usage limit on the free tier (1 GiB and 5 GiB respectively). It is one more proof that we needed self-hosting. However that creates issues with mirroring, that's bad.
  3. Ha, with the translations thing, past week became this week No, I think it's better to rename the current repos to something like old_git-svn_mirror in order not to break forks, archive them, and but up a big bold ugly notice about the migration and the new repo. People will be able to migrate their repos when they wish to Gitea, I'm not doing it as part of migration. By the way I think the mirroring to github/gitlab should wait for a couple days after the migration. Yes, I'm currently testing a mirror-push to Github, and seeing how it takes times, it's definitely uploading the LFS contents. Great idea! I just reserved the 0ad name I can add you as owner if you PM me your Codeberg username.
  4. Yes, I also liked Blue Ocean for displaying stage output, but unfortunately that project is ended. Blue Ocean will not receive updates and thus will never be able to display a lot of modern Pipeline features I started to put into place. The officially recommended replacement I used, Pipeline Graph View, is still pretty new and rough around the edges, but I'm sure it will get improvements and get better usability as time goes by
  5. No not at all, I have just been too lazy to set up anonymous access to all the jobs I created, deleted, re-created.... I will open them up in the upcoming days, and I will also be able to considerably simplify the access to Jenkins after the actual migration.
  6. Fantastic, thanks for your interest. Indeed we need to get the SM upgrade done, and I have the experience to make it happen, but I will have to focus on the git migration first, so I'll probably be entirely occupied by that for the month of August. So, I can help you, but I cannot entirely take it off your hands for now - and it is a good opportunity to try your hands at it if you are interested. The plan is to upgrade to a more recent ESR, yes, sometimes it's better to avoid skipping an ESR (especially if the newest one is too new and has undetected issues with embedding), but sometimes there is no need to keep a previous ESR... I'll have to see when I look into this specific one. It sounds great to have your help and input on the use of submodules! But, considering the work needed, the imminence of the migration, and the unresolved discussion about using them or not, I would like to perform the migration with the source-libs SVN repo, and try and maybe decide using submodules in the near future, after the migration. You can certainly start working on that, but you would design an incremental improvement upon my current work, so expect having to rebase a lot. Yes: even with submodules we'd need a wrapper script somewhere. I believe my get/build-libraries script are here to stay, and they could abstract the use of svn-export or git submodules without breaking the user workflow if we switch to submodules. Thanks for your input! I saw gitea support being launched in the latest release and I didn't know if we could be interested in that. I will look into sha256 support, if we're going to late-switch to git, we may as well use modern features and be future-proof Thank you so much for the work I will try and get other developers to give their opinion, but now at least there is no technical blockage against either solution for translations.
  7. I forgot that one. And I am almost certain there is a technical term for what I'm trying to describe, but I don't know it: very frustrating. What I mean is that translation commits are not a single unit of work. When we commit graphical assets, we don't commit a bunch of new graphics, work-in-progress models, and a few shader patches all together. We try to keep commits atomic, a single commit is the work of one artist, or a small group of artists, working on a specific asset or collection of assets. Whereas a translation update is just a large dump of the current state of our translators work across the entire project. It's not like translators coordinated themselves to publish a consistent update for French, or a multilingual update specifically covering the tutorial. If it was like this, then those commits would be crafted by humans, and I'd keep them in the history; on the contrary, I generally oppose generated commits (even if the generation is based on human work). A rule of thumb could be: can one write a meaningful commit message? If you can't be more specific than "Updated PO and POT files", the commit is not very meaningful. You wouldn't imagine pushing a "Updated PNG files" commit. I'd like to explain myself further about the date-triggered commits. What makes the translation commits especially meaningless is that they would be different if the Jenkins job ran ten minutes later. They depend on the state of Transifex at the moment the script is run. I believe there is a technical term for that but I don't know it either, and I think it's a bad practice to have such contingent commits in large quantity in the history (as I said, those translation commits would take 80% of the size of the git repo if I let them in!). Obviously I'm proposing such a date-dependent system when I propose nightly builds, but that's a typical Continuous Delivery process where users receive a new version of the game for testing. Since we don't have a choice but to pull Transifex updates regularly, it makes sense to me to include them in that regular delivery. But I don't think automatic continuous delivery is supposed to happen the other way around into the source repository.
  8. I agree! But I am still a bit concerned about timing, as my current full-time availability is going to expire It's definitely about trade-offs and there is no ideal solution... maybe we could find one over time but I feel like I don't have that time. I would love to find some middle ground on the translations, I thought the separate repo could be the solution, but it doesn't look like it is. Sorry not clear again! I mean the code under i18n_helper, extractors, and also tests, it looks like this code is shared between the pot generation script and the other scripts, so I kept it duplicated, it would require more work to cleanly separate all that. It's probably not worth it. Hum, me too, to be fair. But I'll post on the staff forums to alert on my schedule and deadlines and hope to receive more input. Thank you so much for the kind words, and I understand I had a bit forgotten about the hurdles of written communication in such a project and I really appreciate your acknowledgement of my work
  9. @DunedanAh damnit our messages crossed. Written communication is so frustrating. I'm talking about the translations of non-0ad projects in that sentence. Most of the complexity I encountered when setting up the separate repo would also exist with translations in the main git repo (many Jenkins jobs, much hacking around git+Jenkins peculiarities, checkDiff.py not compatible with git, ...), so I'm not sure that is correct. I separated them without issue, the only problem is with our bundled modules, but I don't know enough about those Python scripts. Feel free to take a look. Yeah, it is a big downside.... And it adds to the complexity of the release process, as deciding which translations get in would be done much sooner in the cycle. No, you can do that all the time since you can patch the build scripts to download stuff from elsewhere. We do that all the time when working on SM upgrades. SpiderMonkey contains the tarball, and all source libs contain windows prebuilt binaries. That's a good point, but it doesn't work for web views. If it's only personal preference, well I'd go with my own personal preferences, since I'm the one working on this. But I am indeed trying to come up with the best experience for contributors, so I'm listening to your feedback. At this point I'm just running out of control with this translations issue: I've been dedicating far too much time and energy on that recently, while I dreamt of actually planning the migration And since I'm pondering about how to handle translations since a year and a half now, it's difficult for me to consider all the options fairly: I'm inclined to keep the system I designed. So I suppose I'll just give up on that issue and accept keeping the translations in the repo. But I have a condition, that you update the checkDiff.py script to work with git! I'm not going to spend extra days on this thing. Whoops I thought you had access, just fixed that. Well... I was very proud to have reduced the size of that repository through careful design, and now I'm just throwing that part to the trash. To me it really feels like debasing my work, but I'll have to hope I'm wrong. For translations I trust your judgment, but for libraries that's an entirely different problem and keeping them in the repo is not possible. The idea of submodules is (probably) good but it can be done in the future if it actually bothers contributors. Having been maintaining those libs for seven years now, I'm much more confident about that part.
  10. Thanks for giving a hand In other news, I have added SPIR-V generation (still running) to the nightly build, and, through a sprint, I also set up a separate translations repository, to see how it would work in practice. I'd have to document it on the dev environment page if we go with that, but basically: upon each commit to the main repo, pot files are generated and pushed to the translations repo for the future: ideally, after that pot update, we should push templates to Transifex instead of letting them autoupdate daily, the translation repo pulls updates from Transifex. in the future, we should run the check script and also lint po files, and follow translation issues in that repo. we should be able to eliminate issues such as #4250 This system still has some rough edges but here are my comments after developing that: the concerns are nicely separated now. all interaction with Transifex happens in the translations repo. that's a really good thing. this separate repo will be the perfect place to collaborate with translators and fix long-standing issues such as #4250 the base git repo only sends generated pot files using messages.json and updateTemplates.py. I deleted all the other scripts and all the .tx/config files the translation repo holds the majority of i18n python scripts, which do not need to know about the structure of the data/ dir anymore, that should make their maintenance simpler the checkDiff.py script does not work anymore with git, it was developed specifically for SVN! so the translations git repo gets all po/pot updates even if they are irrelevant. it is not a big deal for that auxiliary repo, but this is another dealbreaker against the idea of having translations in the main git repo. this new repo adds some complexity to the migration, as those scripts need to receive changes for that new infrastructure the nightly repo (and the git repo) can fetch the translations without using svn export... but git does not have that "export" feature so the new fetch script relies on creating a temporary shallow clone, that's a bit yucky the Jenkins setup is much more complex now, with two extra jobs (maybe three if we add a push job to Transifex) on top of the nightly-build job. It is impractical to commit and push to git using Jenkins, this is definitely not a standard use case, and my pipelines may break in the future if Jenkins features evolve. (this reinforces my prejudice against letting bots push to git) the Jenkins setup around the translation updates is not covered by Gitea features. it will be impractical to setup post-commit triggers for this. In a nutshell, I think that this alternative way of handling translations is on the pros side: separates issues, addresses some of your concerns, and gives us much more flexibility in our interactions with Transifex, with room for improvement; on the cons side: more complex to maintain for Jenkins, would require extra development work for you and other Python devs around the migration
  11. Proof submitted by PM, I merged the user's temporary account into their main account, with restored access.
  12. Hello, what is your main account? I can see and check if your IPs match. Or else you'll have to find another way to prove that you are behind the other account. If we have proof, we can merge your accounts without problem, but else we obviously won't do it
  13. @Dunedan Thinking about it, if you want to give a hand, you could try and take a look at creating that translations repo. You should have access to create repositories on Gitea under the 0ad/ organization. Right now I'm focused on shaders and other Jenkins improvements, I'm not going to look at translations until next week.
  14. If the exported translations are not committed, that feedback does not actually happen... Do you think it could create actual issues? Yes, the first one isn't possible. The second one I hadn't considered but maybe it would be nice. I am thinking this "translations" repository could even allow us to store the translations of official/semi-official mods. Those would not be pulled into the git and the nightly-build of 0ad, but they could be synced with Transifex, allowing our translator community to translate them. The modders would then pull translations into their mods when they release them, and push pot files to the repository whenever they wish. For the third one, I answer you further below. Well it wouldn't be possible to review changes to the libraries themselves, but changes to the 0 A.D. code would still be reviewed and even tested by CI. This is indeed problematic but it is the same situation as today. Take for instance D5002: the contributor cannot publish the SM upgrade itself inside that diff, the diff only consists of the changes in the game. Right now that situation is too complicated for our CI pipelines to handle it (arc patch doesn't work well with binaries) thus the build for those diffs always fail and make much noise. With git, it would be possible for a contributor to send big binaries in their PR for initial review of the whole. Then we could commit the update to source-libs and have a final round of CI on the final PR that would just update the SVN revision in libraries/ scripts, before committing. So that addresses your concern. windows-libs changes are currently unreviewable, as they are just me or Stan sending new binaries on SVN. At least with the new system we could send the new binaries, then create a PR that just updates the SVN revision in libraries/ scripts, so that CI can run on it. So it's still a bit unreviewable, but it will actually go through CI before commit. ah that's right! Well, it is arbitrary, or at least based on feelings (but I think it's important to consider those). One of my issues is: those commits would lack justification. The history of the repo would contain a lot of commits that do not address an issue. Those would just be commits happening because "translations are updated all Mondays and Fridays". Moreover, those commits have the big downside of highlighting low development activity. It feels very bad to see that autobuild commits more than the team in times of reduced activity. This is awful for morale. My proposal allows us to have daily updated translations, which is better for testing, but if we keep translations in the git repo, having daily updates would dwarf actual human contributions. Hence, if my current system really irks you, your second proposal above is the only one I would find acceptable. Basically I think that the issue with spir-v and with translations is the same: we have files that are needed for the end user (including nightly testers) but which cannot be committed to the git repo (for translations, replace "cannot be" by "I would rather have not"). I figured that if I generated shaders in the nightly and provided a way to export them into the git repo if developers needed them, I could just have the translations follow suit, and fix two problems with the same consistent solution. You can contribute PRs to the private dev-migration repo on Gitea. But I agree it's a difficult project to collaborate on Your thoughtful feedback is much appreciated though By "large" I was thinking about translations (and "very" with shaders on top) but you're right that going from 100MiB to 700MiB would probably not bother users too much (I, on the other hand, would be sad to miss that opportunity to reduce the size by such a factor!) and I hadn't indeed realized that increasing the frequency of translation commits would not increase size that much. Thanks again for your feedback I'll try and think about the idea of a cleaner, separate repo for the translations.
  15. By the way, since I also read about that on IRC: It is very likely that my urgent work after the migration will be a (partial or full) SpiderMonkey upgrade. Having handled several of those now, I am thrilled (that's not an exaggeration) about performing that future upgrade under the new git setup I designed, mostly because CI will finally be able to handle it. I think I'm a good judge of whether the setup is adapted to performing library upgrades. If I stumble on rough edges, that will be the opportunity to address them, and if I don't, it will be a good example in the repo history of how to perform such an upgrade in the new setup. To reiterate, maybe it would be even better with submodules, but I don't know enough about them to design a submodule-based setup myself. I'm not against using them in the future.
×
×
  • Create New...