Jump to content

Version Control system


Recommended Posts

As far as I see Git's handling of binary assets just does not work right Linus designed git for text based code on the kernel project so it's binary capabilities are very limited.And GitHub may start wanting cash as the repos grow binary data are just blobs that git has no handle on.

Enjoy the Choice :)

  • Like 3
Link to comment
Share on other sites

10 hours ago, Samulis said:

so I would encourage that we consider an option that has some kind of at least somewhat functional GUI-based option

I think for either svn or git, there are appropriately easy to use GUI, so imo this should be taken care of either way.

10 hours ago, Samulis said:

On the other hand, if something is so easy that even a total beginner can contribute to it without any kind of technical hurdle, it may lead to a risk of lower quality code or assets being contributed.

I agree, but that could be avoided through better documentation about what patches / art / sound assets are actually wanted and what quality is expected. Having a lower barrier of entry doesn't mean you have to accept anything and it is totally fine to decline patches / pull requests when they don't meet the quality standard. The important thing for people to not get mad about this is: having a proper reason why something is declined and having a place / code of conduct / design document where all the quality standards are described.

  • Like 3
Link to comment
Share on other sites

2 hours ago, maroder said:

Having a lower barrier of entry doesn't mean you have to accept anything and it is totally fine to decline patches / pull requests when they don't meet the quality standard

It is in fact totally up to the reviewers/upstream to merge, ie nothing gets in the upstream repo from a fork without explicitly merging it.

 

Besides the technical decision what vcs to use, if the decision goes git, it is more or less a decision what self hosted server to use that suits best.

Gitlab is little monster, builtin Ci, lots of features, same for github server, but there are also lighter ones providing the basics, really depends what is required.

 

The main point of git, besides technical specs is considerable, that vast pool of devs and easy collaboration.

No problem to include git repos hosted on another platform.

  • Like 1
Link to comment
Share on other sites

17 hours ago, bb_ said:

How does that work with versioning of the binaries? And with clones?

https://dzone.com/articles/git-lfs-why-and-how-to-use Basically it would solve the size issue on the server :) 

17 hours ago, bb_ said:

First and foremost, I find that all code, art and anything else must be in one and the same repo. Obviously there must be a clear directory structure. But for some patches (the secondary attack patch #252, is just one example) it is required to change both code and art at the same time. Having to propose different patches for different repo's, will make life a lot harder.

While I generally agree with the fact it would be more complicated, I do believe that things with different livetimes e.g the CI pipelines, the lobby should be in another repository.

Actually feel free to prove me wrong but I don't think anyone in the team clones the audio, art, and art_source folders which are the same repository than the main SVN. Nor do we have the design document

In the future, having an empires_ascendant mod in a separate repo we could do hotfixes through mod.io

17 hours ago, bb_ said:

We have been around a long time, the world around us will change  and we don't know what will happen. Therefore I will strongly plea for keeping sovereignty. Make sure everything is available in a self-hosted variant. Using third-party services is perfectly fine with me, however we should depend minimally on them. Whether it is to avoid a total collapse of the service, changes in the service itself, or changes in the terms, it shouldn't matter us. We should be able to continue, and at any point be able to decide to stop using any service.

No argument on that end which is why I'm currently pushing for a self hosted gitlab (and such the git migration)

 

17 hours ago, bb_ said:

 

Dev version testers should be made easy to test: we need them and they might not be as experienced as devs when it comes to handling code bases. Therefore precompiled binaries (autobuilds) should be directly available in the repo. This might also help devs on some platforms.

A few people on linux complained about the presence of windows binaries in the repo, because that makes the download heavier for no reason. The git migration ticket planned to have a download script for the windows plateform just like we can download the spidermonkey tarball. I think this could be acceptable if it works straight out of the box. eg double click on a script would pull all the binaries and update the relevant ones using md5 or sha1 or something.

  • Like 3
Link to comment
Share on other sites

17 hours ago, maroder said:

What does the choice of hosting provider mean for the barrier of entry and the possible pool of contributors.

This would be the biggest argument for GitHub. It has an extremely large and active community of developers.

In my experience, GitHub ability to bring valuable contributions is overrated.
I even took down the GitHub mirrors of the software I maintain a couple months ago, after years with no contributions coming this way (while we still get contributions on our dedicated GitLab instance).

  • Like 3
Link to comment
Share on other sites

The scope of the discussion isn't particularly clear, going by title and poll it seems a debate svn vs git. I'm a proponent of git.

For most part either works, and whoever prefers to work with the other can do so independent of what the master repo uses. So what would one gain from migrating to git, there is quite a list but I limit myself to what I think is most important for 0ad mostly based on commit history and what comes to mind right now.

  1.  Git unlike svn respects authorship. Pyrogenesis/0ad is a volunteer project as such attribution is of paramount importance for many if not most.
  2.  Git workflows encourage commit often. As such commits are far more likely to be properly split, i.e do one thing only and do it properly.
  3.  Git workflows don't differ between diff and commit. Often subject and message of the commit are even more important than the code change itself. Using a git workflow both are made part of the review process.
  4. With git you push commits. Bad commits, which happens to the best, can be fixed before publicizing.

 

Some points made in this thread I think are worth addressing with a few words.

Git and binary handling: Well, the current binaries in tree and their history are a non issue when using git. Stan mentioned lfs, there are other means of binary handling as well, all with their own pros and cons. Discussing them might make sense but can be ignored just as well.

Autobuild: It's _very poor_ practice to store build artifacts in the repo irrespective of vcs. Also last I checked there were more Linux than Windows users therefore we need to replace them immediately with Linux binaries to maximize the use for our users. ;) A proper solution if people want to use prebuilt artifacts is to add a script which fetches them from CI. This would work for any commit and not just selected ones plus for all supported platforms. Regular proper pre-releases for testing could be provided as well, so pure testers wouldn't even have to bother with a vcs at all.

Split repo: If pyrogenesis/0ad is considered an engine plus a game and not just a game a split is almost mandatory. We can have a very lengthy discussion about this but is out of scope here. First, yes, a proper split is more complex than putting public into it's own repo. Second, as for why spidermonkey as a dep sucks for us/distros is to a large part that it's in the same repo as firefox.

  • Like 2
Link to comment
Share on other sites

6 hours ago, Stan` said:
On 25/11/2021 at 8:10 PM, bb_ said:

How does that work with versioning of the binaries? And with clones?

https://dzone.com/articles/git-lfs-why-and-how-to-use Basically it would solve the size issue on the server :) 

Don't think you answered my question. How does git-lfs work with the versioning history of binaries? And does a clone automatically inherit the linkage?

 

30 minutes ago, hyperion said:

it seems a debate svn vs git

It is NOT. I tried to make in the first post clear what this thread is about, namely gathering ideas, wishes etc. regarding VCS.

  • Like 1
Link to comment
Share on other sites

57 minutes ago, hyperion said:

Git and binary handling: Well, the current binaries in tree and their history are a non issue when using git.

I'd think that too from the file sizes of the images.

A dedicated art/image repo might do, included with a submodule even.

It takes probably 3-4 submodules given the current source tree, maybe more, depending on preferences.

  • Like 1
Link to comment
Share on other sites

What are the current limitations and what are possible solutions for that?

The current limitation is that Phabricator is no longer actively maintained by its parent corporation. This is it. This is the only major cause for action that I can see. So, we need to be careful to avoid creating new problems in our suggested approach to solving this major problem

What is required from the version control system and CI?

  • Security
  • Reliability
  • Longevity
  • Companies involved are allies of FLOSS instead of enemies
  • Ease of use, particularly for newer contributors
  • Development data is not fragmented

Which features are required?

  • A solution for storing binary files

What is only nice to have?

  • GPG signing of commits or at least tags and pull requests
  • 2-factor authentication
  • End-to-end encrypted backup of data
  • Easy export of data

What are the pros and cons for them?

I will address two of the traits that are, in my opinion, key requirements: Security and Ease of Use.

 

Security
Solarwinds was hit by malware that targeted the build environment. It was described in some articles as "IT's Pearl Harbor". I encourage people to personally examine the security of the build environment, if only briefly. If you can think of vulnerabilities then so can the bad guys.

One step to achieving security of the software supply chain is to cryptographically sign patches or tags and pull requests, as well as releases.

Also, a fundamental method of achieving security is to minimize the attack surface. Don't use unnnecessary software, and don't use overly complex software. Some of these CI/CD systems have more than 1 million lines of code. Do you think they've been reviewed? Do you think they've been designed with security in mind? I like the advice, "Trust, but verify." If you weren't involved in the development process of the CI/CD tool, and you haven't read the source code, then you're trusting something but not verifying it.
 

Ease of Use

We need to address the wants and needs of stakeholders who are not participating in this conversation, but who are nevertheless valued contributors to the project. We have more than 20 developers and artists, and probably a handful of them have strong IT skills.

Visualize a scenario where you have little knowledge about source control system inner workings and little free time to learn about them. You just want to make content or contribute a feature or bug fix. If the VCS is complex with a confusing user interface and inconsistent or invalid metaphors, are you going to spend a lot of time reading documentation and watching tutorial videos in order to contribute your improvements to WFG? Or, are you going to throw up your hands and not contribute the improvements at all?

That decision depends on the individual, but the decision about which VCS we use can have an impact on how many contributors we have in the long term.

Edited by dave_k
  • Like 2
Link to comment
Share on other sites

20 hours ago, Loki1950 said:

As far as I see Git's handling of binary assets just does not work right Linus designed git for text based code on the kernel project so it's binary capabilities are very limited.And GitHub may start wanting cash as the repos grow binary data are just blobs that git has no handle on.

I wonder if it would be best to continue using svn for the handling of binary assets, and how well it would work to use an svn repo as a git submodule (There are some results when I search "using svn repo as git submodule").

On 25/11/2021 at 6:54 AM, Stan` said:

This is fun :)  But I didn't know it existed so thanks :)

https://docs.gitlab.com/ee/user/project/merge_requests/allow_collaboration.html

Some of the original quote is missing here, but to the question of whether or not maintainers can edit pull requests: on GitHub as well, it can be done. On a GitHub PR, there's a default option:

 

image.png.076c59bfd3a9de4e524f4a965243ffb4.png

also, when reviewing the PR(diff), maintainers, or even reviewers with no write access can insert a suggestion, and then the original submitter can click to commit the suggestion.

On 25/11/2021 at 3:32 AM, Stan` said:

Yeah submodules are generally a PITA. Mentioned them for the sake of the argument, because I believe more things should be split from the repo, which is not a consensus inside the team see the spoiler in the first post.

I haven't worked with submodules enough to form an opinion yet whether or not subs are are PITA. I struggle with sometimes just because I don't use them regularly and forget some of the basic git commands for using them. Based on the little experience I have with them so far, I'm of the opinion they should generally only be used -- and it's when they're most practical to have them included as submodules -- if they are completely independent of the project, such as third-party libraries (although there are exceptions).

 

For anyone that hasn't worked with submodules yet, you might want to check out the SuperTux project and follow their instructions for fetching the submodules and building the project.

 

As for pull requests on GitHub, if anyone wants to experiment or practice, feel free to make a pull request on community maps 2. You can just edit any file and add "Hello World" and make your pull request. (Gitlab PRs are pretty similar, as far as I know.) And of course you can also make practice PRs on  your own repo, or other repos that already exist for that purpose.image.png.076c59bfd3a9de4e524f4a965243ffb4.png

Edited by andy5995
  • Like 2
Link to comment
Share on other sites

It seems like we haven't done enough "comparison shopping", because I found several CI/CD tools that support Svn, Git, and even Mercurial.

Apache's comparison of CI/CD tools

How seriously is Svn being considered as a solution? It seems like it is completely possible. The decision process for selecting CI/CD tools would need to be less rushed.

These are the options that I would suggest.

Tools that support Svn and Git, are actively developed and maintained, and allow self-hosting

The first two options have commits within the past few days. The second two options have most recent commits as of 2 years ago.


OpenProject has 359K lines of code, and 4 documented security vulnerabilities since 2017.

Tuleap has 1.3M lines of code, and 14 documented security vulnerabilities since 2014.

FusionForge has 702K lines of code, and 3 documented security vulnerabilities since 2013.

Redmine has 414K lines of code, and 44 documented security vulnerabilities since 2008.

Gitlab has 1.6M lines of code, and 601 documented security vulnerabilities since 2013.

The only project that seems to encourage their contributors to write secure code is Tuleap. But, they also acknowledge that their codebase is huge, and some parts are 20 years old and probably have flaws.

This is part of the basis for my recommendations above for CI/CD tools, listed above in order of preference from highest to lowest.

Whatever CI/CD tools we consider choosing, we should do at least a cursory evaluation of how securely they were programmed. After skimming this article, I can see how difficult is is to write secure apps with Ruby. And, PHP wasn't designed with security in mind, so it may also be difficult to write secure apps in PHP.

Also, see the guides on SQL Injection Prevention and PHP Configuration, and the OWASP Top 10 categories of vulnerabilities.

My opinion of the ranking of web-based languages from most likely to be secure to least likely: Python, Perl, PHP, Ruby, Go, Java.

Justification: Python and Go are the only type-safe and memory-safe languages used by CI/CD tools in the lists that I have found. It is not possible to use Go with private modules, so Go requires trusting a third party. Therefore, I consider Python to be the best language, as long as pip is NOT used to install modules. As a result, I recommend using a traditional distro for hosting the CI/CD tools. Java has had numerous articles written about its vulnerability to arbitrary code execution.

Apache Bloodhound is the only Python-based CI/CD tool that supports SVN, as far as I'm aware. I don't see active development on it, though. The last release appears to be in 2015.

Maybe there are more actively developed CI/CD tools out there that I haven't found. If you know of options that aren't listed on the Apache comparison page, besides OpenProject, then please suggest additional options for us to consider.

Edited by dave_k
  • Like 1
Link to comment
Share on other sites

4 hours ago, dave_k said:

One step to achieving security of the software supply chain is to cryptographically sign patches or tags and pull requests, as well as releases.

 

This would not, in any way, prevent a SolarWinds scenario - the releases were properly signed in that attack.  You'd have to have a second, separate, build chain rebuilding releases to compare output hashes to catch it.

 

4 hours ago, andy5995 said:

I haven't worked with submodules enough to form an opinion yet whether or not subs are are PITA. I struggle with sometimes just because I don't use them regularly and forget some of the basic git commands for using them. Based on the little experience I have with them so far, I'm of the opinion they should generally only be used -- and it's when they're most practical to have them included as submodules -- if they are completely independent of the project, such as third-party libraries (although there are exceptions).

What I personally tend to prefer is uploading private (as in, not widely published, not that they're hidden) package repositories.  Gitlab, at least, has several built-in package repositories (NuGet, maven, PyPi) that can be used for uploading/hosting these kinds of things, and I believe several other CI systems do as well.  And in some cases that's not necessary - recent versions of cmake, I believe, can grab git repositories as project includes (assuming the referenced project also uses cmake).

12 hours ago, Stan` said:

A few people on linux complained about the presence of windows binaries in the repo, because that makes the download heavier for no reason. The git migration ticket planned to have a download script for the windows plateform just like we can download the spidermonkey tarball. I think this could be acceptable if it works straight out of the box. eg double click on a script would pull all the binaries and update the relevant ones using md5 or sha1 or something.

I personally feel no build output should ever be persisted to a repository, and no third-party binaries if you can avoid it.  Back when I made my few contributions and also saw the earlier git/svn debates, my feeling was that an artists'/testers' script was the way to go, too.  I think my only recommendation here is to make it so that you could pass version numbers to be able to work against a particular build (just stating what I assume was implied).  For that matter, it's possible to install hooks into various parts of git (I don't know about svn, but possibly there too) - UnrealEngine, for example, has a script that you're supposed to execute on first clone; thereafter, any time you switch branch or pull updates, it downloads the relevant third-party dependencies and common content.

 

I personally prefer working with git, being a programmer, because it makes diffs and branching so much easier for text assets.  In most common cases, assuming submissions are based on branches/pull requests (instead of submitted diffs), there's usually not much visible difference between git and svn, though - it's only once you have multiple people modifying the same file that git and prs really help.  Other than switching typing `svn` for `git`, I don't think artists would really be affected.

  • Like 1
  • Thanks 1
Link to comment
Share on other sites

13 hours ago, artoo said:

I'd think that too from the file sizes of the images.

A dedicated art/image repo might do, included with a submodule even.

It takes probably 3-4 submodules given the current source tree, maybe more, depending on preferences.

Splitting anywhere else than project boundaries is a bad idea IMHO. For the sake of argument let's assume binary assets are an issue or are anticipated to become an issue then lfs is a superior solution to some hand crafted custom setup.

For me engine and game are two different projects. Two projects can live in the same repo in theory but it takes a lot of discipline. In practice this rarely ever works. Let's have a look at 0ad premake.lua with the lofty goal of building multiple static libs so they can be used on their own outside of pyrogenesis. In practice there is tight coupling, even plenty of cyclic dependencies, further stuff in third_party was made dependent on pyrogenesis as well. Then look at the directory public, supposedly the game, which contains plenty stuff which belongs to the engine making the engine unusable without the game. The engine having a proper API one day is pretty much out of reach in such a setup.

 

10 hours ago, dave_k said:

This is part of the basis for my recommendations above for CI/CD tools, listed above in order of preference from highest to lowest.

pyrogenesis has 0 cve listed, so the most secure piece of software ever ...

  • Like 2
  • Thanks 1
Link to comment
Share on other sites

2 hours ago, hyperion said:

For me engine and game are two different projects

Sure.

I'd put the public mod in a repo, the source dir in a repo. Details to solve, art, specifically images may go in a repo too.

Optionally, maybe mod gui specific stuff to be separated.

I would place a bet on devs getting frustrated over time if images go in the source code repo.

Edited by artoo
  • Like 2
Link to comment
Share on other sites

18 hours ago, hyperion said:

Split repo: If pyrogenesis/0ad is considered an engine plus a game and not just a game a split is almost mandatory.

There is an important caveat: splitted repos should be in sync via some relation mechanism.

3 hours ago, hyperion said:

pyrogenesis has 0 cve listed, so the most secure piece of software ever ...

Someone doubts? :cool:

  • Like 1
  • Haha 2
Link to comment
Share on other sites

19 hours ago, bb_ said:

How does git-lfs work with the versioning history of binaries?

Updating is as far as I know, just drop and replace with no binary diffs present as git just versions the "pointer" to it.

I am fairly certain GitHub already charges for LFS if it exceeds a limit, so that's something to consider as well when binary files has long histories as I would assume all copies are kept.

Edited by smiley
  • Like 1
Link to comment
Share on other sites

1 hour ago, smiley said:

I am fairly certain GitHub already charges for LFS if it exceeds a limit, so that's something to consider as well when binary files has long histories as I would assume all copies are kept.

We'd self host the LFS server as well anyway.

It seems the drawback would be that you need to install git lfs on top of git, and you might have to use git lfs pull instead of git pull to get some of the images. It writes the images in a folder iirc?

 

 

  • Like 1
Link to comment
Share on other sites

  • 2 weeks later...

Thanks for everyone posting their input. I have updated the posts at the top of this thread. Also I made a short list of the limitations in our current setup. Feel free to yell if I miss points. Between the listed points I feel there is some overlap.

To start with the top point (not that there is any particular order, feel free to comment on the other points too) "Phabricator not being maintained anymore". If we want to migrate, we can migrate anywhere.  @dave_k already posted some ideas on possible systems before:

Obviously there is also the Phabricator fork, Phorge. And surely there are more.

What are everyone's thoughts on these? Are there migrate scripts available for any of these? Do we loose any data currently in phabricator? What features do we loose? What do we gain? Can the trac or forum data be somehow incorporated. How does it work with external linking to the current phabricator pages. How does any choice relate to the feature list at the top of this thread?

  • Like 1
  • Thanks 1
Link to comment
Share on other sites

Actually I don't quite understand the suggestion @dave_k

While those tools can incooperate vcs, aren't that mainly project planning or agile development tools?

From my quick look I can't find that much information about how making a diff and then discussing/ reviewing/ requesting changes works there.

And for the security concerns: I think this is a case of who looks the most finds the most.

 

Generally, Gitlab meets all? of the requirements and has migrating scripts (or at least many useful looking results on google) for the migration from trac and phabricator.

  • Like 1
Link to comment
Share on other sites

On 07/12/2021 at 10:03 PM, bb_ said:

Obviously there is also the Phabricator fork, Phorge. And surely there are

Do note however that they are considering dropping subversion as mentioned by @s0600204 here. Source: https://d.i10o.ca/tmp/phabricator-future/

Also, that ironically, our git repo is about 10GB (5GB .git and 5GB actual files uncompressed for a given revision)  while Phabricator is currently taking 138GB and growing. Last time I deleted 10GB worth of logs.

As for migration, here is something from Trac to Gitlab https://github.com/tracboat/tracboat 

Phabricator will however be more tricky we currently have: (Maybe @vv221 has some ideas)

Jenkins

  • Migrating Jenkinsfile to YAML.

 

 

  • Like 1
Link to comment
Share on other sites

On 26/11/2021 at 10:25 PM, dave_k said:

What is only nice to have?

  • GPG signing of commits or at least tags and pull requests
  • 2-factor authentication

That is unnecessary. 0 A.D. does not use the practice of "committing on trust"

Nice code from "random internet user" will be added

Any code from the "main administrator" will still be reviewed

 

On 26/11/2021 at 10:25 PM, dave_k said:

Solarwinds was hit by malware that targeted the build environment. It was described in some articles as "IT's Pearl Harbor". I encourage people to personally examine the security of the build environment, if only briefly. If you can think of vulnerabilities then so can the bad guys.

One step to achieving security of the software supply chain is to cryptographically sign patches or tags and pull requests, as well as releases.

Source code signing does not protect against MTM injection in build system. Replace files for the compiler with unsigned ones and it is almost impossible to detect it

  • Like 1
Link to comment
Share on other sites

The way I see it, we have 3 main components:

  • the actual 'reference' VCS
  • the development tool. That's currently Phabricator, which is somewhat git/svn agnostic, unlike say gitlab or GitHub, but like some other alternatives.
  • the CI/CD pipeline, currently Jenkins.`

The VCS question is only relevant when it is determined by our development tool, which should IMO be the focus. Phabricator works appropriately but has been somewhat of a pain to maintain, and I think we should exclude any tool that leads to more headaches in that space.

I do believe the 'reference' VCS should be somewhat independently hosted and backed-up, so we don't risk losing the code or independence there. However, I also think the 'work' VCS could be something separate and we just run synchronisation scripts. For what it matters, if we could have a single tool abstracting gitlab PRs, GitHub PRs, phabricators diffs & so on, I would be very happy. Exporting data from that system should be possible but I also believe that's mostly a free space everywhere.

My ranking of the issues noted by bb in the second post + 1 personal take

Spoiler

MUST

  • No more sysadmin effort than current Phabricator (see below) < this is my own requirement, new
  • Possible to work completely remotely (locyneah) < not sure what this means because we essentially kinda do already?
  • Easy export of data (dave_k)
  • Usable by everyone from anywhere (bb)
  • Supported in maintaned platforms (Stan)

SHOULD

  • Low sysadmin efforts (Stan)
  • Proper binary handling (Freagarach, dave_k)
  • Longevity (dave_k)
  • Partial checkouts (Freagarach)
  • Reliable (dave_k)
  • Branching (locyneah, maroder)
  • Back up on devs machines (Freagarach) < sort of a free space tbh
  • Autobuild in repo (maroder, bb)
  • Simple straightforward workflow (bb)
  • Low threshold for new contributers (Stan, bb, dave_k)

NICE

  • Security (dave_k)
  • All code art etc. in one place (bb, dave_k)
  • Companies involved are FLOSS minded (dave_k)  << mostly in that we should be able to export our data if we need to, but I doubt somewhat it's a problem.
  • Possibility to self-host (locyneah, bb)
  • Possibility for bots (elexis)
  • Easy rebasing (maroder) < sorta already possible though we're technically on svn thanks to git mirror + phab workflow
  • Easily properly credit authors of patches, even if they don't have direct commit access (Stan)

DON'T CARE

  • Used in mods and other FLOSS projects (Stan, Locyneah, samulis)
  • Public in production third party back up using different vcs (bb)
  • 2FA (dave_k)
  • Including (parts of) other (third party) repo's in ours (submodules) (Stan)
  • GPG signing of commits (dave_k)
  • Support for GUI based workflow (samulis) < though sorta goes with the 'easily usable for newcomers'.

    • Like 2
    Link to comment
    Share on other sites

    On 08/12/2021 at 4:18 PM, wraitii said:

    Phabricator works appropriately but has been somewhat of a pain to maintain, and I think we should exclude any tool that leads to more headaches in that space.

    I think this is should be the focus. We don’t want to "burn" people with the administration of the tools that will be adopted.

    This is the main reason I’m mostly advocating in favour of Gitlab, because as a Gitlab administrator and maintainer myself this is something I could help with both in the short term (helping in the setup and migration) and long term (helping in the maintenance and administration).

    I don’t think it is a perfect solution, I actually have more than a couple issues with Gitlab, but I have been working with it long enough to know of its shortcomings and can help working around them. I would still be willing to help if another solution is chosen, of course, but I would be much less efficient with other tools if I need to learn how to use/maintain them from scratch.

    • Like 1
    • Thanks 1
    Link to comment
    Share on other sites

    On 08/12/2021 at 12:05 PM, Stan&#x60; said:

    Do note however that they are considering dropping subversion as mentioned by @s0600204 here. Source: https://d.i10o.ca/tmp/phabricator-future/

    For the second link, I suppose you refer to:

    Quote
    2021-06-02 18:03:08 <epriestley> A fork also creates an opportunity to make the project at least slightly more manageable: you could drop Subversion and Mercurial support, drop like 5-10 applications that see very little use (e.g., Releeph is broken; Phortune is unlikely to be useful for a non-commercial maintainer). You could drop Arcanist entirely.

    It seems that epriestly (the old maintainer of phab) is just posting some suggestions on what one could do. It doesn't seem to contain any evidence that svn is/will actually be dropped. Not sure what the official line is (I heard some conflicting info from @Freagarach on irc). Would be good to figure out.

    On 08/12/2021 at 12:05 PM, Stan&#x60; said:

    Also, that ironically, our git repo is about 10GB (5GB .git and 5GB actual files uncompressed for a given revision)  while Phabricator is currently taking 138GB and growing. Last time I deleted 10GB worth of logs.

    This is an unfair comparison. A git repo contains far less than a phabricator instance: think of all discussions, attachments of whatever kind, non-committed patches etc. The fair comparison would be to compare the size of vcs's (git, svn etc.) with size of development tools like phabricator. If anyone has some data on either of these, please add them to the thread.

    On 08/12/2021 at 4:18 PM, wraitii said:

    For what it matters, if we could have a single tool abstracting gitlab PRs, GitHub PRs, phabricators diffs & so on, I would be very happy

    Not the only person asking: https://secure.phabricator.com/D8775?id=20822. Also may I add https://www.tuleap.org/integration/tuleap-gitlab-integration-why-and-how-to-use-it and https://docs.gitlab.com/ee/integration/github.html. Feel free to list similar pages for other solutions.

    • Like 1
    Link to comment
    Share on other sites

    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.

    Guest
    Reply to this topic...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.

     Share

    ×
    ×
    • Create New...