Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The World Before Git (osshistory.org)
35 points by janvdberg on Dec 20, 2023 | hide | past | favorite | 34 comments


Unpopular opinion, but I still really like SVN and prefer Mercurial and Fossil to Git as distributed systems. You can install TortoiseSVN for a non-technical user and give them a quick rundown on how to checkout asset files from SVN, how to lock/unlock it when they need it and when they are done and how to commit they can more or less understand it and work with it. GFL getting someone to understand how to use Git LFS or dealing with merge conflicts on asset files.

Git's LFS/annexe extensions are serviceable for developers and technical folks but really are subpar compared to a centralized version control options.


One of the conceits people forget about git is that it really is meant for text only.

I'm one of those weirdos who actually likes git LFS, but only because I admit all it's short comings. But it is working really well for my game dev project. Not what I use to manage my asset library though. It is what I use to sync the most up to date binary files, which is really the use case it is maintained for anyway (think syncing build artifacts or large data sets).


What do you use for assets then, if you don't mind my asking..?


Right now it is managed using blender, and then I back it up and version it manually, which is pretty common for blender workflows.

I am planning an overhaul in a bit. I looked into sun, but would probably rather use git lfs honestly. There are also options for uncompressed files to be stored as text, but it doesn't seem like you can get away from binary data completely. I haven't tried to hard to focus on it yet. It is somewhat frustrating that there isn't a clearly good free option.


I used SVN a lot before hg and git came out, and liked it, but once distributed systems were available it was just so much easier to clone copies and keep them wherever I wanted that I switched over and never looked back. I prefer hg for many of my personal projects (and not just coding--I store things like my blog in hg repos) because I find it simpler than git for basic repo operations.


I used to use HG for my personal projects but these days I use fossil and keep the database in dropbox, works shockingly well.


There are so many other industries where versioning is also an issue, but who lack the tools to properly manage it.

The start of my career was in the pro audio industry, where "copy and rename" mixing session project files is still a common practice:

    {songTitle}_final_rev2-2021.08.03-FINAL-MIX.ptx
VCS working on text files allowing deltas/diffs helps a lot in this sense, versus saving binary project files where diffs/merges make little sense and can corrupt the output.


> versus saving binary project files where diffs/merges make little sense and can corrupt the output

Diffs and merges with binary files can work just fine. Look at rsync as an example. The problem is that it can take... too... much... time. Text is easy to diff because we can simplify the problem down to lines separated by `[\r]\n`. Being able to split a large text file into individual lines to delta is a much easier problem.

Binary data is much more difficult. So the problem isn't corrupting an output file, but just how much time it can take to figure out the most efficient way to delta a file. Because of this time limit, it makes the workflow of revision control too unwieldy.


Someone needs to make a visual diff for audio files.


The world after git is jujutsu. Once you get your workflow down, it’s so much more useful than git because you can easily switch from any commit to another by editing it or checking it out, the latter creates a new commit on top of your working copy which itself is a commit. The former lets you amend.

https://martinvonz.github.io/jj/v0.12.0/

You can squash your working copy into the previous commit with one command. All these things, to the uninitiated, probably don’t sound like much. All I can say is the UX is much more attuned to the way I work. Much quicker and more intuitive. It is exactly how I would build a vcs if I needed to and knew how.

I also like that it is built in rust. Since the vcs is really just a big data structure, I feel like rust is a good choice because of how it forces correct code to some degree and doesn’t allow cycles, etc.


> ClearCase by Atria Software introduced a robust branching model, and the ability to configure workflows to the preferences of individual teams within enterprises.

Clearcase was awesome and terrible at the same time. Being able to version directories was somewhat novel then, so that was nice. Config specs - what a cool idea. Felt like a bunch of folks who had worked on filesystems/NFS asked themselves "hey what if we made a filesystem that could give a window into a VCS?". So cool, so compelling and yet so many devils in those details. At the time, things like clearmake and derived objects had such tremendous potential.


Well, git getting developed didn't really do much to move people off Subversion: at the time, multiple DVCSes were being developed, like Mercurial, Bazaar (derived from Arch) and Git...

What got people to switch to Git (arguably, worst UX among them, but fastest for most operations) was GitHub combined with the rising popularity of Linux. But mostly GitHub.


I remember this differently. SVN had great _branching_ but horrific merge support. They resisted improvements to merging, because they wanted something theoretically perfect.

Git came along with great merge support and the entire world loved it.


Global ignores also came quite late for svn. Until then you needed to add a svn:ignore property to the root directory and then apply it recursively (which then could go wrong).


Before Git, Perforce was state of the art.

Before that it was SVN (open source, decent) and tools like IBM Clearcase (not good, file locking, etc)

Perforce was very easy to use and honestly git was a step backwards on a overall usability.


After Git, Perforce is still state of the art for any project that involves large binary files.

Perforce is still easier to use than Git.

It’s a shame that VCS software has stagnated on Git.


Easy to use until you used "streams." Pulling a file down to a stream of a stream was painful.


How dare you, for a good decade there I forgot that IBM Clearcase exists.

Now I'm ruined, again.


i would like to insert a step 0, no version control whatsoever, which is what currently happens with the vast majority of VBA, and its top contender for replacement, Power Automate.


I would like also to add what happened after git. Since git isn't the last word in the version control area, it would be useful to put git into perspective.


I truly hope so, but there is so much momentum that it is going to be tough to topple. Any competitor which does not offer a git compatibility interface is dead on arrival.

I dislike git for many reasons, but to get people to change is going to take a big "thing" that git cannot do. Incremental improvements are not going to justify the switching cost.


I remember downloading/emailing patch files around and groaning when your local changes had made it impossible to get the patch file to work smoothly, so you ended up going through line by line trying to figure out what was supposed to go where, when...


How is that different from conflict resolution today?

Git essentially automated patch file management, but the hard problem of incompatible changes is just as hard today.


There was less automation and as far as I recall, none of us had "branches" -- everyone worked on a local trunk and everything would diverge rapidly. I wasn't working professionally at the time (I was 11-13), so I wasn't exactly up-to-date on the latest modern conveniences then either. :)

It was also much more difficult to figure out changes across multiple files with a single low-resolution (800 x 600 at the time, I think) monitor on a computer that could barely keep a few files open simultaneously.

Another big pain point was spending an hour or more downloading a patch (via dialup) only to realize that you needed other patches or files before you could actually start working on anything, or that you'd downloaded the wrong file, etc.

So there were many little inconveniences that added up to make it tremendously painful (in my memories) compared to today's experiences.


With the CRT of yore, 800x600 was extremely rare: you had a bandwidth to work with, so you could balance refresh rate with resolution, and 1280x1024 or something wasn't rare at all on 15"/17" screens. Only with "HD video" and TFT screens did we end up getting stuck on 720p (or 1366x768, yikes) screens for a decade before FullHD started appearing on screens again.

Again, that's unrelated to "patch management": I am not disputing we have better experience overall today, but none of that is due to git.


I also think you could have local branches in SVN, but I might be misremembering things.


Around 2003-2006 immediately before git (and mercurial) there were a bunch of new open source distributed VCSes. http://moinmo.in/NewVCS lists a few of them from the time.

Bazaar, Bazaar-NG, Codeville, Darcs, Monotone

Git and Mercurial used ideas from a few of them and made them faster and more usable. For git it helped that there was a large project primed to switch to a new VCS.

https://dwheeler.com/essays/scm.html is another good source from that period

(disclosure I helped with Monotone a bit)


Right before git made it big, I got so fed up with SVN I decided to build my own.

It was UI based, hit a button and it would make a zip of just the files that had changed, along side an xml manifest of all files, their hash, permissions and metadata. Every so many commits it would make a full backup.

Worked really well, even built a visual difftool and history explorer - IMHO nicer than most git UIs these days.

No mechanism for branches or working with other devs but it worked really well for a small team working on a shared SMB server in 2007.

Thought about trying to commercialize it but git took off and I fell in love. Wrote a tool that converted that history to git and never looked back.


I suspect if only merit was involved, Fossil would be in more widespread use than Git.

https://sqlite.org/whynotgit.html


The world before git had one thing that git lost: a strong notion of what "upstream" means. Git nowadays is used just like any other centralized SCM tool of lore; at the end of the day you push you changes to the master repo and go home. GitHub may have made git accessible and popular but it ironically did so by turning it into just another centralized SCM.


It's trivial to decentralize any given process with which you're involved, if you care to.

The fact that the overwhelming majority of users prefer central workflows isn't a bad thing about Git.


TFS and before that VSS were also major systems in use by companies. In TFS creating a new branch was a time consuming operation. But it did have one interesting feature which was the ability to exclusively check out a file. Interesting until that person left the company with the file still locked.


It's kind of insane that we all just put up with the Git UX. It's so bad, the author named it Git. But we all just keep using it as is.


What's so bad about Git UX? There are many GUIs for Git. Most people use one instead of typing commands directly. Most modern IDEs have one built-in too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: