Friday, July 03, 2009

Discouraged by version control

So I watch the Git vs Mercurial vs bazaar vs SVN low-intensity conflict simmer on the intarweb and I wonder... Is is me or has all that stuff suddenly become Too Hard to use? Where has it all deviated from the Unix philosophy and gone down the Enterprise Way?

I think I'm just going to go back to the old way: compressed archives of (few) directories.

For those who ask, indignantly, and not on this blog because comment count is almost negative, about the multi-member team working on massive software, I reply: The Rule Of Modularity (http://www.faqs.org/docs/artu/ch01s06.html#id2877537), which I transcribe here purely for effect.

As Brian Kernighan once observed, “Controlling complexity is the essence of computer programming” [Kernighan-Plauger]. Debugging dominates development time, and getting a working system out the door is usually less a result of brilliant design than it is of managing not to trip over your own feet too many times.

Assemblers, compilers, flowcharting, procedural programming, structured programming, “artificial intelligence”, fourth-generation languages, object orientation, and software-development methodologies without number have been touted and sold as a cure for this problem. All have failed as cures, if only because they ‘succeeded’ by escalating the normal level of program complexity to the point where (once again) human brains could barely cope. As Fred Brooks famously observed [Brooks], there is no silver bullet.

The only way to write complex software that won't fall on its face is to hold its global complexity down — to build it out of simple parts connected by well-defined interfaces, so that most problems are local and you can have some hope of upgrading a part without breaking the whole.

No comments: