Monday, July 27, 2009
Friday, July 03, 2009
I think I'm just going to go back to the old way: compressed archives of (few) directories.
For those who ask, indignantly, and not on this blog because comment count is almost negative, about the multi-member team working on massive software, I reply: The Rule Of Modularity (http://www.faqs.org/docs/artu/ch01s06.html#id2877537), which I transcribe here purely for effect.
As Brian Kernighan once observed, “Controlling complexity is the essence of computer programming” [Kernighan-Plauger]. Debugging dominates development time, and getting a working system out the door is usually less a result of brilliant design than it is of managing not to trip over your own feet too many times.
Assemblers, compilers, flowcharting, procedural programming, structured programming, “artificial intelligence”, fourth-generation languages, object orientation, and software-development methodologies without number have been touted and sold as a cure for this problem. All have failed as cures, if only because they ‘succeeded’ by escalating the normal level of program complexity to the point where (once again) human brains could barely cope. As Fred Brooks famously observed [Brooks], there is no silver bullet.
The only way to write complex software that won't fall on its face is to hold its global complexity down — to build it out of simple parts connected by well-defined interfaces, so that most problems are local and you can have some hope of upgrading a part without breaking the whole.
Then more interesting stuff here from Jakob Kaplan-Moss with lively discussion, including Ian Bicking.
I left a comment there:
See also: http://chrismahan.blogspot.com/2008_10_01_archive.html
I've just made another interesting deviation while rereading the above. The rulers in Dune relied on the mentats to best determinate long-term strategy, as well as implementation details. The elder Herbert continues to amaze me.