All Comments
TopTalkedBooks posted at August 19, 2017
Never meaningfully demonstrated? Have you tried to verify the claim?

Go read Peopleware. You will find described very carefully set up coding comparisons that routinely found a factor of 10 productivity difference between different experienced programmers on the same task, and also a discussion of what organizational factors lead to those productivity differences.

There is other research on the topic as well. For instance cites "individual differences" research that found a 30 fold difference between mediocre programmers and the top programmers. That article is supposed to be a distillation of so I'd look there if you want citations into how that research was done and what exactly they found.

TopTalkedBooks posted at August 19, 2017
A lot of these come from Glass's excellent book "Facts and Fallacies of Software Engineering":

It's quite concise (224 pages), but it's chock full of quite excellent advice. Each one of these points (and many others) is fleshed out in a separate chapter that gives a good deal of background, clarification, and supporting evidence.

TopTalkedBooks posted at August 19, 2017
I've never been a fan of extended analogies as a way of explaining things. At some point you need to understand the subject for itself, and not in terms of something else.

That said I strongly second his point about the value of keeping designs simple. And I have an interesting piece of evidence in support of it.

I just read and one of the "facts" that Robert Glass brings up is that a 25% increase in requirements results in a 100% increase in software complexity.

I put "facts" in quotes because the claim is only supported by one study and I have lots of questions around how one would define key terms like "software complexity". However based on experience I am inclined towards believing that something like this is true. If you further believe (as I do), that the source of the requirements doesn't matter, then adding internal requirements about design is going to result in complications.

As a result I am inclined towards simple, clean designs. I gladly introduce polymorphism, abstraction layers, and so on fairly frequently. But only if I see a concrete benefit from doing so. If I am not convinced that this piece of abstraction produces a net win within the context of this software solution, I don't do it.

(For the record I am not a software master by any means. In Zed Shaw's terms I'd be an expert, although early mentoring with the right person kept me from ever suffering from the love of complexity that he pastes all experts with.)

TopTalkedBooks posted at August 20, 2017

This is a really tough issue. See what Robert Glass has to say on the subject in his book "Facts and Fallacies of Software Engineering". (Note that Amazon has books available new for less than they're available second-hand! [as of 2009-01-05 12:20 -08:00]; also at Google books.)

TopTalkedBooks posted at August 20, 2017

The book Facts and Fallacies Of Software Engineering states this fact: "Modification of reused code is particularly error-prone. If more than 20 to 25 percent of a component is to be revised, it is more efficient and effective to rewrite it from scratch." The numbers come from some statistical studies performed on the subject. I think the numbers may vary due to the quality of the code base, so in your case, it seems to be more efficient and effective to rewrite it from scratch by taking this statement into account.

Top Books
We collected top books from hacker news, stack overflow, Reddit, which are recommended by amazing people.