Rapid Development: Taming Wild Software Schedules
All
Stack Overflow 9
This Year
Hacker News 4
This Month
Hacker News 1
https://www.amazon.com/Software-Estimation-Demystifying-Deve...
and
https://www.amazon.com/Rapid-Development-Taming-Software-Sch...
Also think: if the business does project A, how much money does that make for us or save for us? From a developer standpoint I have gotten the business people to tell me a value that is a fraction of the cost of the product. You fold that one and free up resources for project B which is the other way around.
Two books help with methods of doing this.
Rapid Development: Taming Wild Software Schedules: https://www.amazon.com/Rapid-Development-Taming-Software-Sch...
Software Estimation: Demystifying the Black Art https://www.amazon.com/Software-Estimation-Demystifying-Deve...
This likely does mean you'll need to be allowed to deploy automated test cases, develop documentation and the entire nine yards, because it's going to be far harder to hit any reasonable deadline if you don't have a known development pipeline.
You'll need to reduce your uncertainty as much as you can to even have a chance - and even then, things will still blindside you and double (or more) the actual vs. the original estimate.
After a number of iterations of this, you converge on a baseline architecture that you try to support future change without being excessively focused on YAGNI. Over-focus on YAGNI often leads to architectures that are so inflexible that when you DO need it, you can't add it without a demolition crew[1].
People should also remember the context in which YAGNI came up: Kent Beck, Ward Cunningham, Smalltalk, and the C3 project. This particular combination of people, project, and process made YAGNI feasible. And the C3 project, which spawned XP, was not as successful as folklore would have it[2].
It's not always so. Imagine if YAGNI was the focus of Roy Fielding and the HTTP specification, which has lasted remarkably well because of allowance for future change in the architecture.
Scrum and other processes that claim adherence to the Agile Manifesto work best in certain contexts, where code turnaround can be fast, mistakes do not take the company down and fixes can be redeployed quickly and easily, the requirements are not well-understood and change rapidly, and the architecture is not overly complex.
Many other projects don't fit this model, and people seem to think that so-called "Agile" (which even the creators of the Manifesto complain is not a noun), and mostly Scrum, is the sole exemplar of productivity.
The fact is that there are many hybrid development processes, described by Steve McConnell[3] long before the word "agile" became a synonym for "success", that may be more suitable to projects that have different dynamics.
An example of such a project could be one that does have well-defined requirements (e.g. implementing something based on an international standard) and will suffer from a pure "User Story" approach.
Let's be much more flexible about how we decide to develop, and accept that you need to tailor the approach to the nature of the project, based on risk factors, longevity, and other factors, and not on dogma.
And let's not underplay the extreme importance of a well-thought out architecture in all but the most trivial of projects.
[1]: https://www.amazon.com/Rapid-Development-Taming-Software-Sch...
https://www.amazon.com/Rapid-Development-Taming-Software-Sch...
for the engineering and peopleware considerations, which is maybe 1/3 of "right".
Another 1/3 of "right" is the path from conceptual design to database modelling and realizing operations on the database from code. If this part is well planned the code almost writes itself and the customer can be always right because the answer to "can you do this small thing?" is always "yes!"
The other 1/3 of "right" is the content of the computer science curriculum. Some of this is practically math such as combinatorics and algorithm analysis. You would also have to take some classes in areas such as compilers, computer architecture, operating systems, etc. For the average person who wants transferrable skills I say go for compiler construction because small simple compilers are userful and methods used in compiler construction are useful for other kinds of programs. Also compilers interact with the processor and operating system so you can learn some of that by learning compilers.
Another path that gets closer to the metal is do some embedded development, for instance, program a microcontroller to talk to the computer in your car. Operating systems for tiny machines are all the rage these days and easy to learn because they are themselves tiny.
https://www.amazon.com/Rapid-Development-Taming-Software-Sch...
"Point the boat in the right direction and row" is the dominant paradigm in the industry.
The concept of "technical debt" is I think harmful because it is a two word phrase that stops thought. In the real world if you want to take on commercial debt, the bank and/or bondholders and originators are going to want to see a detailed financial analysis that will indicate they will get their money back.
Probably 80% of effort on software is maintenance, but it is rare for any project to start out thinking about the cost of maintenance.
A good phrase would be "least cost development", which is really the idea behind the book "Rapid Development"
https://www.amazon.com/Rapid-Development-Taming-Software-Sch...
The short of it is that working to a realistic plan with the right amount and kind of planning is much more "rapid" than the typical "let's point the boat in the right direction and row for a while" strategy that is rooted in wishful thinking.
That said there often is some simple strategy that gets 70% of the answers right and it is correct to get that into place (at some required scale) and then think about the problem of developing a strategy that another gets 7% of the remaining right, then 3%, etc. It could be very dangerous to start from the other end of implementing strategies that add 1% or 0.1% of gain. That is, "pick the low hanging fruit" but with a plan to pick the rest of it!
Ok, it's a little more complex than that. In project management people talk about the "triple constraints" of budget, time, and scope.
The relationship between budget and time is not a simple tradeoff. To some extent you can accelerate a software project by increasing the budget, but that extent is limited, maybe you can accelerate the schedule by 30% relative to a "least cost" plan.
Fred Brooks learned these limitations the hard way in the 1960s and he wrote the Mythical Man Month so you don't have to! One problem is that if you add more people to a project, it takes time and attention to onboard them that could otherwise be used to get the project done.
A counter to that is that sometimes buying hardware, software, or services, can greatly accelerate the project. For instance if you are training neural networks on a MacBook, it is probably worth every penny to get a real desktop PC and put a 1080Ti graphics card in it, or to spend some money on cloud computing.
Many managers look at the cost as a function of the deadline, that is, they see the cost of the project as a function of the time you are tied up doing it, so if they can compress the deadline, the cost goes down. (or so they think).
Thus that leads to the "phony deadline" which has no real basis. One problem is that setting out without a realistic plan you are likely to make mistakes which will draw out the project, add to costs, possibly make the project fail.
Most of what I say above is laid out in more detail here:
https://www.amazon.com/Rapid-Development-Taming-Software-Sch...
The flip side of that is the hard deadline, where the job might as well not be done if it is not done on time, for instance, you need to get a grant application in on before a certain day, or you are putting together a demo you are going to show at a trade show on a particular date.
It is important to understand what the actual nature of deadlines that you are up against, what flexibility you have, what impact being late has on the business, etc.
That leaves "scope" as an area with wriggle room. Probably there are some features of the project can be dropped or modified, and management may be able to hit the deadline by dropping features it can afford to drop. This is one big advantage of "agile" methods; if you have something that sorta-kind works at the 25% mark of the project and then you hit the deadline with the most important 80% part of the functionality you are doing better than most people.
https://www.amazon.com/Software-Estimation-Demystifying-Deve...
For someone who "jumps into the code" and "design[s] as they go", I would say writing anything including a functional spec is better than your current methods. A great deal of time and effort can be saved if you take the time to think it through and design it before you even start.
You'll find that most places will have some variation of these three documents. The functional spec can be lumped into the design document.
I'd recommend reading Rapid Development if you're not convinced. You truely can get work done faster if you take more time to plan and design.
These two are great resources on SPM and methods therein:
Software Project Survival Guide - Steve McConnell
Rapid Development - Steve McConnell
This is a good resource on SCRUM, an agile method of development
Agile Software Development with Scrum
If you haven't read Rapid Development, I highly recommend it. What makes it super pertinent now is how it leaves off just as it should start talking about Agile. So it gives you a thorough background of "how we got here". When you read the sections on iterative development and iterative prototyping, it will make any additional research you do on Agile that much more meaningful.