In the first minutes of navigating an unfamiliar code base which you have to build upon, there is always a certain allure to restarting from null. Only if you could start with an empty module, nothing but some specifications, a few months’ time, and lots of pizza and coffee. You would come out at the other end with a super-clean and neatly designed code base, everything perfectly tested and even better documented (if your team does documentation at all). Then, the changes you are struggling with —they will take months to implement on the current code base anyway— would become completely trivial.
Some projects deserve a rewrite, no matter how you look at it; if the code is fundamentally wrong, because it was written by someone who did not know the language, and tried to write C in Python, for example. But, let’s be honest, one gets the same sentiment even when working on a middle-sized project harboring complexity because of the problem domain, and not because of zany design decisions or bad coding style. The attraction of what could be called an editor tabula rasa is enormous, and it derives from not having the load of earlier design decisions and the difficult bugs. The new bugs and design dead ends are not there to be seen, of course; this ‘delayed negative reinforcement’ has led to many a rewrite, not always with positive results.
Most software development methodologies feed the charm of the clean slate, by taking as a starting point the missing functionality. There is a gap where a piece of software or a feature had to be, and the programmer’s job is to fill that hole. Fortunately, the myth of a perfect specification for this hole has been discarded by what is now called agile, but even in agile project management, what is principle is developing a product from the ground up, given a prioritized backlog and a host of other people to ask whether they are happy with the direction the work is taking. The programmer’s job is therefore to pile up code to build functionality, making the existing pile bigger and, by definition, more valuable.
The problem inherent in the daily work of the programmer, however, is another one: You have to improve the platform you are moving on while at the same time moving forward. Improving-while-moving is necessary if what you work on serves as a tool for the future; libraries that become infrastructure, tools for deployment, code and project management tools, etc. I ran into a great analogy on this mode of working in a philosophy paper I had to read while still at the university. Otto von Neurath, among other things philosopher, argued against Carnap’s claim that we could start from primitive statements of observation (‘protocol sentences’) and build all of science on these, and made the following comparison:
There is no way of taking conclusively established pure protocol sentences as the starting point of the sciences. No tabula rasa exists. We are like sailors who must rebuild their ship on the open sea, never able to dismantle it in dry-dock and to reconstruct it there out of the best materials. Only the metaphysical elements can be allowed to vanish without trace. Vague linguist conglomerations always remain in one way or another as components of the ship.
The ship you are sailing on has to stay afloat. There are bills to be paid, and a web service that has to stay available, or a desktop program that needs a new version to generate revenue for the next quarter. So you have to keep stuff working, while at the same time changing parts and improving quality. You can’t just pull the ship into the dock for a few months, revamping everything at once; this would be the same thing as sinking the ship.
The kind of improvement you have to make goes a level higher than the purely technical, because as any scrum coach will tell you, it’s about improving the process. Both as a team member and as an individual, inspecting and adapting the whole process, including the technologies you use, is the order of the day. At the same time that you are working on new features, you have to work on how your team functions, and how you as an individual can write better software. This means learning new things, trying out different techologies, adapting those which promise more value for you.
There is a second sense to this higher-level improvement, and it becomes visible if you think of the parallel to Neurath’s topic, how science is made. The abstractions that you make in various levels when building software are also tools which are part of your platform. As you mutate your code base with the requirements, your abstractions will have to change too, and the worst you can do is to let them freeze and dictate you how to bend over backwards to fit their frame, instaed of you bending and remaking the abstractions. These abstractions are never strict models which translate one-to-one to code, however; they are ‘vague linguistic conglomerations’, as Neurath calls them, and require a different vocabulary to talk about, compared to code. Object oriented programming was an attempt to bring these conglomerations into the domain of a programming language, decreasing the distance between the nouns and verbs of natural language and those of code.
Whoever takes rebuilding while floating seriously should dedicate more time to refactoring, obviously, and much more attention to what agile development methods call technical debt. Since I include abstractions among the products and tools of software development, it follows that debt relating to past decisions is not limited to the technical sphere. More important then getting rid of this debt, however, is a mental change towards thinking of software development as building tools for the future. One trick that many developers use, for example, is writing code that uses an API, and then writing the API from this specification; this way, you ensure that you are in the mindset of building something that won’t only work, but also be used by others to build further.