paint-brush
The third decade of programming: Just was *is* Productivity?by@azw
985 reads
985 reads

The third decade of programming: Just was *is* Productivity?

by Adam Zachary WassermanApril 4th, 2018
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

The word productivity comes with a lot of baggage. For many years programmer productivity was measured in lines of code per day/week/month. There are several flaws in this method.

People Mentioned

Mention Thumbnail

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - The third decade of programming: Just was *is* Productivity?
Adam Zachary Wasserman HackerNoon profile picture

Part two of Chapter Three in a very personal history of programming. Read Part one here.

Productivity

The word productivity comes with a lot of baggage. For many years programmer productivity was measured in lines of code per day/week/month. There are several flaws in this method.

Let’s take a small game program called life and have two programmers of different skill levels write it in two languages: C and assembly. In C this is a 57-line program. The assembly language version is 97 lines. Let’s say that the C programmer takes sixty seconds per line (this is actually very fast) and the assembly programmer writes code even faster than the C programmer at the rate of one line every 40 seconds. It will take 65 minutes to create the assembly language version versus 57 minutes for the C version, so it took 15% more time, even though the assembly programmer writes 33% more lines of code per day.

Can we say that the assembly language programmer was more productive? If we measure by lines of code a day, she definitely was. If we measure by time spent to achieve results, no. Things get even less clear when you consider that the two programmers may be making different salaries because a programmer that writes fewer lines might also be paid less and therefore _cost_less “per line”.

Can we at least compare two programmers making the same salary, working in the same language? Let us consider this case: two programmers write the game of life in C. Captain Slow as we shall call him takes an entire day to write 97 lines of highly optimized code that takes 100 kilobytes of disk and loads in less than a second. Captain Showboat takes the same day, manages to find a way to use 970 lines of code where 97 would do just fine, to write a program that takes up one megabyte of disk and takes 10 seconds to load.

Now tell me: who is the more productive programmer?

More code does not automatically make a program better. Do we really want to incentivize people to write more lines than necessary? Bill Gates famously denigrated measuring productivity by lines of code by calling it a race “to build the world’s heaviest airplane”.¹ Later on in the series, I’ll quote Larry Wall on the same basic idea.

Another favorite unit of measure is “function points”, a conceptual unit that is supposed to represent a discrete functionality for the end user. For example, the ability to login would be one function point (FP). The ability to change your profile photo would be another. In the real world, almost nobody counts the function points or has the ability to track how much programmer time was spent on a given FP. Therefore, almost everybody who measures productivity in FP/day “cheats” by using industry standard conversion factors that say: _this_language typically uses this many lines of code per FP, and that language typically uses that many lines. Then you take the lines of code and divide by the conversion factor, which turns it into function points. Obviously, this is no better than the first method. There are two or three other methods, none of them any better.

Therefore, when I write about productivity in software development, I am appealing to common sense, not metrics. I consider an improvement in productivity to be anything that helps get the software product successfully completed faster or better without costing more.

Of novices and masters

My reason for saying that C could possibly have set productivity back by a few decades is deeply connected to the central thesis of my book that modern programming is artisanal and cannot succeed without master programmers. My absolute favorite programming joke (from the Jargon File, of course) is written in the style of a koan, a riddle intended to help a Zen monk achieve enlightenment.

A novice was trying to fix a broken Lisp machine by turning the power off and on.

Knight, seeing what the student was doing, spoke sternly: “You cannot fix a machine by just power-cycling it with no understanding of what is going wrong”.

Knight turned the machine off and on.

The machine worked.

Tom Knight, one of the Lisp² machine’s principal designers, knew what he was doing when he power-cycled the machine, which is why it worked for him and not for the novice, who had “no understanding of what is going wrong”.

C is like that; it is almost assembly language. It is powerful, and it is dangerous. It is magic. In the hands of a master programmer, the proper incantations in C can be used to write an operating system, another language, or operate devices that our very lives depend upon such as the anti-lock brakes on your car. In the hands of people not quite sure of what they are doing, the magic spell could destroy a piece of hardware, wipe out data, or leave a subtle bug that will not be discovered for a long time and will be almost impossible to find once the undesired side effects are noticed.

C brought programming back to the days of Mel (see the previous chapter in this series), whose brilliant trickery would have been impossible in FORTRAN, COBOL, or ALGOL. It would also be impossible in the other new languages becoming available at the time; BASIC, Pascal, Forth, and SmallTalk. Yet in C you have only to point to a memory address and away you go. Just like Mel.

And here’s the thing:

If every programmer were as good as Mel, this kind of programming would be just lovely. We would all be using optimized code that ran super-fast and didn’t make mistakes. But the unfortunate reality is that not every programmer is that good. And it was just as C was released, that things were going to get a lot worse.

Barbarians at the gate

Up until now, because of the cost of computers, there were mostly two kinds of programmers: a) professionals working for government or large corporations, and b) academics, often postgraduates, using programming for research. Both kinds were almost exclusively self-taught, very smart, and almost always with a real affinity or talent for coding.

But now, in the seventies, two important things changed. First, academic institutions were creating curricula specifically for programming, creating new “teaching languages”, and most importantly, undergraduates (and the occasional lucky high schooler) were being given time-sharing accounts specifically for the purpose of learning how to program.

The second thing that changed was the advent of personal computers, first as kits assembled at home, then later as attractively packaged consumer goods that sold for the same price as an old used car. It was going to get a lot easier for those who did have an affinity to teach themselves. The first mass produced computer kit, the Altair 8800, shipped 5,000 units in 1975, its first year of production. In three years (1977–1979) the two top personal computers — the Tandy TRS-8³ and the Apple II — sold more than 150,000 units. That is approximately how many computers existed in the entire world at the beginning of the decade. The BASIC language was just about the only software that was included.

It just got a heck of a lot easier to get anywhere near a computer to begin with.

Hundreds of thousands of new programmers were either being taught or were teaching themselves, and they would soon be unleashed into the world, a world that was hungry for computer programs. This would soon dramatically swell the ranks of programmers.

It was no longer so very hard to become a programmer, and programming was becoming known as a promising career that one could train for. Once the object of passion and care, programming had started the journey to becoming a lucrative career that would eventually attract all sorts of people who, with no particular passion or affinity for programming, were only in it for the money.

Robert Martin, of whom I will speak later in this book, is outspoken about an inexplicable shift he has seen in his lifetime. The cohort of programmers was fairly gender balanced in the fifties and sixties, and then very quickly turned into a male dominated field in the seventies.

It occurs to me that a plausible explanation for the disparity could be the shift of programming from being an obscure vocation that only attracted people based on merit and passion, to being an attractive and lucrative career that was becoming as respected as electrical engineering.

When programming was an obscure vocation, there was little competition. I think it is possible that once programming became a lucrative and desirable career, competition was fierce, and in a male dominated society, institutional bias ensured that the specialized education and choice careers went to men.

Of myth and man

The last thing I want to tell you about from the seventies is the publication in 1975 of The Mythical Man-Month. This oft-quoted book by a former IBM manager named Frederick P. Brooks Jr. describes the lessons he learned ten years earlier while managing one of the biggest software development projects to that date, the System/360 operating system.

It is astonishing to me that in re-reading it now, I cannot find a single analysis or observation that is not as pertinent and trenchant today as it was over forty years ago when this book of essays first came out. _The Mythical Man-Month_was a seminal work. It is impossible to overstate its significance and the impact and influence it has had on the entire computing industry. Although I think no one at the time recognized it, it was the first comprehensive software development lifecycle (SDLC) methodology. Not only did it describe the challenges faced by systems programming, it prescribed (in over 150 pages):

  • the phases of SDLC
  • the allocation of time for each cycle
  • how to estimate
  • how to staff
  • how to structure the organization
  • how to manage the design of the program
  • how to plan iteration
  • documentation and governance
  • communications protocols
  • how to measure productivity
  • some special concerns around code design
  • deliverables and artifacts
  • tools and frameworks
  • how to debug and plan releases
  • how to manage the process


It is true that a version of the step by step path in the figure to the left had appeared in W.W. Royce’s 1970 paper Managing the Development of Large Software Systems but contrary to what many people believe, Royce was not advocating its use. In fact, he issued _warnings_about using it. He was simply describing common practice of the time and some of the problems associated with it, and as Brooks did five years later, he made some  proposals on how to mitigate the issues.

However, Royce’s recommendations filled ten pages, compared to the 150 pages of detailed prescriptions that are in The Mythical Man-Month. Royce’s paper is not so much a methodology as a call to action. That is why I credit Brooks with the first written methodology.

I’ll talk more about methodologies a few chapters from now, but not just yet. There are still a few stories to tell about languages in the eighties.

“Do NOT simply read the instructions in here without understanding what they do.”


~ The configuration file that comes  bundled with the Apache webserver

<- Previous article in the series

Next article in the series ->

[1] “The Physicist”. Wired, 1995, https://www.wired.com/1995/09/myhrvold/.

[2] You will probably remember I mentioned LISP in Chapter One, an influential computer language in use since the late 1950s.

[3] My little brother was one of those people. I, a musician, was interested in his TRS-80 for all of ten minutes one day. If you had told either one of us that day that he would become a professional artist, and I would be a computer geek, we both would have thought you were completely crazy.