In the eighties, a new kind of computer started to become common, the workstation.¹ Workstations were powerful professional versions of personal computers that cost as much as a new car. Consumer personal computers only cost as much as a used car. There were SGIs used for engineering tasks such as digital animation, Apollos for newspaper layout, Suns running computing centers and computer-aided design (CAD), and the NeXT workstation which became the world’s first web server. Here’s an interesting fact: from a programmers perspective the iPhone is the direct descendent of NeXT.
A lot of the programs written for these new computers were written in new languages. For the first time since COBOL, languages were going to have a greater effect on productivity than hardware or OS. Of the many languages that emerged during this decade (at least 65 of them), there were three new languages developed for these workstations that left a lasting impact on programming for all computers.
Smalltalk because it profoundly influenced almost every language that came after it.
C++ (in part two of this article) because even though it did not change the craft of programming as much as the other two, it is so universal it has to be mentioned.
And finally, Objective-C (in part two of this article) because it is a story of how the class libraries that come packaged with a language can dramatically improve productivity and be so flexible that they can power hundreds of generations of devices over decades.
Smalltalk was actually first written in 1972 but was kept within the Xerox Palo Alto Research Center (PARC) for eight years. In 1980, a new version was released and distributed to four companies for peer review and implementation: Hewlett-Packard, Digital Equipment Corp, Tektronix, and Apple. I estimate that Smalltalk ranks equal to C in terms of importance and impact upon computers and programming productivity.
One of the many ways in which it was influential was that it came with a development environment. Integrated Development Environments (IDEs) are commonplace today, but prior to Smalltalk most programmers wrote code with a text editor (something that is making a bit of a comeback in recent years). Smalltalk came with a special tool that not only organized the code in a repository, but also helped you see which bits of code were connected to which other bits, and it allowed for the examination of the code as it ran. Before this, programmers would have to embed debugging messages (sent to themselves while the program was running) so they could see how far the program executed before crashing. They had to use hardware features to step through code one instruction at a time and would take snapshots of memory called core dumps.
Smalltalk was the first language to include a development environment that supported live examination of the program as it ran, which meant that developers could instantly see all the implications of any code they wrote. Want to know if an index starts at zero or if it starts at one? Just try it out — “ask the compiler”. It is faster than looking it up. Smalltalk was the most interactive programming language ever to appear.
Many modern languages such as Java would be completely unusable without some of the IDE concepts that Smalltalk pioneered.
Smalltalk also was the first general purpose Object Oriented Programming (OOP) language. Almost every language currently used today uses objects. Smalltalk showed us how.
Objects were seen as a way to get control of code reuse. Since the earliest days of programming, coders would reuse favorite bits of code to implement routines and functions that were common to most programs. As ever, this code reuse was highly idiosyncratic, varying in every way from one programmer to another. As well, bug fixes and enhancements made to one copy were not available to all the other programs in which the code was copied.
The way objects work is that there is a “class library”. The classes are templates or blueprints for objects and the main code of the program will use them to “instantiate” or build objects. Since the code of the program is separate from the class library, a class can be updated with a bug fix. Thus, any program that uses that library can be recompiled and the bug fix will be automatically incorporated without rewriting any of the program code. At least in theory.
Almost all popular languages today use this obvious improvement over cutting and pasting scraps of code from a notebook.
Objects are an abstraction layer, which is what you call it when underlying complexity is hidden behind a model. A good example of that is television channels. Channels are abstractions of radio frequencies. When you tune a television to “channel 2” what you are actually doing is adjusting the television’s circuitry so that it picks up the radio frequencies between 54 and 60 megahertz and ignores all other radio signals.
FORTRAN and other languages are abstractions of assembly code, and assembly code is an abstraction of machine code. Object oriented languages added a 3rd level of abstraction.
Modern software use many levels of abstraction. When you interact with a web page you are dealing with at least a dozen layers of abstraction. The “button” you click on is really a region of the screen which is caught by the operating system, passed to the web browser, which compares it to a pixel map in memory which was rendered from a “Document Object Model” which was constructed by interpreting HTML (and CSS and JavaScript), which calls HTTP functions that in turn call TCP/IP protocols, which deliver messages to and responses from an “Application Programming Interface” which is usually built from one or more “frameworks” which are abstractions of the language they are written in, which are themselves abstractions of assembly language which is an abstraction of machine code.
And this is highly simplified. The reality is that there are actually many more layers involved.
However, in the eighties, programmers were still working pretty “close to the metal”. There were only the operating system abstraction and the language abstraction(s). And good programmers were aware of what was below. Good programmers, even if they did not write in assembly code, knew how to read it, and knew how their code translated to assembly — at least in areas critical for performance.
Apple Computer was strongly influenced by the work being done at Xerox PARC. Apple had made a deal with Xerox, letting them buy 100,000 shares of stock for $1M (worth $22M one year later, and worth $6.7B today) in exchange for two one day demonstrations of the technology at PARC in 1979. Xerox did okay.
Unlike the Xerox management, Apple management was intent upon putting these ideas into the hands of customers. Naturally some of the best minds at Xerox eventually came to work for Apple, and ideas came with them.
Although Apple had access to Smalltalk, it used Pascal as its official development language, and so it ended up extending the Pascal language to include objects. This was Object Pascal. It then created an object library or framework called MacApp which was intended to create major productivity gains for developers. It missed the mark because it had too many levels of inheritance and was too complex. Instead of simplifying development it made it more complex in many ways. Which is why you probably have never heard of it.
[1] If a train station is where the train stops, what’s a work station? (typical programmer humor)
This article is an excerpt from my upcoming book The Chaos Factory which explains why most companies and government can’t write software that “just works”, and how it can be fixed.