Back at the dawn of the modern computing era, one Donald Knuth published a book titled "The Art of Computer Programming" that soon became a bible of software engineering and defined many generations of computer programmers to come. This book, which was intended as a compendium of computer algorithms, had also put forward a very important idea: software engineering is a form of art. At the time, that was the prevailing point of view.
That was then.
Fast forward to the "mobile first", "internet of everything" interconnected ultra-computerized era of today. Computers have changed a lot, but what changed even more is the perception of the computer programming as a profession. Once considered a form of applied art, programming of today is nothing more than a relatively well paying job, one that requires sitting in a large dimly lit room and staring at the laptop screen for hours. Long gone are the days of programming competitions, replaced by "hackathons" that usually boil down to insanely long work days thinly veiled as a "party". Forget about finding the most elegant solution to the problem! The motto of the day is "ship it now, fix it later". Cheaper by the dozen.
So how did we get there?
The advent of mobile brought about a lot of revolutionary changes (that have been well documented and discussed all over the media) but one of the major paradigm shifts that came along with mobile gets very little attention. Mobile era put the concept of disposable technology front and center and conditioned the crowds to expect crashes and failures. Before, the errors and bugs were grudgingly accepted. Now, they are expected. Nobody blinks when the app crashes - we just start it again. It's perfectly fine when the phone (or laptop for that matter) freezes - we just reboot it. With increasingly fast delivery cycles, the code has become disposable, and so has the approach to programming in general.
Writing quality code (never mind "elegant", just stable readable code) is no longer practical and is no longer expected - who cares how good or bad your code is when it will be thrown away in a year or two anyways. Keenly aware of the cultural shift, most major tech companies are slashing their testing departments and pushing the buck onto developers and end users.
How do people who work in the industry feel about all that? How does the "technical talent" relate to the fact that their work becomes more and more mechanical, less and less creative? A few years ago, PayScale.com published a comprehensive job satisfaction survey covering well over a hundred different industries. How did computer folks fare? Take a look:
The first number is the median pay, followed by percent of people who sees their job as highly meaningful. The last column is the share of people who are satisfied with their job.
Fascinating, isn’t it?
Computer “talent” group came well below bartenders. Below waiters and waitresses. And slightly above dishwashers. Only 29% of the programmers see their job as meaningful. The wast majority sees it as a waste of time.
Houston, we have a problem.
A number of problems, in fact. Those problems have nothing to do with technology and everything with the human side of the profession. As computer technology became disposable, computer professionals became disposable too. And they are keenly aware of that. Internet opened the borders and brought about global competition where the “talent” is measured in cents and most gigs are awarded based on the price point. Cheaper by the dozen. There is no point in developing a unique vision and style. There is no point in trying to be the best. Employers want cheap disposable commodity, and they want a lot of it.
And you know what? They are getting it.