When programming became a chore

Smári McCarthy
ITNEXT
Published in
13 min readMar 27, 2020

--

A conversation I had with one of my teachers in ninth grade still comes back to me occasionally. We were discussing how to interact with people, and how to do so without coming off as an arrogant asshole. This was certainly an issue that warranted discussing. She knew I liked programming, and suggested approaching it as a problem that needed solving. At some point, she suggested the business jargon of “solutions”, having heard that from somebody in the tech industry. “Only corporate drones use that kind of term,” I fired back. “It’s a rejection of creativity.”

While I’d like to believe I’ve learned at least a little bit about not being an arrogant ass in the last twenty years, I still appreciate 15-year-old me’s observation. A few years later, a friend who worked for the company I had been thinking of when I disparaged the corporate drones complained that his work had become an endlessly repetitive exercise in reimplementing CRUD ─ Create, Read, Update, Delete.

Since then a lot has changed. Both the software industry and I have become more mature. For the industry, this means different languages and trends, greater heaps of abstraction layers and more powerful libraries that do the heavy lifting; the exciting parts of software development have become a rare laboratory task, while the day-to-day of most programmers has been reduced to a mind-numbing chore that a friend aptly summarized as “mental health prostitution.”

For me, it means having moved on to a line of work that has more to do with interacting with people than writing code. Still, in my free time, I regularly return to my roots, because for all the faults of the modern software industry and its technological underpinnings, I still enjoy programming, I still consider myself a programmer, and I still care about what direction technology takes.

Does software have a philosophical doctrine?

On some level, what we’re dealing with is a failure of philosophy. As technology has been increasingly commoditized and the economic pressures mounted, a lot of virtues have been tossed out the window in the service of perceived efficiency gains.

These virtues included deep meditation on problems, understanding of the entire system, but also more generally a thirst for knowledge. The culture of software development used to be thick with literary references, puns, off-kilter humor and true artistry ─ it’s rare to happen upon these today, and when one does, it’s a delight that almost certainly comes out of Maker culture, hacker spaces, or from the occasional wizard who’s somehow managed to not get sucked in.

On some level it might be argued that society at large has slipped away from the ideals of Kant’s “Copernican revolution” or even enlightenment values in recent decades. Depending on your worldview, you could attribute this to a lot of factors. But whether you take a Huxleyan “amusing ourselves to death” view, a more positive McLuhanesque view of us simply being too busy exploring the noosphere to care about such contrivances as enlightenment, or perhaps some kind of more Marxist view, ultimately it comes down to the question of to what ends do we seek to develop technology? And to that, I think there’s great wisdom to be found in Edward R. Murrow’s analysis of an earlier technology:

This instrument can teach, it can illuminate; yes, and even it can inspire. But it can do so only to the extent that humans are determined to use it to those ends. Otherwise, it’s nothing but wires and lights in a box. There is a great and perhaps decisive battle to be fought against ignorance, intolerance and indifference.

However, a lack of virtue is hardly a useful description of the problem, and even if it were, it would not provide any roadmap to a solution. When we try to peel back the monotony in software development, it seems that there are a few root causes.

  1. A lot of time is spent boilerplate plumbing for CRUD operations.
  2. Too much effort is spent wrestling with tool chains.
  3. Faulty abstractions and metaphors are exhausting for humans and computers alike.
  4. Creativity and skill have been usurped by pressure to ship a product.

Each of these deserves some dedicated inspection, but it’s worth stating that in no way are all programmers or all programming jobs dealing with all or even some of these issues. My claim is more that the majority of software developers experience at least some of these issues on a day to day basis, and that this is leading to a reduction in motivation, a loss of interest in technical excellence, a sloppy attitude towards technology, and perhaps most perniciously, a sense that as long as it works, it’s fine.

For an absurdist existentialist like myself, this fairly flimsy utilitarian approach leaves a lot to be desired. But it’s not hard to see how a “good enough” doctrine might emerge from a culture where illumination and inspiration are more the product of industrial manufacturing than a rich inner life, and where ignorance, intolerance and indifference are thought so natural that it’s become a recipe for electoral success.

Boilerplate plumbing, or, how Super Mario predicted the Fourth Industrial Revolution

At the bleeding edge of software development, there’s a lot of crazy space ninja shit going on. From speech recognition to automated drug discovery, from neural links to deep learning driven automation, there’s a lot of suave stuff on the horizon.

And yet, the vast majority of software development is chasing bugs, making templates for websites, and wiring up different things that write to, read from, and manipulate data in databases. A not too uncommon task for the modern day programmer is writing something like this:

function api_get_user_profile(request) {
if (request.method != 'POST') { return http.response(405); }
if ('userid' in request.params) {
var user = db.users.get(id=request.params['userid']);
if (user) {
http.response(json.serialize(user.get_public_info()));
} else {
return http.response(404);
}
}
}

In this kind of task set, your job is to make a pipeline that fetches data from the database and delivers it to the next component in the chain. Then write the pipeline that receives that data from the API, or what-have-you, and figures out what to do with it. Sometimes you’ll have 2–3 different individual pieces of plumbing before you actually get to do anything with the data. In the majority of situations, you’re just displaying the data, possibly inside form elements, so that it can be modified via other, similar pipelines.

You are in a maze of twisty passages, all alike.

To be fair, this kind of drudgery isn’t exactly useless ─ it’s just tedious. And increasingly it’s automated in various ways. There are entire frameworks dedicated to making this as painless as possible, although this just reduces the programmer to a curator of pipelines rather than their creator.

Perhaps in some way this is what our childhood prepared us for. A great many modern day programmers were raised on games like Super Mario. And while I’m the last person to doubt the entertainment value of discovering time after time that the princess is indeed in another castle, there is a degree to which the game mechanics are acclimatizing us to this drudgery: instead of insisting on complex puzzles, challenging tasks and creativity, you’re weaned on mindlessly weaving through a series of (largely linear) tubes, stomping on the occasional Goombah who gets in your way.

The story we’re currently told is that all the aforementioned crazy space ninja shit is “The Fourth Industrial Revolution”, and that it is going to fundamentally change the way everybody lives their lives. This narrative begs us to be excited to discover that the princess is in another castle, as somewhere, probably in the cooler campuses of Silicon Valley, the future is being created.

Not by you though. Your job, until it gets automated into oblivion by a majestic Go-playing, self-driving, drug-discovering AI, is to continue stomping on Goombahs.

For the vast majority of people in the tech industry, the Fourth Industrial Revolution isn’t a glamorous worldview of technological excellence. It’s fighting with your coffee machine’s internet-enabled bean grinder while your browser decides how many gigabytes of memory it needs to invoke the JavaScript molasses your framework’s documentation site requires in order for you to remember the correct way to verify user permissions.

And at some point we all catch ourselves thinking, “if only I had a tool that did this mindless shit for me.”

The Right Tool for the Right Job

In Shop Class as Soulcraft, Matthew B. Crawford argues for the virtue of understanding ones tools as a form of deep meditation: “The truth, of course, is that creativity is a by-product of mastery of the sort that is cultivated through long practice. It seems to be built up through submission (think a musician practicing scales, or Einstein learning tensor algebra). Identifying creativity with freedom harmonizes quite well with the culture of the new capitalism, in which the imperative of flexibility precludes dwelling in any task long enough to develop real competence.”

I feel this so deeply, it hurts. As a 36 year old on my fourth or fifth career so far, I feel like I’ve attained mastery in competent incompetence: adapting my skills to whatever problem is at hand, pattern matching against previous experience, constantly out of my depth treading water, and yet somehow managing to mostly not fuck things up. And I’m not alone.

Modernity requires hyperconnectivity, continuous partial attention, and an infinity of meetings. We live in an era of fast cuts and subliminal messages, overlayed with a sound track of speed metal. When I asked on Twitter what programmers find themselves spending their time doing, the responses followed a pattern:

“The task I waste the most time on, is trying to regain focus and head-state after I have lost it.”

and

“In all seriousness, it depends a lot on what stage the project is in. Early on it is a lot of boilerplate and slowly trends towards mostly crying because I don’t understand cache invalidation and multi threaded computing.”

In this environment, we push to modularize our knowledge and abstract it away from us. Make a tool for every occasion, and then instead of struggling with remembering techniques, we remember how to search for the tools we know exist. And so, to figure out if a number if odd or even, instead of:

number & 1

people do:

const isOdd = require('is-odd');
isOdd(number);

This may seem like a contrived example, but the referred is-odd package has 600 thousand weekly downloads. And sure, to the degree it is a contrivance, it’s one that glosses over the shortcomings of Javascript’s type system. But it’s also a reality that rings true throughout modern software.

This has correctly been criticized as cargo cult programming, but I feel that all the snark about NPM-for-everything and Stack Overflow helpers in editors ignores the actual problem, which is that most programmers, and indeed most people, simply don’t have the time or mindspace to actually get to know their tools properly.

It doesn’t help that most of the tools don’t actually do what you need them to do. This is one of Jonathan Blow’s observations, who suggests that “[IDE’s are] this weird Frankenstein’s beast the main job of which is to make up for the fact that most languages are underspecified.” There are detailed instructions on how to interpret variable declaraion, but no guidance on how to assemble multiple code files into a single program. The compiler understands how to tokenize and parse the input stream, but doesn’t know anything about building, testing or debugging. Blow argues in various talks that development environments tend to ignore how development actually happens, what people are actually trying to accomplish, and instead we get layers upon layers of systems that don’t do what we want them to do, and even if we can repurpose them for those tasks, they still work fairly badly.

While some are undoubtedly entirely happy to do CRUD-plumbing for a living, the culture that created that need, when mixed with an increasingly bad culture of misfit tool chains, leads to diminshing returns in terms of software quality.

The sum of all of this is that we have distracted programmers using bad tools to solve problems they don’t have time to fully understand. If this isn’t a recipe for poor job satisfaction and eventual burn out, to say nothing of poor software quality, I don’t know what is.

The subtle art of not repeating yourself

When parsing a preferences file takes long enough that it’s worth showing it in a dialog.

There is a principle in software development, called “DRY” or “Don’t Repeat Yourself”, which is typically stated as “every piece of knowledge must have a single, unambiguous, authoritative representation within a system”.

While it’s easy to see programmers seeking to uphold this principle in their projects, either through carefully crafted class hierarchies, sensible data structure selection. But looking beyond that specific, syntactically exposed layer, I see much less effort put into applying the DRY principle to the abstractions themselves and the metaphors that are employed in the design.

Object-oriented class hierarchies are a basic example of this. Computer scientists are trained to believe in the idea of clean classes that communicate only with their subordinate objects and conform to an easily recognized pattern. But in practice this fails fast as you end up building increasingly convoluted mechanisms to pass messages across to another part of your program without violating some rigorously held conventions.

People are told to observe DRY, but they’re also told to lean on well defined design patterns. This is a contradiction: if you are using design patterns, you are repeating yourself ─ albeit on a more abstract, intangible level.

Tendencies like these lead to a world where compound lossy abstractions lead to orders-of-magnitude slowdowns in software operation. This is a very common contempt for considering the hardware design of the underlying system. Wastefulness is considered both acceptable and normal, on the grounds that hardware is cheap and programmers expensive.

Thus, in general benchmarking, you can assume a Python program to be around 37 times slower than an equivalent C program. Individual instantization of hundreds of objects in memory (RAII style, for instance) frequently leads to cache performance hits of 100–500x. And as a result, splash screens, loading dialogs and even interstitial websites abound where they needn’t exist.

Maintenance of these increasingly absurd abstractions is a contributing factor to programmer unhappiness. But it’s also a major factor in code being slow, buggy and insecure.

When your job is to provide dogma-compliant plumbing between different libraries that pass incompatible data structures around in cache-inefficient ways, instead of finding ways to apply the logic of those libraries to the problem you’re trying to solve, then your CPU is going to be unhappy, your users are going to be unhappy, and you’re going to be unhappy.

Monotony killed the programming star

She doesn’t know what the project is — that’s classified — or what it’s called. It’s just her project. She shares it with a few hundred other programmers, she’s not sure exactly who. And every day when she signs on to it, there’s a stack of memos waiting for her, containing new regulations and changes to the rules that they all have to follow when writing code for the project. These regulations make the business with the bathroom tissue seem as simple and elegant as the Ten Commandments. ─ Neal Stephenson, Snow Crash

It’s hard not to think about this scene when considering the modern programming job. Or, indeed, any number of Dilbert or User Friendly strips. Although it’s not so much, in practice, that people don’t know what their project is, it is very much that they often might as well not.

You would be hard pressed to find a programmer who didn’t get into the line of work out of a thirst for creative outlet, technical curiosity, and a desire for mastery over the wires and lights in the boxes that, if not illuminate and inspire us, then at least amuse us. But you’d be equally hard pressed to find a random sampling of programmers who feel like they spend most of their time doing creative, technical work, instead of plumbing their way through a monotonous mind numbing chore.

But remember: Repetition is practice. By doing the same thing again and again, be it plumbing, implementing quicksort, or building a web framework, you’re honing your skills. But are you practicing the right thing? Are you developing your skills, or just goofing around in an abstraction layer that lies within your comfort zone?

The flipside is that there might be a reason people keep implementing new frameworks ─ it turns out, making frameworks, or learning new ones, is often more engaging than using them if the task you’re using them for is monotonous. Everybody knows the guy who’s used five different frameworks just for fun.

Learning and growing as a person is perhaps both anathema and intrinsic to repeating yourself: you must repeat yourself to attain mastery; but the true master never allows himself to become stuck at a particular layer, never accumulates the ideological baggage of believing they’ve found the one true way, and perhaps most importantly, exhibits the “bad practice” of insisting on taking the time to figure things out for themselves, even when there are proven optimal solutions available.

But, to zoom out a bit, it’s worth noting that it’s not just in technology that this is the case. Everything I’ve said about the software industry so far could easily be translated to any of my last four or five carreers. It certainly fits my work in politics with alarming ease. I just spent a week of back-to-back meetings about economic response to the Covid19 pandemic, and feel like it’s all been an exercise in rearranging the deck chairs on the Titanic. Our abstractions, metaphors, tool chains and overbearing philosophical doctrine makes a mockery of any attempt to creatively solve the actual problems.

At the same time, I’m feeling that after two weeks of isolation at home after being diagnosed with this weird virus, for the first time in a long time I have the mind space to actually think about what’s wrong. And I’d be lying if I said I wasn’t at least mildly tempted to remain in self-isolation for a few more weeks, in the hopes that a draft new philosophy shows up.

At least on the technology front, unlike politics, there is a very real sense that things could be different. With a correct amount of criticism of our underlying assumptions, the correct amount of effort at building masterful tools and taking the time to master them, and with the correct amount of quiet introspection about the guiding philosophies of the current era, perhaps someday we can make programming fun again.

--

--

Interested in ecosystems and sustainability, society and technology, networks and markets. Founder & CEO of Ecosophy. Former member of the Icelandic Parliament.