OK, it's not really a lost art
July 31, 2018
I was surprised to learn that an old blog entry was being discussed on reddit. Maybe I should update my comments a little. It seemed odd to me that a blog I had abandoned (_[mea culpa]) had generated three emails to me in a single week. Finally someone pointed out to me that it had been reposted and was getting a little bit of attention.
I haven't read all the responses from the numerous redditors. Maybe I'll read more later. This is my tentative response.
Many have suggested that the speckled background is ugly and hard to read. Ugly, probably. Hard to read-- well, it looked OK on my screen. But, point taken, it's gone.
Someone used the term "elitist sour grapes." That is probably largely correct. :)
Someone else said you shouldn't trust someone who claims to be a web developer but doesn't know CSS. That, I would say, is 100% correct.
But you see, I do not claim to be a web developer. In fact, I specifically claim that I do not know web development.
I realize that web guys have collectively made billions, maybe even trillions of dollars over the years. And it is an absolutely necessary part of modern society, and I don't want to demean the people who are good at it and who want to do it.
But web coding as such just doesn't interest me. I'm not good at it, and I don't want to be good at it. I'd rather do other things.
I'm not a programming "god" or "rock star," by the way. I'm just a guy who's been doing this a long time, but has not stuck to one thing long enough to be considered brilliant.
The user LondonPilot raises some very interesting points. I wish I could buy him a beer or six (or substitute beverage of choice).
In fact, his argument is sort of the flip side of mine-- and arguably the more important angle on the whole situation. I would even say that I have made the same argument at different times and places in the past.
Here's a good example. I recall a time in the early 90s, when personal computers were less common, I had two friends who had recently got married. He was an electrical engineer, and she majored (I think) in English. This was 1992, when Windows 3.1 was hip and trendy in some circles, and DOS was still very much alive.
Jon slightly bemoaned the fact that Joyce used a Mac rather than a PC. He made the comment that "people don't learn anything about computers that way." I suspect from his hardware-oriented point of view, the DOS command line was pretty super-high-level.
In some ways, I sympathized. But I made the counter-argument that a toaster is meant to toast bread, not to teach us about electricity. (He hated this analogy.) But I often felt the computer should become invisible-- that the proper response to "What are you doing?" should be "I'm writing a paper" or "I'm designing a graphic" or whatever-- it should never be "I'm using the computer." And we've made some great progress there in recent decades.
Let me amend what I said before. I don't mean to backtrack (though I probably will), but to clarify and expand.
First of all, there are in fact certain small lessons from computer science that even a modern "clean hands" developer should know. I cannot count the number of times I have seen a programmer (and I really wanted to put quotes around that) who was absolutely incensed that (2.0/3.0)*3.0 was only 1.99999999 or thereabouts. I have seen people who were livid and demanded that this bug be fixed.
I have also had difficulty explaining that no matter how fast your processor is, an algorithm that is O(N^2) can eventually bite you. I have seen people who didn't know that hard disk access is typically slower than memory access. I could go on.
Having said that: You can get an awful lot done these days without any CS coursework. You can create things that are useful, worthwhile, and certainly financially successful.
And there is nothing magical about a degree anyway. What counts is domain knowledge-- and I mean enough knowledge to be truly effective, not just "enough to fool your manager" (and perhaps not just "enough to pass a boot camp").
So getting back to the excellent comments by LondonPilot-- I think perhaps there should be another field or discipline split off from computer science. This would be parallel to what I believe happened with electrical engineering and computer science.
I studied CS for 6 years, and as far as I recall, I only had two or three EE courses (including one lab). To the generation before mine, that might seem like a horrifying concept.
So let's imagine a new major. I don't know what to call it-- maybe "High-Level Computing"? Not a great neologism, but I'll run with it for now. An HLC major would certainly have one or two courses that were from traditional CS (just as I myself had my fleeting glimpse at EE material). But the rest would be more pragmatic, more about gluing this piece of tech to that one, properly trusting that the components were created and tested by CS people-- just as I properly trust that my CPU works as advertised and never try to build my own.
As LondonPilot said, this is not a bad thing. This is how human progress works.
I do wish that more web developers had deeper knowledge. But (and here I may truly be backtracking)-- it does not have to be super- deep knowledge. I probably said it did, but I was wrong, of course. As long as there are cars, there will be people who understand the workings of the engines; but a self-driving car will be perfectly useful in getting from point A to point B. Part of the point of technology is to make it easier to accomplish tasks with less effort and less specialized knowledge.
So my modest proposal would be: Let CS give birth to a new field just as EE gave birth to CS. EE did not die, and CS will not die in the imaginable future. Let's just admit these are different ways of thinking now, and should perhaps be different tracks.
Thank you to all who responded. Maybe I'll dust off this blog again soon.