January 21, 2020
An old friend of mine asked me this in email last night...
Thanks for taking the time here. My younger son is interested in pursuing a career in computer science. He's a freshman at Kennesaw State right now, but he really struggles with the idea of taking two years of classes that he has very little interest in. There are three schools in Atlanta with 8- to 16-week full-day immersive courses that focus solely on technology: The Iron Yard, Digital Crafts, and General Assembly. In the world of computer science, what is the opinion is of programs like this? Is a 4-year degree viewed as more valuable?This morning, this was my off-the-cuff answer:
Well, here is my opinion... I'm speaking as someone with two degrees and therefore six years in comp sci, as well as 30 years' experience. In other words, I'm old-fashioned, a freaking dinosaur. :)
There is a whole generation of computer people who are very different from the previous generation. CS used to be (and really still is) a very deep and demanding field.
But two or three things have happened. First of all, computers are pretty much universal now. Do you know anyone under 80 who doesn't have one? Second, the World Wide Web (born in 1989 as a "face" on the more bare-bones Internet) started being invaded by the general public in 1995 or so. As usage became more common, it became more important to everyday life. And third, the tools we use to create applications have gotten a bit more sophisticated. It's easier to use these tools, just as it's easier to drive a car now than it was in 1905.
What all this means is that there are more kinds of "computer person" than there ever were before. There are many thousands of "programmers" who never took a programming course. This is both good and bad.
It means that a person can get the little things done while knowing very little. But it also means that this person probably will never learn enough to get the big things done.
To be honest, I get secretly frustrated with the lower-level people who now exist in giant hordes. (I rarely tell anyone that.) To me, they are like people who have decided to learn 5% of their field in order to get a few things done, have some fun, and make a living.
These people use tools to create little applications for everyday use. But remember: The tools themselves are also software. But they are a level of software far beyond anything these people could dream of creating. They use languages, editors, compilers, and operating systems; but they don't have the first clue about how to create any of these things or even how they really work.
In 20 years, some race car driver is going to hold a kid in his lap, and the kid is going to say, "Well, I can drive a car, too. Everybody can drive a car. You just push the green button and say, 'Take me to Wal-Mart.'" And I kind of feel like that race car driver sometimes.
Bringing it back to computers... My friend had a 7-year-old who used a fancy tool that he learned by himself, and he created a game for the iPhone. Sure, it was a simple game. But he did something by himself that a PhD could barely have done 30 years ago (and not on a handheld device).
To some extent, my complaints are valid. And to some extent, it's just me being an old-timer and a curmudgeon.
Sometimes I interact with certain people and their complaints and questions show where they have reached the boundaries of their knowledge. And sometimes it's shocking to me. I think: You should have learned that in your first computer science class in your first semester. And then I remember: This person never had a real computer science class at all.
Of course, generations before me look at me the same way. In the 1940s, you pretty much had to have a degree in electrical engineering in order to see or touch a computer. In the 70s and early 80s, you just had to be a science or engineering major. In the 90s, you just had to have a credit card (or have parents with a credit card).
I'm very lost with hardware, by the way. I live on the keyboard. Every time I ever opened up a computer to try to change something, I screwed up two other things. I'm strictly a software guy. I've known people who made fun of me for that, and even older people who laughed a little at me because I didn't know how to use a soldering iron.
It all depends really on what a person really wants to do. If you want to make online stores and shopping carts and web forms and pretty pictures and social media apps, then you don't need a degree in computer science anymore. Not even an associate's degree.
If you want to make really interesting exciting things that have never existed before, if you want to make a tiny little difference in the industry and change the world just a little bit, then you do need that degree. If you want to make the tools and libraries that the lower-level people use, you do need that degree.
Or look at it this way: If you want to build doghouses, just pick up some skills with hammer and nails, and then go for it. If you want to be an architect who designs and builds skyscrapers, then go get a degree in architecture first. But please (speaking again as a curmudgeon), don't learn to build doghouses and call yourself an architect.
I hope this silly rant is helpful in some way. :)
Ask me any more detailed questions if you want to.