I read an article today on Read Write Web decrying the Computer Science degree as something that may no longer be necessary for a fruitful career in technology. I don’t agree with this statement. Well, at least not entirely.
I admit, I learned a lot of the web development skills that come naturally to me simply by being on the internet beginning at a ridiculously young age. And with the plethora of online learning tools available now (from technical blogs to iTunes University offering free videos of classes at Stanford), you can really gain a lot of knowledge online and at little to no cost. Definitely cheaper than my degree at NYU, anyway.
Also, sure, GPA at a top school (or any school, for that matter) is not a clear indicator as to whether a given person will be successful in a particular role. I know lots of folks who flunked (read: drank) through their first year in college and ended up turning things around, graduating and becoming responsible and accomplished adults.
And then, of course, there’s the value of experience. Experience, in my opinion, is the best instructor you can ever have. Banging your head in front of a computer screen for hours because you forgot one (ONE!) semicolon in your code will teach a very valuable lesson. Doing freelance work or gathering a smattering of industry jobs will definitely also help you gain experience as you go.
But…
I’ve been seeing a disturbing trend. I’ve met people who say they are proficient in JavaScript, for example, but really only understand abstracted frameworks (like jQuery and etc). I’ve come across people who know all the right buzz words to use but don’t really understand how logic works. I’ve been in “bootcamp” classes where pre-requisite programming knowledge is required (CS 101, basically), and people don’t know what a “variable” or an “object” is.
Look, I’m not saying that I’m the best programmer in the world. I’m also not saying that obtaining a Computer Science degree will make you better than someone who doesn’t have one. However, what a Computer Science degree did for me was give me a world within which I can frame problems and solutions. I understand that most programming languages contain similar constructs so the only barrier to entry is just getting the syntax and best practices down (that’s the stuff of experience). Plus, I got to spend four years immersing myself in all that. It is hard to find this long of an uninterrupted chunk of time outside of college.
I was at a conference last week Friday (GothamJS) and Tom MacWright said something that really resonated with me.
“Use abstractions for efficiency, not ignorance.”
That’s exactly the problem I’m seeing with new programmers. They rely on abstractions to just get the job done which unfortunately results in them not REALLY LEARNING ANYTHING. Whether it’s leaning heavily on jQuery or other framework when vanilla JS would’ve been sufficient or (RAGE!) copying and pasting solutions directly from StackOverflow or some other tech solutions site, I see it far too often from novices.
It makes me really sad. I love to encourage people (especially ladies, if you are reading this) to get into coding. I’ve done career day at various middle schools in the area telling kids that coding is a way for them to gain some control in their lives — for once they can tell something what to do! — but with that great power comes great responsibility. You have to be responsible for the code you push out into the world. This was also something that was echoed at GothamJS. If you don’t understand what you are creating, how can you possibly take responsibility for it? Also, if you don’t understand what you are creating, how are you sure it is solving a problem? And how are you sure it is solving the RIGHT problem? And how are you sure that you aren’t introducing MORE problems (especially performance-related problems)?
Now, that said, I think there’s a lot of “real world” stuff for which my Computer Science degree didn’t necessarily 100% prepare me. But I have a feeling that’s pretty much college in general and a much larger problem to solve. I’ve always felt that Computer Science degree programs should offer “tracks” that can help students gain more marketable specialized skills. I was always interested in web development so I took EVERY web development class offered at the higher levels. But, if you are interested in networking, for example, your “track” should be specialized so you end up taking all the networking classes offered and etc. Seems to me akin to how you don’t just major in “Engineering,” but rather you might select Mechanical Engineering or Electrical Engineering. Perhaps Computer Science needs to be a smidgen less broad, at least at the undergraduate level, to better prepare students for life after college.
And, conversely, “learn to code” bootcamps may need to incorporate a little more theory into their offerings. My experience has been that “breadth” of knowledge — knowing how to discover solutions to problems that may not exist yet! — can be a great complement to “depth” of knowledge.
To leave you with a very real life example, consider Flash. Flash (and ActionScript, the language that powered it) used to be a coveted skill for a web professional to have. Since the proliferation of the iPad and emergence of client-side methods for simple animating (HTML5, CSS3, JavaScript frameworks), the demand for Flash skills has diminished. If you focused on learning Flash but not truly understanding the underlying programming concepts that made ActionScript work, for example, you might be shit out of luck right now. However, if you learned basic programming constructs as you learned ActionScript, you might be finding success applying those same concepts to JavaScript (or some other scripting language) with great success.
The moral of this story? Understanding the LOGIC behind WHY things work the way they do, by whatever means you get there, is the REAL must have skill.