Colleges Don’t Teach Useful Software Development (Do They?)

Colleges Don’t Teach Useful Software Development (Do They?)

With regard to what I shared in the previous post – What Do You Expect From Being a Software Developer? – the author provided 10 things that I found worth sharing (and now considering).

At the very least, it’s something to keep in mind for those of you looking to enter the industry or who have been working in the industry. At most, that article is something that provides a solid perspective from one person’s experience (which likely echo many others, too).

This lead me to think about my experience both in college and in my career in software development and how it relates to the ten points mentioned in the last post. It’s easy to say colleges don’t teach useful software development, but how true is that? Again, I’m not arguing the original post. But perhaps my perspective is a little different (though no less valid).

I’m not going to give a deep dive into my entire history doing this (who would want to read that, anyway? 🙂 but why not give my perspective on the 10 points mentioned?


Distinguishing Between Computer Science and Software Development

I think of computer science and software development as two different things.

This is an important distinction because if you’re talking to someone who works in the field of computer science, they may or may not be in the field of software development; however, if you talk to someone in the field of software development then it’s likely they’ll be doing exactly what the title implies.

And has the field has grown just in the last 10-15 years, entire degree programs or certification programs specifically for software development have emerged (as opposed to those just for computer science). That said, there are also degrees and certifications that have a foundation in computer science with a focus in software engineering, software development, web development, and so on.

All of this is worth mentioning because when we start talking about the case of if college will prepare you for a career in software development, I think it’s important to be nuanced enough to map what the college program was teaching versus what you’re expecting from the job market.

“College Will Not Prepare You for the Job”

This is where, at least to some degree, I think it’s important to distinguish between computer science and software development. When I was in school, I majored in computer science with a focus in software engineering.

If I had to summarize it, I ended up with a lot of classes based in computer science – that is, the mathematical parts – and then I had courses on what software engineering – that is, how to analyze, design, build, and/or maintain software. There was also times where I worked as a teaching assistant and participated in internships to get some experience. This will be relevant in future posts.

So there’s my context for the experience I’ve had thus far. In the original article, Mensur writes:

The college will prepare you for some basics, but what most of the colleges teach is so far away from day-to-day jobs. Most of the professors who teach at universities are not good software engineers.

Only a small percentage of them even worked as software engineers. Also, the university curriculums are heavily outdated. They trot years behind the software development market needs.

college will not prepare you for the job

There are three main points in this passage I found most relevant to what I’ve done over the last 16 years (including what I’m doing now).

1. Most Professors Are Not Good Software Engineers

I can definitively say that in my experience, you could tell who the academics were and who the practitioners were. To summarize, it’d be something like:

Not all academics are practitioners, but all practitioners are academics.

Even then, though, you have to take it with a grain of salt because if they are a professor working at a university then it should be implied that even though those who have actual experience working in the field are still working on something academic in nature outside of the classroom.

Generic class room all too similar. Photo Credit.

Further, I can unequivocally say the professors who had the greatest impact on my career and my interest in software as a career were those who had spent time outside of an academic setting and worked in the industry (in fact, I still remember their names and many of the projects we did for those classes).

It’s disingenuous to say those who have not worked in the industry aren’t good professors. They are. Case in point: I had a Systems and Networks professor who taught a two semester class that covered a lot of material. We talked about the foundations of modern computing, understanding everything from logic gates up to x86 assembly, and C. And at that time, that was really relevant information (that is, the iPhone had not yet come out, Apple Silicon wasn’t a thing, and AMD and Intel were the two primary processor competitors).

I know: Giving two examples relevant to my own experience doesn’t make or a solid case in one direction or the other. But the point isn’t to write a persuasive article or anything like that.

The point is simply this:

  • The majority of the professors who prepared us for working in the industry were those who had already been there.
  • Those who had not worked in the industry weren’t equipping us, but they were teaching foundational concepts and they were teaching us how to think.

And I’ll have more to say about that later.

2. University Curriculum Is Heavily Outdated

This point isn’t going to be very long because I don’t have much to say about it. Generally, it’s a strong maybe from me. I’m sure it depends on the university or the program in which you’re enrolled.

At the time, I didn’t feel like much of the content I was learning was out of date. But it’s hard to know when you’re in school, right? What do you have to judge it against?

But looking back, there’s only one course that I took that felt out of date for me and it didn’t so much have to do with the content but it had to do with the language we were using: Smalltalk.

There’s a caveat here, though: Part of the reason we were tasked with using Smalltalk was because it was also to help inculcate the idea of object-oriented programming where everything is an object that receives messages (versus classes and functions that are invoked on an instance of the class).

So even that served it’s purpose because it told us how to conceptually adapt to a weird requirement.

And for those who are wondering, there were a lot of languages used in school at the time (Java, C, Python, Smalltalk, JavaScript, SQL, and so on). But to say this is heavily outdated isn’t as true as it might be for others.

Using Smalltalk? Probably. The concepts taught in that class (such as object-oriented analysis and design) along with designing or improving object-oriented based systems were useful.

Building a network-connected photo sharing application Java to help understand sockets and sending large pieces of data across a network? Definitely more useful (the language being less important than the actual work being done).

I can pull other examples but I’m trying to use the ones most significant and relative to the point I’m making to keep this relevant.

Now, I’ve probably forgotten what things were no longer needed – as is apt to happen – but one thing I distinctly remember learning that I only used in my first job out of school were UML Diagrams.

I get the point of, I understand what they are supposed to convey, but rarely do teams have the time to do this when tasked with a project. Further, it’s harder to do this when working with a system that already exists.

That will come up in another post, though.

3. Learn How to Learn

As I mentioned in the first point, the biggest advantage that I came away with when graduating with my degree was that I had learned how to learn.

No, I didn’t know all the languages but I didn’t need to do so. No, I didn’t know all the various IDEs, database systems, operating systems, in and out. But I knew how to learn and adapt. And that one skill alone has likely been the one that’s paid dividends as I’ve continued through my career.

So if you’re in a program – be it something from freeCodeCamp, a course at a local college, a university program, or anything in between, don’t forget to focus on the why behind much of the principles that are being taught. And if that means going to office hours, doing independent reading, or meeting with the professor or a group of other students, then do it.

Learning the why behind something helps you to start formulating your own ways of how to learn something new. And when you understand the why, you understand how the people before you arrived at doing what they are doing.


Nothing I’ve shared here is a rebuttal to the previous article nor is it meant to even be argumentative in nature. But just the the original linked article laid out a persons own experience, I’m doing the same here.

It’s a good exercise to write something other that programming examples, to evaluate my own career thus far, and to potentially help others who are reading this.

There’s the topic of greenfield projects coming up, though. And that’ll be fun.

Advertisement

About The Author

Tom

Archives