There seem to be a few different schools of thought when it comes to teaching the latest programming languages, tools, frameworks, etc. On one extreme, there are the computer science academics who say that the choices they make in their classes are irrelevant; it's the underlying concepts that students learn that matter. On another extreme, there are those who say that theoretical concepts aren't what students need to get a job; it's knowing the most up-to-date tools that matters, and most of what is learned in a theoretical computer science degree is never used in the workplace.
Like most extreme viewpoints, I think that they contain morsels of truth but ultimately talk past important nuances.
Let's tackle the theoretical computer scientist extreme first. If I were to ask a lecturer why they chose the languages they teach in their module, I would hazard a guess that their reply would be that it's what they are most familiar with. Familiarity is important. Any tool takes time to learn. Some are easier to learn than others, but ultimately, if you need to use a new tool, you need to learn how to use it first. With respect to programming languages, some are easier to learn in general than others, and some are easier to learn if you already know a similar language. This is because programming languages, much like spoken languages, can be roughly mapped into paradigms—groups of common ideas implemented in slightly different ways. An analogy in languages could be that it's easier for a French speaker to learn Spanish than it is for them to learn Mandarin. However, there is absolutely a cost to learning a new language, so wouldn't it be better if students learning for the first time were less likely to have to bear that cost when they enter the workplace, especially while they're still learning the basics?
Of course colleges can't teach all possible languages that a student may encounter in the workplace—that would be ridiculous. But if you aim to teach current languages that use ideas common across many other modern languages, it goes a long way.
The more you use a language, the better you become at using it. Over time, developers build a fluency in a language. If you ask any dev, they'd probably be able to tell you their top two or three most fluent languages. Mine are probably Python, SQL, and JavaScript. Building fluency in a popular, modern programming language is extremely useful (even with AI, etc.), and I think colleges should remain consistent in their language choices for modules to allow students the time to build that fluency.
Okay, so what about the “theory is irrelevant” argument? Perhaps a more common and more nuanced argument I heard recently is, “Just let them build something and they'll learn what they need to get the job done.” In a recent interview with Andrej Karpathy, I heard him refer to this dichotomy as depth-first versus breadth-first knowledge. Depth-first is a lot of what you do on the job—learning bits and pieces here and there to get the job done. Breadth-first is much more of what formal education attempts to provide. There's a lot more of “We're learning this now because, trust me, you'll need it later.” Both are, to me anyway, very valuable. As someone who has flipped between work and education quite a bit in my time, when you're in the depths of one, you crave the other. Engineers often feel that they have no time to “properly learn” something, while students feel that they're not building enough practical stuff. Perhaps the ideal is a perfect split between the two: sufficiently tricky practical projects, but with the time and formal backing to “properly learn” the underlying stuff. Of course, this is very difficult to achieve in practice—hence why we've all experienced it.
If you fail to dig into the underlying concepts, you'll end up with brittle knowledge, so if you're thrown into a language you don't know, you'll feel completely lost. However, if you learn a bit about why things are done the way they are, you'll find commonalities in a new language that you've never seen before, and make the learning experience so much easier.
As individuals, I think we all have preferences for one mode or the other. I'm personally fascinated by how things work, so I often tend more toward theory than practice. But no matter which you prefer, it's important to give yourself a healthy dose of both to become a well-rounded practitioner. Sometimes this may feel uncomfortable, unproductive, or pointless, but as long as it's not all that you do, I don't think it will be a waste of your time.
I have been teaching only for a very short time and have been practicing as an engineer for a relatively short time too. But my current thoughts are to try to teach programming modules with current but stable languages, tools, and frameworks, remain consistent with what tools other modules in the course teach to allow students to build fluency in a language, and endeavour to strike a healthy 50/50 balance between theory and practice, even if you don't quite get it right.