The massive boom in computer science enrollment over the last 20 years has been driven mostly by people chasing tech salaries, not by any real interest in computing itself. These students often show up completely unprepared for how difficult CS actually is, and universities have responded by dumbing down their programs to keep everyone happy and enrolled.
If this weeds out the people who are just there for the paychecks, it might actually be a relief to get back to teaching students who genuinely want to learn about computing.
I think CS departments are at least partially responsible for this development. They know that most of the students applying care nothing about Computer Science, have no interest in Computer Science and will never learn Computer Science, yet they keep accepting (and graduating) them. If CS departments actually wanted to teach CS, then they would advocate for setting up a new series of departments/degrees with names like Software Development and Engineering or Application Design and UX, and send most of the students there. Then those who want to learn/teach Computer Science can learn/teach Computer Science without having to deal with a classroom full of people who really don't want to be there.
But if you can sell those customers a better product at the same price, then perhaps everybody will be happier. As it is now no one is happy. CS staff are annoyed that the students don't want to learn CS, and the students are annoyed since most of the CS they are forced to learning isn't relevant for the web developer job they want to apply after graduation.
There's an argument that a huge amount of the specific engineering theory that students learn never gets applied in a lot of jobs. I used some for a few years in mechanical engineering but not really a whole lot. A lot more was sense in managing projects. And, while I took a programming course (wouldn't call is CS), it probably didn't really help me in my job a lot more than the limited programming I took in high school did.
University departments do not always have the kind of autonomy your post implies. It is common for the university’s central administration to dictate how many students they must let it, how much money they get per student, and hence how many they can fail without going into the red.
None of this has to be for-profit. It just requires the university administration to put its wants, priorities, and head count (!) above the interests of the individual departments and of the students.
Making the whole thing a non-profit or a charity won’t solve this.
Even 20 years ago back when I was in college you had a sizable portion of kids who came in to study computer science thinking it would be fun and games. They were then made to study formal logic in their first semester and debug segfaults in gdb in the next, and by the end of the first year pretty much all of them had switched majors.
Anecdotally I've heard that very few CS programs even use C++ anymore, and schools now favor Python because students find it more accessible.
There's definitely this disconnect between programming and computer science. For better or worse, a lot of top schools don't even really teach programming in their computer science programs. It's something you just pick up on your own.
But not sure that using Python as the specific tool is so bad--based on the MOOC that's what MIT uses in Intro to Algorithms. May be better than spending a lot of time on the vagaries of C++ which are certainly relevant to system programming (though that's probably slowly switching to Rust) if your focus is on algorithms and other design details.
> […] a lot of top schools don't even really teach programming in their computer science programs. It's something you just pick up on your own.
"Computer science is no more about computers than astronomy is about telescopes, biology is about microscopes or chemistry is about beakers and test tubes. Science is not about tools. It is about how we use them, and what we find out when we do."
I think completely separating it like that goes too far in the other direction. The absolute best undergrad classes I had were the two where the lectures were entirely on the theory side, then the projects were practical implementations of the theory.
Totally. One consequence which I'm somewhat ambivalent about is that people with interest in engineering fields other than computer science may not be expected to have the degree of familiarity with the tools of the trade that budding computer scientists do. But, given the popularity of computer science as a major, it's probably inevitable to expect that freshmen at least in top schools have done at least some programming.
Being able to understand computer science and apply it is called software ENGINEERING for a reason, and it's a lot more complicated than 'just programming'.
You sound like a physicist who thinks mechanical engineers are unnecessary because we have physicists and car mechanics.
> You sound like a physicist who thinks mechanical engineers are unnecessary because we have physicists and car mechanics.
Or I sound like someone who recognizes that physics and computer science, and mechanical engineering and computer engineering / programming, are different areas of activity.
Well, essentially you have the language used in the class match the subject being taught.
Python gets language difficulty out of the way of learning a given algorithm. Bonus points for exacerbating the time issue when trying to introduce Big O timing notation. The kids can actually "feel it" in an in your face kind of way.
Systems, it's different, as you say.
Compilers. Different AI courses. And on and on. Each you may have legit reasons for using different languages.
The concern starts to grow when Python is being used across many courses to the exclusion of any other language or technology. That's the issue that's growing across CS departments right now. Couple that with kids who have no interest in learning the other languages on their own and voilá! You have an issue with uninterested kids graduating, but now they're also unprepared.
No argument. 6.001 I think it is goes into sort algorithms, Big O, etc. And Python seems a pretty good match especially as an ostensibly into course. Rust (or C++) is clearly a better match for other purposes and would probably deserve some sort of class for some of the associated concepts even if the student sort of knew how to "program."
I tried to study IT(lighter version of CS) without any prior programming experience and our first programming language was Java because apparently with Java you "Write once, run anywhere". But when I saw Java's syntax, I was like....this is not happening. That was the golden age of mobile apps and our focus was on the mobile development since Web was not sexy anymore. I'm actually happy that I quit because all I heard were horror stories of Java's mobile development on Android. My interest is actually Web development and if I had to choose all over again I would rather learn JavaScript. I can only imagine and speculate how hard it is for someone with no prior CS and programming experience to learn computer architecture, assembly, C and C++ for the first time.
I took the MIT MOOC CS Intro a few years back for a combination of refresher (though I wasn't actually a CS major) and just to kick the tires. Good Lord, that would have kicked my butt back when I was a freshman absent prior programming experience, admittedly when computer programming was much less a norm than it is today.
I had a high school BASIC class but that was about it.
> But when I saw Java's syntax, I was like....this is not happening.
No, you're good, this is the natural reaction of basically all programmers except for those strange beasts known as Java programmers who believe verbosity and needlessly complex yet organized in a twisted sense is nirvana, in the same way the accountant sees the tax return as nirvana. Many other language enthusiasts such as C, python, or LISP, will also get a bad taste for Java. Of course there are other gnarly languages such as APL or SAS.
There seem to be relatively few C/C++ jobs, so they went with what the market wanted (Java). They still exposed you to other other types of languages such as assembly, JS, and even COBOL. I really don't think going with python except for an into/cross-major class is a good idea since it's so simple and there aren't that many jobs with it.
>I really don't think going with python except for an into/cross-major class is a good idea since it's so simple and there aren't that many jobs with it.
Python's ecosystem is massive and there are lots of use cases[0], basically data analysis, machine learning, deep learning and all the rest of AI run on Python.
I recently heard someone say "I've never had a job that didn't involve at least some Python". I think that's true of almost all computer science jobs in the current market.
Python is the primary language for scientific computing and the secondary language for a good number of other tasks.
Yeah, but I'd work that in the same way it's actually used - as a secondary language in one of the data related classes, such as a database course where you learn SQL, then have the students do calculations on in python. Or do it in one of the course that is used for introduction or non-CS students minoring in CS (like business majors). The real thing is that if you learn other stuff like Java, Python is super easy to pick up and doesn't need to be formally taught.
I work at a fintech company doing Spring Boot, and the only python I've seen here is in some CI scripts (which I can see but do not have access to edit, we have a team that maintains CI). Everything else is Java, Kotlin, or JS for some websites, that's about it. I've heard the AI teams use python, but while pretty much all the devs in my team know python, we don't use it at all.
Similar here. One exception is that we had some Lambdas written in Python. Now there is a push to replace them with Go because of the potential money savings from faster execution times.
>I recently heard someone say "I've never had a job that didn't involve at least some Python".
Tell me you've been in the field for less than 10 years without telling me.
Python is a joke language with a joke name. The only reason it ever caught on for AI is that someone wrote a few good math libraries for it in the 2000s, and its' rise is entirely incidental to that.
Are you trying to get banned from hackernews? You can't go around calling peoples comments complete nonsense, you called me full of shit last week. Do you even like this community? We try really hard to be civil to each other in this place. https://news.ycombinator.com/newsguidelines.html
While C++ is unquestionably still widely used, outside of legacy and some niches (like embedded) it's not in the top 3 "best" choices. Why teach a more complicated language with a poorer ecosystem and toolchain than is actually needed?
I think there's still value in starting with C and C++, to see where it's coming from and see how much tooling and DX has improved, but I can't really blame courses jumping directly to the more useful things.
The first real language I learned was C++ and I was decent at it (best in class but mostly stuck with console based programs, nothing advanced like GUIs that year). I have never used it at work. If you look at job boards, it's overwhelmingly stuff like Java, JS/TS frameworks, etc. Why teach it when it seems like less than 10% of postings are looking for it? Might as well teach other languages and just learn it if you need it.
Frameworks are the opposite of what universities should be teaching. I had this argument with another classmate when I was there who said they should be doing that. The point of a university is to give students the skills to communicate the ideas needed to use and develop frameworks effectively, not to teach you particular frameworks.
Especially considering that the hot framework you are studying will probably be out of fashion by the time you graduate. If I was paying American college tuition rates and sat in class studying Next.js I'd demand a refund.
Why teach it when it seems like less than 10% of postings are looking for it?
And we're back to the discussion of what is the point of a University CS education. I would argue that learning something like C++ is important for the same reason something like Lisp or Haskell is important. Not because it will necessarily help you get a job, but because it introduces you to new concepts and a new way of thinking about programming and computation that will be useful no matter what language you end up programming in for a living.
I don't think there are any main concepts that I learned in C++ that weren't also covered in other courses, such as Java, assembly, and COBOL. If we're going to teach different types of languages then they should be truly different, such as the ones just mentioned.
Of course you can teach most concepts in most languages, but when it comes to understanding concepts like stack vs heap, pointers to a value vs the value itself, move vs copy semantics, pass by reference vs pass by value, explicit vs implicit memory allocation and deallocation, and so on, I found C++ pretty useful.
University isn't there to teach any languages. But it is there to provide the foundation and theory. If somebody wants practical C++ then go to a further education college.
I think that's what the professor meant when s/he said that schools are "dumbing down" curricula. The amount of Python being taught is a bit concerning even from the perspective of personal economics. If I was paying all that money every term in tuition, I'd want them to teach me the hard stuff. Not the language I can learn in a weekend while I'm shirtless on the couch watching GameDay. It's like no one ever stops for a moment and says, "Wait? Why am I paying this much money to learn a language that's so easy my English Lit friend knows it inside out already?"
If I were at a school where they are teaching JavaScript or Python, you kind of already know that program is more "money grab" than "study of computing technologies".
College should not be about teaching a specific language. It should be teaching the programming skills needed to pick up any language. Python is just as good as C++ in this regard. In fact, if python is an easier on ramp and get people excited about programming and show shows them what’s possible before crushing their soul with C/C++ then I say go for it.
In college, I regularly wrote my programs in PHP language I had taught myself prior to college and then converted them to see to submit my homework/test. While PHP was obviously much slower to run, it let me iterate and develop faster than my peers.
In fact, I find it borderline fraudulent that so many colleges waste time on a language that most graduates will never use. Python knowledge is way more useful than C++ knowledge in my opinion, especially for a new grad.
Then again, I have a very dim view on college CS programs as a whole. They aren’t just fighting the “last war”, they are fighting a war from decades ago. Almost everything that I used in my first job were things that I taught myself, not things that I learned in college. That was one big reason why I dropped out of college my junior year I wasn’t learning anything that was useful for my field. The professors were pedantic and cared about silly things like making sure I put a semicolon at the end of each of SQL queries that I wrote for an exam.
I would argue that Python, being a simple, easy to learn language, allows you to focus on other aspects of CS e:g complex algorithms. Rather than faffing about with memory management etc you can really study algorithms without the language getting in the way. Of course, this depends upon the professors actually teaching this "hard stuff" :)
Having interviewed a number of graduates from code camps, they're definitely just chasing the salary.
Most of them have no actual passion for computing, their scope of knowledge is superficial, and they're asking for six-figure salaries out of the gate.
I had a relatively simple coding assignment (shouldn't take more than 15 minutes) that I would use to weed out those that were just copying and pasting sample code. It required a very large number of values and added an additional profiling step to it. The sample code wasn't performant with a very large number of values, and was painfully slow to use unless you made minor adjustments to a few things.
Back when we used to interview in person, we used to have a computer and screen in the office running Linux. The number of people who couldn't handle a terminal even to type "ls" was either remarkable or shocking depending on your point of view. We're talking about people who claimed years of Linux experience, applying for Linux programming and administration jobs.
"If this weeds out the people who are just there for the paychecks, it might actually be a relief to get back to teaching students who genuinely want to learn about computing."
It's not going to work that way. I was genuinely interested and took many high level electives. I felt the program was very good 15 years ago at the school I attended. I also got an MSIS at a different school, but feel that one was not any more advanced than BS, just a faster pace and weirdly less coding. I did well for years at my job. Now it looks like I might lose my job and probably won't get another IT one. I will probably end up working at Walmart or something.
This is how I've been feeling through the whole situation. I got my current job at the bottom of the last slump the other year so it doesn't seem to be affecting me.
Still I've been careful to set my life up so I could go many years without employment if I had to. It's hard to trust the rest of the economy in general.
About 15 years ago when I started my degree there were both the “I want a good job” people and another crowd that I’d describe as having followed a thought process of “I like video games; I want to make video games; I should study comp sci”. At that time at least I think the video game crowd was even less equipped than the job crowd. Not to disparage video games, they are a majority of my free time, but those who were joining the field to _have fun_ are going to have an even harder time than those looking for work.
Lots of discussion about choice of programming language in the comments below.
- In principle, it should not matter at all, but there are practical reasons why one PL may be better than another in a particular school or context.
- But, all this "choice of PL" discussion is really a discussion about CS1. A CS degree has at least seven other courses -- assuming 1 CS course per semester -- and in practice many more than that. So, if you're going to ask questions about CS1, the question to ask is, "Does CS1 setup students to succeed in the advanced courses?" Classically, these were courses in compilers, operating systems, networking, and so on. These days, you can add distributed computing, machine learning, etc. (but don't subtract the classics).
There really is too much hand holding of university students nowadays. I don't think degrees are really equivalent to what they were thirty years ago. Back then the university was about weeding out the wasters and lazy folk who didn't study. College courses are meant to be hard for a reason. Don't get me started with that extra credit crap.
Thanks for sharing. Is this similar to what was attracting students to medicine/doctor (guaranteed position with high salary)? But the med-school-to-full-time-physician pipeline is long and can weed out. CS is a difficult subject, certain ways of thinking are difficult but certainly can be learned, like recursive thinking.
Did colleges expand their computer science departments or even just create them to meet the demand for the degree? The pipeline to possible employment with a CS degree is quite short, doesn't require residency and board-certification so it's a quicker route to employment, but then you are competing with peers with stronger backgrounds and educations and seasoned professionals for the same positions.
Salary isn't traditionally what motivates people to study medicine. It's prestige. The difficulty is part of the prestige, which is probably why they still do things like memorize long lists of cranial nerve names. I haven't heard that they have a problem with dropout rates.
A good CS education only gives you prestige with fellow nerds.
I actually think prestige is a contributing factor for CS as well. People assume you must be smart to be a software engineer, and FAANG companies are prestigious to normal people because they have name recognition. Definitely not on the same level as a Doctor/Surgeon/Lawyer or whatever but certainly could be more than a typical 4-year degree will get you. And I suppose there's also the fact that those companies were viewed very differently 10-15 years ago and now there is a lot more cynicism about big tech in general.
Yes, prestige, perception of self by others, but certainly salary and job guarantee are attractors to medicine. First hand anecdata from educators and doctors alike supports this.
I started my degree in 1999, and already then this was already a factor. I hope more articles like this are published, and people who aren't really interested in computing stop choosing this path.
Thank you for saying this because it feels like these people entering the industry in such numbers over the 2010s completely killed what made this job fun in the first place. I call them "ticket completers". Sure they can mechanically perform the minimum requirements of the job, but there is zero interest at all in what is actually being done; just following PM directions to the letter with no further thought. The whole spirit of innovation and curiosity and discovery has been lost, replaced by lifestyle seekers who look at you like an insane person when trying to talk about software in the abstract (ha!).
The hackers and nerds will be just fine. They are like gold when we find them now. But if this makes CS "uncool" again, I am all for it.
"The hackers and nerds will be just fine. They are like gold when we find them now."
This is not at all my experience. One of the problems I face is many of those PMs and companies in general want mindless ticket completers. My current job just wants us to grind through the Jira backlog. They have no interest in anything else and crush it from your will too.
> The hackers and nerds will be just fine. They are like gold when we find them now. But if this makes CS "uncool" again, I am all for it.
Think about how AI can help students cheat nowadays. You could still cheat previously, but now a CS-degree seeker can have an AI do the entirety of school work for them (with exception of say pen-and-paper tests). Imagine how the quality of new graduates drops with regard to the understanding and abilities you highlight as crucial to being effective in software, and how those that do understand are even more valuable relatively, but perhaps harder to find in the noise.
When most jobs just want you to be a ticket completer, the cheaters will do just fine if they can do it faster. The rest of use will be considered slow and discarded. It's happening to me.
Yes, they can be ticket punchers more easily, kind of trained to do that. But there are certainly levels of achievement that are not as possible with such a foundation that lacks grounding and true understanding.
Do you mind elaborating here on what is happening to you? It seems worthwhile information to add to the discussions ongoing for this post.
The short of it is that the team just wants high throughput but doesn't care about improving the system health or process efficiency. I tend to consider multiple aspects of the work including those areas. But if you just want someone to turn out tickets, I tend to be slower unless the task is simple or repetitive. I have a disability and graying hair, so my options are limited. I'm going to fail my PIP later this month and I'll probably end up working at Walmart.
I use Copilot, but it isn't that helpful. There is no morale - I'm so burnt out I fantasize about getting hit by a bus. No team will touch me on a PIP.
If genuinely feeling suicidal for any length of time, please reach out for help. :) From your many posts, some of which I replied to before, I believe you work at a soul-destroying place which isn't doing you any good, and I understand you're kinda stuck there cos of health insurance etc, but, what I'd say is try to live for today, not worry too much. IF they fire you, f** 'em. I bet your skills are better and more useful than you currently think. If you did get kicked out , you may well find something better. To me there doesn't seem evidence you'll be stuck working at Walmart. Maybe you could get a job in local state government? (OK not federal cos that's been gutted) Not stunningly paid but perhaps nicer working environment?
Hackers could like hacking outside their job for hobby but find job itself soul crushing. More like hackers and nerds that can withstand the corporate demand to grind will be left, rest get filtered. Plenty of talented people can be left outside the cog machine but cook up apps nobody but they care about
I'd argue it is actually more of a broad trend i.e. the boom in computer science enrollment over the last 20 years has been driven mostly by people chasing a better return on investment in the increasing cost of the average four-year degree, and software pays better than the average four-year degree. I do think that college being cheaper on average would help at least somewhat with CS being such a popular major.
If we're around in 2000 and 2009, you've seen this before. Our field has ups and downs and every time we hit the bottom people say it won't come back. It will.
We had some cleaning up to do. I was a hiring manager during COVID and the resumes I saw were unbelievable. People with "web" boot camps being considered for 6 figure salaries. People who had absolutely no business being in this field were being hired.
It was due to the easy money from low interest rates. This field always had solid salaries but some people were making a million to sit on meetings and integrate frameworks into me-too websites.
The hammer is coming down and is unfortunately hitting many good people too. But they will recover while the people who shouldn't be here will move on. Don't get your HVAC repair certification quite yet. Stop complaining about AI and go study it (the hard stuff not ChatGPT for dummies).
Well put. And this happened back in 2000, and 2009. I had people who I knew from direct experience to be non-technical slackers tell me about their IT Director jobs. I knew it wouldn't last then, and I'm not surprised now. Just get out of your comfort zone and start looking, and if you are in a bad situation don't be cowed. If you are truly technical this is always valued, despite the easy answers from ChatGPT, you must understand what it is telling you to really make use of it.
It's not the AI, it's outsourcing who really killing IT jobs. Even from relatively cheap East Europe projects are being moved to India, Vietnam and Philippines.
I don't read too much into the fact that unemployment for nutrition science is at 0.4% - that doesn't mean those people are all working as nutritionists or even in a job that requires a degree. You can see this clearly in the underemployment rate which is 45%+.
Likewise, the top unemployment rate (9.4%) of those with an anthropology major probably doesn't mean all those people are living under a bridge - a fair number of them will be pretty well off, living off their parents and knew going in that their field doesn't hire millions.
So what to make of IT grads having high unemployment rates (but low underemployment rates! bottom 5 in those)? I feel some more on-the-ground reporting is needed.
The quotes from randos reacting in this article don't really help. "Every kid with a laptop thinks they're the next Zuckerberg, but most can't debug their way out of a paper bag," because debugging (like Zuck!) is computer science, apparently.
> So what to make of IT grads having high unemployment rates (but low underemployment rates! bottom 5 in those)?
That's a very important observation. It's much better to be in a field with a 6% unemployment rate than a 60% underemployment rate (like criminal justice, performing arts, and, surprisingly, medical technicians).
1. Overproduction. Even liberal arts colleges have 15-20% of students majoring in computer science. “Learn to code” ceases to be good advice if too many people do.
2. AI… sort of. It’s a lousy replacement for serious engineering talent, but the bosses are so enticed by reduced labor costs (and reduced employee leverage) that they will keep trying even if the stuff doesn’t work. Expectations are going up, teams are shrinking, and junior roles are vanishing.
3. Reputation collapse. Remember how we dismissed Michael O. Church as a crank? His writing style was grating (and has improved immensely) but he was right about everything, five years before anyone else. In 2009, we were “good rich people” in contrast to Wall Street. Now we’re Public Enemy #1 and, while we don’t all deserve it, our industry’s leadership does. This doesn’t hurt big tech companies because they’re invincible monopolies, but it has ended the era in which even non-tevh companies wanted three or four “data scientists.”
> “Learn to code” ceases to be good advice if too many people do.
I believe "learn to code" is a great advice, nonetheless; the skill is highly applicable. The bad idea is thinking that alone will land you a cushy job.
I'll observe that, at a top-rate tech school I'm pretty familiar with, major + computing is a very prevalent option in a lot of the majors. As an undergrad (pre-PCs), I graduated with one computer programming course in FORTRAN and that was pretty much the only time I touched a computer keyboard undergrad. You can't really do that today in engineering/sciences.
Anecdotally, but talking to a lot of people who really have their ears to the ground, the junior roles thing seems to be very real. It probably isn't just AI--with more senior folks probably more available, why hire juniors--but seems to be pretty pronounced (with probably the corollary that bootcamps are probably a bad idea these days). Which isn't a great trend if real.
Both things can be true. It can be tough for juniors and tough for seniors who haven’t kept up and are just trying to cruise. Not that age discrimination isn’t a thing.
'learn to code' is a great advice for anybody. If you're a biology major and need to check the world molecules database (forgot the name, sorry), being able to write your own query goes a long way despite the nocode solutions.
It's mainly #1. For 20 years now we've been hearing non-stop about how computer science is this magical major where anyone can sleepwalk out of college into a 150k job. Parents have been pushing their kids into it whether they are interested or not. Colleges have been taking advantage by pushing sub-par programs and boosting graduation rates. The end result is a large number of CS graduates who can't write a for loop in an interview (and will then loudly complain about how the interview process is unfair).
i just spoke with a chem prof who said that a lot of phd students in the degree sign up not because they want to do science but because of the salary bump the degree provides in industry.
i guess that is a natural dynamic in our economic/belief system in which all central planning must be inherently bad so we must always pay the on-demand price instead of the bulk price and every mis-timing mistake has to cost a lifetime of being wrong afterwards…
Do you have a link to the post (or posts) from Michael O. Church? I have a vague recollection of the idea but I would like to reread it with what I know today!
I actually think some of big tech cough Apple cough is a decent short right now. I wanted to do it back in December but it's hard to bring yourself to short the largest companies like that.
Tesla, both the company and the stock, is pretty complicated. I certainly wouldn't short it right now.
The problem with many of these tech companies is that they've been so successful abusing their users out that they've quit putting energy into developing their products. HP and Sonos are two good recent examples of how this ends.
Tesla doesn't seem to be doing that right now. The big thing you'd be be betting on (long or short) is how successful the robotaxi and optimis will be. I'm not optimistic with either of those (robo taxi seems like it should be practical, it's more about the particular execution) but I also wouldn't be willing to bet against them.
“Learn to code” ceases to be good advice if too many people do.
Completely disagree. No matter what job you end up with, you will almost certainly be able to do it a bit better if you know how to code. Knowing how to code is basically always a plus when applying for a job. However "just learn to code a little bit, and nothing else" is probably bad advice.
Everyone has finite amounts of ‘shits’ to give (albeit some activities multiply instead of subtract on that front!). If they spend it on coding instead of something else, hopefully it was worth it eh?
> 1. Overproduction. Even liberal arts colleges have 15-20% of students majoring in computer science. “Learn to code” ceases to be good advice if too many people do.
"Learn to code" was the scam to address the so-called "skills shortage" BS in programming. Even worse, the skills that was pushed were also the most automatable: HTML, CSS and especially Javascript just to get $250k roles which was the most unsustainable ZIRP era to happen.
Now you won't see the influencers screaming about web developer roles given the current massive flush in those who joined because of the $$$ just to rearrange a <div> or adding accessiblity styling for 6 figures.
‘Skills shortage’ is similar to complaining about STEM shortages. It’s mostly BS.
The complaint isn’t about n people not being available, it’s about n people not being available for x low price, or z terrible working conditions.
No matter how cheap or how widely available, some folks will still complain because for some folks, even if they had to pay $0, it still would be ‘too much’ if people also demanded human rights.
It’s similar to the ‘where have all the good men gone’, or ‘why don’t people want to work anymore?’, etc. complaints.
STEM is fairly meaningless in an employment context because biology/chemistry/math undergrads are generally in a different category than at least some engineering grads. And it's actually reasonable to think that those engineering grad salary expectations should be roughly in the ballpark of other professionals. They certainly used to be.
> Every kid with a laptop thinks they're the next Zuckerberg, but most can't debug their way out of a paper bag
I feel like I've seen this quote many times over the years.
Also, how do they calculate employment rate? If you get a job at McDonald's while having a civil engineering degree or nutrition science, that counts as employed as well, no?
Would be good to see how many are actually employed in their field of study
> If you get a job at McDonald's while having a civil engineering degree
That would be under_employment (vs un_employment).
Un_employment refers to people actively seeking work but unable to find it, while under_employment encompasses individuals who are working but not fully utilizing their skills or working fewer hours than they would like
Bad term. If I get employed as a quantitative trader on Jane Street after completing a philosophy major am I underemployed because I'm not writing papers on ontology? Why do other people get to say what my "full utilization" is without even knowing me?
Underemployment as "not working as many hours as you'd like" is the standard definition, and that one actually does seem to respect people's interiority.
> Bad term. If I get employed as a quantitative trader on Jane Street after completing a philosophy major am I underemployed because I'm not writing papers on ontology?
No, not by the common definition of underemployment. You're not over-qualified to work at Jane Street and presumably you want to work there.
But it would be worth tracking if you wanted to work in academia and ended up at Jane Street. It's about measuring labor demand vs. supply, because labor supply is difficult to measure over time (because people don't just sit forever waiting for a job in their field to open).
> Underemployment as "not working as many hours as you'd like" is the standard definition
These are related concepts and tracked for similar reasons. You're "not working as many hours as you'd like at a job you're qualified for and would like to have". The number of hours you're working at that desired job is 0, and you're replacing it with some undesired job instead.
I maintain it is patently silly to use any definition of underemployment that can expand to include a full-time quantitative trader on Jane Street, even theoretically, but I respect your commitment to the bit.
If you want to get technical and read the small print, in this study "the underemployment rate is defined as the share of graduates working in jobs that typically do not require a college degree"
Since most people working at Jane Street have a college degree, you would not be considered 'underemployed' in this particular study.
(I think you are right to ask if a survey can accurately capture "underemployment", there are many problems with the definition and how to capture the right information to measure it.)
You're darn right I am right to question it. These questions only leave me further convinced this is a bad term.
"""
93. Would you rather have a job more closely related to your education, training and experience?
94. Considering your education, training and experience, do you feel that you are overqualified for your current job?
95. Considering your education, training and experience, do you feel that you have been overqualified for most of your jobs?
"""
93 is not a question I suspect most people answer faithfully. Because most people with tertiary education could probably find such a job - but it would be at a substantial pay cut. Yet the angle of compensation is nowhere to be found in the question itself.
94 is subject to the same bias that makes 90% of people think they're in the top 50% of driving, parenting, lovemaking and/or karaoke.
95 has that same issue, but also brings in a narcissistic wound aspect to it. No, of course you're better than all of those hams, shams, and japeths who you worked with/under/over through the years.
These numbers are all hard to measure. I only more or less worked in my engineering field of study (not CS) for 3 years but, other than going back to grad school for 2, was never underemployed by any serious definition of the term.
I left the software engineering field about 17 years ago to become a high school teacher. One of the things I taught was computer science (to high schoolers) and I recall sitting in many meetings of HS CS educators discussing the upcoming critical shortage of workers with CS degrees. I would tell them I left the field because there wasn't much work, and they would look at me like I was crazy. "Something wrong with that guy... He can't find work when there's a CRITICAL SHORTAGE of workers!!"
It’s a boom/bust profession that has been in a long boom.
Many of the big companies that have been on hiring orgies are advertising dependent. Ads are the thing that gets slashed heading into a bad economy, and we’re in an economic mess that is going to get alot worse.
20ys ago: you must study CS it's in high demand RN!
10ys ago: don't even apply w/o a master's degree!
2ys ago: sorry we're full!
1 week ago: you must study ML it's in high demand RN!
There's so many graduates that are not worth the paper their degree is printed on that it's laughable (if it weren't sad).
That's a good part of the reason why hiring processes are so long and you need to re-check everything people are supposed to know. Filtering out hundreds of candidates to get a mediocre one at best, thousands to get a really good one.
There are job openings, but just having a piece of paper is not enough to get to those.
AI tools have made recruiting a miserable experience for everyone involved, there's so much cheating in applicants and you waste so much time filtering those out and sadly, good candidates sometimes get lost in the noise.
Networking is what has the highest signal to noise ratio. A good recommendation from someone you trust helps a lot, but it penalizes people just starting their careers and have smaller networks.
"Despite computer science being ranked as number one by the Princeton Review for college majors, the tech industry may not be living up to graduates' expectations.
When it came to undergraduate majors with the highest unemployment rates, computer science came in at number seven, even amid its relative popularity.
The major saw an unemployment rate of 6.1 percent, just under those top majors like physics and anthropology, which had rates of 7.8 and 9.4 percent respectively.
Computer engineering, which at many schools is the same as computer science, had a 7.5 percent unemployment rate, calling into question the job market many computer science graduates are entering.
On the other hand, majors like nutrition sciences, construction services and civil engineering had some of the lowest unemployment rates, hovering between 1 percent to as low as 0.4 percent.
This data was based on The New York Fed's report, which looked at Census data from 2023 and unemployment rates of recent college graduates."
This is nothing (it will get worse) compared to what will happen in 2030.
Just look at what is happening in just the last 5 to 6 months since this prediction was made [0]. The definition of "AGI" was hijacked to mean all sorts of things to the companies that operate the AI systems, even conflicting with each other on timeframes and goals.
But what really is the true definition of "AGI" is the blueprint inside the WEF's Future of Jobs Report 2025 [1] with the deadline of 2030 including mass layoffs which 40% of employers admittedly anticipate reducing their workforce where AI can automate tasks, as I said before [2]
So what AGI actually means is a 10% global unemployment increase by 2030 or 2035 and with all those savings going to the AI companies.
> with all those savings going to the AI companies
I'm not even sure those savings will "go" anywhere, they will just stay with the companies. Right now, if I use my $20/mo ChatGPT subscription to automate away my secretary's job ($3,000/mo or whatever), it's not like those $3,000/mo is going to OpenAI. And I don't think in the future they will be able to jack up prices, because foundational LLM models have become a race to the bottom.
Great point! And also an uncomfortable truth: this sort of use (to replace human jobs) will be a net negative on GDP. That secretary will now have to make do with a lot less money, so they'll make fewer/cheaper purchases, etc.
However, the "number go up" crowd doesn't give a fuck about the secretaries -- so they will chant "AI! AI! AI!" to juice the stock and make out like bandits, while they still can.
CS professor with 15 years experience.
The massive boom in computer science enrollment over the last 20 years has been driven mostly by people chasing tech salaries, not by any real interest in computing itself. These students often show up completely unprepared for how difficult CS actually is, and universities have responded by dumbing down their programs to keep everyone happy and enrolled.
If this weeds out the people who are just there for the paychecks, it might actually be a relief to get back to teaching students who genuinely want to learn about computing.
I think CS departments are at least partially responsible for this development. They know that most of the students applying care nothing about Computer Science, have no interest in Computer Science and will never learn Computer Science, yet they keep accepting (and graduating) them. If CS departments actually wanted to teach CS, then they would advocate for setting up a new series of departments/degrees with names like Software Development and Engineering or Application Design and UX, and send most of the students there. Then those who want to learn/teach Computer Science can learn/teach Computer Science without having to deal with a classroom full of people who really don't want to be there.
When education becomes a business students become your customers. And the last thing you want to do is piss off your customers by failing them.
But if you can sell those customers a better product at the same price, then perhaps everybody will be happier. As it is now no one is happy. CS staff are annoyed that the students don't want to learn CS, and the students are annoyed since most of the CS they are forced to learning isn't relevant for the web developer job they want to apply after graduation.
There's an argument that a huge amount of the specific engineering theory that students learn never gets applied in a lot of jobs. I used some for a few years in mechanical engineering but not really a whole lot. A lot more was sense in managing projects. And, while I took a programming course (wouldn't call is CS), it probably didn't really help me in my job a lot more than the limited programming I took in high school did.
Well according to the linked data 94%+ of CS graduates have well paying jobs in the industry, so not sure if "no one is happy" is accurate.
University departments do not always have the kind of autonomy your post implies. It is common for the university’s central administration to dictate how many students they must let it, how much money they get per student, and hence how many they can fail without going into the red.
What could possibly go wrong making education a for-profit business? /s
None of this has to be for-profit. It just requires the university administration to put its wants, priorities, and head count (!) above the interests of the individual departments and of the students.
Making the whole thing a non-profit or a charity won’t solve this.
Even 20 years ago back when I was in college you had a sizable portion of kids who came in to study computer science thinking it would be fun and games. They were then made to study formal logic in their first semester and debug segfaults in gdb in the next, and by the end of the first year pretty much all of them had switched majors.
Anecdotally I've heard that very few CS programs even use C++ anymore, and schools now favor Python because students find it more accessible.
There's definitely this disconnect between programming and computer science. For better or worse, a lot of top schools don't even really teach programming in their computer science programs. It's something you just pick up on your own.
But not sure that using Python as the specific tool is so bad--based on the MOOC that's what MIT uses in Intro to Algorithms. May be better than spending a lot of time on the vagaries of C++ which are certainly relevant to system programming (though that's probably slowly switching to Rust) if your focus is on algorithms and other design details.
> […] a lot of top schools don't even really teach programming in their computer science programs. It's something you just pick up on your own.
"Computer science is no more about computers than astronomy is about telescopes, biology is about microscopes or chemistry is about beakers and test tubes. Science is not about tools. It is about how we use them, and what we find out when we do."
* https://quoteinvestigator.com/2021/04/02/computer-science/
* https://en.wikiquote.org/wiki/Computer_science#Disputed
Perhaps a trade school would be better if someone wants to focus on 'just' programming.
I think completely separating it like that goes too far in the other direction. The absolute best undergrad classes I had were the two where the lectures were entirely on the theory side, then the projects were practical implementations of the theory.
Totally. One consequence which I'm somewhat ambivalent about is that people with interest in engineering fields other than computer science may not be expected to have the degree of familiarity with the tools of the trade that budding computer scientists do. But, given the popularity of computer science as a major, it's probably inevitable to expect that freshmen at least in top schools have done at least some programming.
Being able to understand computer science and apply it is called software ENGINEERING for a reason, and it's a lot more complicated than 'just programming'.
You sound like a physicist who thinks mechanical engineers are unnecessary because we have physicists and car mechanics.
> You sound like a physicist who thinks mechanical engineers are unnecessary because we have physicists and car mechanics.
Or I sound like someone who recognizes that physics and computer science, and mechanical engineering and computer engineering / programming, are different areas of activity.
Well, essentially you have the language used in the class match the subject being taught.
Python gets language difficulty out of the way of learning a given algorithm. Bonus points for exacerbating the time issue when trying to introduce Big O timing notation. The kids can actually "feel it" in an in your face kind of way.
Systems, it's different, as you say.
Compilers. Different AI courses. And on and on. Each you may have legit reasons for using different languages.
The concern starts to grow when Python is being used across many courses to the exclusion of any other language or technology. That's the issue that's growing across CS departments right now. Couple that with kids who have no interest in learning the other languages on their own and voilá! You have an issue with uninterested kids graduating, but now they're also unprepared.
No argument. 6.001 I think it is goes into sort algorithms, Big O, etc. And Python seems a pretty good match especially as an ostensibly into course. Rust (or C++) is clearly a better match for other purposes and would probably deserve some sort of class for some of the associated concepts even if the student sort of knew how to "program."
I tried to study IT(lighter version of CS) without any prior programming experience and our first programming language was Java because apparently with Java you "Write once, run anywhere". But when I saw Java's syntax, I was like....this is not happening. That was the golden age of mobile apps and our focus was on the mobile development since Web was not sexy anymore. I'm actually happy that I quit because all I heard were horror stories of Java's mobile development on Android. My interest is actually Web development and if I had to choose all over again I would rather learn JavaScript. I can only imagine and speculate how hard it is for someone with no prior CS and programming experience to learn computer architecture, assembly, C and C++ for the first time.
I took the MIT MOOC CS Intro a few years back for a combination of refresher (though I wasn't actually a CS major) and just to kick the tires. Good Lord, that would have kicked my butt back when I was a freshman absent prior programming experience, admittedly when computer programming was much less a norm than it is today.
I had a high school BASIC class but that was about it.
> But when I saw Java's syntax, I was like....this is not happening.
No, you're good, this is the natural reaction of basically all programmers except for those strange beasts known as Java programmers who believe verbosity and needlessly complex yet organized in a twisted sense is nirvana, in the same way the accountant sees the tax return as nirvana. Many other language enthusiasts such as C, python, or LISP, will also get a bad taste for Java. Of course there are other gnarly languages such as APL or SAS.
There seem to be relatively few C/C++ jobs, so they went with what the market wanted (Java). They still exposed you to other other types of languages such as assembly, JS, and even COBOL. I really don't think going with python except for an into/cross-major class is a good idea since it's so simple and there aren't that many jobs with it.
>I really don't think going with python except for an into/cross-major class is a good idea since it's so simple and there aren't that many jobs with it.
Python's ecosystem is massive and there are lots of use cases[0], basically data analysis, machine learning, deep learning and all the rest of AI run on Python.
[0] https://lh3.googleusercontent.com/keep-bbsk/AFgXFlJPnxraSopK...
Image that I shared was from my Google Keep account and apparently it was set private afterwards by Google so don't click the link....here is the alternative link: https://media.springernature.com/full/springer-static/image/...
The ecosystem is massive, but most CS jobs when looking at job boards either don't use it or use it just as a secondary (similar to SQL).
If you're saying that all of AI runs on python, then what's the problem here? Implicitly students will need to learn python as part of their AI class.
I recently heard someone say "I've never had a job that didn't involve at least some Python". I think that's true of almost all computer science jobs in the current market.
Python is the primary language for scientific computing and the secondary language for a good number of other tasks.
Yeah, but I'd work that in the same way it's actually used - as a secondary language in one of the data related classes, such as a database course where you learn SQL, then have the students do calculations on in python. Or do it in one of the course that is used for introduction or non-CS students minoring in CS (like business majors). The real thing is that if you learn other stuff like Java, Python is super easy to pick up and doesn't need to be formally taught.
I work at a fintech company doing Spring Boot, and the only python I've seen here is in some CI scripts (which I can see but do not have access to edit, we have a team that maintains CI). Everything else is Java, Kotlin, or JS for some websites, that's about it. I've heard the AI teams use python, but while pretty much all the devs in my team know python, we don't use it at all.
Similar here. One exception is that we had some Lambdas written in Python. Now there is a push to replace them with Go because of the potential money savings from faster execution times.
>I recently heard someone say "I've never had a job that didn't involve at least some Python".
Tell me you've been in the field for less than 10 years without telling me.
Python is a joke language with a joke name. The only reason it ever caught on for AI is that someone wrote a few good math libraries for it in the 2000s, and its' rise is entirely incidental to that.
I was taught Python in class twenty years ago. I use/used it for systems roles. I don't use it as much anymore but your comment is complete nonsense.
Are you trying to get banned from hackernews? You can't go around calling peoples comments complete nonsense, you called me full of shit last week. Do you even like this community? We try really hard to be civil to each other in this place. https://news.ycombinator.com/newsguidelines.html
While C++ is unquestionably still widely used, outside of legacy and some niches (like embedded) it's not in the top 3 "best" choices. Why teach a more complicated language with a poorer ecosystem and toolchain than is actually needed?
I think there's still value in starting with C and C++, to see where it's coming from and see how much tooling and DX has improved, but I can't really blame courses jumping directly to the more useful things.
Maybe it's selection bias because I'm an above average C++ programmer but all three large tech companies I've worked at used C++ very heavily.
The first real language I learned was C++ and I was decent at it (best in class but mostly stuck with console based programs, nothing advanced like GUIs that year). I have never used it at work. If you look at job boards, it's overwhelmingly stuff like Java, JS/TS frameworks, etc. Why teach it when it seems like less than 10% of postings are looking for it? Might as well teach other languages and just learn it if you need it.
Frameworks are the opposite of what universities should be teaching. I had this argument with another classmate when I was there who said they should be doing that. The point of a university is to give students the skills to communicate the ideas needed to use and develop frameworks effectively, not to teach you particular frameworks.
Especially considering that the hot framework you are studying will probably be out of fashion by the time you graduate. If I was paying American college tuition rates and sat in class studying Next.js I'd demand a refund.
Why teach it when it seems like less than 10% of postings are looking for it?
And we're back to the discussion of what is the point of a University CS education. I would argue that learning something like C++ is important for the same reason something like Lisp or Haskell is important. Not because it will necessarily help you get a job, but because it introduces you to new concepts and a new way of thinking about programming and computation that will be useful no matter what language you end up programming in for a living.
I don't think there are any main concepts that I learned in C++ that weren't also covered in other courses, such as Java, assembly, and COBOL. If we're going to teach different types of languages then they should be truly different, such as the ones just mentioned.
Of course you can teach most concepts in most languages, but when it comes to understanding concepts like stack vs heap, pointers to a value vs the value itself, move vs copy semantics, pass by reference vs pass by value, explicit vs implicit memory allocation and deallocation, and so on, I found C++ pretty useful.
Sure, and those are pretty much all things you learn in assembly with the added benefit of understanding the structure (registers).
University isn't there to teach any languages. But it is there to provide the foundation and theory. If somebody wants practical C++ then go to a further education college.
I think that's what the professor meant when s/he said that schools are "dumbing down" curricula. The amount of Python being taught is a bit concerning even from the perspective of personal economics. If I was paying all that money every term in tuition, I'd want them to teach me the hard stuff. Not the language I can learn in a weekend while I'm shirtless on the couch watching GameDay. It's like no one ever stops for a moment and says, "Wait? Why am I paying this much money to learn a language that's so easy my English Lit friend knows it inside out already?"
If I were at a school where they are teaching JavaScript or Python, you kind of already know that program is more "money grab" than "study of computing technologies".
I don’t think I could disagree more strongly.
College should not be about teaching a specific language. It should be teaching the programming skills needed to pick up any language. Python is just as good as C++ in this regard. In fact, if python is an easier on ramp and get people excited about programming and show shows them what’s possible before crushing their soul with C/C++ then I say go for it.
In college, I regularly wrote my programs in PHP language I had taught myself prior to college and then converted them to see to submit my homework/test. While PHP was obviously much slower to run, it let me iterate and develop faster than my peers.
In fact, I find it borderline fraudulent that so many colleges waste time on a language that most graduates will never use. Python knowledge is way more useful than C++ knowledge in my opinion, especially for a new grad.
Then again, I have a very dim view on college CS programs as a whole. They aren’t just fighting the “last war”, they are fighting a war from decades ago. Almost everything that I used in my first job were things that I taught myself, not things that I learned in college. That was one big reason why I dropped out of college my junior year I wasn’t learning anything that was useful for my field. The professors were pedantic and cared about silly things like making sure I put a semicolon at the end of each of SQL queries that I wrote for an exam.
I would argue that Python, being a simple, easy to learn language, allows you to focus on other aspects of CS e:g complex algorithms. Rather than faffing about with memory management etc you can really study algorithms without the language getting in the way. Of course, this depends upon the professors actually teaching this "hard stuff" :)
Having interviewed a number of graduates from code camps, they're definitely just chasing the salary.
Most of them have no actual passion for computing, their scope of knowledge is superficial, and they're asking for six-figure salaries out of the gate.
I had a relatively simple coding assignment (shouldn't take more than 15 minutes) that I would use to weed out those that were just copying and pasting sample code. It required a very large number of values and added an additional profiling step to it. The sample code wasn't performant with a very large number of values, and was painfully slow to use unless you made minor adjustments to a few things.
Back when we used to interview in person, we used to have a computer and screen in the office running Linux. The number of people who couldn't handle a terminal even to type "ls" was either remarkable or shocking depending on your point of view. We're talking about people who claimed years of Linux experience, applying for Linux programming and administration jobs.
"If this weeds out the people who are just there for the paychecks, it might actually be a relief to get back to teaching students who genuinely want to learn about computing."
It's not going to work that way. I was genuinely interested and took many high level electives. I felt the program was very good 15 years ago at the school I attended. I also got an MSIS at a different school, but feel that one was not any more advanced than BS, just a faster pace and weirdly less coding. I did well for years at my job. Now it looks like I might lose my job and probably won't get another IT one. I will probably end up working at Walmart or something.
This is how I've been feeling through the whole situation. I got my current job at the bottom of the last slump the other year so it doesn't seem to be affecting me.
Still I've been careful to set my life up so I could go many years without employment if I had to. It's hard to trust the rest of the economy in general.
About 15 years ago when I started my degree there were both the “I want a good job” people and another crowd that I’d describe as having followed a thought process of “I like video games; I want to make video games; I should study comp sci”. At that time at least I think the video game crowd was even less equipped than the job crowd. Not to disparage video games, they are a majority of my free time, but those who were joining the field to _have fun_ are going to have an even harder time than those looking for work.
Lots of discussion about choice of programming language in the comments below.
- In principle, it should not matter at all, but there are practical reasons why one PL may be better than another in a particular school or context.
- But, all this "choice of PL" discussion is really a discussion about CS1. A CS degree has at least seven other courses -- assuming 1 CS course per semester -- and in practice many more than that. So, if you're going to ask questions about CS1, the question to ask is, "Does CS1 setup students to succeed in the advanced courses?" Classically, these were courses in compilers, operating systems, networking, and so on. These days, you can add distributed computing, machine learning, etc. (but don't subtract the classics).
There really is too much hand holding of university students nowadays. I don't think degrees are really equivalent to what they were thirty years ago. Back then the university was about weeding out the wasters and lazy folk who didn't study. College courses are meant to be hard for a reason. Don't get me started with that extra credit crap.
Thanks for sharing. Is this similar to what was attracting students to medicine/doctor (guaranteed position with high salary)? But the med-school-to-full-time-physician pipeline is long and can weed out. CS is a difficult subject, certain ways of thinking are difficult but certainly can be learned, like recursive thinking.
Did colleges expand their computer science departments or even just create them to meet the demand for the degree? The pipeline to possible employment with a CS degree is quite short, doesn't require residency and board-certification so it's a quicker route to employment, but then you are competing with peers with stronger backgrounds and educations and seasoned professionals for the same positions.
Salary isn't traditionally what motivates people to study medicine. It's prestige. The difficulty is part of the prestige, which is probably why they still do things like memorize long lists of cranial nerve names. I haven't heard that they have a problem with dropout rates.
A good CS education only gives you prestige with fellow nerds.
I actually think prestige is a contributing factor for CS as well. People assume you must be smart to be a software engineer, and FAANG companies are prestigious to normal people because they have name recognition. Definitely not on the same level as a Doctor/Surgeon/Lawyer or whatever but certainly could be more than a typical 4-year degree will get you. And I suppose there's also the fact that those companies were viewed very differently 10-15 years ago and now there is a lot more cynicism about big tech in general.
Absolutely, and the _prestige_ of being a CS person that has a high salary. Society admires those with wealth.
Yes, prestige, perception of self by others, but certainly salary and job guarantee are attractors to medicine. First hand anecdata from educators and doctors alike supports this.
I started my degree in 1999, and already then this was already a factor. I hope more articles like this are published, and people who aren't really interested in computing stop choosing this path.
Thank you for saying this because it feels like these people entering the industry in such numbers over the 2010s completely killed what made this job fun in the first place. I call them "ticket completers". Sure they can mechanically perform the minimum requirements of the job, but there is zero interest at all in what is actually being done; just following PM directions to the letter with no further thought. The whole spirit of innovation and curiosity and discovery has been lost, replaced by lifestyle seekers who look at you like an insane person when trying to talk about software in the abstract (ha!).
The hackers and nerds will be just fine. They are like gold when we find them now. But if this makes CS "uncool" again, I am all for it.
"The hackers and nerds will be just fine. They are like gold when we find them now."
This is not at all my experience. One of the problems I face is many of those PMs and companies in general want mindless ticket completers. My current job just wants us to grind through the Jira backlog. They have no interest in anything else and crush it from your will too.
> The hackers and nerds will be just fine. They are like gold when we find them now. But if this makes CS "uncool" again, I am all for it.
Think about how AI can help students cheat nowadays. You could still cheat previously, but now a CS-degree seeker can have an AI do the entirety of school work for them (with exception of say pen-and-paper tests). Imagine how the quality of new graduates drops with regard to the understanding and abilities you highlight as crucial to being effective in software, and how those that do understand are even more valuable relatively, but perhaps harder to find in the noise.
When most jobs just want you to be a ticket completer, the cheaters will do just fine if they can do it faster. The rest of use will be considered slow and discarded. It's happening to me.
Yes, they can be ticket punchers more easily, kind of trained to do that. But there are certainly levels of achievement that are not as possible with such a foundation that lacks grounding and true understanding.
Do you mind elaborating here on what is happening to you? It seems worthwhile information to add to the discussions ongoing for this post.
The short of it is that the team just wants high throughput but doesn't care about improving the system health or process efficiency. I tend to consider multiple aspects of the work including those areas. But if you just want someone to turn out tickets, I tend to be slower unless the task is simple or repetitive. I have a disability and graying hair, so my options are limited. I'm going to fail my PIP later this month and I'll probably end up working at Walmart.
Hm, I see. Do you use a coding assistant? Do you see value in keeping up or your morale is diminishing? Can you change teams or positions or focus?
I use Copilot, but it isn't that helpful. There is no morale - I'm so burnt out I fantasize about getting hit by a bus. No team will touch me on a PIP.
If genuinely feeling suicidal for any length of time, please reach out for help. :) From your many posts, some of which I replied to before, I believe you work at a soul-destroying place which isn't doing you any good, and I understand you're kinda stuck there cos of health insurance etc, but, what I'd say is try to live for today, not worry too much. IF they fire you, f** 'em. I bet your skills are better and more useful than you currently think. If you did get kicked out , you may well find something better. To me there doesn't seem evidence you'll be stuck working at Walmart. Maybe you could get a job in local state government? (OK not federal cos that's been gutted) Not stunningly paid but perhaps nicer working environment?
Hackers could like hacking outside their job for hobby but find job itself soul crushing. More like hackers and nerds that can withstand the corporate demand to grind will be left, rest get filtered. Plenty of talented people can be left outside the cog machine but cook up apps nobody but they care about
I'd argue it is actually more of a broad trend i.e. the boom in computer science enrollment over the last 20 years has been driven mostly by people chasing a better return on investment in the increasing cost of the average four-year degree, and software pays better than the average four-year degree. I do think that college being cheaper on average would help at least somewhat with CS being such a popular major.
If we're around in 2000 and 2009, you've seen this before. Our field has ups and downs and every time we hit the bottom people say it won't come back. It will.
We had some cleaning up to do. I was a hiring manager during COVID and the resumes I saw were unbelievable. People with "web" boot camps being considered for 6 figure salaries. People who had absolutely no business being in this field were being hired.
It was due to the easy money from low interest rates. This field always had solid salaries but some people were making a million to sit on meetings and integrate frameworks into me-too websites.
The hammer is coming down and is unfortunately hitting many good people too. But they will recover while the people who shouldn't be here will move on. Don't get your HVAC repair certification quite yet. Stop complaining about AI and go study it (the hard stuff not ChatGPT for dummies).
Well put. And this happened back in 2000, and 2009. I had people who I knew from direct experience to be non-technical slackers tell me about their IT Director jobs. I knew it wouldn't last then, and I'm not surprised now. Just get out of your comfort zone and start looking, and if you are in a bad situation don't be cowed. If you are truly technical this is always valued, despite the easy answers from ChatGPT, you must understand what it is telling you to really make use of it.
It's not the AI, it's outsourcing who really killing IT jobs. Even from relatively cheap East Europe projects are being moved to India, Vietnam and Philippines.
I've also been hearing this for 25 years now... Outsourcing also happens in waves.
The source is here https://www.newyorkfed.org/research/college-labor-market#--:...
I don't read too much into the fact that unemployment for nutrition science is at 0.4% - that doesn't mean those people are all working as nutritionists or even in a job that requires a degree. You can see this clearly in the underemployment rate which is 45%+.
Likewise, the top unemployment rate (9.4%) of those with an anthropology major probably doesn't mean all those people are living under a bridge - a fair number of them will be pretty well off, living off their parents and knew going in that their field doesn't hire millions.
So what to make of IT grads having high unemployment rates (but low underemployment rates! bottom 5 in those)? I feel some more on-the-ground reporting is needed.
The quotes from randos reacting in this article don't really help. "Every kid with a laptop thinks they're the next Zuckerberg, but most can't debug their way out of a paper bag," because debugging (like Zuck!) is computer science, apparently.
> So what to make of IT grads having high unemployment rates (but low underemployment rates! bottom 5 in those)?
That's a very important observation. It's much better to be in a field with a 6% unemployment rate than a 60% underemployment rate (like criminal justice, performing arts, and, surprisingly, medical technicians).
Causes:
1. Overproduction. Even liberal arts colleges have 15-20% of students majoring in computer science. “Learn to code” ceases to be good advice if too many people do.
2. AI… sort of. It’s a lousy replacement for serious engineering talent, but the bosses are so enticed by reduced labor costs (and reduced employee leverage) that they will keep trying even if the stuff doesn’t work. Expectations are going up, teams are shrinking, and junior roles are vanishing.
3. Reputation collapse. Remember how we dismissed Michael O. Church as a crank? His writing style was grating (and has improved immensely) but he was right about everything, five years before anyone else. In 2009, we were “good rich people” in contrast to Wall Street. Now we’re Public Enemy #1 and, while we don’t all deserve it, our industry’s leadership does. This doesn’t hurt big tech companies because they’re invincible monopolies, but it has ended the era in which even non-tevh companies wanted three or four “data scientists.”
> “Learn to code” ceases to be good advice if too many people do.
I believe "learn to code" is a great advice, nonetheless; the skill is highly applicable. The bad idea is thinking that alone will land you a cushy job.
I'll observe that, at a top-rate tech school I'm pretty familiar with, major + computing is a very prevalent option in a lot of the majors. As an undergrad (pre-PCs), I graduated with one computer programming course in FORTRAN and that was pretty much the only time I touched a computer keyboard undergrad. You can't really do that today in engineering/sciences.
Anecdotally, but talking to a lot of people who really have their ears to the ground, the junior roles thing seems to be very real. It probably isn't just AI--with more senior folks probably more available, why hire juniors--but seems to be pretty pronounced (with probably the corollary that bootcamps are probably a bad idea these days). Which isn't a great trend if real.
Five years ago, maybe even two years ago, we were hearing a lot about "age discrimination" and how seniors couldn't get hired.
Maybe it's just a phase?
Or maybe today's juniors are different than the juniors were five years ago. And maybe that's because of AI.
Both things can be true. It can be tough for juniors and tough for seniors who haven’t kept up and are just trying to cruise. Not that age discrimination isn’t a thing.
'learn to code' is a great advice for anybody. If you're a biology major and need to check the world molecules database (forgot the name, sorry), being able to write your own query goes a long way despite the nocode solutions.
It's mainly #1. For 20 years now we've been hearing non-stop about how computer science is this magical major where anyone can sleepwalk out of college into a 150k job. Parents have been pushing their kids into it whether they are interested or not. Colleges have been taking advantage by pushing sub-par programs and boosting graduation rates. The end result is a large number of CS graduates who can't write a for loop in an interview (and will then loudly complain about how the interview process is unfair).
i just spoke with a chem prof who said that a lot of phd students in the degree sign up not because they want to do science but because of the salary bump the degree provides in industry.
i guess that is a natural dynamic in our economic/belief system in which all central planning must be inherently bad so we must always pay the on-demand price instead of the bulk price and every mis-timing mistake has to cost a lifetime of being wrong afterwards…
Do you have a link to the post (or posts) from Michael O. Church? I have a vague recollection of the idea but I would like to reread it with what I know today!
He has a blog at antipodes dot Substack but I am terrified to link it because people who say anything good about him tend to get banned.
He seems to have moved toward CS theory, AI, and literary fiction.
I actually think some of big tech cough Apple cough is a decent short right now. I wanted to do it back in December but it's hard to bring yourself to short the largest companies like that.
Some of big tech (cough tesla cough) was ripe for shorting for many years now but market will be irrational longer than you stay solvent;)
Tesla, both the company and the stock, is pretty complicated. I certainly wouldn't short it right now.
The problem with many of these tech companies is that they've been so successful abusing their users out that they've quit putting energy into developing their products. HP and Sonos are two good recent examples of how this ends.
Tesla doesn't seem to be doing that right now. The big thing you'd be be betting on (long or short) is how successful the robotaxi and optimis will be. I'm not optimistic with either of those (robo taxi seems like it should be practical, it's more about the particular execution) but I also wouldn't be willing to bet against them.
There isn’t even a rational argument regarding Tesla stock. Trying to do so is folly. It’s a meme/cult stock.
What happened to Michael O. Church? I enjoyed his writings
Here you go:
https://news.ycombinator.com/item?id=10017538
He wasn’t wrong.
“Learn to code” ceases to be good advice if too many people do.
Completely disagree. No matter what job you end up with, you will almost certainly be able to do it a bit better if you know how to code. Knowing how to code is basically always a plus when applying for a job. However "just learn to code a little bit, and nothing else" is probably bad advice.
Everyone has finite amounts of ‘shits’ to give (albeit some activities multiply instead of subtract on that front!). If they spend it on coding instead of something else, hopefully it was worth it eh?
> 1. Overproduction. Even liberal arts colleges have 15-20% of students majoring in computer science. “Learn to code” ceases to be good advice if too many people do.
"Learn to code" was the scam to address the so-called "skills shortage" BS in programming. Even worse, the skills that was pushed were also the most automatable: HTML, CSS and especially Javascript just to get $250k roles which was the most unsustainable ZIRP era to happen.
Now you won't see the influencers screaming about web developer roles given the current massive flush in those who joined because of the $$$ just to rearrange a <div> or adding accessiblity styling for 6 figures.
"Skills shortage" isn't BS. But minimal front-end learnings aren't the fix.
‘Skills shortage’ is similar to complaining about STEM shortages. It’s mostly BS.
The complaint isn’t about n people not being available, it’s about n people not being available for x low price, or z terrible working conditions.
No matter how cheap or how widely available, some folks will still complain because for some folks, even if they had to pay $0, it still would be ‘too much’ if people also demanded human rights.
It’s similar to the ‘where have all the good men gone’, or ‘why don’t people want to work anymore?’, etc. complaints.
STEM is fairly meaningless in an employment context because biology/chemistry/math undergrads are generally in a different category than at least some engineering grads. And it's actually reasonable to think that those engineering grad salary expectations should be roughly in the ballpark of other professionals. They certainly used to be.
Who the fuck is Michael O. Church?
The important thing here isn't who he is, but how he was treated when presented the idea that techies are going to loose reputation.
Sometimes it's best to begin with the aftermath: https://news.ycombinator.com/item?id=10017538
> Every kid with a laptop thinks they're the next Zuckerberg, but most can't debug their way out of a paper bag
I feel like I've seen this quote many times over the years.
Also, how do they calculate employment rate? If you get a job at McDonald's while having a civil engineering degree or nutrition science, that counts as employed as well, no?
Would be good to see how many are actually employed in their field of study
> If you get a job at McDonald's while having a civil engineering degree
That would be under_employment (vs un_employment).
Un_employment refers to people actively seeking work but unable to find it, while under_employment encompasses individuals who are working but not fully utilizing their skills or working fewer hours than they would like
Bad term. If I get employed as a quantitative trader on Jane Street after completing a philosophy major am I underemployed because I'm not writing papers on ontology? Why do other people get to say what my "full utilization" is without even knowing me?
Underemployment as "not working as many hours as you'd like" is the standard definition, and that one actually does seem to respect people's interiority.
> Bad term. If I get employed as a quantitative trader on Jane Street after completing a philosophy major am I underemployed because I'm not writing papers on ontology?
No, not by the common definition of underemployment. You're not over-qualified to work at Jane Street and presumably you want to work there.
But it would be worth tracking if you wanted to work in academia and ended up at Jane Street. It's about measuring labor demand vs. supply, because labor supply is difficult to measure over time (because people don't just sit forever waiting for a job in their field to open).
> Underemployment as "not working as many hours as you'd like" is the standard definition
These are related concepts and tracked for similar reasons. You're "not working as many hours as you'd like at a job you're qualified for and would like to have". The number of hours you're working at that desired job is 0, and you're replacing it with some undesired job instead.
I maintain it is patently silly to use any definition of underemployment that can expand to include a full-time quantitative trader on Jane Street, even theoretically, but I respect your commitment to the bit.
If you want to get technical and read the small print, in this study "the underemployment rate is defined as the share of graduates working in jobs that typically do not require a college degree"
Since most people working at Jane Street have a college degree, you would not be considered 'underemployed' in this particular study.
It depends if you want to be a philosopher or not. The surveys do actually ask this question, see questions #93-95 here for example: http://gpiatlantic.org/pdf/communitygpi/communitypart2b.pdf
(I think you are right to ask if a survey can accurately capture "underemployment", there are many problems with the definition and how to capture the right information to measure it.)
You're darn right I am right to question it. These questions only leave me further convinced this is a bad term.
""" 93. Would you rather have a job more closely related to your education, training and experience?
94. Considering your education, training and experience, do you feel that you are overqualified for your current job?
95. Considering your education, training and experience, do you feel that you have been overqualified for most of your jobs? """
93 is not a question I suspect most people answer faithfully. Because most people with tertiary education could probably find such a job - but it would be at a substantial pay cut. Yet the angle of compensation is nowhere to be found in the question itself.
94 is subject to the same bias that makes 90% of people think they're in the top 50% of driving, parenting, lovemaking and/or karaoke.
95 has that same issue, but also brings in a narcissistic wound aspect to it. No, of course you're better than all of those hams, shams, and japeths who you worked with/under/over through the years.
These numbers are all hard to measure. I only more or less worked in my engineering field of study (not CS) for 3 years but, other than going back to grad school for 2, was never underemployed by any serious definition of the term.
I left the software engineering field about 17 years ago to become a high school teacher. One of the things I taught was computer science (to high schoolers) and I recall sitting in many meetings of HS CS educators discussing the upcoming critical shortage of workers with CS degrees. I would tell them I left the field because there wasn't much work, and they would look at me like I was crazy. "Something wrong with that guy... He can't find work when there's a CRITICAL SHORTAGE of workers!!"
It’s a boom/bust profession that has been in a long boom.
Many of the big companies that have been on hiring orgies are advertising dependent. Ads are the thing that gets slashed heading into a bad economy, and we’re in an economic mess that is going to get alot worse.
20ys ago: you must study CS it's in high demand RN! 10ys ago: don't even apply w/o a master's degree! 2ys ago: sorry we're full! 1 week ago: you must study ML it's in high demand RN!
It's old news that markets are dynamic.
Depends on who you ask. It's news to graduates who find themselves w/o a job and zilions in debt.
regular people, not really interested in tech chasing 'easy tech salaries' coming up against the fact its a hard industry.
There's so many graduates that are not worth the paper their degree is printed on that it's laughable (if it weren't sad).
That's a good part of the reason why hiring processes are so long and you need to re-check everything people are supposed to know. Filtering out hundreds of candidates to get a mediocre one at best, thousands to get a really good one.
There are job openings, but just having a piece of paper is not enough to get to those.
AI tools have made recruiting a miserable experience for everyone involved, there's so much cheating in applicants and you waste so much time filtering those out and sadly, good candidates sometimes get lost in the noise.
Networking is what has the highest signal to noise ratio. A good recommendation from someone you trust helps a lot, but it penalizes people just starting their careers and have smaller networks.
It's a sad state of affairs.
This is inevitable with any boom/bust cycle. Employment in tech just can’t possibly grow exponentially forever.
"Despite computer science being ranked as number one by the Princeton Review for college majors, the tech industry may not be living up to graduates' expectations.
When it came to undergraduate majors with the highest unemployment rates, computer science came in at number seven, even amid its relative popularity.
The major saw an unemployment rate of 6.1 percent, just under those top majors like physics and anthropology, which had rates of 7.8 and 9.4 percent respectively.
Computer engineering, which at many schools is the same as computer science, had a 7.5 percent unemployment rate, calling into question the job market many computer science graduates are entering.
On the other hand, majors like nutrition sciences, construction services and civil engineering had some of the lowest unemployment rates, hovering between 1 percent to as low as 0.4 percent.
This data was based on The New York Fed's report, which looked at Census data from 2023 and unemployment rates of recent college graduates."
Source:
https://www.newyorkfed.org/research/college-labor-market (requires Javascript)
Data: (no Javascript required)
https://www.newyorkfed.org/medialibrary/research/interactive...
https://www.newyorkfed.org/medialibrary/research/interactive...
https://www.newyorkfed.org/medialibrary/research/interactive...
https://www.newyorkfed.org/medialibrary/research/interactive...
Civil Engineering 1.0%
Aerospace Engineering 1.4%
Mechanical Engineering 1.5%
Chemical Engineering 2.0%
Electrical Engineering 2.2%
General Engineering 2.4%
Miscellaneous Engineering 3.4%
Computer Science 6.1%
Computer Engineering 7.5%
This is nothing (it will get worse) compared to what will happen in 2030.
Just look at what is happening in just the last 5 to 6 months since this prediction was made [0]. The definition of "AGI" was hijacked to mean all sorts of things to the companies that operate the AI systems, even conflicting with each other on timeframes and goals.
But what really is the true definition of "AGI" is the blueprint inside the WEF's Future of Jobs Report 2025 [1] with the deadline of 2030 including mass layoffs which 40% of employers admittedly anticipate reducing their workforce where AI can automate tasks, as I said before [2]
So what AGI actually means is a 10% global unemployment increase by 2030 or 2035 and with all those savings going to the AI companies.
[0] https://news.ycombinator.com/item?id=42490692
[1] https://www.weforum.org/publications/the-future-of-jobs-repo...
[2] https://news.ycombinator.com/item?id=42652402
> with all those savings going to the AI companies
I'm not even sure those savings will "go" anywhere, they will just stay with the companies. Right now, if I use my $20/mo ChatGPT subscription to automate away my secretary's job ($3,000/mo or whatever), it's not like those $3,000/mo is going to OpenAI. And I don't think in the future they will be able to jack up prices, because foundational LLM models have become a race to the bottom.
Great point! And also an uncomfortable truth: this sort of use (to replace human jobs) will be a net negative on GDP. That secretary will now have to make do with a lot less money, so they'll make fewer/cheaper purchases, etc.
However, the "number go up" crowd doesn't give a fuck about the secretaries -- so they will chant "AI! AI! AI!" to juice the stock and make out like bandits, while they still can.
[flagged]
90% of them are fullstack Node/React devs i think.