Hacker Newsnew | past | comments | ask | show | jobs | submit | bluefirebrand's commentslogin

I could "write" this code the same way, it's easy

Just copy and paste from an open source relational db repo

Easy. And more accurate!


It is a Rust reimplementation of SQLite. Not exactly just "copy and paste"

The actual task is usually to mix something that looks like a dozen of different open source repos combined but to take just the necessary parts for task at hand and add glue / custom code for the exact thing being built. While I could do it, LLM is much faster at it, and most importantly I would not enjoy the task.

> it is hard to square this with people who are also saying not to worry about AI displacement because there's limitless demand for software.

Well, that's easy to square: the idea that there is limitless demand for software is nonsense. Pure fiction


Companies take your unemployment length as a negative signal

"He's been unemployed for 13 months? Why doesn't anyone want to hire him? Must be something wrong with him"


It's typically easiest to find a job when you have a job.

Maybe. Probably? But I also sense a fallacy here. I could get a new job tomorrow. Maybe it took me 8 years to find that job and I didn’t realize that because I was employed the whole time.

Does that make sense?


People wonder why something was picked over before committing to it, that's all it comes down to

Focus on what you can control, and you can control the perception of that. If you are interested in money, professional validation, and corporate structure, go that way.

You can try to alter the cultural fundamental assumptions when you're done.


I don't think it's very silent?

Maybe people just have their fingers in their ears but this has been a problem for years now


> I've worked in specialized fields where it takes YEARS for the right candidate to even start looking for jobs. You need to have the job listings up and ready

If this is true then those shouldn't even be public job postings. That sort of critical position is for headhunters


> If this is true then those shouldn't even be public job postings.

Why? Not everyone is on LinkedIn or has an updated profile.

Some of the best candidates I've hired were people who were in other states who were planning to move, but waiting for the right job opportunity to come up.

We also used recruiters.

Why does it make people so angry that we posted job listings for real jobs that we were really hiring for?


If your real candidate pool is so small that you're effectively targeting a handful of people worldwide then you aren't "really hiring"

Yet somehow we really did hire people.

If only we had listened to HN comments and given up instead


Or... your company should be training potential replacements. This is what the US military and "white shoe" consulting companies do. While expensive, it guarantees that critically needed skilled staff are always available.

I recommend the article "Up or Out: Solving the IT Turnover Crisis" [0] which gives a reasonable argument for doing exactly that.

Notes:

0 - https://thedailywtf.com/articles/up-or-out-solving-the-it-tu...


> In 2025, the US exported more crude oil and petroleum products

Another way to read this is "The US is still producing enough crude oil and petroleum to be a net exporter of these products"


> The point is to solve problems, right?

I want to solve problems correctly and in a high quality manner.

LLMs do not enable me to do this any better in my opinion. They enable me to do it faster* but worse

* I'm not entirely sold on it being actually faster either


There’s a place for everything.

Most coding tasks take place outside of pure tech companies, if I’d venture a guess.

And let’s be honest, enterprises in general do not value that quality - and they face very little in terms of technical challenges that can’t be solved by code on stack-overflow or github.

What most enterprises lack is knowledge about themselves though - this is more a business problem than a technical one however.


Yeah. It remains an open question if they'll be a net positive in the end. If they end up being helpful, good. And if they end up being mostly a waste of our time, then we'll go back to where we were. More or less.

It seems like there's still some juice to squeeze from this technology, though. So my money is on a net positive. For now.

Even if they end up being bad practice for production code, we can probably agree that they're decent at mock-ups, experimentation, and quick proof of concepts. That at least has some value.

I see a lot of people with zero coding/engineering experience trying to make their own products, and very few of those products with long term staying power. We've got a long way to mature with how we use this tech. This moment feels like the introduction of the home microwave: A lot of terrible, terrible meals were cooked while people briefly forsook their stove to use the miraculous microwave for everything. Eventually people figured out what tasks the microwave was suitable for and then went back to the oven for all but simple re-heating.


I've been feeling this a lot

I don't know what I'm going to do next but I have very little interest in or respect for LLM assisted coding, so I think the industry is likely not a good place for me anymore

If I could retire I would but I'm not quite 40. I don't have the savings to stop working now. So I'll figure out something new I guess

What a bummer. I loved my career to this point and I'm very sad that LLMs have ruined it

I guess I'm glad I'm not alone in this


What have they ruined, exactly? You can do all the same things you used to, can’t you?

You can't get paid doing them.

If coding goes away, decades of experience become worthless instantly. Not all of it, but the vast majority, enough to justify starting over in another career.

In that world, it will have become more cost-effective for most companies to spend most of their budget on inference vendors and employ a few low-paid LLM wranglers, even if the final output is of terrible quality. No point in competing for that kind of employment experience with that kind of pay.


I really don't get this point of view at all. I acknowledge that two yours into my quarter century of experience, most of what I knew was easily replaceable by the AI of today. After two decades of experience however, syntax and specific algorithm and language knowledge was perhaps 10% of my value, nowhere near the vast majority.

The idea that low-paid LLM wranglers are going to push out the experienced engineers just doesn't wash. What I think is much more likely to happen is the number of software engineers greatly reduces, but the remaining ones actually get paid more, because writing code is no longer the long pole, and having fewer minds designing the system at a high level will allow for more cohesive higher-level design, and less focus on local artesenal code quality.

To be honest AI is just the catalyst and excuse for overhiring that happened due to the gold rush over the last 20 years related to the internet and smart phone revolutions, zero-interest rate, and pandemic effect.


> language knowledge was perhaps 10% of my value, nowhere near the vast majority.

Do you not see LLM's catching up with your experience fast?

You might not lose your job, but you'll definitely have to take a pay cut


> What I think is much more likely to happen is the number of software engineers greatly reduces, but the remaining ones actually get paid more.

You realize that this is contradictory, right? If the number of competitors remains the same, yet there are far fewer jobs, it's a buyer's market: companies have to offer very little to find someone desperate enough.

> It will allow for more cohesive higher-level design, and less focus on local artesenal code quality.

I don't buy this, LLM code is extremely bloated. It never reuses abstractions or comes up with novel designs to simplify systems. It can't say no, it just keeps bolting on code. In a very very abstract sense you might be right, but that's outside the realm of engineering, that's product design.


You raise some good points about the economics, that's where I feel the least confident, but let me explain my reasoning.

Software has eaten the world, and thus the value of maintaining software has never been hire. Engineers are the people who understand how software works. Therefore unless we move away from software, the value of software engineering remains high.

AI does not reduce software, it increases the amount of software, makes messier software and generally increases the surface area of what needs to be maintained. I could be wrong, but as impressive as LLM's language and code processing capabilities are, I believe there is a huge chasm that will likely never be crossed between the human intent of systems and their implementation that only human engineers can actually bridge. And even if I'm wrong, there's another headwind which is that, as Simon Willison has point out, you can't hold an LLM accountable, and therefore corporate leaders are very unlikely to put AI in any position of power, because all the experience and levers they have for control are based on millenia of evolution and a shared understanding of human experience; in short they want a throat to choke.

The other factor is that while AI can clearly replace rote coding today, I think the demos oversell the utility of that software. Sure it's fine to get started, but you quickly paint yourself in a corner if you attempt to run a business on that code overtime where UX cohesion, operational stability and data integrity over time are paramount and not something that can be solved for without a lot of knowledge and guardrails.

So net of all this, where I think we land is a lot of jobs that are based purely on knowledge of one slow-changing system and specific code syntax will go away, but there will be engineers who maintain all the same code, they'll just cover more scope with LLM assisted tools. You put your finger on something, that I do believe this moves engineering closer to product design, but I still think there's a huge amount on the engineering side that LLMs won't be able to do any time soon (both for technical and the social reasons stated above), and ultimately I don't see the boundary the same way you do, as software engineers we have always had to justify our systems by their real world interaction.


> Software is everywhere and thus the value of maintaining software and the value of software engineering remains high.

This is an unfinished argument. What if we get coding agents to maintain software? What if frequent rewriting becomes cheap enough? Something that's a tenth or one hundredth of your salary doesn't have to be good to make for a good business decision. Why do you think every native application has been replaced by slop made up of 10 layers of JS frameworks on top of electron? Nothing matters as long as the product is cheap and fast to pump out, barely works on modern hardware, and makes dough.

> AI does not reduce software, it increases the amount of software.

There's not infinite demand for software. If AI inference costs take 50% of the prior payroll expenses, while making a company twice as efficient, that means we need 4 times as much demand in software engineering at the same salary for everyone to keep their job. What new or improved subscription, app, website, device, or other software product does the world need right now? 99.9% of people use the same 5 apps. Most of their free time, attention, and disposable income has already been captured by trash that is unbeatable due to network effects. Are we all going to sell shitty LLM frontends to businesses until they notice they could have done the same thing themselves? There might be an explosion in new software, but no one there to care about using it.

> I believe there is a huge chasm that will likely never be crossed between the human intent of systems and their implementation that only human engineers can actually bridge.

Maybe, or the AI might just be missing context. Think of all the unwritten culture, practices, and conversations the LLM hasn't been made aware of.

> In short they want a throat to choke.

You're responsible for those under you anyway, this doesn't help. Banking on those in charge being irrational forever in a way that is bad for business, and without ever noticing, is a bad gamble.

> The other factor is that while AI can clearly replace rote coding today [...], X is not something that can be solved for without a lot of knowledge and guardrails.

I'm talking about the world the AI-maximalists predict is rapidly approaching, not where we are today. None of that knowledge and none of those guardrails are hard to grasp intellectually, compared to advanced mathematics for example. Put your institutional knowledge in a .md file and add another agent that enforces guardrails in a loop. The only way out I see is a situation where there are complex patterns that we intuitively grasp, but can't articulate. Patterns that somehow span too much data or don't have enough examples for LLMs to pick up on.

> There will be engineers who maintain all the same code, they'll just cover more scope with LLM assisted tools.

So fewer jobs with lesser qualifications?

> Ultimately I don't see the boundary the same way you do, as software engineers we have always had to justify our systems by their real world interaction.

I've seen the way engineers design products, and I like products designed by engineers, but no layperson does. Laypeople don't want power, privacy, or agency. They care about how things work, and they lie to themselves and others about what they really want. They don't want a native desktop app that streams high-quality audio from a self-hosted collection, they want a subscription that autoplays algorithmic slop through a react native app on their iPhone. Do you really think you're better at appealing to/fleecing customers than people with actual UX, marketing, and behavioral psychology experience? This example only applies to mass-market software, but I'm sure it's not much different in other fields. Engineers keep thinking they could everyone else's job, but they don't do so well in practice.


I'm sort of shocked at how little of my argument seemed to land with you in any way. I'm wondering how many cycles of software hype have you been through? Were you here for the PC revolution, the .com era, smartphone mass adoption?

There's a lot of what-ifs, and worst case scenarios in your reply that I simply don't find likely. I am not drinking the koolaid from the AI maximalists or the doomers. I could be wrong of course, no one can predict the future, but to me the very real, novel and broad utility of LLMs that we are just learning to harness combined with the investment outlook are leading to a mania that has people overestimating where things will land when the dust settles. If I'm wrong then I guess I'll join the disenfranchised masses picking up pitchforks, but I'm not going to waste time worrying about that until I see more evidence that it's actually going that badly.

So far what I see is that software engineers are the ones getting the most actual utility of AI tooling. The reason is that it still requires a precision of thought and specificity to get anything sustainable out AI coding tools. Note this doesn't mean that engineers can design better apps than proper designers, rather my point is that designers and other disciplines still can not go much further than prototypes, they still need engineers to write the prompts, test the output, maintain the system, and debug things when they go wrong. I have worked long enough with large cross-functional teams to know that the vast majority of folks in non-engineering functions simply can not get enough specificity and clarity in their requests to allow an LLM to turn it into a working system that will work over time. The will hit a wall very quickly where new features add bugs faster than they improve things, and the whole thing collapses under its own weight like a mansion of popsicle sticks. And by the way, I don't consider AI-assisted coding to require less qualification than regular coding. Sure you don't need to know as much syntax or algorithms, but you absolutely need to know data modeling, performance, reliability, debugging, consistency, and migration knowledge in order to use AI to contribute to any software that powers a real business, and yeah you might need to develop your product and business sensibilities, but to me that's what been happening throughout the history of computing. Wiring up ENIAC, certainly required qualifications that were not needed for assembly programing, which in turn required certain things that C programmers did not need and so forth, but harnessing the increasing compute power and complexity required new qualifications. I don't think AI will ultimately be that different, it will change the way we work, it doesn't replace what senior engineers do.


> What I think is much more likely to happen is the number of software engineers greatly reduces

So you just believe you'll be one of the ones left behind?

Best of luck to you


I'm not spending my precious time on this earth reviewing code from my coworkers that they couldn't be bothered to write without using LLMs

And that's really just the tip of the iceberg. LLM usage metrics being introduced by management to ensure the licenses they pay for are being used. New productivity metrics that require LLMs and low standards to reach, and that's before we even get into my ethical problems with the technology

So, yes. Their existence is ruining my love for technology


My coworkers have started writing code with copilot. It's kind of okay but also not really.

I've been enjoying teaching them how the things they're producing with llms work, because they have no idea and constantly break their builds because of it. And at the same time it helps me improve my craft because I get to refine the bits of which I don't have full understanding as well as see some implementations I wouldn't have voluntarily chosen previously, which allows me to explore their benefits and limitations. Llms actually make this process slightly less painful because at least now when I send them away to work for the day they have something to review when we next meet vs. pre-llm days when they would basically have wrote nothing because they were stuck.

I still don't use llms to code beyond whatever search providers autoprovide when I'm looking up documentation. I don't think I'm good enough to use them. Maybe one day. But for now I don't have to because I'm not facing the breadlines for writing things myself.


I mean, if it’s that bad they’ll be released from service at your company, and you’ll be recognized as superior. This only helps you.

In my experience management only cares about velocity, not quality. I believe this is pretty universal across the industry

When hand writing code we could strike a tolerable balance between quality and velocity. With LLM coding we cannot. Velocity is high, quality is low. I don't believe there is any fixing that despite what the many LLM coding shills on this website would have you believe


I've been feeling it too.

Up until a few weeks ago, I've been able to successfully avoid using AI at work. But then mandates happened and now I'm being forced to use them. Absolutely no guidance from leadership though. "Just figure it out amongst yourselves". Other folks in the company have similar reservations but I feel I'm the only one who has very strong feelings about it in the "morality" and ethical sense. I just can't ignore what the tech is built upon and is doing to other people. All so people like us can open 20 PRs in a day. PRs that don't even get merged because no one can keep up with reviews. For tickets that before would've been labeled as "not worth it". For a job that wasn't even that hard to begin with.

Funny thing is that 1-2 years ago when it was all still new I was more open minded. It was a shiny new tool and naturally I would like to try it out. I was one of the first finding potential use-cases for the team. But the more I looked and learned about it the more I hated it.

And I am not a Luddite. Before all of this, I would personally spend my _free_ time using and reading about random and often obscure tools and languages like Lisp, Clojure, Slackware. I'd spend hours curating my Emacs config. I was learning "k8s the hard way" back when it was the hottest tech thing. Does that sound like a Luddite?

I don't have the privilege to just pivot to another career so I have no choice but to stick it out. My only consolation is that when my kids are working age in a couple of years, when they ask me what the fuck happened, I can look them in the eye and honestly say that I did what I can and I did not cheer it on.


Are you replying to the right post? This post is about someone excited to learn again.

Straight up, why aren’t you excited to learn?


I view using LLMs as the absolute opposite of learn which is why I'm turned off of them

I think I misread this post, though. I initially read it as someone who was excited to leave software and learn something new.

Your post made me re-read it, now I'm not sure. Maybe the author is excited to learn a new LLM based workflow. If so, you're right that I have nothing in common with how they feel.


Why have cars become so expensive?

My understanding is that the used car market was gutted by "cash for clunkers" style government programs

Used to be more used cars on lots, so used cars were more affordable

May not be the whole story but it seems likely it played a part


Very true, but you're a decade or two late for that. IIRC Cash4Clunkers put like a $3k floor on used car value (~$5-7k in today's dollars) meaning you'd never sell your old car for $2k to an individual when you could sell it to the government for $3k.

Per google it was started in 2009, which means any car worth less than $5k around 17 years ago isn't materially impacting new or used car prices today.


Huh

I could have sworn the cash for clunkers thing was much more recent. Thank you for the correction


It was like 700,000 vehicles. Enough to have an impact, but we buy like 15 million cars a year, it's gonna smooth out fairly quickly.

And yet even when headlines do start saying that, it will still be stupid to use AI to file your taxes

I'm truly just blown away by how much trust people are putting in these systems. It makes me feel like I'm losing my mind


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: