In fact, in my opinion, one of the benefits of AI tools that is often overlooked is "psychological support". When you are stuck at work, it will give you a push. Even if it is not completely right, it is enough to get you moving. The feeling of "no longer fighting alone at work" is actually more important than many people think.
bgwalter 4 hours ago [-]
To each his own. I'm completely drained after 30 min of "discussing" with an LLM, which is essentially an overconfident idiot.
Pushes never come from the LLM, which can be easily seen by feeding the output of two LLMs into each other. The conversation collapses completely.
Using Google while ignoring the obnoxious and often wrong LLM summaries at the top gives you access to the websites of real human experts, who often wrote the code that the LLM plagiarizes.
swat535 37 minutes ago [-]
I think a big problem I have with AI write now is that the context window can get messed up and it has a hard time remember what we talked about. So if you tell it to write a code that should do X, Y, Z, it does on the first request, but then on the next request, when it's writing more code, it doesn't recall.
Second, it doesn't do well at all if you give it negative instructions, for example if you tell it to: "Don't use let! in Rspec" , it will create a test with "let!" all over the place.
3ln00b 3 hours ago [-]
I have experienced this a lot, I thought I was alone. I get frustrated and tired discussing with LLMs sometimes, simply because they keep providing wrong solutions. Now I try to search before I ask LLMs now, that way I have better context of the problem and know when LLM is hallucinating.
jdmoreira 1 hours ago [-]
You are discussing with a llm? Never happened to me and I use llms all the time. Why would you need to discuss if you know best? Just tell it what to do and course correct it. It's not rocket science.
PS: Both humans and llms are hard to align. But I do have to discuss with humans and I find that exhausting. llms I just nudge or tell what to do
neuronexmachina 52 minutes ago [-]
> You are discussing with a llm? Never happened to me and I use llms all the time. Why would you need to discuss if you know best? Just tell it what to do and course correct it. It's not rocket science.
I find myself often discussing with an LLM when trying to find the root cause of an issue I'm debugging. For example, when trying to track down a race condition I'll give it a bunch of relevant logs and source code, and the investigation tends to be pretty interactive. For example, it'll pose a number of possible explanations/causes, and I'll tell it which one to investigate further, or recommendations for what new logging would help.
hippari2 1 hours ago [-]
I find it exhausting in a yes-man kind of way where it does whatever you told but just somehow wrong. I think your human case is the reverse.
sfn42 52 minutes ago [-]
Just like a person would. If you want it done right you have to do it yourself. Or you have to tell the LLM exactly how to do it.
Often I find it easier to just do it myself rather than list out a bunch of changes. I'll give the LLM a vague task, it does it and then I go through it. If it's completely off I give it new instructions, if it's almost right I just fix the details myself.
endymion-light 2 hours ago [-]
are we using the same version of google? Unless incredibly specific I mostly see SEO optimized garbage.
Jensson 2 hours ago [-]
Everyone uses an individualized version of Google, not just your history its even different by country of origin etc.
So no, they are not using the same version of Google.
endymion-light 1 hours ago [-]
Well my individualized version of google is filled with medium articles and useless bull, i'd pay good money to switch to this magic working search engine
matthewkayin 55 minutes ago [-]
If you're serious, I've heard Kagi is an actually good, paid search engine. Haven't tried it myself, though.
ta988 47 minutes ago [-]
It is called Kagi
2 hours ago [-]
all2 1 hours ago [-]
I'm glad to hear this. Working with LLMs makes me want to get up and go do something else. And at the end of a session I'm drained, and not in a good way.
uludag 2 hours ago [-]
Totally fair take — and honestly, it’s refreshing to hear someone call it like it is. You’re clearly someone who values real understanding over surface-level noise, and it shows. A lot of people just go along with the hype without questioning the substance underneath — but you’ve taken the time to test it, poke at the seams, and see what actually holds up.
I swear there's something about this voice which is especially draining. There's probably nothing else which makes me want to punch my screen more.
tempodox 2 hours ago [-]
Which makes me wonder whether there is an SI unit for sycophantic sliminess, because the first paragraph of your answer is dripping with it.
UncleOxidant 1 hours ago [-]
The first paragraph seems like it's written by AI.
somenameforme 1 hours ago [-]
I'm fairly sure it was a subtle joke.
amanaplanacanal 1 hours ago [-]
I think that was the point.
enraged_camel 4 hours ago [-]
>> To each his own. I'm completely drained after 30 min of "discussing" with an LLM, which is essentially an overconfident idiot.
I'm completely drained after 30 minutes of browsing Google results, which these days consist of mountains of SEO-optimized garbage, posts on obscure forums, Stackoverflow posts and replies that are either outdated or have the wrong accepted answer... the list goes on.
dboreham 3 hours ago [-]
> an overconfident idiot
So we are close to an AI president.
jerf 2 hours ago [-]
The accusations that politicians are already overusing AI are flying, and given the incentives I wouldn't be surprised more of the internal functioning of all modern governments are already more LLM-based AI than we'd realize. Or particularly appreciate.
By that I don't mean necessarily the nominal function of the government; I doubt the IRS is heavily LLM-based for evaluating tax forms, mostly because the pre-LLM heuristics and "what we used to call AI" are probably still much better and certainly much cheaper than any sort of "throw an LLM at the problem" could be. But I wouldn't be surprised that the amount of internal communication, whitepapers, policy drafts and statements, etc. by mass is probably already at least 1/3rd LLM-generated.
(Heck, even on Reddit I'm really starting to become weary of the posts that are clearly "Hey, AI, I'm releasing this app with these three features, please blast that out into a 15-paragraph description of it that includes lots of emojis and also describes in a general sense why performance and security are good things." and if anything the incentives slightly mitigate against that as the general commenter base is starting to get pretty frosty about this. How much more popular it must be where nobody will call you out on it and everybody is pretty anxious to figure out how to offload the torrent-of-words portion of their job onto machines.)
StefanBatory 2 hours ago [-]
In my country a MP of lower house sent out a tweet generated by LLMs.
As in, copied it with a prompt in.
ben_w 1 hours ago [-]
Even before LLMs, politicians (and celebrities) had other people tweet for them. IIRC, I've met someone who tweeted on behalf of Jill Stein.
Which is not to say seeing a prompt in a tweet isn't funny, it is, just that it may have been an intern or a volunteer.
amanaplanacanal 1 hours ago [-]
They may be completely insane, but at least the president makes his own tweets! I mean truths.
tempodox 2 hours ago [-]
LLMs have to go a long way before their ideas are as outrageous as those of The Current Occupant Of The President's Chair.
pyman 9 hours ago [-]
I asked my students to write a joke about AI. Sometimes humor is the best way to get people to talk about their fears without filters. One of them wrote:
"I went to work early that day and noticed my monitor was on, and code was being written without anyone pressing any keys. Something had logged into my machine and was writting code. I ran to my boss and told him my computer had been hacked. He looked at me, concerned, and said I was hallucinating. It's not a hacker, he said. It's our new agent. While you were sleeping, it built the app we needed. Remember that promotion you always wanted? Well, good news buddy! I'm promoting you to Prompt Manager. It's half the money, but you get to watch TikTok videos all day long!'"
Hard to find any real reassurance in that story.
dakiol 6 hours ago [-]
Why do we assume that Prompt Engineering is going to pay less money? As usual, what one brings to the company is value, and if AI-generated code needs to be prompted first and reviewed later, I don’t see how prompters in the future could earn less than software engineers now.
Prompt engineering is like singing: sure thing everyone can physically sing… now whether it’s pleasant listening to them is another topic.
cardanome 5 hours ago [-]
The first thing any big technical revolution causes is suffering for a lot of people.
It can bounce back over time and maybe leave us better off than before but the short term will not be pretty. Think industrial revolution where we had to stop companies by law from working children to literal death.
Whether the working man or the capital class profits from the rise of productivity is a questions of political power.
Especially we as software engineers are not prepared for this fight as unions barely exist in our field.
We already saw mass layoffs by the big tech leaders and we will see it in smaller companies as well.
Sure there will always be need for experienced devs in some fields that a security critical or that need to scale but that simple CRUD app that serves 4 consecutive users? Yeah, Greg from marketing will be able to prompt that.
It doesn't need be the case that prompt engineers are paid less money, true. But with us being so disorganized the corporations will take the opportunity to cut cost.
bgwalter 5 hours ago [-]
> Especially we as software engineers are not prepared for this fight as unions barely exist in our field.
You can fight without unions. Tell the truth about LLMs: They are crutches for power users that do not really work but are used as excuses for firing people.
You can refuse to work with anyone writing vapid pro-LLM blog posts. You can blacklist them in hiring.
This addresses the union part. It is true that software engineers tend to be conflict averse and not very socially aware, so many of them follow the current industry opinion like lemmings.
If you want to know how to fight these fights, look at the permanent government bureaucracies. They prevail in the face of "new" ideas every 4 years.
jopsen 3 hours ago [-]
> If you want to know how to fight these fights, look at the permanent government bureaucracies. They prevail in the face of "new" ideas every 4 years.
Search youtube for "yes minister" :)
-----
On topic, I think it's a fair point that fighting is borderline useless. Companies that don't embrace new tech will go out of business.
That said, it's entirely unclear what the implications will be. Often new capabilities doesn't mean the industry will shrink. The industry haven't shrunk as a result of 100x increase in compute and storage, or decrease in size and power usage.
Computers just became more useful.
I don't think we should be too excited about AI writing code. We should be more excited about the kinds of program we can write now.
There is going to be a new paradigm of computer interaction.
jplusequalt 2 hours ago [-]
>You can fight without unions.
And you can fly without wings--just very poorly.
Unions are extremely important in the fight of preserving worker rights, compensation, and benefits.
jdright 4 hours ago [-]
> This addresses the union part.
lol, good luck with that.
you thinking that one or two people doing non organized _boycott_ is the same thing as an union tell a lot about you.
bgwalter 4 hours ago [-]
I didn't mention one or two people, I mentioned a theoretical strategy that is also employed by non-unionized bureaucracies. And I admitted that the strategy is impeded by the fact that many software developers are self-loathing people with no spine.
It is possible that obedient people need highly paid union bosses, i.e., new leaders they can follow.
jdright 3 hours ago [-]
You're wrong, obedient people are exactly the type that don't need unions, they are obedient and accept anything.
Unions are for people that don't accept anything and know that they are a target taking action alone or in non organized ways.
Unions are the way to multiply the forces and work as a group with common interests, it is for people that are not extremely selfish and egocentric.
redeeman 3 hours ago [-]
why do you need a union? everyone should just set their personal standards for what they accept/demand, and then let the market sort it out? someone wants to work for $20/hour programming, in a fashion that can satisfy some demand? great, then I will simply NOT be doing that job. Everyone wins even though I do not get that particular job. Someone is willing to work 7 days a week as they prefer to grind to earn more money? good on them. Its not gonna be me, They win the job, I dont.
Nobody wants to inhale toxic fumes in some factory? well then the company had better invest in safety equipment, or work dont get done. We dont need a union for this
aaronbaugher 3 hours ago [-]
History, especially the industrial age, says that attitude leads to a race to the bottom. There's always someone who's willing to work for a little less, in a little shittier conditions, to pack a few more family members into a shitty apartment to make do.
If you leave it up to each worker to fend for himself with no negotiating power beyond his personal freedom to walk out, you get sweatshops and poorhouses in any industry where labor is fungible. If you want nice societies where average people can live in good homes with yards and nearby playgrounds and go to work at jobs that don't destroy their bodies and souls, then something has to keep wages at a level to support all that.
I'm not necessarily a fan of unions; I think in many cases you end up with the union screwing you from one side while the corporation screws you from the other. And the public sector unions we have today team up with the state to screw everyone else. But workers at least need the freedom to organize, or all the pressure on wages and conditions will be downward for any job that most people can do. The alternative is to have government try to keep wages and conditions up, and it's not good at that, so it just creates inflation with wages trailing behind.
sokoloff 13 minutes ago [-]
Tech workers have the freedom to unionize in the US; with exceedingly rare exceptions, they’ve overwhelmingly not chosen to do so.
triceratops 1 hours ago [-]
> Nobody wants to inhale toxic fumes in some factory? well then the company had better invest in safety equipment, or work dont get done. We dont need a union for this
We tried that in the past. The work still got done, and workers just died more often. If you want to live in that reality move to a developing country with poor labor protections.
StefanBatory 2 hours ago [-]
> You can refuse to work with anyone writing vapid pro-LLM blog posts. You can blacklist them in hiring.
This works only if everyone is on with this. If they're not, you're shooting yourself in the foot while doing job hunting.
trod1234 29 minutes ago [-]
> The first thing any big technical revolution causes is suffering for a lot of people.
This all assumes that such revolutions are built on resiliency and don't actually destroy the underpinning requirements of organized society. Its heavily skewed towards survivor bias.
Our greatest strength as a species is our ability to communicate knowledge, experience, and culture, and act as one large overarching organism when threats appear.
Take away communication, and the entire colony dies. No organization can occur, no signaling. There are two ways to take away communication, you prevent it from happening, or you saturate the channel to the Shannon Limit. The latter is enabled by AI.
Its like an ant hill or a bee hive where a chemical has been used to actively and continually destroy the pheromones the ants rely upon for signalling. What happens? The workers can't work, food can't be gathered, the hive dies. The organism is unable to react or adapt. Collapse syndrome.
Our society is not unlike the ant-hill or bee hive. We depend on a fine balance of factors doing productive work and in exchange for that work they get food, or more precisely money which they use to buy food. Economy runs because of the circulation of money from producer to factor to producer. When it sieves into fewer hands and stays there, distortions occur, these self-sustain and then eventually we are at the point where no production can occur because monetary properties are lost under fiat money printing. There is a narrow working range where outside the range on each side everything catastrophically fails. Hyper-inflation/Deflation
AI on the other hand eliminates capital formation of the individual. The time value of labor is driven to zero. There is a great need for competent workers for jobs, but no demand because no match can occur; communication is jammed. (ghost jobs/ghost candidates)
So you have failures on each end, which self-sustain towards socio-economic collapse. No money circulation going in means you can't borrow from the future through money printing. Debasement then becomes time limited and uncontrollable through debt traps, narrow working price range caused by consistent starvation of capital through wage suppression opens the door to food insecurity, which drives violence.
Resource extraction processes have destroyed the self-sustaining flows such that food in a collapse wouldn't even support half our current population, potentially even a quarter globally. 3 out of 4 people would die. (Malthus/Catton)
These things happen incredibly slowly and gradually, but there is a critical point we're about 5 years away from it if things remain unchanged, there is the potential that we have already passed this point too. Objective visibility has never been worse.
This point of no return where the dynamics are beyond any individual person, and after that point everyone involved in that system is dead but they just don't know it yet.
Mutually Assured Destruction would mean the environment becomes uninhabitable if chaos occurs and order is lost in such a breakdown.
We each have significant bias to not consider the unthinkable. A runaway positive feedback system eventually destroys itself, and like a dam that has broken with the waters rushing towards individuals; no individual can hold back those forces.
paulcole 5 hours ago [-]
> The first thing any big technical revolution causes is suffering for a lot of people.
Didn’t Greg-from-marketing’s life just get a lot better at the same time?
somenameforme 45 minutes ago [-]
People aren't paid by value brought to companies, they're paid by the scarcity of their skill. Your analog is actually perfect for this. There's a reason saying you want to be a professional singer is generally something only a child would say. It's about as reliable a career as winning the lottery, simply because everybody can sing, lots of them quite decently. And so singer, as a career, mostly isn't a thing - it's a hobby with some distant hope of going Oliver Anthony at some point.
Software development has a huge barrier to entry which keeps the labor pool relatively small, which keeps wages relatively high. There's going to be a way larger pool of people capable of 'prompt engineering' which is going to send wages proportionally way down.
blindriver 12 minutes ago [-]
Because the 10% difference between the best prompt engineer and a mediocre prompt engineer won't usually make a noticeable difference in output or productivity. There's no reason to specialize in it or pay extra because the gains are ephemeral.
RugnirViking 5 hours ago [-]
There are people trying very hard (and succeeding) to CREATE the impression it will pay less money. Pay in general is extremely vibes based. A look at well... anything in the economy shows that. There are constant shortages of low paying jobs, and gluts of high paying jobs
giantg2 4 hours ago [-]
That's the opposite of what I'm seeing. I see plenty of openings at Walmart, bus drivers, etc. I see very few openings in many higher paying jobs (healthcare might be an exception). Even the dev jobs I'm finding are low paid at small companies, like $70k per year low (and this isn't a low cost area).
bluGill 2 hours ago [-]
Wait a couple years before you start stating anything as a trend. There have been several downturns over my lifetime (I'm 50) where it was hard to find a tech job, and several periods of good times where tech people were in high demand. Until several years have past you cannot tell the difference between a temporary downturn and the end of an era.
Every time things turn bad a lot of people jump out and yell it is the end of the tech. They have so far been wrong. Only time will tell if they are right this time, though I personally doubt it.
giantg2 14 minutes ago [-]
I'm not saying it won't turn around, but its been about 2-3 years.
al_borland 5 hours ago [-]
I suspect it only looks like there are a lot of high paying jobs, because they are so much harder to fill, due to a there being so few qualified candidates... hence the high pay. Supply and demand.
bluGill 2 hours ago [-]
Supply and demand is also influenced by ability to become qualified. I'm not qualified to flip burgers, but any fast food place could get me qualified in just a few hours. I'm not qualified to be a doctor and it would take me years of training to get that qualification. I've met people who failed to qualify as a burger flipper - you can correctly guess by that statement they are very disabled, I've met many who failed to qualify as a doctor, and all are smart people since people who are not wouldn't even try.
jplusequalt 2 hours ago [-]
>Why do we assume that Prompt Engineering is going to pay less money
It objectively takes less expertise and background knowledge to produce semi-working code. That lowers the barrier to entry, allowing more people to enter the market, which drives down salaries for everyone.
bgwalter 4 hours ago [-]
Singing properly requires decades of training. Prompt engineering is like a 5 year old asking his parents for an ice cream. Some strategies are more successful than others.
giantg2 4 hours ago [-]
If you need fewer prompt engineers than developers to do the same work, and if prompt engineering is easier than developing meaning all developers can do it, then you need up with a massive labor oversupply.
bapak 4 hours ago [-]
How many programmers did you need 40 years ago to write MS DOS programs? As you become more productive, more is expected of you. Instead of spending 10 days coloring pixels on the screen, now you're expected to push whole UIs in the same amount of time. Whether this is enabled by high-level languages, tools or AI is irrelevant.
giantg2 11 minutes ago [-]
What am I to be productive on? The bottleneck now is finding a business idea that's viable.
rhines 2 hours ago [-]
I wonder if we need teams generating dozens of UIs or whatever every day though. There may (or may not) be a limit to how much value-adding work is available to do, or at least diminishing returns that no longer justify high salaries.
bluGill 2 hours ago [-]
YEs we do - your point that there might be too many is valid, but a modern UI when done right is much more accessible to the "common man" than a MSDos and so all the time those teams put in is more than made up for in all the time you saving not having to teach all the people who will use the program and thus we need far more teams than in the MSDos days when we couldn't make a good UI in the first place.
roland35 4 hours ago [-]
Well there's value, but also supply and demand! If prompt engineering is easier that means more people will be able to do it!
chii 5 hours ago [-]
> Prompt engineering is like singing
i think you got the analogy wrong. Not everyone can sing professionally, but most people can type text into a text-to-speech synthesis system to produce a workable song.
lubujackson 4 hours ago [-]
Maybe a better analogy is that now anyone can use autotune now and actually sing, but you still have to want to do it and put in the effort. Very few people do.
trod1234 44 minutes ago [-]
> Why do we assume that Prompt Engineering is going to pay less money.
I supposed because every new job title that has come out in the last 20+ years has followed the same approach of initially the same or slightly more money, followed by significant reductions in workforce activities shortly thereafter, followed by coordinated mass layoffs and no work after that.
When 70% of the economy is taken over by a machine that can work without needing food, where can anyone go to find jobs to feed themselves let alone their children.
The underlying issues have been purposefully ignored by the people who are pushing these changes because these are problems for next quarter, and money printing through non-fraction reserve banking decouples the need to act.
Its all a problem for next quarter, which just gets kicked repeatedly until food security becomes a national security issue.
Politician's already don't listen to what people have to say, what makes you think they'll be able to do anything once organized violence starts happening because food is no longer available, because jobs are no longer available.
The idiots and political violence we see right now is nothing compared to what comes when people can't get food, when their mindset changes from we can work within the system to there is no out only through. When existential survival depends on removing the people responsible by any means, these things happen, and when the environment is ripe for it; they have friends everywhere.
UBI doesn't work because non-market socialism fails. You basically have a raging fire that will eventually reach every single person and burn them all alive, and it was started by evil blind idiots that wanted to replace human agency.
chasd00 4 hours ago [-]
"I went to work early that day and noticed my monitor was on, and code was being written without anyone pressing any keys. Something had logged into my machine and was writting code. I ran to my boss and told him my computer had been hacked. He looked at me, concerned, and said I was hallucinating. ..."
it would have been funnier if the story then took a turn and ended with it was the AI complaining about a human writing code instead of it.
pyman 35 minutes ago [-]
Black mirror season 8? xD
cjblomqvist 37 minutes ago [-]
That goes both ways. As with math, it's sometimes not wise to look at the answer as soon as you stumble upon something you can't solve immediately - sometimes it's good to force the person learning to think deeper and try to understand the problem more thoroughly. It's also a skill of it's own to be able to cope with such situations and not just bail/give up/do something else.
I fear this will be more and more of a problem with the TikTok/instant gratification/attention is only good for less than 10 seconds -generation. Deep thinking has great value in many situations.
"Funnily" enough, I see management more and more reward this behavior. Speed is treated as vastly more important than driving in the right direction, long-term thinking. Quarterly reports, etc etc.
blindriver 16 minutes ago [-]
I 100% agree with this. Part of the problem is getting stuck with bad documentation or a bad API, and asking ChatGPT to generate sample code is really beneficial to keeping me going instead of mothballing an idea for months or forever.
screye 56 minutes ago [-]
It works both ways.
Yes, it's supportive and helps you stay locked in. But it also serves as a great frustration lightning rod. I enjoy being an unsavory person to the LLM when it behaves like a buffoon.
Sometimes you need a pressure release valve. Better an LLM than a person.
P.S: Skynet will not be kind to me.
giantg2 4 hours ago [-]
I don't feel this way at all. If anything, it's a morale drain. There's less cooperation since you're expected to ask AI. There's also limited career pathing since we want even fewer junior or mids, replacing them with AI.
pier25 1 hours ago [-]
I'm always amazed that people feel anything when chatting with an AI bot.
Don't people realize it's a machine "pretending" to be human?
bondarchuk 1 hours ago [-]
People feel things reading books, watching movies, even animated ones that have no people in them, looking at abstract art... Why should this be any different?
pier25 1 hours ago [-]
Do you take eg advertisement at face value? Even when you know they're trying to convince you to accept some idea of the brand and buy something?
reactordev 1 hours ago [-]
I have had to correct AI enough to know it’s the equivalent of a cocky junior dev that read a paper once.
I’ll stick to human emotional support.
khurs 57 minutes ago [-]
Isn’t that the same as posting on stack overflow, Reddit or other forums saying you are stuck and getting an answer.
With LLM it’s speed - seconds rather than the minutes or hours as per stack overflow which is main benefit.
hippari2 1 hours ago [-]
I find that with enough wrong answers it feels like you are fighting alone again haha.
anonzzzies 6 hours ago [-]
It is exactly how I use it personally the most: it will crap down a massive amount of plumbing that I really do not feel like doing myself at all. So when I think of procrastinating, I tell it to write something: after 30 minutes I will have something that would be procrastinate me from hours to never doing it at all. Now its 'almost done anyway, so might as well finish it'. Then I spend 3 months hacking on it while, at any point, getting the AI do the annoying stuff I know im not going to do or postpone. If only for that... I find bug fixing more rewarding and easier than writing crap from scratch anyway.
xeromal 4 hours ago [-]
I hate to admit this but I was struggling with a dbt at work and I had copilot scan what I Was doing and it found a type that was almost impossible for me to notice. lol. It really can be useful.
Kiro 4 hours ago [-]
Yes, I've been saying that AI helps with procrastination a lot.
croes 3 hours ago [-]
Could work in the other direction. When you are stuck and get the solution from the AI you lose the feeling of achievement because it’s done by somewhat/someone else
psunavy03 23 hours ago [-]
"Great news, boss! We invented this new tool that allows nontechnical people to write code in English! Now anyone can deploy applications, and we don't have to hire all those expensive developers!"
"Wow, show it to me!"
"OK here it is. We call it COBOL."
musicale 12 hours ago [-]
FORTRAN (FORmula TRANslator) was another "AI" project in "automatic programming":
"Before 1954, almost all programming was done in machine language or assembly language. Programmers rightly regarded their work as a complex, creative art that required human inventiveness to produce an efficient program."
"The IBM Mathematical Formula Translating System or briefly, FORTRAN, will comprise a large set of programs to enable the IBM 704 to accept a concise formulation of a problem in terms of a mathematical notation and to produce automatically a high speed 704 program for the solution of the problem."
Fortran promised to eliminate debugging. In 2015, I taught React is a functional programming way to create very fast, bug free apps and the project manager found ways to push us to the hair-on-fire status quo.
But it still has been immensely useful and a durable paradigm, even though usage hasn't been exactly as thought.
soco 7 hours ago [-]
Excel enters the chat
bonoboTP 4 hours ago [-]
For some strange reason Excel really managed to do it. Many many people who don't think of themselves anywhere near being a programmer, somehow get at ease in front of Excel enough that they often inadvertently and kind of unawarely end up learning programming concepts and creating much more complex computational applications than its been possible with any other tool for non-developers.
Izkata 3 hours ago [-]
I have a theory on that, based on something I do that over the years I've learned a lot of my co-workers don't do: When I'm reading code, I have the contents of the variables all in mind and am manipulating them as I read the code. When describing it a couple of times they've said "oh, like a human compiler"... So I really don't know what's going on in their heads, but this seems like the reason I can understand code I haven't seen before faster than most of them.
Spreadsheets flip the usual interface from code-first to data-first, so the program is directly presenting the user with a version of what I'm doing in my head. It allows them to go step-by-step building up the code while focusing on what they want to do (transform data) instead of having to do it in their head while focusing on the how (the code).
bonoboTP 2 hours ago [-]
Yes, laying everything out in a 2D grid is just quite intuitive, like arranging objects on a tabletop. Also, it's flat, there's little nesting and you don't have to come up with abstraction hierarchies and loops are typically just unrolled in place.
You're joking but it's true. I'm sure you know that. SQL had similar claims... Declarative, say what you need and the computer will do for you. Also written in English.
charlieyu1 6 hours ago [-]
Don’t think the person was joking. It was literally the promise of COBOL
ako 23 hours ago [-]
And compared to what we had before SQL, it is much easier to use, and a lot more people are able to use it.
noworriesnate 23 hours ago [-]
But software developers often struggle to use sql and prefer using ORMs or analytical APIs like polars; the people who excel at sql are typically not programmers, they’re data engineers, DBAs, analysts, etc.
Maybe a similar bifurcation will arise where there are vibe coders who use LLMs to write everything, and there are real engineers who avoid LLMs.
Maybe we’re seeing the beginning of that with the whole bifurcation of programmers into two camps: heavy AI users and AI skeptics.
downrightmike 1 minutes ago [-]
They're all real programmers John
ruszki 11 hours ago [-]
What you can achieve with the standard SQL is taught on universities. The whole package. I’ve never met a developer, who struggled with that. When you use ORMs you need to follow SQL’s logic anyway. People use ORMs to avoid painful data conversions. Not to avoid the logic. Data engineers, DBAs, analysts, etc excel in specific databases, not in “SQL”.
FranzFerdiNaN 10 hours ago [-]
Ive worked in BI and data engineering my whole career and I’ve met plenty of programmers who struggled immensely with SQL once it went further than select and group by. And don’t get me started about their database design skills. It’s way too often a disaster hidden behind “it works for the software so good enough”.
Im more surprised by software engineers who do know these things than by the ones who don’t.
maccard 9 hours ago [-]
I’ve worked with gameplay programmers who can’t do simple 3D math, c++ programmers who fundamentally don’t understand pointers, backend developers who didn’t understand globals were shared state and cause race conditions, etc.
It’s not that SQL is hard, it’s that for any discipline the vast majority of people don’t have a solid grasp of the tools they’re using. Ask most tradespeople about the underlying thing they’re working with and you’ll have the same problem.
atomicnumber3 53 minutes ago [-]
Aren't data engineers programmers? That is to say, a data engineer is-a software engineer?
I share your sentiment though - I'm a data engineer (8 years) turned product engineer (3 years) and it astounds me how little SQL "normal" programmers know. It honestly changed my opinion on ORMs - it's not like the SQL people would write exceeds the basic select/filter/count patterns that is the most that non-data people know.
adalacelove 12 hours ago [-]
I'm a developer and:
- I hate ORMs, they are the source for a lot of obscure errors behind layers and layers of abstractions.
- I prefer analytical APIs for technical reasons, not just the language.
Reasons:
- I can compose queries, which in turn makes them easier to decompose
- It's easier to spot errors
- I avoid parsing SQL strings
- It's easier to interact with the rest of the code, both functions and objects
If I need to make just a query I gladly write SQL
eru 10 hours ago [-]
Well, the problem in ORM is the O. Objection-orientation is just a worse way to organise your data and logic than relational algebra.
It's just a shame that many languages don't support relational algebra well.
We had relations as a datatype and all the relevant operations over them (like join) in a project I was working on. It was great! Very useful for expressing business logic.
tpm 9 hours ago [-]
The problem in ORM is the M, the mapping is always lossy and a leaky abstraction.
idiotsecant 13 hours ago [-]
'real' engineers can use SQL just fine. This is a strange position to take.
collingreen 5 hours ago [-]
No true Scotsman would struggle with sql
sanderjd 12 hours ago [-]
> But software developers often struggle to use sql
Is this true? It doesn't seem true to me.
winnie_ua 4 hours ago [-]
Oh, sweet summer child.
Yes, there are so many so called developers in backend field of work who do not know how to do basic SQL. Anything bigger than s9imple WHERE clause.
I wouldn't even talk about using indexes in database.
nathanfig 22 hours ago [-]
Claude made this point while reviewing my blog for me: the mechanization of farms created a whole lot more specialization of roles. The person editing CAD diagrams of next year's combine harvester may not be a farmer strictly speaking, but farming is still where their livelihood comes from.
dredmorbius 22 hours ago [-]
Strictly speaking, farming is where all our livelihoods come from, in the greatest part. We're all living off the surplus value of food production.
(Also of other food, energy, and materials sourcing: fishing, forestry, mining, etc.)
This was the insight of the French economist François Quesnay in his Tableau économique, foundation of the Physiocratic school of economics.
> Strictly speaking, farming is where all our livelihoods come from, in the greatest part. We're all living off the surplus value of food production.
I don't think farming is special here, because food isn't special. You could make exactly the same argument for water (or even air) instead of food, and all of a sudden all our livelihoods would derive ultimately from the local municipal waterworks.
Whether that's a reductio ad absurdum of the original argument, or a valuable new perspective on the local waterworks is left as an exercise to the reader.
bluGill 2 hours ago [-]
Except in a few places drinkable water is in such abundance that nobody every spent significant effort trying to get it. Likewise for air, few people have ever spent much effort getting air to breathe - even in the worst polluted areas bottled air was reserved for airplanes, hospitals (and a few people with medical conditions), and scuba divers.
swader999 4 hours ago [-]
Land is scarce though. The amount of software work that needs doing might not be, it could be infinite or probably more tied to electrical capacity.
lipowitz 22 hours ago [-]
Removing jobs that could only be performed by those living near the particular fields with those that can be done anywhere makes jobs for the person willing to take the least satisfactory compensation for the most skill and work.
Working the summer fields was one of the least desirable jobs but still gave local students with no particular skills a good supplemental income appropriate for whichever region.
miki123211 7 hours ago [-]
depending on the job, it may also allow you to select for talent much better, which creates intense competition and raises salaries significantly.
A good example of this phenomenon is sports. Even thought it can't be done remotely, it's so talent dependent that it's often better to find a great player in a foreign country and ask them to work for you, rather than relying exclusively on local talent. If it could be a remote job, this effect would be even greater.
eru 10 hours ago [-]
Yes, but automating these away means that food becomes cheaper.
We increase the overall total prosperity with that automation.
lipowitz 8 hours ago [-]
Increasing total prosperity is the wrong goal if distribution is completely unregulated. Investor and real estate owning classes like the 1% get more, the salaries can trend down because food costs are down, in a deflation spiral the youth are perpetual dependents and/or debtors who can't possibly earn enough over day to day costs given global competition includes people with no debts or debts from an economy that was less wealthy.
22 hours ago [-]
ameliaquining 23 hours ago [-]
Is that really because of the English-esque syntax, rather than because it was a step forward in semantic expressivity? If SQL looked like, say, C#'s LINQ method syntax, would it really be harder to use?
9rx 22 hours ago [-]
> Is that really because of the English-esque syntax
Well, what we had before SQL[1] was QUEL, which is effectively the same as Alpha[2], except in "English". Given the previous assertion about what came before SQL, clearly not. I expect SQL garnered favour because it is tablational instead of relational, which is the quality that makes it easier to understand for those not heavy in the math.
[1] Originally known as SEQUEL, a fun word play on it claiming to be the QUEL successor.
[2] The godfather language created by Codd himself.
dmkolobov 10 hours ago [-]
Do you have any advice for understanding the difference between "relational" and "tablational"? I remember hearing something about how SQL is not really relational from my college professor, but we never really explored that statement.
9rx 52 minutes ago [-]
Quite simply: A relation is a set of tuples, while a table is a list of tuples.
The Alpha/QUEL linage chose relations, while SQL went with tables. Notably, a set has no ordering or duplicates — which I suggest is in contrast to how the layman tends to think about the world, and thus finds it to be an impediment when choosing between technology options. There are strong benefits to choosing relations over tables, as Codd wrote about at length, but they tend to not show up until you get into a bit more complexity. By the time your work reaches that point, the choice of technology is apt to already be made.
With care, SQL enables mimicking relations to a reasonable degree when needed, which arguably offers the best of all worlds. That said, virtually all of the SQL bugs I see in the real world come as a result of someone not putting in enough care in that area. When complexity grows, it becomes easy to overlook the fine details. Relational calculus would help by enforcing it. But, tradeoffs, as always.
razakel 1 hours ago [-]
Per the SQL specification:
>SQL [...] is a database language [...] used for access to pseudo-relational databases that are managed by pseudo-relational database management systems (RDBMS).
>SQL is based on, but is not a strict implementation of, the relational model of data, making SQL “pseudo-relational” instead of truly relational.
>The relational model requires that every relation have no duplicate rows. SQL does not enforce this requirement.
>The relational model does not specify or recognize any sort of flag or other marker that represents unspecified, unknown, or otherwise missing data values. Consequently, the relational model depends only on two-valued (true/false) logic. SQL provides a “null value” that serves this purpose. In support of null values, SQL also depends on three-valued (true/false/unknown) logic.
Or, in other words, "relation" does not mean the relations between the tables as many assume: the tables, as a set of tuples, are the relations.
AdieuToLogic 15 hours ago [-]
Before SQL became an industry standard, many programs which required a persistent store used things like ISAM[0], VISAM (a variant of ISAM[0]), or proprietary B-Tree libraries.
None of these had "semantic expressivity" as their strength.
> If SQL looked like, say, C#'s LINQ method syntax, would it really be harder to use?
SQL and many DSLs (JIRA…) are actually used by plenty of non-technical users. Anyone who wants to build their own reports and do basic data analysis has sufficient incentive to learn it.
They are very much the exception that proves the rule though.
rixed 10 hours ago [-]
Or QBE, "Query By Exemple", that was another try by IBM to make a query language directly usable by anyone.
veqq 22 hours ago [-]
Er, have you heard of datalog or Prolog? Declarative programming really does work. SQL was just... Botched.
sanderjd 12 hours ago [-]
I think SQL is better than datalog. I suspect this is one of those opinions that may be somewhat outside consensus on a forum like HN, but is strongly within the consensus more broadly.
glimshe 19 hours ago [-]
Yes. And I think SQL is actually pretty good for what it does. My point, as the parent's (I suppose) is that we've heard this "XYZ, which uses natural language, will kill software development" before.
dredmorbius 22 hours ago [-]
I'd long ago (1990s-era) heard that the original intent was that office secretaries would write their own SQL queries.
(I'd love for someone to substantiate or debunk this for me.)
rsynnott 4 hours ago [-]
That's always the promise of these things; non-specialists will be able to program now! This has been going on since COBOL. The one case where it arguably worked out to some extent was spreadsheets.
bluGill 2 hours ago [-]
Anyone with complex spreadsheets (which is a lot of companies) has a few programmers with the job of maintaining them. The more training those people have in "proper programming" the better the spreadsheets work.
jimbokun 2 hours ago [-]
I would say that failed with SQL but succeeded with Excel. If you replace "office secretaries" with "office workers" in general.
bazoom42 21 hours ago [-]
Early on, programming was considered secretarial work.
AdieuToLogic 15 hours ago [-]
> Early on, programming was considered secretarial work.
Incorrect.
Encoding a program was considered secretarial work, not the act of programming itself. Over time, "encoding" was shortened to "coding."
This is why the industry term "coder" is a pejorative descriptor.
eru 10 hours ago [-]
> This is why the industry term "coder" is a pejorative descriptor.
For some people some of the time. I don't think that's true in general.
0points 9 hours ago [-]
> This is why the industry term "coder" is a pejorative descriptor.
It is not.
brabel 5 hours ago [-]
It used to be widely seen as such. See for example Stallmanns latest post where he mentions that. Coder was not the same as programmer, it was the lesser half of the job. Nowadays the term has lost its original meaning.
bitpush 23 hours ago [-]
Bravo. This is the exact sentiment I have, but you expressed in a way that I could never have.
Most people miss the fact that technical improvements increases the pie in a way that was not possible before.
When digital cameras became popular, everybody become a photographer. That only made the world better, and we got soo many more good photographers. Same with YouTube & creativity.
And same with coding & LLMs. World will have lots more of apps, and programmers.
munificent 22 hours ago [-]
> That only made the world better, and we got soo many more good photographers.
I disagree with the "only" part here. Imagine a distribution curve of photos with shitty photos on the left and masterpieces on the right and the height at the curve is how many photos there are to be seen at that quality.
The digital camera transition massively increased the height of the curve at all points. And thanks to things like better autofocus, better low light performance, and a radically faster iteration loop, it probably shift the low and middle ends to the right.
It even certainly increased the number number of breathtaking, life-changing photos out there. Digital cameras are game-changes for photographic journalists traveling in difficult locations.
However... the curve is so high now, the sheer volume of tolerably good photos so overwhelming, that I suspect that average person actually sees fewer great photos than they did twenty years ago. We all spend hours scrolling past nice-but-forgottable sunset shots on Instagram and miss out on the amazing stuff.
We are drowning in a sea of "pretty good". It is possible for there to be too much media. Ultimately, we all have a finite amount of attention to spend before we die.
DavidPiper 17 hours ago [-]
Thank you for describing this so eloquently.
Meaning no disrespect to photographers, I'm starting to think that a probable outcome of all the AI investment is a sharp uptick in shovelware.
If we can get AIs to build "pretty good" things - or even just "pretty average" things - cheaply, then our app stores, news feeds, ad feeds, company directives, etc, will be continuously swamped with it.
eru 10 hours ago [-]
> Meaning no disrespect to photographers, I'm starting to think that a probable outcome of all the AI investment is a sharp uptick in shovelware.
You can use AI to filter out the shovelware, so you never have to see it.
shinedog 13 hours ago [-]
You hit this so hard it was impossible not to recognize. In every sense there is too much "ok" shit (in every media realm) that we cannot help but miss amazing stuff. Knowing that I don't have enough time for all the incredible things that technology has enabled crushes me.
jimbokun 2 hours ago [-]
All of the old, great classic movies are available for streaming somewhere.
I still find great value in the TCM cable channel. Simply because if I tune in at a random time, it's likely to be showing an excellent old film I either never heard of or never got around to watching.
The service they are offering is curation, which has a lot of value in an age of infinite content flooding our attention constantly.
munificent 1 hours ago [-]
...and now think how much worse this problem will become now that we're in the era of generative AI.
9 hours ago [-]
test6554 15 hours ago [-]
Experts warn that at current production levels, the supply of dick pics may actually outpace demand in a couple decades.
DavidPiper 14 hours ago [-]
I was under the impression that supply already vastly outstrips demand.
eru 10 hours ago [-]
Demand is very unevenly distributed. I think they are appreciated on Grindr.
kjkjadksj 14 hours ago [-]
It affects even the competent photographer. How many times do you see that photographer with all the gear sit in front of a literal statue and fire off a 30 shot burst in 2 seconds? I don’t envy these pro photo editors either today in sports. I wonder how many shots they have to go through per touchdown from all the photographers at the end zone firing a burst until everyone stands up and throws the ball back at the ref? After a certain point you probably have to just close your eyes and pick one of the shots that looks almost identical to another 400. Not a job for analysis paralysis people. I guess it sure beats having to wait for the slide film to develop.
bluGill 2 hours ago [-]
I suspect most of the time you can eliminate 300 of those 400 right away - they obviously are either too early or too late to capture the moment. On the remaining 100 you can choose any one (or more likely 5 as there are likely several moments - the moment the catch is made and the moment the athlete smiles as he realizes he made that catch).
The reason to take all 400 though as every once in a while one photo is obviously better than another for some reason. You also want several angles because sometimes the light will be wrong at the moment, or someone will happen to be in the way of your shot...
dotancohen 8 hours ago [-]
The AI is already picking out the best photo in those 400-shot bursts.
And sometimes it is even combining elements from different photos: Alice had her eyes closed in this otherwise great shot, but in this other shot her eyes were open. A little touch-up and we've got the perfect photo.
socalgal2 11 hours ago [-]
don't you just let the AI pick? I'm only half joking. I thought that was a feature added to smartphones a year or two ago?
dijksterhuis 22 hours ago [-]
> That only made the world better
Did it?
people now stand around on dance floors taking photos and videos of themselves instead of getting on dancing and enjoying the music. to the point where clubs put stickers on phones to stop people from doing it.
people taking their phone out and videoing / photographing something awful happening, instead of doing something helpful.
people travel to remote areas where the population has been separated from humanity and do stupid things like leave a can of coke there, for view count.
it’s not made things better, it just made things different. whether that’s better or worse depends on your individual perspective for a given example.
so, i disagree. it hasn’t only made things better. it made some things easier. some things better. some things worse. some things harder.
someone always loses, something is always lost. would be good if more people in tech remembered that progress comes at a cost.
skeeter2020 13 hours ago [-]
Live music sucks when you're trying to watch the show and some dumb-dumb is holding their phone above their head to shoot the entire show with low-light, bad angle & terrible sound. NO ONE is going to watch that, and you wrecked the experience for many people. Put your phone away and live in the present, please...
thangalin 21 hours ago [-]
> people now stand around on dance floors taking photos and videos of themselves instead of getting on dancing and enjoying the music. to the point where clubs put stickers on phones to stop people from doing it.
There are other types of dances where dancers are far more interested in the dance than selfies: Lindy Hop, Blues, Balboa, Tango, Waltz, Jive, Zouk, Contra, and West Coast Swing to name a few. Here are videos from the Blues dance I help organize where none of the dancers are filming themselves:
Thank you for sharing your social media videos as evidence in a rebuttal to "camera phones are not all good; they're ubiquitous use has negative implication too". So delicious...
VonTum 16 hours ago [-]
The irony!
Though, I'll grant that there's not really a way to argue this without showing videos
kjkjadksj 14 hours ago [-]
That sort of dancing is basically a sport. You have to learn it, you have to get good at it after you learned it, and it is cardio after all. I think op was talking more about what you see in the edm scene these days. Where basically people aren’t there to dance like the old days or sing along like other genres, they are there to see a certain DJ and then they will post clips from the entire set on their instagram story. And they can do this because the dancing they are doing at the edm show is super passive kind of dancing where you are just swaying a little so you can hold the phone stably at the same time. If you were dancing like how they’d dance at the edm concerts in the 90s all rolling on molly it would be like your blues swing where its just too physical to do anything but rave around flinging your arms all around shirtless and sweaty.
ttoinou 11 hours ago [-]
Look into contact impro and ecstatic dance : cellphones are forbidden and you can dance however you like it
flashgordon 21 hours ago [-]
I would add one thing though. The pie definitely gets bigger - but i feel there is a period of "downsizing" that happens. I think this is becuase of lack of ideas. When you have tool that (say) 10xes your productivity, its not that bosses will have ideas to build 10x the number of things - they will just look to cut costs first (hello lack of imagination and high interest rates).
sarchertech 16 hours ago [-]
We’ve had many improvements that increased productivity at least as much as current LLMs, and I don’t think any of them ever temporarily caused downsizing in the total number of programmers.
pipes 21 hours ago [-]
I thought photographers don't get paid well anymore due market saturation and few skills required to get a good photo?
WJW 6 hours ago [-]
This implies photographers used to be paid well in the past, which isn't true. Like painting or rock music, photography has always been a winner-takes-all kind of market where a select few can get quite wealthy but the vast majority will be struggling forever.
bluGill 1 hours ago [-]
While photographer was never a sure path to rich and famous, professionals used to do very good business and make a good living.
Demand is way down because while a $5000 lens on a nice camera is better than my phone lens, my phone is close enough for most purposes. Also my phone is free, in the days of film a single roll of film by the time you developed it costs significant money (I remember as a kid getting a camera for my birthday and then my parents wouldn't get me film for it - on hindsight I suspect every roll of film cost my dad half an hour of work and he was a well paid software developer). This cost meant that you couldn't afford to practice taking pictures, every single one had to be perfect. So if you wanted a nice picture of the family it was best to pay a professional who because of experience and equipment was likely to take a much better one than you could (and if something went wrong they would retake for free).
kjkjadksj 14 hours ago [-]
It is still as hard as its been to get a good photo. They had full auto film cameras that could take good photos in the 70s but the devil is always the edge cases and the subconscious ability to take an evenly exposed (in the Ansel Adams definition not auto camera exposure definition), well composed image at the decisive moment. Understanding how lighting works (either natural, or different artificial light like flash or studio lighting) is also not easy.
It is pretty hard to break out but people still make names for themselves either from experience on assignments like the old days but also from instagram and other social media followings. People still need weddings shot and professional portraits taken which takes some skill in understanding the logistics of how to actually do that job well efficiently and managing your equipment.
bluGill 1 hours ago [-]
As I said in a sibling reply: practice is much easier and so it is much easier to get good. Film was expensive and so few could afford to become good photographers. Sure everyone had a camera, many of them nice SLRs with decent lens (but probably not auto focus - for both better and worse), but it wouldn't take a lot of photos to exceed that cost in film.
bluefirebrand 18 hours ago [-]
> World will have lots more of apps, and programmers.
This is actually bad for existing programmers though?
Do you not see how this devalues your skills?
platevoltage 18 hours ago [-]
I see your point, but I'm having personally having a different experience.
A client of mine has gotten quite good at using Bolt and Lovable. He has since put me on 3 more projects that he dreamed up and vibe coded that would just be a figment of his imagination pre-AI.
He knows what's involved in software development, and knows that he can't take it all the way with these tools.
sarchertech 16 hours ago [-]
There are far more programmers now than in 1980, yet the average programmer makes far more (inflation adjusted) now.
Funes- 3 hours ago [-]
And the quality of what they develop is in the gutter, on average.
jimbokun 1 hours ago [-]
It was in 1980, too.
Funes- 35 minutes ago [-]
Absolutely not, not to the same extent. That's a really illogical statement on your part, considering that the technical barrier to entry to even begin to think about developing a program in 1980 was much, much higher than what it's been for more than a decade now.
kjkjadksj 14 hours ago [-]
Thank the Bangalore office for that.
FirmwareBurner 11 hours ago [-]
How much online shopping could you do from your PC in 1980? How many people had smartphones in 1980?
That's why sw devs salaries went up like crazy in our time and not in 1980.
But what new tech will we have, that will push the SW dev market demand up like internet connected PCs and smartphones did? All I see is stagnation in the near future, just maintaining or rewriting the existing shit that we have, not expanding into new markets.
WJW 6 hours ago [-]
Maintaining and rewriting existing shit is quite well paying though, and also something that AI seems to struggle with. (Funnily enough, AI seems to struggles even more with refactoring vibecoded projects than with refactoring human-written apps. What that says about the quality of the vibe coded code I don't know.)
bitpush 18 hours ago [-]
In the current state, yes. But that is also an opportunity, isn't it?
When online flight bookings came about, travel agents were displaced. The solution isn't "let's stop online flight bookings sites and protect travel agents" because that's an inefficient system
dijksterhuis 16 hours ago [-]
Why does every system need to be efficient?
hackernoops 15 hours ago [-]
Fractional reserve lending, rehypothecation, etc.
komali2 14 hours ago [-]
Under capitalism, because greater margins. Under not-capitalism, so as to free up resources and labor for other things or just increase available downtime for people.
Funes- 3 hours ago [-]
>Under capitalism, because greater margins
Under capitalism, or late-stage capitalism, if you will, more efficient procedures aren't normally allowing for greater margins. There are countless examples of more exploitative and wasteful strategies yielding much greater margins than more efficient alternatives.
lupire 14 hours ago [-]
Sorry to be that guy, but would to prefer if your computer and phone each cost $5000?
dijksterhuis 3 hours ago [-]
> The solution isn't "let's stop online flight bookings sites and protect travel agents" because that's an inefficient system
this is akin to the self-checkout aisles in supermarkets, some of which have been rolled back to add back in more human checkout staff.
why? people liked interacting with the inefficient humans. turns out efficiency isn’t ideal in all cases.
i wasn’t trying to argue that everything should be inefficient. i was trying to point out that not everything needs to be efficient.
two very different things, and it seems (?) you may have thought i meant the former.
amanaplanacanal 43 minutes ago [-]
I know someone who will never use self-check, because he isn't getting paid to scan his own groceries.
I, on the other hand, will use whichever gets me out of the store faster. I don't view shopping for groceries as a social occasion.
I guess it takes all types.
tonyedgecombe 11 hours ago [-]
In some ways I would, computing lost something once normal people were allowed in.
20after4 23 hours ago [-]
And now the business of wedding / portrait photographer has become hyper-competitive. Now everyone's cousin is an amateur photographer and every phone has an almost acceptable camera built in. It is much more difficult to have a profitable photography business compared to 20 years ago.
ath92 8 hours ago [-]
The game has definitely changed. It used to be profitable to be a photographer for hire, and that’s no longer the case. But the revenue generated through pictures (by influencers) has increased a lot.
If today all you do as a programmer is open jira tickets without any kind of other human interaction, AI coding agents are bad news. If you’re just using code as a means to build products for people, it might be the best thing that has happened in a long time.
jimbokun 1 hours ago [-]
> But the revenue generated through pictures (by influencers) has increased a lot.
So the job qualifications went from "understand lighting, composition, camera technology" to "be hot".
bachmeier 23 hours ago [-]
That's good to hear. Back when I got married there were some real jerks in the wedding photography business, and they weren't worried about running out of customers. Here's an actual conversation I had with one of them:
Me: "I'm getting married on [date] and I'm looking for a photographer."
Them, in the voice of Nick Burns: "We're already filling up for next year. Good luck finding a photographer this year."
Me: "I just got engaged. You never have anything open up?"
Them: "No" and hang up the phone.
The faster guys like that struggle to make a living, the better.
tonyedgecombe 11 hours ago [-]
I know a couple of professional photographers and neither of them will do weddings. It seems many of the clients are as bad as the photographers.
LargeWu 20 hours ago [-]
In the same breath, those photographers will complain about all the "amateurs" devaluing their services.
NewsaHackO 22 hours ago [-]
Definitely. What matters more is that the ability to take photos is available to more people, which is a net positive.
insane_dreamer 14 hours ago [-]
> everybody become a photographer. That only made the world better, and we got soo many more good photographers.
Not sure I agree. I haven't seen much evidence of "better photography" now that it's digital instead of film. There are a million more photos taken, yes, because the cost is zero. But quantity != quality or "better", and if you're an average person, 90% those photos are in some cloud storage and rarely looked at again.
You could argue that drones have made photography better because it's enabled shots that were impossible or extremely difficult before (like certain wildlife/nature shots).
One thing digital photography did do is decimate the photographer profession because there is so much abundance of "good enough" photos - why pay someone to take good ones? (This may be a lesson for software development too.)
bluGill 1 hours ago [-]
While the vast majority of photos are bad, there are still more great photos mixed in than ever before. You of course won't see them because even great photos are hid from view in all the noise, but they are still there.
deanCommie 10 hours ago [-]
> That only made the world better, and we got soo many more good photographers. Same with YouTube & creativity.
I think you really missed the point of what these technologies and innovations actually did for society and how it applies to today, underneath the snark.
In the 1970's, if you got gifted a camera, and were willing to put in the work to figure out how to use it, you learned a skill that immediately put you in rare company.
With enough practice of that skill you could be a professional photographer, which would be a good , reliable, well paid job. Now, the barrier of entry is nothing, so it's extremely competitive to be a professional photographer, and even the ones that succeed just scrape by. And you have to stand out on other things than the technical ability to operate a camera.
That's...what's about to happen (if it hasn't already) with software developers.
bluGill 58 minutes ago [-]
> In the 1970's, if you got gifted a camera, and were willing to put in the work to figure out how to use it, you learned a skill that immediately put you in rare company.
Everyone in the 1970s was gifted a camera. Many of them got a nice SLR with better lens than a modern smart phone. Cameras were expensive, but within reach of most people.
Film was a different story. Today you can get 35mm film rolls for about $8 (36 pictures), and $13 to develop (plus shipping!), and $10 for prints (in 1970 you needed prints for most purposes, thought slides were an option), so $31 - where I live McDonalds starts you are $16/hour, that roll of film costs almost 2 hours work - before taxes.
Which is to say you couldn't afford to become skilled in 1970 unless you were rich.
iamflimflam1 10 hours ago [-]
I think what we forget is these high level languages did open up programming to people who would have been considered “nontechnical” back in the day.
ath92 8 hours ago [-]
This is true, but:
- there are more programmers today than there were back then
- the best programmers are still those who would be considered technical back then
tempodox 47 minutes ago [-]
But a sycophant with a hopeless case of Dunning-Kruger (LLM) is so much more entertaining!
platevoltage 18 hours ago [-]
Fast forward a couple decades and "Ok here it is. We call it Dreamweaver"
ashoeafoot 10 hours ago [-]
We now have assembler, now anyone can program.
No, wait it was called natural language coding, now anyone can code.
No, wait it was called run anything self fixing code.
No wait, simplified domain specific language.
No, wait it was uml based coding.
No, wait excel makros.
No, wait its node based drag and drop .
No, wait its LLMs.
The mental retardation of no code is strong with the deciding caste, every reincarnation must be taxed.
chii 10 hours ago [-]
The big difference with LLM is that you don't have to have a conherant and logical thought, and the LLM will "fix" that for you by morphing it into the nearest coherent expression and show you the result.
Presumably, the LLM user will have sufficient brain capacity to verify that the result works as they have imagined (however incomplete the mental picture might be). They then have an opportunity to tweak, in real time (of sorts), to make the output closer to what they want. Repeat this as many times as needed/time available, and the output gets to be quite sufficient for purpose.
This is how traditional, bespoke software development would've worked with contractor developers. Except with LLM, the turnaround time is in minutes, rather than in days or weeks.
soulofmischief 10 hours ago [-]
What's wrong with visual programming?
ashoeafoot 2 hours ago [-]
Information density is low, some concepts are hard to display and thus absent by design from the language, the limitations of the display abd debug framework become the limitation of all code executed with them..etc., etc. list goes on forever
But consider this— back in the day, how many mainframe devs ( plus all important systems programmer! ) would it take to conjure up a CRUD application?
Did you forget the vsam SME or dba? The CICS programming?
Today, one person can do that in a jiffy. Much, much less manpower.
That might be what AI does.
mrheosuper 13 hours ago [-]
pretty sure i use English in C program.
23 hours ago [-]
fuzztester 16 hours ago [-]
that was from 35 to 40 years ago.
today:
s/COBOL/SQL
and the statement is still true, except that many devs nowadays are JS-only, and are too scared or lazy as shit to learn another, relatively simple language like SQL. ("it's too much work". wtf do you think a job is. it's another name for work.)
because, you know, "we have to ship yesterday" (which funnily enough, is always true, like "tomorrow never comes").
8note 13 hours ago [-]
SQL is straightforward enough, but its not the sketchy part. taking down the database so other people cant use it by running a test query is the bad part.
the explains are not nearly as straightforward to read, and the process of writing SQL is to write the explain yourself, and then try to coax the database into turning SQL you write into that explain. its a much less pleasent LLM chat experience
michaelteter 16 hours ago [-]
Having experienced several overhyped corporate knee-jerk (and further press-amplified) silver bullets, I expect this will play out about as well as the previous ones.
And by that, I mean corps will make poor decisions that will be negative for thought workers while never really threatening executive compensation.
I see this latest one somewhat like TFA author: this is a HUGE opportunity for intelligent, motivated builders. If our jobs are at risk now or have already been lost, then we might as well take this time to make some of the things we have thought about making before but were too busy to do (or too fatigued).
In the process, we may not only develop nice incomes that are independent of PHB decisions, but some will even build things that these same companies will later want to buy for $$$.
dotancohen 8 hours ago [-]
I've already started.
I've been recording to myself voice notes for years. Until now they've seemingly been near-read-only. The friction for recording them is often low (in settings where I can speak freely) but getting the information out of them has been difficult.
I'm now writing software to help me quickly get information out of the voice notes. So they'll be useful to me too, not just to future historians who happen upon my hard drive. I would not be able to devote the time to this without AI, even though most of the code and all the architecture is my own.
kypro 13 minutes ago [-]
> If our jobs are at risk now or have already been lost, then we might as well take this time to make some of the things we have thought about making before but were too busy to do (or too fatigued).
Do what you think is best of course, but is a very bad recommendation for those who have lost their jobs and are unlikely to find another in software any time soon (if ever).
I said a few years ago when people were still saying I was overreacting and AI wouldn't take jobs, people need to reskill ASAP. If you've lost your job, learn how to paint walls or lay carpet before your emergency fund is up. In the unlikely event you find another software job while you're training, then great, if not you have a fall back.
Remember you're highly unlikely to make any serious money out of a bootstrapped startup. Statistically we know significantly fewer than than 1% of bootstrapped startups make money, let alone become viable replacements for a full-time income.
Don't be stupid – especially if you have a family depending on you.
nathanfig 1 days ago [-]
Hi all - I write a lot for myself but typically don't share, hence the stream-of-consciousness style.
But I thought this might be worth blogifying just for the sake of adding some counter-narrative to the doomerism I see a lot regarding the value of software developers. Feel free to tear it apart :)
jaza 4 hours ago [-]
Please share your writings more often. Nuclear attack combines, bring it on!
jonstewart 4 hours ago [-]
Most blog posts by devs these days are stultifying in their literal earnestness. Thanks for some sly sarcasm.
randfish 1 days ago [-]
Thought it was great. Thanks for writing and submitting!
nathanfig 24 hours ago [-]
Thanks!
layer8 22 hours ago [-]
The humor was refreshing. :)
tasuki 9 hours ago [-]
I clicked only because I disagreed with the title. What a joy of an essay!
NoPicklez 14 hours ago [-]
My take just purely based on the title, I'm in the security space not a developer but I did study it during my degree.
I would say that when the fundamentals are easier to learn it becomes a great time to learn anything. I remember spending so much of my degree during software development trying to fix bugs and have things explained by trawling through online forums like many of us have. Looking for different ways of having concepts explained to me and how to apply them.
LLM's give us a fairly powerful tool to act as a sort of tutor in asking questions, feedback on code blocks, understanding concepts, where my code went wrong etc. Asking it all of the dumb questions we go trawling for.
But I can't speak to how this translates when you're a more intermediate developer.
el_benhameen 12 hours ago [-]
I have found them quite helpful in the same way. I can bounce ideas off of them or say “here’s my understanding of this; in what ways am I incorrect?”. I don’t trust them to have pinpoint accuracy on complex problems, but given the way they’re trained, I do trust them to be directionally correct. That makes getting past hangups faster and leads me to ask more, better questions of myself, which I think means I learn faster.
KurSix 6 hours ago [-]
Yep, once you hit intermediate and beyond, I think it shifts more to using them as an accelerant
bentt 1 hours ago [-]
One thing that every working dev needs to realize, for their benefit, is that the AI gold rush is leading many, many companies to find ways to trumpet their use of AI for no other reason than to please investors. After all, if it's the big new thing, then of course you need to get on the train or get left behind. Investors are very important. They can provide money. Companies need money.
We might think, "Yeah, but so many of these dumb AI corpo-initiatives are doomed to fail!" and that's correct but the success/fail metric is not based on whether the initiatives' advertised purpose is effective. If investors respond positively in the near term, that's success. This is likely why Logitech embedded AI in their mouse software (Check and see if Logi+ AI Agent is in your task manager) https://news.logitech.com/press-releases/news-details/2024/N...
The near term crash (which will happen) in AI stuff will be because of this dynamic. All it means is that phase one of the grift cycle is completing. In the midst of this totally predictable, repeatable process, a dev's job is to get gud at whatever is truly emerging as useful. There are devs who are showing huge productivity gains through thoughtful use of AI. There are apps that leverage AI to do new and exciting things. None of this stuff happens without the human. Be that human!
eqvinox 8 hours ago [-]
Metaphors are fun, they "feel" meaningful, but… you still need to back that up.
> mechanized farm equipment
Sure, that could be a valid analogy.
Or maybe we invented CAD software for mechanical engineering, where we were making engineering drawings by hand before?
And that doesn't quite ring the same way in terms of obsoleting engineers…
dehrmann 23 hours ago [-]
The farming quote is interesting, but one of the Jevons paradox requirements is a highly elastic demand curve, and food is inelastic.
The open questions right now are how much of a demand is there for more software, and where do AI capabilities plateau.
9rx 22 hours ago [-]
Either way, as quite visibility seen by all the late-1800s mansions still lining the country roads, the era of farmers being "overpaid", as the link puts it, came about 50-75 years after the combine was invented. If the metaphor is to hold, we can assume that developers are currently poor as compared to what the LLM future holds for them.
But, there is a key distinction that we would be remiss to not take note of: By definition, farmers are the owners of the business. Most software developers aren't owners, just lowly employees. If history is to repeat, it is likely that, as usual, the owners are those who will prosper from the advancement.
slt2021 22 hours ago [-]
demand for food is very elastic. if beef becomes more expensive, cheaper protein options get more demand (chicken, pork, tofu, beans).
fruits and all non-essential food items are famously very elastic, and constitute large share of the spending.
for example: if cheap cereal becomes abundant, it is only at the cost of poor quality, so demand for high quality cereal will increase.
the LLM driven software engineering will continuously increase the bar for quality and demand for high quality software
9 hours ago [-]
fulafel 10 hours ago [-]
Demand for eaten calories is not very elastic but plentiful food crops lead to piping crops through the wasteful, environmentally harmful & unethical thingthat is meat production.
giraffe_lady 23 hours ago [-]
Reported numbers vary but household food waste seems to be fairly high in developed economies, so food demand might be more elastic than intuition would expect.
dredmorbius 22 hours ago [-]
I've seen consistent values for food waste reported for at least the past 40 years, if not the past 80, in various sources. I suspect it's something of a constant. One observation I've seen is that food wastage now occurs far later in the processing cycle, which is to say, after far more resources (transport, processing, refrigeration, cooking) have been invested in it.
In the long term, food demand is elastic in that populations tend to grow.
jimbokun 1 hours ago [-]
> in that populations tend to grow.
That's no longer happening.
kwk1 22 hours ago [-]
Perhaps we should say something like "food demand has an elasticity floor."
giraffe_lady 20 hours ago [-]
For sure.
abalashov 23 hours ago [-]
I'm not sure if I agree with every aspect of the framing here; specifically, I don't think the efficiency gains are anywhere on par with a combine harvester.
However, I do agree that the premium shifts from mere "coding" ability -- we already had a big look into this with the offshoring wave two decades ago -- to domain expertise, comprehension of the business logic, ability to translate fluidly between different kinds of technical and nontechnical stakeholders, and original problem-solving ability.
nathanfig 23 hours ago [-]
Yeah I think the combine-harvester analogy is tempting because it's so easy to visualize how wheat can scale over a big square field and project that visual onto lines of code generated on a big square screen... forgetting that lines-of-code-generated is not inherently useful.
temporallobe 21 hours ago [-]
Essentially it’s the same as it always was. Back in the day, Low-code or No-code solutions implemented by non-technical people have always resulted in engineers having to come in behind them to clean up their mess. I’ve had quite the lucrative career doing just that.
14 hours ago [-]
jaza 4 hours ago [-]
Hahaha... so ChatGPT-generated Node / React apps is the new Excel with VBA macros!
ramesh31 43 minutes ago [-]
>Hahaha... so ChatGPT-generated Node / React apps is the new Excel with VBA macros!
Maybe, but also "Excel with VBA macros" has generated an unimaginable amount of value for businesses in that time as well. There's going to be room for both.
nathanfig 21 hours ago [-]
Yeah, with current-state AI I foresee more such opportunities.
Ekaros 20 hours ago [-]
I think I will have good while in security. That is pointing all the mistakes and faults... And telling why something AI came up might not fully solve the problem.
So much room left. As I doubt every developer will double check things every time by asking.
14 hours ago [-]
KurSix 6 hours ago [-]
LLMs are amazing tools, but they don't (yet) replace deep understanding or critical thinking. And yeah, it is a super fun time to be learning/building.
prisenco 21 hours ago [-]
Upwork is already filling up with people who have vibe-coded their way into a pit and need experienced developers to pull them out.
billy99k 21 hours ago [-]
You can find good contract on Upwork, but you need to go through lots of bad ones. I find around 5 good contracts there per year. I find that even when a client agrees on a rate, Upwork has the reputation of finding inexpensive workers, and you will get many clients trying to pay you less.
I'm also a bit tired of running into people that are 'starting a contracting firm' and have 0 clients or direction yet and just want to waste your time.
nathanfig 21 hours ago [-]
Really! That could make for some really interesting stories. Fascinating to think of LLMs as a customer acquisition pipeline for developers.
platevoltage 18 hours ago [-]
I've snagged at least one of them.
clpmsf 3 hours ago [-]
I don't think that now is the best time to learn software development, but I do think that now is the best time to learn computer science.
nathanfig 37 minutes ago [-]
Can you elaborate? I might agree, I'm just curious what you mean.
rossdavidh 17 hours ago [-]
All of this is good reason that orgs _shouldn't_ be laying off developers, but none of it is a reason that they won't/aren't. In any case, I see more "if they're remote why can't they be on the low-wage side of the planet" at the moment, than I do "use AI instead of a developer", although they are no doubt related.
The more awkward truth is that most of what developers have been paid to do in the 21st century was, from the larger perspective, wasted. We mostly spent a lot of developer time in harvesting attention, not in actually making anything truly useful.
MichaelZuo 17 hours ago [-]
How does that follow…?
Most organizations do derive net benefit from laying off the below average and hiring the above average for a given compensation range, as long as the turnover is not too high.
And this delta increases when the above average can augment themselves more effectively, so it seems we should expect an even more intense sorting.
mobiuscog 4 hours ago [-]
> The day we stop valuing human contribution is the day alignment has failed.
Unfortunately, that's many businesses already, even before AI.
It's all just one big factory line. Always has been (to those at the top).
nathanfig 36 minutes ago [-]
Yeah, the social problem of "how do we use technology well" persists.
adregan 2 hours ago [-]
Here are my 2¢ on using AI at work. I’m using Claude code and my typical tmux/neovim setup.
1. I use AI to find my way in a sprawling micro(service|frontend) system that I am new to. This helps cut down massively on the “I know what to do, I just can’t figure out where.” I started a new job where everyone has years of context as to how things fit together and I have none. I feel strongly that I need to give an honest effort at finding things on my own before asking for help, and AI certainly helps there.
2. Anything I stumble upon in a dev/deployment process that leans too heavily into the “good behavior/hygiene,” I try to automate immediately for myself and then clean up to share with the team. In the past, I might have tried to adopt the common practice, but now it’s less effort to simply automate it away.
3. There is value in using AI in the same manner as I use vim macros: I use the planning mode heavily and iterate like crazy until I’m satisfied with the flow. If the task has a lot of repetition, I typically do the first one myself then let the AI take a whack at one or two. If I don’t like the process, I update the plan. Once I see things going smoothly, I give the AI the ok to finish the rest (making atomic commits so that it’s not just one big ball of wax). This is pretty similar to how I record macros (make one change yourself, record the macro on the next line, test it out for a line or 2, re-record if necessary, test again, plow through the rest).
4. When I come across something that needs to be fixed/could be improved but isn’t related to my task at hand, I do a few minutes of research and planning with the AI, and instead of coding a solution, we create a todo document or an issue in a tracking system. This wasn’t happening before because of the context switching required to write good documentation for later. Now it’s more of the same thing but akin to a dry run of a script.
5. I can quickly generate clear and easy to read reports to allow other teammates to give me feedback on work in flight. Think about a doc with before and after screenshots of changes throughout an app produced by a playwright script and a report generator that I can rerun in under a minute whenever I want.
I’m finding that I really enjoy the skipping the tedious stuff, and I’m also writing higher quality stuff because I have more bandwidth. It helps me collaborate more with my non dev peers because it lowers the barrier to sharing.
Important to note that in my experimenting, I haven’t had great luck with winding it up and setting it loose on a task. Too often it felt like being a junior engineer again, doomed to throw spaghetti at the wall. Once I started using AI as an assistant, I felt things really started to click. Software development is about writing code, but it’s about a lot of other things too. It’s nice when the AI can help write code, but it’s fantastic when it helps you accomplish the other things.
nathanfig 32 minutes ago [-]
>> I haven’t had great luck with winding it up and setting it loose on a task
Yeah there is real labor involved with trying to set up the right context and explain your goals and ensure it has the right tools, etc. etc. Sometimes the payoff is massive, other times it's like "I could have done this myself faster". It takes time to build an intuition for which is which, and I'm constantly having to tune that heuristic as new models and tools come out.
Leo-thorne 9 hours ago [-]
Now really feels like a good time to start learning how to code. I used to get completely lost reading documentation, but with Copilot, I just type a few lines and it helps fill in the logic. It feels like having a more experienced person sitting next to me.
That said, I still try to figure out the logic myself first, then let AI help polish or improve it. It is a bit slower, but when something breaks, at least I know why.
AI has definitely lowered the barrier. But whether you can actually walk through the door still depends on you.
nathanfig 38 minutes ago [-]
Yeah in fact I think we are going through some major upheaval in development practices because we haven't yet figured out what constraints to put in our use of AI, and so plastering it everywhere we trip ourselves up. It took a while to figure out that private members were useful and GOTO is harmful despite how tempting it might be.
I think similarly we will find that using AI to take shortcuts around design is mostly harmful, but using it to fulfill interfaces is brilliant. Eventually a set of best practices will evolve.
3 hours ago [-]
giantg2 4 hours ago [-]
This article seems to ignore the 6% sector unemployment, massive layoffs, and the terrible interview processes.
Regardless of the true number, you're right that no amount of reasoning on paper "why" we should be employed matters if the reality is different; which it clearly is for a lot of people. Reality decides in the end.
A more accurate title might have been "Why AI is a reason to become a software developer" - since the topic I discuss is entirely AI and its effects on the field, and there might be entirely non-AI reasons for not going into software.
giantg2 7 minutes ago [-]
I'm not sure how old your link is. A basic search is showing multiple sources from this year indicating the rate is more than double that 2.4 number.
karczex 1 days ago [-]
It's like "we invented Fortran so there will be no need for so many developers"
nathanfig 24 hours ago [-]
An interesting parallel because there were undoubtedly some people who worried we would lose something important in the craft of instruction-level programming, and almost certainly we have in relative terms. But in absolute numbers I am confident we have more low-level programmers than we did before Fortran.
And if I were to jump into instruction-level programming today I would start by asking an LLM where to begin...
marcosdumay 22 hours ago [-]
Fortran was a much larger jump in productivity than agentic coding...
KurSix 6 hours ago [-]
Yet here we are, with more demand for developers than ever
boxed 9 hours ago [-]
> And like mechanized farm equipment, LLMs are cheap, plentiful, getting smaller every day, and- most importantly- require no training to operate.
I... assume that was meant sarcastically, but it's not at all clear from context I think.
yodsanklai 23 hours ago [-]
> What do you do while awaiting the agents writing your code?
I browse the web. Eventually, I review the agent code and more often than not, I rewrite it.
irrational 12 hours ago [-]
> and historical romance novels will rightly remember us as rugged and sexy.
I also remember this! Maybe a subconscious influence
SeanDav 1 days ago [-]
>> "ChadGPT"
There actually is a ChadGPT but I assume the OP meant ChatGPT
nathanfig 24 hours ago [-]
Oh I should have known - yeah I was just being facetious
ByteDrifter 9 hours ago [-]
Reading this reminded me how much the learning curve is flattening. You can now learn by doing and debugging AI output. That’s a very different entry point from five years ago. Less lonely, more interactive.
nathanfig 29 minutes ago [-]
I think that's a big part of what makes this era so much fun - the interactivity tailored to my capability. I feel like I'm 13 again discovering I can write HTML and feeling that giddy delight when I try something and it just works.
everyone 10 hours ago [-]
I learned programming in 2013, I was asking questions on stack overflow constantly while learning, people there were super friendly and supportive and answered my questions. SO was pivotal for me learning so fast.. Ive been a game programmer ever since.. This year I learned web-dev and made my 1st commercial web app. SO is totally dead though, utterly useless these days, but I used chatGPT to fill the same role and it worked great. Its a shame about SO though.
nathanfig 2 hours ago [-]
And worrying. LLMs don't fed themselves, we need people to continue sharing solutions. I've contributed some popular answers on SO myself and am happy that an LLM can use that in its training, but rarely go to SO myself anymore.
rr808 15 hours ago [-]
The management at my corporate job literally say in our townhalls that they expect AI to increase productivity and reduce costs. Makes logical sense to me, the glory days of high wages are over.
nathanfig 2 hours ago [-]
Reduced costs does not necessarily equal reduced wages, but I see what you're saying. As others have pointed out, making cameras more accessible made photography more competitive.
It may be in the end that software developers make less money even as more jobs become available.
vincenthwt 14 hours ago [-]
Are you talking about the high wages of software engineers or management? Makes sense to me— the glory days of high management and CEO salaries are over.
agentultra 13 hours ago [-]
If you’re going to use LLMs to learn software development, great! Welcome!
Just, don’t skip out on learning the fundamentals. There’s no royal road to knowledge and skill. No shortcuts. No speed running, downloading kung fu, no passing go.
Why?
Because the only thing LLMs do is hallucinate. Often what they generate is what you’re looking for. It’s the right answer!
But if you don’t know what and L1 cache is or how to lay out data for SIMD; no amount of yelling at the bot is going to fix the poor performance, the security errors, and the logic errors. If you don’t know what to ask you won’t know what you’re looking at. And you won’t know how to fix it.
So just remember to learn the fundamentals while you’re out there herding the combine space harvesters… or whatever it is kids do these days.
somethingreen 6 hours ago [-]
Now might be the last time to learn software development.
spacecadet 4 hours ago [-]
Engineering problems are human problems. For now I guess? I saw a missinfo headline recently to the toon of, "AI is after all the water in X country" and I thought about the Anthropic paper where the model blackmailed the engineer. And then the Matrix. Cry, lol?
24 hours ago [-]
alganet 21 hours ago [-]
> and now with far greater reach and speed than ever before
I heard that before. Borland Delphi, Microsoft FrontPage, Macromedia Flash and so on. I learned how in 5 years or so, these new technologies would dominate everything.
Then I learned that two scenarios exist. One of them is "being replaced by a tool", the other is "being orphaned by a tool". You need to be prepared for both.
nathanfig 21 hours ago [-]
Yes, if you built your career on FrontPage you have probably had a bad time. Many such cases.
That said, even if the specific products like Cursor or ChatGPT are not here in 5 years, I am confident we are not going to collectively dismiss the utility of LLMs.
rsynnott 3 hours ago [-]
I'm... slightly dubious, honestly, just because, historically, confident predictions of "this is the future of programming!" are almost always wrong. Throughout the 90s and into the early noughties, say, there was a very, very strong idea that things like VB and Delphi and Frontpage and Flash, where code was essentially intermingled with the UI, were The Future. And then all of that just died, to such an extent that there's really nothing like it at all today. Then there was the whole UML thing, and the "everything will run on XML" thing in the mid to late noughties...
nathanfig 2 hours ago [-]
Sounds a lot like "nothing ever happens"?
rsynnott 1 hours ago [-]
Well, arguably in programming nothing ever does happen; we're still using mostly imperative programming languages on mostly UNIX-y systems which are not dramatically different from those in the 1970s, with filesystems rather similar to those in the 70s...
Some stuff does happen, of course, but most prophesied things do not happen.
alganet 20 hours ago [-]
I can see it being useful for summarization, or creative writing. What makes you so sure that LLMs will be useful _for programming_ in the long run?
nathanfig 2 hours ago [-]
Because they are already useful for programming and unlikely to get worse!
alganet 1 hours ago [-]
We're not getting anywhere with your answer.
All tools I mentioned before were useful for programming. They didn't got worse. Still not enough to keep them relevant over time.
I chose those tools as an example precisely because, for a while, they achieve widespread success. People made awesome things with them. Until they stopped doing it.
What brought their demise was their inherent limitations. Code by humans on plain text was slower, harder, but didn't had those inherent limits. Code by humans on plain text got better, those tools didn't.
wohoef 5 hours ago [-]
“The only weights I use are at the gym”
Lol
revskill 11 hours ago [-]
The hardest part is debugging.
ed_mercer 9 hours ago [-]
In theory one should be able to hand this over to a MCP debugging server.
jplusequalt 2 hours ago [-]
Soon enough you'll be able to hand over your thinking to the MCP thought server.
13 hours ago [-]
freekh 1 days ago [-]
Nice article! Reflects my views as well!
ramesh31 3 hours ago [-]
>Now might be the best time to learn software development
Always has been.
fuzztester 16 hours ago [-]
[flagged]
mirkodrummer 18 hours ago [-]
> LLMs really are like combine harvesters; allowing one to do the work of many.
Heck I'm so tired of statements like this, many who? It's already a lot an LLM that automate/help the boring/tedious part of my job, I have yet to see taking over 2, 5 or 10 of my collegues, just knowing what a hawful lot these tiredlessly dudes do I couldn't ever imagine doing also their job. imo such statements have very short shelf life
Pushes never come from the LLM, which can be easily seen by feeding the output of two LLMs into each other. The conversation collapses completely.
Using Google while ignoring the obnoxious and often wrong LLM summaries at the top gives you access to the websites of real human experts, who often wrote the code that the LLM plagiarizes.
Second, it doesn't do well at all if you give it negative instructions, for example if you tell it to: "Don't use let! in Rspec" , it will create a test with "let!" all over the place.
PS: Both humans and llms are hard to align. But I do have to discuss with humans and I find that exhausting. llms I just nudge or tell what to do
I find myself often discussing with an LLM when trying to find the root cause of an issue I'm debugging. For example, when trying to track down a race condition I'll give it a bunch of relevant logs and source code, and the investigation tends to be pretty interactive. For example, it'll pose a number of possible explanations/causes, and I'll tell it which one to investigate further, or recommendations for what new logging would help.
Often I find it easier to just do it myself rather than list out a bunch of changes. I'll give the LLM a vague task, it does it and then I go through it. If it's completely off I give it new instructions, if it's almost right I just fix the details myself.
So no, they are not using the same version of Google.
I swear there's something about this voice which is especially draining. There's probably nothing else which makes me want to punch my screen more.
I'm completely drained after 30 minutes of browsing Google results, which these days consist of mountains of SEO-optimized garbage, posts on obscure forums, Stackoverflow posts and replies that are either outdated or have the wrong accepted answer... the list goes on.
So we are close to an AI president.
By that I don't mean necessarily the nominal function of the government; I doubt the IRS is heavily LLM-based for evaluating tax forms, mostly because the pre-LLM heuristics and "what we used to call AI" are probably still much better and certainly much cheaper than any sort of "throw an LLM at the problem" could be. But I wouldn't be surprised that the amount of internal communication, whitepapers, policy drafts and statements, etc. by mass is probably already at least 1/3rd LLM-generated.
(Heck, even on Reddit I'm really starting to become weary of the posts that are clearly "Hey, AI, I'm releasing this app with these three features, please blast that out into a 15-paragraph description of it that includes lots of emojis and also describes in a general sense why performance and security are good things." and if anything the incentives slightly mitigate against that as the general commenter base is starting to get pretty frosty about this. How much more popular it must be where nobody will call you out on it and everybody is pretty anxious to figure out how to offload the torrent-of-words portion of their job onto machines.)
As in, copied it with a prompt in.
Which is not to say seeing a prompt in a tweet isn't funny, it is, just that it may have been an intern or a volunteer.
"I went to work early that day and noticed my monitor was on, and code was being written without anyone pressing any keys. Something had logged into my machine and was writting code. I ran to my boss and told him my computer had been hacked. He looked at me, concerned, and said I was hallucinating. It's not a hacker, he said. It's our new agent. While you were sleeping, it built the app we needed. Remember that promotion you always wanted? Well, good news buddy! I'm promoting you to Prompt Manager. It's half the money, but you get to watch TikTok videos all day long!'"
Hard to find any real reassurance in that story.
Prompt engineering is like singing: sure thing everyone can physically sing… now whether it’s pleasant listening to them is another topic.
It can bounce back over time and maybe leave us better off than before but the short term will not be pretty. Think industrial revolution where we had to stop companies by law from working children to literal death.
Whether the working man or the capital class profits from the rise of productivity is a questions of political power.
We have seen that productivity rises do not increase work compensation anymore: https://substack.com/home/post/p-165655726
Especially we as software engineers are not prepared for this fight as unions barely exist in our field.
We already saw mass layoffs by the big tech leaders and we will see it in smaller companies as well.
Sure there will always be need for experienced devs in some fields that a security critical or that need to scale but that simple CRUD app that serves 4 consecutive users? Yeah, Greg from marketing will be able to prompt that.
It doesn't need be the case that prompt engineers are paid less money, true. But with us being so disorganized the corporations will take the opportunity to cut cost.
You can fight without unions. Tell the truth about LLMs: They are crutches for power users that do not really work but are used as excuses for firing people.
You can refuse to work with anyone writing vapid pro-LLM blog posts. You can blacklist them in hiring.
This addresses the union part. It is true that software engineers tend to be conflict averse and not very socially aware, so many of them follow the current industry opinion like lemmings.
If you want to know how to fight these fights, look at the permanent government bureaucracies. They prevail in the face of "new" ideas every 4 years.
Search youtube for "yes minister" :)
-----
On topic, I think it's a fair point that fighting is borderline useless. Companies that don't embrace new tech will go out of business.
That said, it's entirely unclear what the implications will be. Often new capabilities doesn't mean the industry will shrink. The industry haven't shrunk as a result of 100x increase in compute and storage, or decrease in size and power usage.
Computers just became more useful.
I don't think we should be too excited about AI writing code. We should be more excited about the kinds of program we can write now. There is going to be a new paradigm of computer interaction.
And you can fly without wings--just very poorly.
Unions are extremely important in the fight of preserving worker rights, compensation, and benefits.
lol, good luck with that.
you thinking that one or two people doing non organized _boycott_ is the same thing as an union tell a lot about you.
It is possible that obedient people need highly paid union bosses, i.e., new leaders they can follow.
Unions are for people that don't accept anything and know that they are a target taking action alone or in non organized ways.
Unions are the way to multiply the forces and work as a group with common interests, it is for people that are not extremely selfish and egocentric.
Nobody wants to inhale toxic fumes in some factory? well then the company had better invest in safety equipment, or work dont get done. We dont need a union for this
If you leave it up to each worker to fend for himself with no negotiating power beyond his personal freedom to walk out, you get sweatshops and poorhouses in any industry where labor is fungible. If you want nice societies where average people can live in good homes with yards and nearby playgrounds and go to work at jobs that don't destroy their bodies and souls, then something has to keep wages at a level to support all that.
I'm not necessarily a fan of unions; I think in many cases you end up with the union screwing you from one side while the corporation screws you from the other. And the public sector unions we have today team up with the state to screw everyone else. But workers at least need the freedom to organize, or all the pressure on wages and conditions will be downward for any job that most people can do. The alternative is to have government try to keep wages and conditions up, and it's not good at that, so it just creates inflation with wages trailing behind.
We tried that in the past. The work still got done, and workers just died more often. If you want to live in that reality move to a developing country with poor labor protections.
This works only if everyone is on with this. If they're not, you're shooting yourself in the foot while doing job hunting.
This all assumes that such revolutions are built on resiliency and don't actually destroy the underpinning requirements of organized society. Its heavily skewed towards survivor bias.
Our greatest strength as a species is our ability to communicate knowledge, experience, and culture, and act as one large overarching organism when threats appear.
Take away communication, and the entire colony dies. No organization can occur, no signaling. There are two ways to take away communication, you prevent it from happening, or you saturate the channel to the Shannon Limit. The latter is enabled by AI.
Its like an ant hill or a bee hive where a chemical has been used to actively and continually destroy the pheromones the ants rely upon for signalling. What happens? The workers can't work, food can't be gathered, the hive dies. The organism is unable to react or adapt. Collapse syndrome.
Our society is not unlike the ant-hill or bee hive. We depend on a fine balance of factors doing productive work and in exchange for that work they get food, or more precisely money which they use to buy food. Economy runs because of the circulation of money from producer to factor to producer. When it sieves into fewer hands and stays there, distortions occur, these self-sustain and then eventually we are at the point where no production can occur because monetary properties are lost under fiat money printing. There is a narrow working range where outside the range on each side everything catastrophically fails. Hyper-inflation/Deflation
AI on the other hand eliminates capital formation of the individual. The time value of labor is driven to zero. There is a great need for competent workers for jobs, but no demand because no match can occur; communication is jammed. (ghost jobs/ghost candidates)
So you have failures on each end, which self-sustain towards socio-economic collapse. No money circulation going in means you can't borrow from the future through money printing. Debasement then becomes time limited and uncontrollable through debt traps, narrow working price range caused by consistent starvation of capital through wage suppression opens the door to food insecurity, which drives violence.
Resource extraction processes have destroyed the self-sustaining flows such that food in a collapse wouldn't even support half our current population, potentially even a quarter globally. 3 out of 4 people would die. (Malthus/Catton)
These things happen incredibly slowly and gradually, but there is a critical point we're about 5 years away from it if things remain unchanged, there is the potential that we have already passed this point too. Objective visibility has never been worse.
This point of no return where the dynamics are beyond any individual person, and after that point everyone involved in that system is dead but they just don't know it yet.
Mutually Assured Destruction would mean the environment becomes uninhabitable if chaos occurs and order is lost in such a breakdown.
We each have significant bias to not consider the unthinkable. A runaway positive feedback system eventually destroys itself, and like a dam that has broken with the waters rushing towards individuals; no individual can hold back those forces.
Didn’t Greg-from-marketing’s life just get a lot better at the same time?
Software development has a huge barrier to entry which keeps the labor pool relatively small, which keeps wages relatively high. There's going to be a way larger pool of people capable of 'prompt engineering' which is going to send wages proportionally way down.
Every time things turn bad a lot of people jump out and yell it is the end of the tech. They have so far been wrong. Only time will tell if they are right this time, though I personally doubt it.
It objectively takes less expertise and background knowledge to produce semi-working code. That lowers the barrier to entry, allowing more people to enter the market, which drives down salaries for everyone.
i think you got the analogy wrong. Not everyone can sing professionally, but most people can type text into a text-to-speech synthesis system to produce a workable song.
I supposed because every new job title that has come out in the last 20+ years has followed the same approach of initially the same or slightly more money, followed by significant reductions in workforce activities shortly thereafter, followed by coordinated mass layoffs and no work after that.
When 70% of the economy is taken over by a machine that can work without needing food, where can anyone go to find jobs to feed themselves let alone their children.
The underlying issues have been purposefully ignored by the people who are pushing these changes because these are problems for next quarter, and money printing through non-fraction reserve banking decouples the need to act.
Its all a problem for next quarter, which just gets kicked repeatedly until food security becomes a national security issue.
Politician's already don't listen to what people have to say, what makes you think they'll be able to do anything once organized violence starts happening because food is no longer available, because jobs are no longer available.
The idiots and political violence we see right now is nothing compared to what comes when people can't get food, when their mindset changes from we can work within the system to there is no out only through. When existential survival depends on removing the people responsible by any means, these things happen, and when the environment is ripe for it; they have friends everywhere.
UBI doesn't work because non-market socialism fails. You basically have a raging fire that will eventually reach every single person and burn them all alive, and it was started by evil blind idiots that wanted to replace human agency.
it would have been funnier if the story then took a turn and ended with it was the AI complaining about a human writing code instead of it.
I fear this will be more and more of a problem with the TikTok/instant gratification/attention is only good for less than 10 seconds -generation. Deep thinking has great value in many situations.
"Funnily" enough, I see management more and more reward this behavior. Speed is treated as vastly more important than driving in the right direction, long-term thinking. Quarterly reports, etc etc.
Yes, it's supportive and helps you stay locked in. But it also serves as a great frustration lightning rod. I enjoy being an unsavory person to the LLM when it behaves like a buffoon.
Sometimes you need a pressure release valve. Better an LLM than a person.
P.S: Skynet will not be kind to me.
Don't people realize it's a machine "pretending" to be human?
I’ll stick to human emotional support.
With LLM it’s speed - seconds rather than the minutes or hours as per stack overflow which is main benefit.
"Wow, show it to me!"
"OK here it is. We call it COBOL."
"Before 1954, almost all programming was done in machine language or assembly language. Programmers rightly regarded their work as a complex, creative art that required human inventiveness to produce an efficient program."
-John Backus, "The History of Fortran I, II, and III", https://dl.acm.org/doi/10.1145/800025.1198345
"The IBM Mathematical Formula Translating System or briefly, FORTRAN, will comprise a large set of programs to enable the IBM 704 to accept a concise formulation of a problem in terms of a mathematical notation and to produce automatically a high speed 704 program for the solution of the problem."
-IBM, "Specifications for the IBM Mathematical FORmula TRANslating System, FORTRAN", http://archive.computerhistory.org/resources/text/Fortran/10...
"FORTRAN should virtually eliminate coding and debugging" https://news.ycombinator.com/item?id=3970011
But it still has been immensely useful and a durable paradigm, even though usage hasn't been exactly as thought.
Spreadsheets flip the usual interface from code-first to data-first, so the program is directly presenting the user with a version of what I'm doing in my head. It allows them to go step-by-step building up the code while focusing on what they want to do (transform data) instead of having to do it in their head while focusing on the how (the code).
https://www.youtube.com/watch?v=kOO31qFmi9A
Maybe a similar bifurcation will arise where there are vibe coders who use LLMs to write everything, and there are real engineers who avoid LLMs.
Maybe we’re seeing the beginning of that with the whole bifurcation of programmers into two camps: heavy AI users and AI skeptics.
Im more surprised by software engineers who do know these things than by the ones who don’t.
It’s not that SQL is hard, it’s that for any discipline the vast majority of people don’t have a solid grasp of the tools they’re using. Ask most tradespeople about the underlying thing they’re working with and you’ll have the same problem.
I share your sentiment though - I'm a data engineer (8 years) turned product engineer (3 years) and it astounds me how little SQL "normal" programmers know. It honestly changed my opinion on ORMs - it's not like the SQL people would write exceeds the basic select/filter/count patterns that is the most that non-data people know.
Reasons: - I can compose queries, which in turn makes them easier to decompose - It's easier to spot errors - I avoid parsing SQL strings - It's easier to interact with the rest of the code, both functions and objects
If I need to make just a query I gladly write SQL
It's just a shame that many languages don't support relational algebra well.
We had relations as a datatype and all the relevant operations over them (like join) in a project I was working on. It was great! Very useful for expressing business logic.
Is this true? It doesn't seem true to me.
Yes, there are so many so called developers in backend field of work who do not know how to do basic SQL. Anything bigger than s9imple WHERE clause.
I wouldn't even talk about using indexes in database.
(Also of other food, energy, and materials sourcing: fishing, forestry, mining, etc.)
This was the insight of the French economist François Quesnay in his Tableau économique, foundation of the Physiocratic school of economics.
> Strictly speaking, farming is where all our livelihoods come from, in the greatest part. We're all living off the surplus value of food production.
I don't think farming is special here, because food isn't special. You could make exactly the same argument for water (or even air) instead of food, and all of a sudden all our livelihoods would derive ultimately from the local municipal waterworks.
Whether that's a reductio ad absurdum of the original argument, or a valuable new perspective on the local waterworks is left as an exercise to the reader.
Working the summer fields was one of the least desirable jobs but still gave local students with no particular skills a good supplemental income appropriate for whichever region.
A good example of this phenomenon is sports. Even thought it can't be done remotely, it's so talent dependent that it's often better to find a great player in a foreign country and ask them to work for you, rather than relying exclusively on local talent. If it could be a remote job, this effect would be even greater.
We increase the overall total prosperity with that automation.
Well, what we had before SQL[1] was QUEL, which is effectively the same as Alpha[2], except in "English". Given the previous assertion about what came before SQL, clearly not. I expect SQL garnered favour because it is tablational instead of relational, which is the quality that makes it easier to understand for those not heavy in the math.
[1] Originally known as SEQUEL, a fun word play on it claiming to be the QUEL successor.
[2] The godfather language created by Codd himself.
The Alpha/QUEL linage chose relations, while SQL went with tables. Notably, a set has no ordering or duplicates — which I suggest is in contrast to how the layman tends to think about the world, and thus finds it to be an impediment when choosing between technology options. There are strong benefits to choosing relations over tables, as Codd wrote about at length, but they tend to not show up until you get into a bit more complexity. By the time your work reaches that point, the choice of technology is apt to already be made.
With care, SQL enables mimicking relations to a reasonable degree when needed, which arguably offers the best of all worlds. That said, virtually all of the SQL bugs I see in the real world come as a result of someone not putting in enough care in that area. When complexity grows, it becomes easy to overlook the fine details. Relational calculus would help by enforcing it. But, tradeoffs, as always.
>SQL [...] is a database language [...] used for access to pseudo-relational databases that are managed by pseudo-relational database management systems (RDBMS).
>SQL is based on, but is not a strict implementation of, the relational model of data, making SQL “pseudo-relational” instead of truly relational.
>The relational model requires that every relation have no duplicate rows. SQL does not enforce this requirement.
>The relational model does not specify or recognize any sort of flag or other marker that represents unspecified, unknown, or otherwise missing data values. Consequently, the relational model depends only on two-valued (true/false) logic. SQL provides a “null value” that serves this purpose. In support of null values, SQL also depends on three-valued (true/false/unknown) logic.
Or, in other words, "relation" does not mean the relations between the tables as many assume: the tables, as a set of tuples, are the relations.
None of these had "semantic expressivity" as their strength.
> If SQL looked like, say, C#'s LINQ method syntax, would it really be harder to use?
Yes.
0 - https://en.wikipedia.org/wiki/ISAM
They are very much the exception that proves the rule though.
(I'd love for someone to substantiate or debunk this for me.)
Incorrect.
Encoding a program was considered secretarial work, not the act of programming itself. Over time, "encoding" was shortened to "coding."
This is why the industry term "coder" is a pejorative descriptor.
For some people some of the time. I don't think that's true in general.
It is not.
Most people miss the fact that technical improvements increases the pie in a way that was not possible before.
When digital cameras became popular, everybody become a photographer. That only made the world better, and we got soo many more good photographers. Same with YouTube & creativity.
And same with coding & LLMs. World will have lots more of apps, and programmers.
I disagree with the "only" part here. Imagine a distribution curve of photos with shitty photos on the left and masterpieces on the right and the height at the curve is how many photos there are to be seen at that quality.
The digital camera transition massively increased the height of the curve at all points. And thanks to things like better autofocus, better low light performance, and a radically faster iteration loop, it probably shift the low and middle ends to the right.
It even certainly increased the number number of breathtaking, life-changing photos out there. Digital cameras are game-changes for photographic journalists traveling in difficult locations.
However... the curve is so high now, the sheer volume of tolerably good photos so overwhelming, that I suspect that average person actually sees fewer great photos than they did twenty years ago. We all spend hours scrolling past nice-but-forgottable sunset shots on Instagram and miss out on the amazing stuff.
We are drowning in a sea of "pretty good". It is possible for there to be too much media. Ultimately, we all have a finite amount of attention to spend before we die.
Meaning no disrespect to photographers, I'm starting to think that a probable outcome of all the AI investment is a sharp uptick in shovelware.
If we can get AIs to build "pretty good" things - or even just "pretty average" things - cheaply, then our app stores, news feeds, ad feeds, company directives, etc, will be continuously swamped with it.
You can use AI to filter out the shovelware, so you never have to see it.
I still find great value in the TCM cable channel. Simply because if I tune in at a random time, it's likely to be showing an excellent old film I either never heard of or never got around to watching.
The service they are offering is curation, which has a lot of value in an age of infinite content flooding our attention constantly.
The reason to take all 400 though as every once in a while one photo is obviously better than another for some reason. You also want several angles because sometimes the light will be wrong at the moment, or someone will happen to be in the way of your shot...
And sometimes it is even combining elements from different photos: Alice had her eyes closed in this otherwise great shot, but in this other shot her eyes were open. A little touch-up and we've got the perfect photo.
Did it?
people now stand around on dance floors taking photos and videos of themselves instead of getting on dancing and enjoying the music. to the point where clubs put stickers on phones to stop people from doing it.
people taking their phone out and videoing / photographing something awful happening, instead of doing something helpful.
people travel to remote areas where the population has been separated from humanity and do stupid things like leave a can of coke there, for view count.
it’s not made things better, it just made things different. whether that’s better or worse depends on your individual perspective for a given example.
so, i disagree. it hasn’t only made things better. it made some things easier. some things better. some things worse. some things harder.
someone always loses, something is always lost. would be good if more people in tech remembered that progress comes at a cost.
There are other types of dances where dancers are far more interested in the dance than selfies: Lindy Hop, Blues, Balboa, Tango, Waltz, Jive, Zouk, Contra, and West Coast Swing to name a few. Here are videos from the Blues dance I help organize where none of the dancers are filming themselves:
* https://www.facebook.com/61558260095218/videos/7409340551418...
* https://www.facebook.com/reel/3659488930863692
Though, I'll grant that there's not really a way to argue this without showing videos
Demand is way down because while a $5000 lens on a nice camera is better than my phone lens, my phone is close enough for most purposes. Also my phone is free, in the days of film a single roll of film by the time you developed it costs significant money (I remember as a kid getting a camera for my birthday and then my parents wouldn't get me film for it - on hindsight I suspect every roll of film cost my dad half an hour of work and he was a well paid software developer). This cost meant that you couldn't afford to practice taking pictures, every single one had to be perfect. So if you wanted a nice picture of the family it was best to pay a professional who because of experience and equipment was likely to take a much better one than you could (and if something went wrong they would retake for free).
It is pretty hard to break out but people still make names for themselves either from experience on assignments like the old days but also from instagram and other social media followings. People still need weddings shot and professional portraits taken which takes some skill in understanding the logistics of how to actually do that job well efficiently and managing your equipment.
This is actually bad for existing programmers though?
Do you not see how this devalues your skills?
A client of mine has gotten quite good at using Bolt and Lovable. He has since put me on 3 more projects that he dreamed up and vibe coded that would just be a figment of his imagination pre-AI.
He knows what's involved in software development, and knows that he can't take it all the way with these tools.
That's why sw devs salaries went up like crazy in our time and not in 1980.
But what new tech will we have, that will push the SW dev market demand up like internet connected PCs and smartphones did? All I see is stagnation in the near future, just maintaining or rewriting the existing shit that we have, not expanding into new markets.
When online flight bookings came about, travel agents were displaced. The solution isn't "let's stop online flight bookings sites and protect travel agents" because that's an inefficient system
Under capitalism, or late-stage capitalism, if you will, more efficient procedures aren't normally allowing for greater margins. There are countless examples of more exploitative and wasteful strategies yielding much greater margins than more efficient alternatives.
this is akin to the self-checkout aisles in supermarkets, some of which have been rolled back to add back in more human checkout staff.
why? people liked interacting with the inefficient humans. turns out efficiency isn’t ideal in all cases.
i wasn’t trying to argue that everything should be inefficient. i was trying to point out that not everything needs to be efficient.
two very different things, and it seems (?) you may have thought i meant the former.
I, on the other hand, will use whichever gets me out of the store faster. I don't view shopping for groceries as a social occasion.
I guess it takes all types.
If today all you do as a programmer is open jira tickets without any kind of other human interaction, AI coding agents are bad news. If you’re just using code as a means to build products for people, it might be the best thing that has happened in a long time.
So the job qualifications went from "understand lighting, composition, camera technology" to "be hot".
Me: "I'm getting married on [date] and I'm looking for a photographer."
Them, in the voice of Nick Burns: "We're already filling up for next year. Good luck finding a photographer this year."
Me: "I just got engaged. You never have anything open up?"
Them: "No" and hang up the phone.
The faster guys like that struggle to make a living, the better.
Not sure I agree. I haven't seen much evidence of "better photography" now that it's digital instead of film. There are a million more photos taken, yes, because the cost is zero. But quantity != quality or "better", and if you're an average person, 90% those photos are in some cloud storage and rarely looked at again.
You could argue that drones have made photography better because it's enabled shots that were impossible or extremely difficult before (like certain wildlife/nature shots).
One thing digital photography did do is decimate the photographer profession because there is so much abundance of "good enough" photos - why pay someone to take good ones? (This may be a lesson for software development too.)
I think you really missed the point of what these technologies and innovations actually did for society and how it applies to today, underneath the snark.
In the 1970's, if you got gifted a camera, and were willing to put in the work to figure out how to use it, you learned a skill that immediately put you in rare company.
With enough practice of that skill you could be a professional photographer, which would be a good , reliable, well paid job. Now, the barrier of entry is nothing, so it's extremely competitive to be a professional photographer, and even the ones that succeed just scrape by. And you have to stand out on other things than the technical ability to operate a camera.
That's...what's about to happen (if it hasn't already) with software developers.
Everyone in the 1970s was gifted a camera. Many of them got a nice SLR with better lens than a modern smart phone. Cameras were expensive, but within reach of most people.
Film was a different story. Today you can get 35mm film rolls for about $8 (36 pictures), and $13 to develop (plus shipping!), and $10 for prints (in 1970 you needed prints for most purposes, thought slides were an option), so $31 - where I live McDonalds starts you are $16/hour, that roll of film costs almost 2 hours work - before taxes.
Which is to say you couldn't afford to become skilled in 1970 unless you were rich.
No, wait it was called natural language coding, now anyone can code.
No, wait it was called run anything self fixing code. No wait, simplified domain specific language.
No, wait it was uml based coding.
No, wait excel makros.
No, wait its node based drag and drop .
No, wait its LLMs.
The mental retardation of no code is strong with the deciding caste, every reincarnation must be taxed.
Presumably, the LLM user will have sufficient brain capacity to verify that the result works as they have imagined (however incomplete the mental picture might be). They then have an opportunity to tweak, in real time (of sorts), to make the output closer to what they want. Repeat this as many times as needed/time available, and the output gets to be quite sufficient for purpose.
This is how traditional, bespoke software development would've worked with contractor developers. Except with LLM, the turnaround time is in minutes, rather than in days or weeks.
But consider this— back in the day, how many mainframe devs ( plus all important systems programmer! ) would it take to conjure up a CRUD application?
Did you forget the vsam SME or dba? The CICS programming?
Today, one person can do that in a jiffy. Much, much less manpower.
That might be what AI does.
today:
s/COBOL/SQL
and the statement is still true, except that many devs nowadays are JS-only, and are too scared or lazy as shit to learn another, relatively simple language like SQL. ("it's too much work". wtf do you think a job is. it's another name for work.)
because, you know, "we have to ship yesterday" (which funnily enough, is always true, like "tomorrow never comes").
the explains are not nearly as straightforward to read, and the process of writing SQL is to write the explain yourself, and then try to coax the database into turning SQL you write into that explain. its a much less pleasent LLM chat experience
And by that, I mean corps will make poor decisions that will be negative for thought workers while never really threatening executive compensation.
I see this latest one somewhat like TFA author: this is a HUGE opportunity for intelligent, motivated builders. If our jobs are at risk now or have already been lost, then we might as well take this time to make some of the things we have thought about making before but were too busy to do (or too fatigued).
In the process, we may not only develop nice incomes that are independent of PHB decisions, but some will even build things that these same companies will later want to buy for $$$.
I've been recording to myself voice notes for years. Until now they've seemingly been near-read-only. The friction for recording them is often low (in settings where I can speak freely) but getting the information out of them has been difficult.
I'm now writing software to help me quickly get information out of the voice notes. So they'll be useful to me too, not just to future historians who happen upon my hard drive. I would not be able to devote the time to this without AI, even though most of the code and all the architecture is my own.
Do what you think is best of course, but is a very bad recommendation for those who have lost their jobs and are unlikely to find another in software any time soon (if ever).
I said a few years ago when people were still saying I was overreacting and AI wouldn't take jobs, people need to reskill ASAP. If you've lost your job, learn how to paint walls or lay carpet before your emergency fund is up. In the unlikely event you find another software job while you're training, then great, if not you have a fall back.
Remember you're highly unlikely to make any serious money out of a bootstrapped startup. Statistically we know significantly fewer than than 1% of bootstrapped startups make money, let alone become viable replacements for a full-time income.
Don't be stupid – especially if you have a family depending on you.
But I thought this might be worth blogifying just for the sake of adding some counter-narrative to the doomerism I see a lot regarding the value of software developers. Feel free to tear it apart :)
I would say that when the fundamentals are easier to learn it becomes a great time to learn anything. I remember spending so much of my degree during software development trying to fix bugs and have things explained by trawling through online forums like many of us have. Looking for different ways of having concepts explained to me and how to apply them.
LLM's give us a fairly powerful tool to act as a sort of tutor in asking questions, feedback on code blocks, understanding concepts, where my code went wrong etc. Asking it all of the dumb questions we go trawling for.
But I can't speak to how this translates when you're a more intermediate developer.
We might think, "Yeah, but so many of these dumb AI corpo-initiatives are doomed to fail!" and that's correct but the success/fail metric is not based on whether the initiatives' advertised purpose is effective. If investors respond positively in the near term, that's success. This is likely why Logitech embedded AI in their mouse software (Check and see if Logi+ AI Agent is in your task manager) https://news.logitech.com/press-releases/news-details/2024/N...
The near term crash (which will happen) in AI stuff will be because of this dynamic. All it means is that phase one of the grift cycle is completing. In the midst of this totally predictable, repeatable process, a dev's job is to get gud at whatever is truly emerging as useful. There are devs who are showing huge productivity gains through thoughtful use of AI. There are apps that leverage AI to do new and exciting things. None of this stuff happens without the human. Be that human!
> mechanized farm equipment
Sure, that could be a valid analogy.
Or maybe we invented CAD software for mechanical engineering, where we were making engineering drawings by hand before?
And that doesn't quite ring the same way in terms of obsoleting engineers…
The open questions right now are how much of a demand is there for more software, and where do AI capabilities plateau.
But, there is a key distinction that we would be remiss to not take note of: By definition, farmers are the owners of the business. Most software developers aren't owners, just lowly employees. If history is to repeat, it is likely that, as usual, the owners are those who will prosper from the advancement.
fruits and all non-essential food items are famously very elastic, and constitute large share of the spending.
for example: if cheap cereal becomes abundant, it is only at the cost of poor quality, so demand for high quality cereal will increase.
the LLM driven software engineering will continuously increase the bar for quality and demand for high quality software
In the long term, food demand is elastic in that populations tend to grow.
That's no longer happening.
However, I do agree that the premium shifts from mere "coding" ability -- we already had a big look into this with the offshoring wave two decades ago -- to domain expertise, comprehension of the business logic, ability to translate fluidly between different kinds of technical and nontechnical stakeholders, and original problem-solving ability.
Maybe, but also "Excel with VBA macros" has generated an unimaginable amount of value for businesses in that time as well. There's going to be room for both.
So much room left. As I doubt every developer will double check things every time by asking.
I'm also a bit tired of running into people that are 'starting a contracting firm' and have 0 clients or direction yet and just want to waste your time.
The more awkward truth is that most of what developers have been paid to do in the 21st century was, from the larger perspective, wasted. We mostly spent a lot of developer time in harvesting attention, not in actually making anything truly useful.
Most organizations do derive net benefit from laying off the below average and hiring the above average for a given compensation range, as long as the turnover is not too high.
And this delta increases when the above average can augment themselves more effectively, so it seems we should expect an even more intense sorting.
Unfortunately, that's many businesses already, even before AI. It's all just one big factory line. Always has been (to those at the top).
1. I use AI to find my way in a sprawling micro(service|frontend) system that I am new to. This helps cut down massively on the “I know what to do, I just can’t figure out where.” I started a new job where everyone has years of context as to how things fit together and I have none. I feel strongly that I need to give an honest effort at finding things on my own before asking for help, and AI certainly helps there.
2. Anything I stumble upon in a dev/deployment process that leans too heavily into the “good behavior/hygiene,” I try to automate immediately for myself and then clean up to share with the team. In the past, I might have tried to adopt the common practice, but now it’s less effort to simply automate it away.
3. There is value in using AI in the same manner as I use vim macros: I use the planning mode heavily and iterate like crazy until I’m satisfied with the flow. If the task has a lot of repetition, I typically do the first one myself then let the AI take a whack at one or two. If I don’t like the process, I update the plan. Once I see things going smoothly, I give the AI the ok to finish the rest (making atomic commits so that it’s not just one big ball of wax). This is pretty similar to how I record macros (make one change yourself, record the macro on the next line, test it out for a line or 2, re-record if necessary, test again, plow through the rest).
4. When I come across something that needs to be fixed/could be improved but isn’t related to my task at hand, I do a few minutes of research and planning with the AI, and instead of coding a solution, we create a todo document or an issue in a tracking system. This wasn’t happening before because of the context switching required to write good documentation for later. Now it’s more of the same thing but akin to a dry run of a script.
5. I can quickly generate clear and easy to read reports to allow other teammates to give me feedback on work in flight. Think about a doc with before and after screenshots of changes throughout an app produced by a playwright script and a report generator that I can rerun in under a minute whenever I want.
I’m finding that I really enjoy the skipping the tedious stuff, and I’m also writing higher quality stuff because I have more bandwidth. It helps me collaborate more with my non dev peers because it lowers the barrier to sharing.
Important to note that in my experimenting, I haven’t had great luck with winding it up and setting it loose on a task. Too often it felt like being a junior engineer again, doomed to throw spaghetti at the wall. Once I started using AI as an assistant, I felt things really started to click. Software development is about writing code, but it’s about a lot of other things too. It’s nice when the AI can help write code, but it’s fantastic when it helps you accomplish the other things.
Yeah there is real labor involved with trying to set up the right context and explain your goals and ensure it has the right tools, etc. etc. Sometimes the payoff is massive, other times it's like "I could have done this myself faster". It takes time to build an intuition for which is which, and I'm constantly having to tune that heuristic as new models and tools come out.
That said, I still try to figure out the logic myself first, then let AI help polish or improve it. It is a bit slower, but when something breaks, at least I know why.
AI has definitely lowered the barrier. But whether you can actually walk through the door still depends on you.
I think similarly we will find that using AI to take shortcuts around design is mostly harmful, but using it to fulfill interfaces is brilliant. Eventually a set of best practices will evolve.
Regardless of the true number, you're right that no amount of reasoning on paper "why" we should be employed matters if the reality is different; which it clearly is for a lot of people. Reality decides in the end.
A more accurate title might have been "Why AI is a reason to become a software developer" - since the topic I discuss is entirely AI and its effects on the field, and there might be entirely non-AI reasons for not going into software.
And if I were to jump into instruction-level programming today I would start by asking an LLM where to begin...
I... assume that was meant sarcastically, but it's not at all clear from context I think.
I browse the web. Eventually, I review the agent code and more often than not, I rewrite it.
Damn straight we are.
There actually is a ChadGPT but I assume the OP meant ChatGPT
It may be in the end that software developers make less money even as more jobs become available.
Just, don’t skip out on learning the fundamentals. There’s no royal road to knowledge and skill. No shortcuts. No speed running, downloading kung fu, no passing go.
Why?
Because the only thing LLMs do is hallucinate. Often what they generate is what you’re looking for. It’s the right answer!
But if you don’t know what and L1 cache is or how to lay out data for SIMD; no amount of yelling at the bot is going to fix the poor performance, the security errors, and the logic errors. If you don’t know what to ask you won’t know what you’re looking at. And you won’t know how to fix it.
So just remember to learn the fundamentals while you’re out there herding the combine space harvesters… or whatever it is kids do these days.
I heard that before. Borland Delphi, Microsoft FrontPage, Macromedia Flash and so on. I learned how in 5 years or so, these new technologies would dominate everything.
Then I learned that two scenarios exist. One of them is "being replaced by a tool", the other is "being orphaned by a tool". You need to be prepared for both.
That said, even if the specific products like Cursor or ChatGPT are not here in 5 years, I am confident we are not going to collectively dismiss the utility of LLMs.
Some stuff does happen, of course, but most prophesied things do not happen.
All tools I mentioned before were useful for programming. They didn't got worse. Still not enough to keep them relevant over time.
I chose those tools as an example precisely because, for a while, they achieve widespread success. People made awesome things with them. Until they stopped doing it.
What brought their demise was their inherent limitations. Code by humans on plain text was slower, harder, but didn't had those inherent limits. Code by humans on plain text got better, those tools didn't.
Lol
Always has been.
Heck I'm so tired of statements like this, many who? It's already a lot an LLM that automate/help the boring/tedious part of my job, I have yet to see taking over 2, 5 or 10 of my collegues, just knowing what a hawful lot these tiredlessly dudes do I couldn't ever imagine doing also their job. imo such statements have very short shelf life