"Great news, boss! We invented this new tool that allows nontechnical people to write code in English! Now anyone can deploy applications, and we don't have to hire all those expensive developers!"
"Wow, show it to me!"
"OK here it is. We call it COBOL."
glimshe 11 hours ago [-]
You're joking but it's true. I'm sure you know that. SQL had similar claims... Declarative, say what you need and the computer will do for you. Also written in English.
ako 11 hours ago [-]
And compared to what we had before SQL, it is much easier to use, and a lot more people are able to use it.
noworriesnate 11 hours ago [-]
But software developers often struggle to use sql and prefer using ORMs or analytical APIs like polars; the people who excel at sql are typically not programmers, they’re data engineers, DBAs, analysts, etc.
Maybe a similar bifurcation will arise where there are vibe coders who use LLMs to write everything, and there are real engineers who avoid LLMs.
Maybe we’re seeing the beginning of that with the whole bifurcation of programmers into two camps: heavy AI users and AI skeptics.
adalacelove 16 minutes ago [-]
I'm a developer and:
- I hate ORMs, they are the source for a lot of obscure errors behind layers and layers of abstractions.
- I prefer analytical APIs for technical reasons, not just the language.
Reasons:
- I can compose queries, which in turn makes them easier to decompose
- It's easier to spot errors
- I avoid parsing SQL strings
- It's easier to interact with the rest of the code, both functions and objects
If I need to make just a query I gladly write SQL
idiotsecant 33 minutes ago [-]
'real' engineers can use SQL just fine. This is a strange position to take.
nathanfig 10 hours ago [-]
Claude made this point while reviewing my blog for me: the mechanization of farms created a whole lot more specialization of roles. The person editing CAD diagrams of next year's combine harvester may not be a farmer strictly speaking, but farming is still where their livelihood comes from.
dredmorbius 10 hours ago [-]
Strictly speaking, farming is where all our livelihoods come from, in the greatest part. We're all living off the surplus value of food production.
(Also of other food, energy, and materials sourcing: fishing, forestry, mining, etc.)
This was the insight of the French economist François Quesnay in his Tableau économique, foundation of the Physiocratic school of economics.
lipowitz 10 hours ago [-]
Removing jobs that could only be performed by those living near the particular fields with those that can be done anywhere makes jobs for the person willing to take the least satisfactory compensation for the most skill and work.
Working the summer fields was one of the least desirable jobs but still gave local students with no particular skills a good supplemental income appropriate for whichever region.
10 hours ago [-]
ameliaquining 11 hours ago [-]
Is that really because of the English-esque syntax, rather than because it was a step forward in semantic expressivity? If SQL looked like, say, C#'s LINQ method syntax, would it really be harder to use?
AdieuToLogic 3 hours ago [-]
Before SQL became an industry standard, many programs which required a persistent store used things like ISAM[0], VISAM (a variant of ISAM[0]), or proprietary B-Tree libraries.
None of these had "semantic expressivity" as their strength.
> If SQL looked like, say, C#'s LINQ method syntax, would it really be harder to use?
> Is that really because of the English-esque syntax
Well, what we had before SQL[1] was QUEL, which is effectively the same as Alpha[2], except in "English". Given the previous assertion about what came before SQL, clearly not. I expect SQL garnered favour because it is tablational instead of relational, which is the quality that makes it easier to understand for those not heavy in the math.
[1] Originally known as SEQUEL, a fun word play on it claiming to be the QUEL successor.
[2] The godfather language created by Codd himself.
kayodelycaon 5 hours ago [-]
SQL and many DSLs (JIRA…) are actually used by plenty of non-technical users. Anyone who wants to build their own reports and do basic data analysis has sufficient incentive to learn it.
They are very much the exception that proves the rule though.
veqq 10 hours ago [-]
Er, have you heard of datalog or Prolog? Declarative programming really does work. SQL was just... Botched.
glimshe 7 hours ago [-]
Yes. And I think SQL is actually pretty good for what it does. My point, as the parent's (I suppose) is that we've heard this "XYZ, which uses natural language, will kill software development" before.
dredmorbius 10 hours ago [-]
I'd long ago (1990s-era) heard that the original intent was that office secretaries would write their own SQL queries.
(I'd love for someone to substantiate or debunk this for me.)
bazoom42 9 hours ago [-]
Early on, programming was considered secretarial work.
AdieuToLogic 3 hours ago [-]
> Early on, programming was considered secretarial work.
Incorrect.
Encoding a program was considered secretarial work, not the act of programming itself. Over time, "encoding" was shortened to "coding."
This is why the industry term "coder" is a pejorative descriptor.
bitpush 11 hours ago [-]
Bravo. This is the exact sentiment I have, but you expressed in a way that I could never have.
Most people miss the fact that technical improvements increases the pie in a way that was not possible before.
When digital cameras became popular, everybody become a photographer. That only made the world better, and we got soo many more good photographers. Same with YouTube & creativity.
And same with coding & LLMs. World will have lots more of apps, and programmers.
munificent 10 hours ago [-]
> That only made the world better, and we got soo many more good photographers.
I disagree with the "only" part here. Imagine a distribution curve of photos with shitty photos on the left and masterpieces on the right and the height at the curve is how many photos there are to be seen at that quality.
The digital camera transition massively increased the height of the curve at all points. And thanks to things like better autofocus, better low light performance, and a radically faster iteration loop, it probably shift the low and middle ends to the right.
It even certainly increased the number number of breathtaking, life-changing photos out there. Digital cameras are game-changes for photographic journalists traveling in difficult locations.
However... the curve is so high now, the sheer volume of tolerably good photos so overwhelming, that I suspect that average person actually sees fewer great photos than they did twenty years ago. We all spend hours scrolling past nice-but-forgottable sunset shots on Instagram and miss out on the amazing stuff.
We are drowning in a sea of "pretty good". It is possible for there to be too much media. Ultimately, we all have a finite amount of attention to spend before we die.
DavidPiper 5 hours ago [-]
Thank you for describing this so eloquently.
Meaning no disrespect to photographers, I'm starting to think that a probable outcome of all the AI investment is a sharp uptick in shovelware.
If we can get AIs to build "pretty good" things - or even just "pretty average" things - cheaply, then our app stores, news feeds, ad feeds, company directives, etc, will be continuously swamped with it.
shinedog 1 hours ago [-]
You hit this so hard it was impossible not to recognize. In every sense there is too much "ok" shit (in every media realm) that we cannot help but miss amazing stuff. Knowing that I don't have enough time for all the incredible things that technology has enabled crushes me.
test6554 3 hours ago [-]
Experts warn that at current production levels, the supply of dick pics may actually outpace demand in a couple decades.
DavidPiper 2 hours ago [-]
I was under the impression that supply already vastly outstrips demand.
kjkjadksj 2 hours ago [-]
It affects even the competent photographer. How many times do you see that photographer with all the gear sit in front of a literal statue and fire off a 30 shot burst in 2 seconds? I don’t envy these pro photo editors either today in sports. I wonder how many shots they have to go through per touchdown from all the photographers at the end zone firing a burst until everyone stands up and throws the ball back at the ref? After a certain point you probably have to just close your eyes and pick one of the shots that looks almost identical to another 400. Not a job for analysis paralysis people. I guess it sure beats having to wait for the slide film to develop.
dijksterhuis 10 hours ago [-]
> That only made the world better
Did it?
people now stand around on dance floors taking photos and videos of themselves instead of getting on dancing and enjoying the music. to the point where clubs put stickers on phones to stop people from doing it.
people taking their phone out and videoing / photographing something awful happening, instead of doing something helpful.
people travel to remote areas where the population has been separated from humanity and do stupid things like leave a can of coke there, for view count.
it’s not made things better, it just made things different. whether that’s better or worse depends on your individual perspective for a given example.
so, i disagree. it hasn’t only made things better. it made some things easier. some things better. some things worse. some things harder.
someone always loses, something is always lost. would be good if more people in tech remembered that progress comes at a cost.
thangalin 9 hours ago [-]
> people now stand around on dance floors taking photos and videos of themselves instead of getting on dancing and enjoying the music. to the point where clubs put stickers on phones to stop people from doing it.
There are other types of dances where dancers are far more interested in the dance than selfies: Lindy Hop, Blues, Balboa, Tango, Waltz, Jive, Zouk, Contra, and West Coast Swing to name a few. Here are videos from the Blues dance I help organize where none of the dancers are filming themselves:
Thank you for sharing your social media videos as evidence in a rebuttal to "camera phones are not all good; they're ubiquitous use has negative implication too". So delicious...
VonTum 4 hours ago [-]
The irony!
Though, I'll grant that there's not really a way to argue this without showing videos
kjkjadksj 2 hours ago [-]
That sort of dancing is basically a sport. You have to learn it, you have to get good at it after you learned it, and it is cardio after all. I think op was talking more about what you see in the edm scene these days. Where basically people aren’t there to dance like the old days or sing along like other genres, they are there to see a certain DJ and then they will post clips from the entire set on their instagram story. And they can do this because the dancing they are doing at the edm show is super passive kind of dancing where you are just swaying a little so you can hold the phone stably at the same time. If you were dancing like how they’d dance at the edm concerts in the 90s all rolling on molly it would be like your blues swing where its just too physical to do anything but rave around flinging your arms all around shirtless and sweaty.
skeeter2020 58 minutes ago [-]
Live music sucks when you're trying to watch the show and some dumb-dumb is holding their phone above their head to shoot the entire show with low-light, bad angle & terrible sound. NO ONE is going to watch that, and you wrecked the experience for many people. Put your phone away and live in the present, please...
flashgordon 9 hours ago [-]
I would add one thing though. The pie definitely gets bigger - but i feel there is a period of "downsizing" that happens. I think this is becuase of lack of ideas. When you have tool that (say) 10xes your productivity, its not that bosses will have ideas to build 10x the number of things - they will just look to cut costs first (hello lack of imagination and high interest rates).
sarchertech 4 hours ago [-]
We’ve had many improvements that increased productivity at least as much as current LLMs, and I don’t think any of them ever temporarily caused downsizing in the total number of programmers.
pipes 9 hours ago [-]
I thought photographers don't get paid well anymore due market saturation and few skills required to get a good photo?
kjkjadksj 2 hours ago [-]
It is still as hard as its been to get a good photo. They had full auto film cameras that could take good photos in the 70s but the devil is always the edge cases and the subconscious ability to take an evenly exposed (in the Ansel Adams definition not auto camera exposure definition), well composed image at the decisive moment. Understanding how lighting works (either natural, or different artificial light like flash or studio lighting) is also not easy.
It is pretty hard to break out but people still make names for themselves either from experience on assignments like the old days but also from instagram and other social media followings. People still need weddings shot and professional portraits taken which takes some skill in understanding the logistics of how to actually do that job well efficiently and managing your equipment.
bluefirebrand 6 hours ago [-]
> World will have lots more of apps, and programmers.
This is actually bad for existing programmers though?
Do you not see how this devalues your skills?
platevoltage 6 hours ago [-]
I see your point, but I'm having personally having a different experience.
A client of mine has gotten quite good at using Bolt and Lovable. He has since put me on 3 more projects that he dreamed up and vibe coded that would just be a figment of his imagination pre-AI.
He knows what's involved in software development, and knows that he can't take it all the way with these tools.
sarchertech 4 hours ago [-]
There are far more programmers now than in 1980, yet the average programmer makes far more (inflation adjusted) now.
kjkjadksj 2 hours ago [-]
Thank the Bangalore office for that.
bitpush 6 hours ago [-]
In the current state, yes. But that is also an opportunity, isn't it?
When online flight bookings came about, travel agents were displaced. The solution isn't "let's stop online flight bookings sites and protect travel agents" because that's an inefficient system
dijksterhuis 4 hours ago [-]
Why does every system need to be efficient?
hackernoops 3 hours ago [-]
Fractional reserve lending, rehypothecation, etc.
komali2 2 hours ago [-]
Under capitalism, because greater margins. Under not-capitalism, so as to free up resources and labor for other things or just increase available downtime for people.
lupire 3 hours ago [-]
Sorry to be that guy, but would to prefer if your computer and phone each cost $5000?
20after4 11 hours ago [-]
And now the business of wedding / portrait photographer has become hyper-competitive. Now everyone's cousin is an amateur photographer and every phone has an almost acceptable camera built in. It is much more difficult to have a profitable photography business compared to 20 years ago.
bachmeier 11 hours ago [-]
That's good to hear. Back when I got married there were some real jerks in the wedding photography business, and they weren't worried about running out of customers. Here's an actual conversation I had with one of them:
Me: "I'm getting married on [date] and I'm looking for a photographer."
Them, in the voice of Nick Burns: "We're already filling up for next year. Good luck finding a photographer this year."
Me: "I just got engaged. You never have anything open up?"
Them: "No" and hang up the phone.
The faster guys like that struggle to make a living, the better.
LargeWu 8 hours ago [-]
In the same breath, those photographers will complain about all the "amateurs" devaluing their services.
NewsaHackO 10 hours ago [-]
Definitely. What matters more is that the ability to take photos is available to more people, which is a net positive.
insane_dreamer 2 hours ago [-]
> everybody become a photographer. That only made the world better, and we got soo many more good photographers.
Not sure I agree. I haven't seen much evidence of "better photography" now that it's digital instead of film. There are a million more photos taken, yes, because the cost is zero. But quantity != quality or "better", and if you're an average person, 90% those photos are in some cloud storage and rarely looked at again.
You could argue that drones have made photography better because it's enabled shots that were impossible or extremely difficult before (like certain wildlife/nature shots).
One thing digital photography did do is decimate the photographer profession because there is so much abundance of "good enough" photos - why pay someone to take good ones? (This may be a lesson for software development too.)
platevoltage 6 hours ago [-]
Fast forward a couple decades and "Ok here it is. We call it Dreamweaver"
and the statement is still true, except that many devs nowadays are JS-only, and are too scared or lazy as shit to learn another, relatively simple language like SQL. ("it's too much work". wtf do you think a job is. it's another name for work.)
because, you know, "we have to ship yesterday" (which funnily enough, is always true, like "tomorrow never comes").
8note 44 minutes ago [-]
SQL is straightforward enough, but its not the sketchy part. taking down the database so other people cant use it by running a test query is the bad part.
the explains are not nearly as straightforward to read, and the process of writing SQL is to write the explain yourself, and then try to coax the database into turning SQL you write into that explain. its a much less pleasent LLM chat experience
11 hours ago [-]
nathanfig 14 hours ago [-]
Hi all - I write a lot for myself but typically don't share, hence the stream-of-consciousness style.
But I thought this might be worth blogifying just for the sake of adding some counter-narrative to the doomerism I see a lot regarding the value of software developers. Feel free to tear it apart :)
layer8 10 hours ago [-]
The humor was refreshing. :)
randfish 13 hours ago [-]
Thought it was great. Thanks for writing and submitting!
nathanfig 12 hours ago [-]
Thanks!
michaelteter 4 hours ago [-]
Having experienced several overhyped corporate knee-jerk (and further press-amplified) silver bullets, I expect this will play out about as well as the previous ones.
And by that, I mean corps will make poor decisions that will be negative for thought workers while never really threatening executive compensation.
I see this latest one somewhat like TFA author: this is a HUGE opportunity for intelligent, motivated builders. If our jobs are at risk now or have already been lost, then we might as well take this time to make some of the things we have thought about making before but were too busy to do (or too fatigued).
In the process, we may not only develop nice incomes that are independent of PHB decisions, but some will even build things that these same companies will later want to buy for $$$.
NoPicklez 2 hours ago [-]
My take just purely based on the title, I'm in the security space not a developer but I did study it during my degree.
I would say that when the fundamentals are easier to learn it becomes a great time to learn anything. I remember spending so much of my degree during software development trying to fix bugs and have things explained by trawling through online forums like many of us have. Looking for different ways of having concepts explained to me and how to apply them.
LLM's give us a fairly powerful tool to act as a sort of tutor in asking questions, feedback on code blocks, understanding concepts, where my code went wrong etc. Asking it all of the dumb questions we go trawling for.
But I can't speak to how this translates when you're a more intermediate developer.
rossdavidh 5 hours ago [-]
All of this is good reason that orgs _shouldn't_ be laying off developers, but none of it is a reason that they won't/aren't. In any case, I see more "if they're remote why can't they be on the low-wage side of the planet" at the moment, than I do "use AI instead of a developer", although they are no doubt related.
The more awkward truth is that most of what developers have been paid to do in the 21st century was, from the larger perspective, wasted. We mostly spent a lot of developer time in harvesting attention, not in actually making anything truly useful.
MichaelZuo 5 hours ago [-]
How does that follow…?
Most organizations do derive net benefit from laying off the below average and hiring the above average for a given compensation range, as long as the turnover is not too high.
And this delta increases when the above average can augment themselves more effectively, so it seems we should expect an even more intense sorting.
dehrmann 11 hours ago [-]
The farming quote is interesting, but one of the Jevons paradox requirements is a highly elastic demand curve, and food is inelastic.
The open questions right now are how much of a demand is there for more software, and where do AI capabilities plateau.
9rx 10 hours ago [-]
Either way, as quite visibility seen by all the late-1800s mansions still lining the country roads, the era of farmers being "overpaid", as the link puts it, came about 50-75 years after the combine was invented. If the metaphor is to hold, we can assume that developers are currently poor as compared to what the LLM future holds for them.
But, there is a key distinction that we would be remiss to not take note of: By definition, farmers are the owners of the business. Most software developers aren't owners, just lowly employees. If history is to repeat, it is likely that, as usual, the owners are those who will prosper from the advancement.
slt2021 10 hours ago [-]
demand for food is very elastic. if beef becomes more expensive, cheaper protein options get more demand (chicken, pork, tofu, beans).
fruits and all non-essential food items are famously very elastic, and constitute large share of the spending.
for example: if cheap cereal becomes abundant, it is only at the cost of poor quality, so demand for high quality cereal will increase.
the LLM driven software engineering will continuously increase the bar for quality and demand for high quality software
giraffe_lady 11 hours ago [-]
Reported numbers vary but household food waste seems to be fairly high in developed economies, so food demand might be more elastic than intuition would expect.
dredmorbius 10 hours ago [-]
I've seen consistent values for food waste reported for at least the past 40 years, if not the past 80, in various sources. I suspect it's something of a constant. One observation I've seen is that food wastage now occurs far later in the processing cycle, which is to say, after far more resources (transport, processing, refrigeration, cooking) have been invested in it.
In the long term, food demand is elastic in that populations tend to grow.
kwk1 10 hours ago [-]
Perhaps we should say something like "food demand has an elasticity floor."
giraffe_lady 8 hours ago [-]
For sure.
abalashov 11 hours ago [-]
I'm not sure if I agree with every aspect of the framing here; specifically, I don't think the efficiency gains are anywhere on par with a combine harvester.
However, I do agree that the premium shifts from mere "coding" ability -- we already had a big look into this with the offshoring wave two decades ago -- to domain expertise, comprehension of the business logic, ability to translate fluidly between different kinds of technical and nontechnical stakeholders, and original problem-solving ability.
nathanfig 11 hours ago [-]
Yeah I think the combine-harvester analogy is tempting because it's so easy to visualize how wheat can scale over a big square field and project that visual onto lines of code generated on a big square screen... forgetting that lines-of-code-generated is not inherently useful.
temporallobe 9 hours ago [-]
Essentially it’s the same as it always was. Back in the day, Low-code or No-code solutions implemented by non-technical people have always resulted in engineers having to come in behind them to clean up their mess. I’ve had quite the lucrative career doing just that.
2 hours ago [-]
nathanfig 9 hours ago [-]
Yeah, with current-state AI I foresee more such opportunities.
Ekaros 8 hours ago [-]
I think I will have good while in security. That is pointing all the mistakes and faults... And telling why something AI came up might not fully solve the problem.
So much room left. As I doubt every developer will double check things every time by asking.
2 hours ago [-]
prisenco 10 hours ago [-]
Upwork is already filling up with people who have vibe-coded their way into a pit and need experienced developers to pull them out.
billy99k 9 hours ago [-]
You can find good contract on Upwork, but you need to go through lots of bad ones. I find around 5 good contracts there per year. I find that even when a client agrees on a rate, Upwork has the reputation of finding inexpensive workers, and you will get many clients trying to pay you less.
I'm also a bit tired of running into people that are 'starting a contracting firm' and have 0 clients or direction yet and just want to waste your time.
nathanfig 9 hours ago [-]
Really! That could make for some really interesting stories. Fascinating to think of LLMs as a customer acquisition pipeline for developers.
platevoltage 6 hours ago [-]
I've snagged at least one of them.
agentultra 59 minutes ago [-]
If you’re going to use LLMs to learn software development, great! Welcome!
Just, don’t skip out on learning the fundamentals. There’s no royal road to knowledge and skill. No shortcuts. No speed running, downloading kung fu, no passing go.
Why?
Because the only thing LLMs do is hallucinate. Often what they generate is what you’re looking for. It’s the right answer!
But if you don’t know what and L1 cache is or how to lay out data for SIMD; no amount of yelling at the bot is going to fix the poor performance, the security errors, and the logic errors. If you don’t know what to ask you won’t know what you’re looking at. And you won’t know how to fix it.
So just remember to learn the fundamentals while you’re out there herding the combine space harvesters… or whatever it is kids do these days.
karczex 12 hours ago [-]
It's like "we invented Fortran so there will be no need for so many developers"
nathanfig 12 hours ago [-]
An interesting parallel because there were undoubtedly some people who worried we would lose something important in the craft of instruction-level programming, and almost certainly we have in relative terms. But in absolute numbers I am confident we have more low-level programmers than we did before Fortran.
And if I were to jump into instruction-level programming today I would start by asking an LLM where to begin...
marcosdumay 10 hours ago [-]
Fortran was a much larger jump in productivity than agentic coding...
yodsanklai 11 hours ago [-]
> What do you do while awaiting the agents writing your code?
I browse the web. Eventually, I review the agent code and more often than not, I rewrite it.
I also remember this! Maybe a subconscious influence
rr808 3 hours ago [-]
The management at my corporate job literally say in our townhalls that they expect AI to increase productivity and reduce costs. Makes logical sense to me, the glory days of high wages are over.
vincenthwt 2 hours ago [-]
Are you talking about the high wages of software engineers or management? Makes sense to me— the glory days of high management and CEO salaries are over.
60 minutes ago [-]
SeanDav 12 hours ago [-]
>> "ChadGPT"
There actually is a ChadGPT but I assume the OP meant ChatGPT
nathanfig 12 hours ago [-]
Oh I should have known - yeah I was just being facetious
12 hours ago [-]
freekh 12 hours ago [-]
Nice article! Reflects my views as well!
alganet 9 hours ago [-]
> and now with far greater reach and speed than ever before
I heard that before. Borland Delphi, Microsoft FrontPage, Macromedia Flash and so on. I learned how in 5 years or so, these new technologies would dominate everything.
Then I learned that two scenarios exist. One of them is "being replaced by a tool", the other is "being orphaned by a tool". You need to be prepared for both.
nathanfig 9 hours ago [-]
Yes, if you built your career on FrontPage you have probably had a bad time. Many such cases.
That said, even if the specific products like Cursor or ChatGPT are not here in 5 years, I am confident we are not going to collectively dismiss the utility of LLMs.
alganet 8 hours ago [-]
I can see it being useful for summarization, or creative writing. What makes you so sure that LLMs will be useful _for programming_ in the long run?
mirkodrummer 6 hours ago [-]
> LLMs really are like combine harvesters; allowing one to do the work of many.
Heck I'm so tired of statements like this, many who? It's already a lot an LLM that automate/help the boring/tedious part of my job, I have yet to see taking over 2, 5 or 10 of my collegues, just knowing what a hawful lot these tiredlessly dudes do I couldn't ever imagine doing also their job. imo such statements have very short shelf life
fuzztester 4 hours ago [-]
>What do you do while awaiting the agents writing your code?
>ChadGPT (sic) suggests exercise, and so I drop and do twenty. Planks are good too but tempt me to use Gemini Flash where a thinking model would be better. And it's hard to explain at the company summit why I'm suddenly sporting washboard abs and biceps the size of hams. "Oh, uhh, definitely NOT AI," I lie, blushing profusely, "The only weights I use are at the gym! Haha."
"Wow, show it to me!"
"OK here it is. We call it COBOL."
Maybe a similar bifurcation will arise where there are vibe coders who use LLMs to write everything, and there are real engineers who avoid LLMs.
Maybe we’re seeing the beginning of that with the whole bifurcation of programmers into two camps: heavy AI users and AI skeptics.
Reasons: - I can compose queries, which in turn makes them easier to decompose - It's easier to spot errors - I avoid parsing SQL strings - It's easier to interact with the rest of the code, both functions and objects
If I need to make just a query I gladly write SQL
(Also of other food, energy, and materials sourcing: fishing, forestry, mining, etc.)
This was the insight of the French economist François Quesnay in his Tableau économique, foundation of the Physiocratic school of economics.
Working the summer fields was one of the least desirable jobs but still gave local students with no particular skills a good supplemental income appropriate for whichever region.
None of these had "semantic expressivity" as their strength.
> If SQL looked like, say, C#'s LINQ method syntax, would it really be harder to use?
Yes.
0 - https://en.wikipedia.org/wiki/ISAM
Well, what we had before SQL[1] was QUEL, which is effectively the same as Alpha[2], except in "English". Given the previous assertion about what came before SQL, clearly not. I expect SQL garnered favour because it is tablational instead of relational, which is the quality that makes it easier to understand for those not heavy in the math.
[1] Originally known as SEQUEL, a fun word play on it claiming to be the QUEL successor.
[2] The godfather language created by Codd himself.
They are very much the exception that proves the rule though.
(I'd love for someone to substantiate or debunk this for me.)
Incorrect.
Encoding a program was considered secretarial work, not the act of programming itself. Over time, "encoding" was shortened to "coding."
This is why the industry term "coder" is a pejorative descriptor.
Most people miss the fact that technical improvements increases the pie in a way that was not possible before.
When digital cameras became popular, everybody become a photographer. That only made the world better, and we got soo many more good photographers. Same with YouTube & creativity.
And same with coding & LLMs. World will have lots more of apps, and programmers.
I disagree with the "only" part here. Imagine a distribution curve of photos with shitty photos on the left and masterpieces on the right and the height at the curve is how many photos there are to be seen at that quality.
The digital camera transition massively increased the height of the curve at all points. And thanks to things like better autofocus, better low light performance, and a radically faster iteration loop, it probably shift the low and middle ends to the right.
It even certainly increased the number number of breathtaking, life-changing photos out there. Digital cameras are game-changes for photographic journalists traveling in difficult locations.
However... the curve is so high now, the sheer volume of tolerably good photos so overwhelming, that I suspect that average person actually sees fewer great photos than they did twenty years ago. We all spend hours scrolling past nice-but-forgottable sunset shots on Instagram and miss out on the amazing stuff.
We are drowning in a sea of "pretty good". It is possible for there to be too much media. Ultimately, we all have a finite amount of attention to spend before we die.
Meaning no disrespect to photographers, I'm starting to think that a probable outcome of all the AI investment is a sharp uptick in shovelware.
If we can get AIs to build "pretty good" things - or even just "pretty average" things - cheaply, then our app stores, news feeds, ad feeds, company directives, etc, will be continuously swamped with it.
Did it?
people now stand around on dance floors taking photos and videos of themselves instead of getting on dancing and enjoying the music. to the point where clubs put stickers on phones to stop people from doing it.
people taking their phone out and videoing / photographing something awful happening, instead of doing something helpful.
people travel to remote areas where the population has been separated from humanity and do stupid things like leave a can of coke there, for view count.
it’s not made things better, it just made things different. whether that’s better or worse depends on your individual perspective for a given example.
so, i disagree. it hasn’t only made things better. it made some things easier. some things better. some things worse. some things harder.
someone always loses, something is always lost. would be good if more people in tech remembered that progress comes at a cost.
There are other types of dances where dancers are far more interested in the dance than selfies: Lindy Hop, Blues, Balboa, Tango, Waltz, Jive, Zouk, Contra, and West Coast Swing to name a few. Here are videos from the Blues dance I help organize where none of the dancers are filming themselves:
* https://www.facebook.com/61558260095218/videos/7409340551418...
* https://www.facebook.com/reel/3659488930863692
Though, I'll grant that there's not really a way to argue this without showing videos
It is pretty hard to break out but people still make names for themselves either from experience on assignments like the old days but also from instagram and other social media followings. People still need weddings shot and professional portraits taken which takes some skill in understanding the logistics of how to actually do that job well efficiently and managing your equipment.
This is actually bad for existing programmers though?
Do you not see how this devalues your skills?
A client of mine has gotten quite good at using Bolt and Lovable. He has since put me on 3 more projects that he dreamed up and vibe coded that would just be a figment of his imagination pre-AI.
He knows what's involved in software development, and knows that he can't take it all the way with these tools.
When online flight bookings came about, travel agents were displaced. The solution isn't "let's stop online flight bookings sites and protect travel agents" because that's an inefficient system
Me: "I'm getting married on [date] and I'm looking for a photographer."
Them, in the voice of Nick Burns: "We're already filling up for next year. Good luck finding a photographer this year."
Me: "I just got engaged. You never have anything open up?"
Them: "No" and hang up the phone.
The faster guys like that struggle to make a living, the better.
Not sure I agree. I haven't seen much evidence of "better photography" now that it's digital instead of film. There are a million more photos taken, yes, because the cost is zero. But quantity != quality or "better", and if you're an average person, 90% those photos are in some cloud storage and rarely looked at again.
You could argue that drones have made photography better because it's enabled shots that were impossible or extremely difficult before (like certain wildlife/nature shots).
One thing digital photography did do is decimate the photographer profession because there is so much abundance of "good enough" photos - why pay someone to take good ones? (This may be a lesson for software development too.)
today:
s/COBOL/SQL
and the statement is still true, except that many devs nowadays are JS-only, and are too scared or lazy as shit to learn another, relatively simple language like SQL. ("it's too much work". wtf do you think a job is. it's another name for work.)
because, you know, "we have to ship yesterday" (which funnily enough, is always true, like "tomorrow never comes").
the explains are not nearly as straightforward to read, and the process of writing SQL is to write the explain yourself, and then try to coax the database into turning SQL you write into that explain. its a much less pleasent LLM chat experience
But I thought this might be worth blogifying just for the sake of adding some counter-narrative to the doomerism I see a lot regarding the value of software developers. Feel free to tear it apart :)
And by that, I mean corps will make poor decisions that will be negative for thought workers while never really threatening executive compensation.
I see this latest one somewhat like TFA author: this is a HUGE opportunity for intelligent, motivated builders. If our jobs are at risk now or have already been lost, then we might as well take this time to make some of the things we have thought about making before but were too busy to do (or too fatigued).
In the process, we may not only develop nice incomes that are independent of PHB decisions, but some will even build things that these same companies will later want to buy for $$$.
I would say that when the fundamentals are easier to learn it becomes a great time to learn anything. I remember spending so much of my degree during software development trying to fix bugs and have things explained by trawling through online forums like many of us have. Looking for different ways of having concepts explained to me and how to apply them.
LLM's give us a fairly powerful tool to act as a sort of tutor in asking questions, feedback on code blocks, understanding concepts, where my code went wrong etc. Asking it all of the dumb questions we go trawling for.
But I can't speak to how this translates when you're a more intermediate developer.
The more awkward truth is that most of what developers have been paid to do in the 21st century was, from the larger perspective, wasted. We mostly spent a lot of developer time in harvesting attention, not in actually making anything truly useful.
Most organizations do derive net benefit from laying off the below average and hiring the above average for a given compensation range, as long as the turnover is not too high.
And this delta increases when the above average can augment themselves more effectively, so it seems we should expect an even more intense sorting.
The open questions right now are how much of a demand is there for more software, and where do AI capabilities plateau.
But, there is a key distinction that we would be remiss to not take note of: By definition, farmers are the owners of the business. Most software developers aren't owners, just lowly employees. If history is to repeat, it is likely that, as usual, the owners are those who will prosper from the advancement.
fruits and all non-essential food items are famously very elastic, and constitute large share of the spending.
for example: if cheap cereal becomes abundant, it is only at the cost of poor quality, so demand for high quality cereal will increase.
the LLM driven software engineering will continuously increase the bar for quality and demand for high quality software
In the long term, food demand is elastic in that populations tend to grow.
However, I do agree that the premium shifts from mere "coding" ability -- we already had a big look into this with the offshoring wave two decades ago -- to domain expertise, comprehension of the business logic, ability to translate fluidly between different kinds of technical and nontechnical stakeholders, and original problem-solving ability.
So much room left. As I doubt every developer will double check things every time by asking.
I'm also a bit tired of running into people that are 'starting a contracting firm' and have 0 clients or direction yet and just want to waste your time.
Just, don’t skip out on learning the fundamentals. There’s no royal road to knowledge and skill. No shortcuts. No speed running, downloading kung fu, no passing go.
Why?
Because the only thing LLMs do is hallucinate. Often what they generate is what you’re looking for. It’s the right answer!
But if you don’t know what and L1 cache is or how to lay out data for SIMD; no amount of yelling at the bot is going to fix the poor performance, the security errors, and the logic errors. If you don’t know what to ask you won’t know what you’re looking at. And you won’t know how to fix it.
So just remember to learn the fundamentals while you’re out there herding the combine space harvesters… or whatever it is kids do these days.
And if I were to jump into instruction-level programming today I would start by asking an LLM where to begin...
I browse the web. Eventually, I review the agent code and more often than not, I rewrite it.
There actually is a ChadGPT but I assume the OP meant ChatGPT
I heard that before. Borland Delphi, Microsoft FrontPage, Macromedia Flash and so on. I learned how in 5 years or so, these new technologies would dominate everything.
Then I learned that two scenarios exist. One of them is "being replaced by a tool", the other is "being orphaned by a tool". You need to be prepared for both.
That said, even if the specific products like Cursor or ChatGPT are not here in 5 years, I am confident we are not going to collectively dismiss the utility of LLMs.
Heck I'm so tired of statements like this, many who? It's already a lot an LLM that automate/help the boring/tedious part of my job, I have yet to see taking over 2, 5 or 10 of my collegues, just knowing what a hawful lot these tiredlessly dudes do I couldn't ever imagine doing also their job. imo such statements have very short shelf life
>ChadGPT (sic) suggests exercise, and so I drop and do twenty. Planks are good too but tempt me to use Gemini Flash where a thinking model would be better. And it's hard to explain at the company summit why I'm suddenly sporting washboard abs and biceps the size of hams. "Oh, uhh, definitely NOT AI," I lie, blushing profusely, "The only weights I use are at the gym! Haha."
oh God, not yet another chi-chi / hipster post.
oh God, yes it is exactly that.
bailing now ...