I saw someone use the term "orchestration", which seems to be the word for building the software using LLM tools.
It made me think of the conductor, seemingly the most skillless job in the orchestra. All you do is wave the batton, no need to ever play a instrument. If LLMs are doing the hard part (writing code) then we can be the conductor waving the batton.
But of course the visuals are misleading. Being conductor doesn't take the least skill, it takes the most. He hears every instrument individually, he knows the piece intimately, and through his conducting brings a unique expression to a familiar work.
LLMs have made the musician part automated. They'll play whatever you want. No doubt a powerful tool in the hands of a skilled conductor. And a incredible tool for someone who can't play to generate music for themselves.
There's no shortage of "I built it and they won't come" posts here on HN, predating LLMs by decades. Because code has never been the hard part of "software as a business ". LLMs have driven this point home. Code has never been cheaper. Business has never been harder.
This "orchestration" software is about people trying to increase productivity by running many instances of a coding agent on the same project, without stepping on each other too much. It doesn't seem to be fully baked yet. A "shared nothing" architecture where you work have each instance work on a distinct project seems simpler if you want to spin more plates.
> LLMs have made the musician part automated. They'll play whatever you want
I like your metaphor even as someone who can be a bit skeptical of the overly broad promises of LLM’s/AI. But I do think this statement is too generous. It implies way too much actual musical ability. It also means that everything I can imagine musically is possible which it just isn’t, as there are limitations just like with real musicians.
If we want to really make the metaphor work, it’s an orchestra full of very informed people who have read a lot about music and have an idea of what their instrument should sound like and can even make whatever they’re holding sound like the appropriate instrument most of the time sort of. With our direction, our “conducting,” their success goes up.
But ultimately: they aren’t real musicians, they aren’t holding the right instruments, and they haven’t actually been taught how to read music. They are just often good at sort of making it work in a way that approximates what we want.
Can't read this every paragraph ends with it's not x it's y. Just give me the prompt so I can read the real insights you have and not the generated fluff.
I randomly skipped to five different paragraphs and each one ended with a "!x but y" logical statement, just formatted differently most of the time. Crazy how you can't unsee it.
A sibling [dead] comment to mine is a rebuttal to "just post the prompt", where it itself was expanded to several paragraphs that each say nearly nothing, including this gem:
> "That’s not a critique of the writing. It’s a diagnosis"
I miss when people just typed their thoughts concisely and hit send without passing it to an inflater. I'd maybe have a chance of understanding the sibling comment's point.
The "barrier to entry for building software" has not collapsed, as it was never about "where engineering shifts from writing code to shaping systems". It has always been about understanding the problem to solve and doing so in a provably correct manner.
Another way to reify this is:
When making software, remember that it is a snapshot of
your understanding of the problem. It states to all,
including your future-self, your approach, clarity, and
appropriateness of the solution for the problem at hand.
Choose your statements wisely.
With LLM usage, there's another necessary step, you need to distill that understanding into text. I see people put significant time into LLM workflows. But LLM coding quality will be solved by the AI companies in due time. What's less likely to be solved, on comparable timeframes, is the creation of the input text artefact, containing your world model, from which good programs emerge when future LLMs ingest it. That is what takes up my time, building the textual wellspring for my project. The code is relatively ephemeral.
In other words, yes we have CNC machines and electric saws and whatnot, reliable to a certain degree (you can still injure yourself badly), but it doesn't remove the need of a carpenter, because a carpenter also knows how to make a hammer from scratch even if he doesn't make one in his entire life.
When you pay for anything you basically exchange money so that someone else take care of a problem you have. Obviously if you are paying you expect the result to be of good quality. Software is no different, AI won't change that fact and engineering is about creating robust solution at the cheapest price. Just my 2 cents.
solving problem should be an obsession rather than building. AI have fueled way too many builders while edge cases and lifecycle maintainence of the code is more of an afterthought
> People are increasingly building tools to solve a single, specific problem exactly once—and then discarding them. It is software as a disposable utility, designed for the immediate "now" rather than the distant "later."
Yes! This is 100% it.
This is a net good for everyone because it brings basic programming literacy to the masses and culls a lot of junk projects that are littering github or SaaS scams.
It means people can focus on the problems that actually matter.
AI doesn't have any impact on the need for accountable humans to write code.
The scratchpad analogy is so good. Most mature business software is almost literally like a tome of legal documents that have to be edited carefully, but that doesn't have anything to do with the napkin in your pocket.
In a way it's good but as far as energy usage goes, it sucks.
Not only is it taking way more energy to write software now with LLMS than by "hand", now everyone is repeating work many times over to write the same tools.
From a freedom standpoint one could argue is gives the user the most freedom to have what they want and need. But its very bad from an energy efficiency point of view.
For the love of all that is holy, I cannot read another 5 page AI post that could've been like 200 words. Just make it a paragraph or two and write using your brain, people. Does everything have to be ran through an AI? I'm sure there's some decent ideas in here, but I'm not wasting my time reading this slop.
A title per paragraph (slight exaggeration), half of the form The X, The Y, The Z. Every section which ends with "it's not; it's y" contrast framing.
But really the only issue is it's monotone linkedin still insight fluff and you can't tell where the prompt ends and the LLM crap begins. I expect something interesting was put into the LLM, but the LLM has destroyed the author's ability to communicate it with me effectively.
Google, Apple, Meta, X, Bluesky, Shopify, Stripe and all the big software companies must be really shaking in their boots for disruption against the army of vibe coders. /s
Why would any big software company need to care? There are so many small businesses with unique problems with no current off-the-shelf software solutions because they've always been too niche to justify the time and expense of bespoke development. Now that door is open. Big software companies can keep servicing big businesses and mass markets, while opportunities abound for anyone else willing to innovate on smaller problems. Not everything needs to be built to scale.
What a random set of companies to choose. You'd probably need to think critically about each one of those when assessing the accuracy of your statements.
It made me think of the conductor, seemingly the most skillless job in the orchestra. All you do is wave the batton, no need to ever play a instrument. If LLMs are doing the hard part (writing code) then we can be the conductor waving the batton.
But of course the visuals are misleading. Being conductor doesn't take the least skill, it takes the most. He hears every instrument individually, he knows the piece intimately, and through his conducting brings a unique expression to a familiar work.
LLMs have made the musician part automated. They'll play whatever you want. No doubt a powerful tool in the hands of a skilled conductor. And a incredible tool for someone who can't play to generate music for themselves.
There's no shortage of "I built it and they won't come" posts here on HN, predating LLMs by decades. Because code has never been the hard part of "software as a business ". LLMs have driven this point home. Code has never been cheaper. Business has never been harder.
So what you wrote does not bode well for the profession.
I like your metaphor even as someone who can be a bit skeptical of the overly broad promises of LLM’s/AI. But I do think this statement is too generous. It implies way too much actual musical ability. It also means that everything I can imagine musically is possible which it just isn’t, as there are limitations just like with real musicians.
If we want to really make the metaphor work, it’s an orchestra full of very informed people who have read a lot about music and have an idea of what their instrument should sound like and can even make whatever they’re holding sound like the appropriate instrument most of the time sort of. With our direction, our “conducting,” their success goes up.
But ultimately: they aren’t real musicians, they aren’t holding the right instruments, and they haven’t actually been taught how to read music. They are just often good at sort of making it work in a way that approximates what we want.
A sibling [dead] comment to mine is a rebuttal to "just post the prompt", where it itself was expanded to several paragraphs that each say nearly nothing, including this gem:
> "That’s not a critique of the writing. It’s a diagnosis"
I miss when people just typed their thoughts concisely and hit send without passing it to an inflater. I'd maybe have a chance of understanding the sibling comment's point.
This isn't mind control, just language evolution quiety nudged by AI. ;)
Another way to reify this is:
Yes! This is 100% it.
This is a net good for everyone because it brings basic programming literacy to the masses and culls a lot of junk projects that are littering github or SaaS scams.
It means people can focus on the problems that actually matter.
AI doesn't have any impact on the need for accountable humans to write code.
The scratchpad analogy is so good. Most mature business software is almost literally like a tome of legal documents that have to be edited carefully, but that doesn't have anything to do with the napkin in your pocket.
Not only is it taking way more energy to write software now with LLMS than by "hand", now everyone is repeating work many times over to write the same tools.
From a freedom standpoint one could argue is gives the user the most freedom to have what they want and need. But its very bad from an energy efficiency point of view.
But really the only issue is it's monotone linkedin still insight fluff and you can't tell where the prompt ends and the LLM crap begins. I expect something interesting was put into the LLM, but the LLM has destroyed the author's ability to communicate it with me effectively.
Google, Apple, Meta, X, Bluesky, Shopify, Stripe and all the big software companies must be really shaking in their boots for disruption against the army of vibe coders. /s
(They are actually laughing at all of them)
All of the mentioned named companies have network effects, distribution and trust.
Not quite easy to copy. Disposable LLM gen'd code without users is cheap, which is the point of the article.