Most SaaS these days are really about gatekeeping some resource (typically data, or a marketplace) so while you can replicate the gate, it's really unlikely you can replicate the resource on your own. Like random example, I could vibe code a social network, but I can't make people switch to it.
Every time I read some statement like "You don’t install software anymore; you ask for it", it's not some typical non-technical office worker making the claims who is actually using the technology in this way. And when I check who is making the claim, it' always some foaming-at-the-mouth tech founder shilling their AI/No-Code platform who is about to run out of money.
Well, their point has much more substance than your "this is ad hominem" slur since the blog author didn't disclose that his opinion piece is directly tied to the business model of his startup (and therefore wants to sell you the solution so they are clearly biased). Some people may therefore feel deceived.
What I'm saying about the "ideas" is that they are entirely based on a delusion which is stated as a fact, but only because the author has a strong incentive to believe the delusion.
I have had the bright idea of replacing a couple of SaaS I rely on with the assistance of AI. I will note that I do consider a LLMs a genuine productivity boost. Observations:
1) It took a lot longer than I had hoped and I doubt the hours spent are worth the cost savings. Turns out a lot of SaaS I use are actually quite detailed products with lots of features and functionality that take a lot of time and consideration to replicate effectively.
2) Once you finish building that SaaS replacement, you are signing up to operate, maintain and secure it as long as you are running it. More time investment.
My realization is that, yes, AI can help me replace a couple other products I use and some of it is an interesting exercise. But signing up to build and manage a dozen services for my own use that I was previously paying a nominal fee for (in addition to all my other professional commitments) is probably not the utopia I might have thought it was.
The chatbot gurus 10 years ago were parroting the same thing. Nobody needs an app, everyone ll use a chatbot. And now "nobody needs a saas, everyone ll use an LLM" Sounds familiar? You got 1 part of the entire thing wrong "The large language model has effectively read the entire internet’s worth of React docs and StackOverflow threads, so it produces reliable code in that stack with minimal fuss."
I have mixed thoughts on this post. I’m a pretty senior product designer. V0, Gemini and these other tools
Can build pretty impressive prototypes. I am repeatedly asked how I am using these AI tools in my workflow. The answer is—I’m not really.
These tools are a novelty now. But I do think these will improve drastically in the next 5 or so years. When they have more context about the existing code base and design system and customer issues I think we’ll see a big leap forward.
Does that mean SaaS is dead?
Maybe small CRUD apps with few users? I think a lot of people on this forum miss that with enterprise software, business are buying processes. The software is just the codified way to execute on that.
AI is already making these processes different. But I think we’re probably going to see more SaaS to replace old and support new processes.
And is ChatGPT a SaaS? If anything SaaS is more alive than ever... I see people forking over hundreds of dollars per month for API tokens to LLM inference endpoints, without giving it a second thought...
ChatGPT is not a SaaS in the relevant sense, because, if you had a magic genie that could do the work of a hundred 99th-percentile software engineers for free, that would not suffice to let you clone it. You would also need to acquire tens of thousands of GPUs and put them to work training the foundation model, which reportedly would cost a billion dollars or more. LLMs are very unlikely to move the needle on this requirement even if they keep becoming more and more capable.
One could sort of analogize it in this respect to an IaaS like AWS rather than a SaaS; you're paying them a big monthly premium primarily so that you can avoid substantial physical capex, not to avoid the costs of software development.
(I agree that the post's framing is deeply silly in a number of respects; it's just that this specific objection doesn't go through.)
I don't think this is likely to be a real issue in the grand scheme of things.
SaaS customers pay for a product that provides a workflow and a structure for solving their problem without them having to think too hard about a solution. They are outsourcing the problem-solving work, not just the programming effort.
If you must worry, you should be thinking about the new wave of competitors and copycats using LLMs to replace their programming effort. They are the ones coming to eat your lunch if you don't have a defensible market position.
My experience at the minute is that AI is good at writing functionally correct code, but bad at writing clean code. This is why it quickly gets out of hand if you don't manually refine the output and rely solely on the vibes - if you keep building without refining it quickly becomes non-functional spaghetti. I've found that it's quite good at "colouring in" - you sketch out or partially sketch out what you want just using descriptive function names and then the AI can fill in the blanks. Once a boilerplate is established and you need to repeat it multiple times that's where it really shines and saves a lot of time.
SaaS or software to a larger extent is domain knowledge + quality code.
Does AI generate high quality code now? Yes but not always. Will it close the gap in the long run? Maybe, but unlikely. Chatbots are software themselves. And mass production of commodity requires standard assembly line. The majority of software products cannot be mass produced.
> Want to import recipes from a website or pull your bank transactions into your new dashboard? Just ask. In this future of open pipes…
If we truly had open pipes devoid of ads, tracking, subscriptions, custom data formats and other obstacles to third-parties, we wouldn’t have needed AI or ephemeral software. There would have been an abundance of third party software for commonly-found backends.
> Thanks to large AI models, we’re now on the cusp of user interfaces built from simple prompts.
Idk. If this were true I feel like we’d be seeing the implosion of a lot of companies’ valuations. This statement of “we’re on the cusp” keeps being presented uncritically, as if the progress in LLMs hasn’t completely plateaued over the past year.
I don’t think we’re anywhere close to replacing real humans with brains. If anything, it’s becoming clearer by the day that LLMs are not doing anything like the abstract symbolic reasoning needed to build true quality software.
Would that be the same cusp self-driving cars were on 5 years ago? I think the benefits of LLMs are farther in the future but the (massive) costs are here today. Folks, invest your retirement savings carefully.
we take a lot of things today at face value and that's not necessarily how things will work.
WHY do I still need to update fields in a CRM, why doesn't it just know that I've had an e-mail/meeting (that is auto-transcribed and tracked) and update it accordingly?
the ai-native variants of CRMs will look different for sure.
See, the "SaaS is dead" falls in a similar trope like "I can build facebook/uber/twitter/instagram/blabla in a weekend". They make this assumption because they usually lack experience. Sure you can, and soon you will learn what 80/20 means in software development and why most people prefer to rather pay for a known working product than to wrangle the 80/20 prompt after prompt.
This is pretty similar to the contemplating rewriting an app. There's usually a lot of knowledge and nuance built into legacy software that was hard-won, edge cases that are dealt with. Rewriting the app may be a good idea, but you have to ensure you're not losing what the legacy learned.
You can certainly make one-off apps to deal with things as they come up. But unless you are already an expert on what you need, you will still spend time building (or vibe coding), reacting to domain holes that you could also just spend $20/mo to ignore entirely.
API endpoints for recipes across sites over the internet or your banking app aren't open. Maybe the AI could crawl it, but does that work at scale? Can it do it without error or being blocked?
And for lots of apps, it's about users and data. There's been a million Twitter clones over the years but very few ever caught on. An AI generated Twitter won't get users...
Every. Time.
1) It took a lot longer than I had hoped and I doubt the hours spent are worth the cost savings. Turns out a lot of SaaS I use are actually quite detailed products with lots of features and functionality that take a lot of time and consideration to replicate effectively.
2) Once you finish building that SaaS replacement, you are signing up to operate, maintain and secure it as long as you are running it. More time investment.
My realization is that, yes, AI can help me replace a couple other products I use and some of it is an interesting exercise. But signing up to build and manage a dozen services for my own use that I was previously paying a nominal fee for (in addition to all my other professional commitments) is probably not the utopia I might have thought it was.
These tools are a novelty now. But I do think these will improve drastically in the next 5 or so years. When they have more context about the existing code base and design system and customer issues I think we’ll see a big leap forward.
Does that mean SaaS is dead?
Maybe small CRUD apps with few users? I think a lot of people on this forum miss that with enterprise software, business are buying processes. The software is just the codified way to execute on that.
AI is already making these processes different. But I think we’re probably going to see more SaaS to replace old and support new processes.
One could sort of analogize it in this respect to an IaaS like AWS rather than a SaaS; you're paying them a big monthly premium primarily so that you can avoid substantial physical capex, not to avoid the costs of software development.
(I agree that the post's framing is deeply silly in a number of respects; it's just that this specific objection doesn't go through.)
SaaS customers pay for a product that provides a workflow and a structure for solving their problem without them having to think too hard about a solution. They are outsourcing the problem-solving work, not just the programming effort.
If you must worry, you should be thinking about the new wave of competitors and copycats using LLMs to replace their programming effort. They are the ones coming to eat your lunch if you don't have a defensible market position.
Does AI generate high quality code now? Yes but not always. Will it close the gap in the long run? Maybe, but unlikely. Chatbots are software themselves. And mass production of commodity requires standard assembly line. The majority of software products cannot be mass produced.
If we truly had open pipes devoid of ads, tracking, subscriptions, custom data formats and other obstacles to third-parties, we wouldn’t have needed AI or ephemeral software. There would have been an abundance of third party software for commonly-found backends.
Idk. If this were true I feel like we’d be seeing the implosion of a lot of companies’ valuations. This statement of “we’re on the cusp” keeps being presented uncritically, as if the progress in LLMs hasn’t completely plateaued over the past year.
I don’t think we’re anywhere close to replacing real humans with brains. If anything, it’s becoming clearer by the day that LLMs are not doing anything like the abstract symbolic reasoning needed to build true quality software.
Would that be the same cusp self-driving cars were on 5 years ago? I think the benefits of LLMs are farther in the future but the (massive) costs are here today. Folks, invest your retirement savings carefully.
we take a lot of things today at face value and that's not necessarily how things will work.
WHY do I still need to update fields in a CRM, why doesn't it just know that I've had an e-mail/meeting (that is auto-transcribed and tracked) and update it accordingly?
the ai-native variants of CRMs will look different for sure.
Their other blog post talks about how writing code is a thing of the past. They also have a gem where they say:
> Think garbage collection versus free(): once the platform got smarter, we happily stopped counting pointers.
No.. no we didn't.
You can certainly make one-off apps to deal with things as they come up. But unless you are already an expert on what you need, you will still spend time building (or vibe coding), reacting to domain holes that you could also just spend $20/mo to ignore entirely.
The gap is smaller for sure, but it's not gone.
I feel like an old timer, looking back on history repeating itself.
And for lots of apps, it's about users and data. There's been a million Twitter clones over the years but very few ever caught on. An AI generated Twitter won't get users...