Dram alternates between feast and famine; it's the nature of a business when the granularity of investment is so huge (you have a fab or you don't, and they cost billions -maybe trillions by now). So, it will swing back. Unfortunately it looks like maybe 3-5 years on average, from some analysis here:
https://storagesearch.com/memory-boom-bust-cycles.html
(That's just me eyeballing it, feel free to do the math)
In a traditional pork cycle there's a relatively large number of players and a relatively low investment cost. The DRAM market in the 1970s and 1980s operated quite similarly: you could build a fab for a few million dollars, and it could be done by a fab which also churned out regular logic - it's how Intel got started! There were dozens of DRAM-producing companies in the US alone.
But these days the market looks completely different. The market is roughly equally divided up between SK Hynix, Micron, and Samsung. Building a fab costs billions and can easily a year of 5 - if not a decade - from start to finish. Responding to current market conditions is basically impossible, you have to plan for the market you expect years from now.
Ignoring the current AI bubble, DRAM demand has become relatively stable - and so has the price. Unless there's a good reason to believe the current buying craze will last over a decade, why would the DRAM manufacturers risk significantly changing their plans and potentially creating an oversupply in the future? It's not like the high prices are hurting them...
Also, current political turbulence makes planning for the long term extremely risky.
Will the company be evicted from the country in 6 months? A year? Will there be 100% tariffs on competitions imports? Or 0%? Will there be an anti-labor gov’t in effect when the investment might mature, or a pro-labor?
The bigger the investment, the longer the investment timeframe, and the more sane the returns - the harder it is to make the investment happen.
High risk requires a correspondingly high potential return.
That everyone has to pay more for current production is a side effect of the uncertainty, because no one knows what the odds are of even future production actually happening, let along the next fancy wiz-bang technology.
I wouldn't be so sure. I've seen analyses making the case that this new phase is unlike previous cycles and DRAM makers will be far less willing to invest significantly in new capacity, especially into consumer DRAM over more enterprise DRAM or HBM (and even there there's still a significant risk of the AI bubble popping). The shortage could last a decade. Right now DRAM makers are benefiting to an extreme degree since they can basically demand any price for what they're making now, reducing the incentive even more.
The most likely direct response is not new capacity, it's older capacity running at full tilt (given the now higher margins) to produce more mature technology with lower requirements on fabrication (such as DDR3/4, older Flash storage tech, etc.) and soak up demand for these. DDR5/GDDR/HBM/etc. prices will still be quite high, but alternatives will be available.
...except current peak in demand is mostly driven by build-out of AI capacity.
Both inference and training workloads are often bottlenecked on RAM speed, and trying to shoehorn older/slower memory tech there would require non-trivial amount of R&D to go into widening memory bus on CPU/GPU/NPUs, which is unlikely to happen - those are in very high demand already.
Even if AI stuff does really need DDR5, there must be lots of other applications that would ideally use DDR5 but can make do with DDR3/4 if there's a big difference in price
Yes, but if new capacity is also redirected to be able to be sold as enterprise memory, we won't see better supply for consumer memory. As long as margins are better and demand is higher for enterprise memory, the average consumer is screwed.
I mean, the only difference we care about is how much of it is actual RAM vs HBM (to be used on GPUs) and how much it costs. We want it to be cheap. So yes, there's a difference if we're competing with enterprise customers for supply.
I don't really understand why every little thing needs to be spelled out. It doesn't matter. We're not getting the RAM at an affordable price anymore.
Doesn't the same factory produce enterprise (i.e. ECC) and consumer (non-ECC) DRAM?
If there is high demand for the former due to AI, they can increase production to generate higher profits. This cuts the production capacity of consumer DRAM, and lead to higher prices in that segment too. Simple supply & demand at work.
Conceptually, you can think of it as "RAID for memory".
A consumer DDR5 module has two 32-bit-wide buses, which are both for example implemented using 4 chips which each handle 8 bits operating in parallel - just like RAID 0.
An enterprise DDR5 module has a 40-bit-wide bus implemented using 5 chips. The memory controller uses those 8 additional bits to store the parity calculated over the 32 regular bits - so just like RAID 4 (or RAID 5, I haven't dug into the details too deeply). The whole magic happens inside the controller, the DRAM chip itself isn't even aware of it.
Given the way the industry works (some companies do DRAM chip production, it is sold as a commodity, and others buy a bunch of chips to turn them into RAM modules) the factory producing the chips does not even know if the chips they have just produced will be turned into ECC or non-ECC. The prices rise and fall as one because it is functionally a single market.
Each memory DIMM/stick is made up of multiple DRAM chip. ECC DIMMs have an extra chip for storing the error correcting parity data.
The bottleneck is with the chips and not the DIMMs. Chip fabs are expensive and time consuming, while making PCBs and placing components down onto them is much easier to get into.
A LOT of businesses learned during Covid they can make more money by permanently reducing output and jacking prices. We might be witnessing the end times of economies of scale.
The idea is someone else comes in that's happy to eat their lunch by undercutting them. Unfortunately, we're probably limited to China doing that at this point as a lot of the existing players have literally been fined for price fixing before.
I think in part it is a system level response to the widespread just-in-time approach of those businesses' clients. A just-in-time client is very "flexible" on price when supply is squeezed. After that back and forth i think we'll see return to some degree of supply buffering(warehousing) to dampen down the supply levels/price shocks in the pipelines.
No, a wafer is very much not a wafer. DRAM processes are very different from making logic*. You don't just make memory in your fab today and logic tomorrow. But even when you stay in your lane, the industry operates on very long cycles and needs scale to function at any reasonable price at all. You don't just dust off your backyard fab to make the odd bit of memory whenever it is convenient.
Nobody is going to do anything if they can't be sure that they'll be able to run the fab they built for a long time and sell most of what they make. Conversely fabs don't tend to idle a lot. Sometimes they're only built if their capacity is essentially sold already. Given how massive the AI bubble is looking right now, I personally wouldn't expect anyone to make a gamble building a new fab.
* Someone explained this at length on here a while ago, but I can't seem to find their comment. Should've favorited it.
> Sometimes they're only built if their capacity is essentially sold already.
"Hyperscalers" already have multi-year contracts going. If the demand really was there, they could make it happen. Now it seems more like they're taking capacity from what would've been sold on the spot or quarterly markets. They already made their money.
Well, I've experienced both to some degree in the past. The previous long time with very similar hardware performance was when PCs were exorbitantly expensive and commodore 64 was the main "home computer" (at least in my country) over the latter 80s and early 90s.
That period of time had some benefits. Programmers learned to squeeze absolutely everything out of that hardware.
Perhaps writing software for today's hardware is again becoming the norm rather than being horribly inefficient and simply waiting for CPU/GPU power to double in 18 months.
I was lucky. I built my am5 7950x Ryzen pc with 2x48gb ddr5 2 years ago. I just bought 4x48gb kit a month ago with an idea to build another home server with the old 2*48gb kit.
Today my old g.skill 2x48gb kit costs Double what I paid for the 4x48gb.
Furthermore I bought two used rtx3090 (for AI) back then. A week ago I bought a third one for the same price... ,(for vram in my server).
I think that goes to show that official inflation benchmarks are not very practical / useful in terms of buckets of things that people actually buy or desire. If the bucket that measured inflation included computer parts (GPUs?), food and housing - i.e. all that the thing that a geek really needs inflation would be wayy higher...
I just gave up and built an AM4 system with a 3090 because I had 128G of ddr4 udimms on hand the whole build was for less than just the memory would have cost for an AM5/ddr5 build.
Really wish that I could replace my old skylake-x system but even ddr4 rdimms for an older xeon are crazy now let alone ddr5. Unfortunately I need slots for 3xTitan V's for the 7.450 TFLOPS each of FP64. Even the 5090 only does 1.637 TFLOPS for FP64, so just hopping that old system keeps running.
If you don't need full ieee-754 double precision, ozaki scheme (emulation with tensor cores) might do the trick. It's been added (just a little bit) to cublas recently.
My 64gb DDR5 kit started having stability issues running XMP a few weeks out of warranty. I bought it two years ago. Looked into replacing it and the same kit is now double the price. Bumping the voltage a bit and having better cooling gets it through memtest thankfully. The fun of building your own computer is pretty much gone for me these days.
I guess I lucked out. I bought a 768GB workstation (with 9995wx CPU and rtx 6000 Pro Blackwell GPU) in August. 96GB modules were better value than 128GB. That build would be a good bit pricier today looks like.
Interesting that Samsung put their prices up 60% today, and a retailer who bought their stock at the old price feels compelled to put their prices up 2.5x.
When the AI bubble bursts we can get back to the old price
Such is life. I suggest finding a less volatile hobby, like crocheting.
Actually, the textile market is pretty volatile in the US these days with Joan's out of business. Pick a poison, I guess? There's little room for stability in a privately-owned-world.
Last night, while writing a LaTeX article, with Ollama running for other purposes, Firefox with its hundreds of tabs, multiple PDF files open, my laptop's memory usage spiked up to 80GB RAM usage... And I was happy to have 128GB. The spike was probably due to some process stuck in an effing loop, but the process consuming more and more RAM didn't have any impact on the system's responsiveness, and I could calmly quit VSCode and restart it with all the serenity I could have in the middle of the night.
Is there even a case where more RAM is not really better, except for its cost?
On consumer chips the more memory modules you have the slower they all run. I.e. if you have a single module of DDR5 it might run at 5600MHz but if you have four of them they all get throttled to 3800MHz.
Mainboards have two memory channels so you should be able to reach 5600mhz on both and dual slot mainboards have better routing than quad slot mainboards. This means the practical limit for consumer RAM is 2x48GB modules.
If you are working on an application that has several services (database, local stack, etc.) as docker containers, those can take up more memory. Especially if you have large databases or many JVM services, and are running other things like an IDE with debugging, profiling, and other things.
Likewise, if you are using many local AI models at the same time, or some larger models, then that can eat into the memory.
I've not done any 3D work or video editing, but those are likely to use a lot of memory.
Having recently updated to 192gb from 96gb I'm pretty happy. I run many containers, have 20 windows of vscode and so on. Plus ai inference on CPU when 48gb vram is not enough.
So glad I bought 128gb ddr5 for my desktop a year ago... I usually don't need it all but it was cheap at the time. Most I use it for is cpu offloading for LLMs too big for my 3090 and for running 10 or so small VMs for my projects.
I'm especially annoyed that this is most likely intentional.
(Not at all)openAI saw they are getting behind their competitors (gpt 5 and 5.1 were progressively worse for my use case - actual problem solving and tweaking existing scripts) are getting better. (Claude and sonnet were miles ahead and I used gpt only due to lower price). Now not only open weights models like Qwen3 and kimik2 exceeded their capability and you can run them at home if you have the hardware or for peanuts on a variety of providers. Cheap-er hardware like strix halo (and Nvidia dgx) made 128gb vram achievable to enthusiast. And Google is eating their punch with Gemini.
All while their CFO starts talking about government bailing them out from spending they cannot possibly fund.
Of course they will attempt to blow up the entire hardware market so if they AI flops they will be able to at least re not you hardware like AWS.
- the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns. Even if it squeezes out every single other sector that happens to want to use SDRAM to do things OTHER than buffer memory before it's fed into a PCIE lane for a GPU.
- I'm really REALLY glad i decided to buy brand new gaming laptops for my wife and I just a couple months ago, after not having upgraded our gaming laptops for 7 and 9 years respectively. It seems like gamers are going to have this the worst - GPUs have been f'd for a long time due to crypto and AI, and now even DRAM isn't safe. Plus SSD prices are going up too. And unlike many other DRAM users where it's a business thing and they can to some degree just hike prices to cover - gamers are obviously not running businesses. It's just making the hobby more expensive.
It is a weird form of centralized planning. Except there's no election to get on to the central committee, it's like in the Soviet era where you had to run in the right circles and have sway in them.
There's too much group-think in the executive class. Too much forced adoption of AI, too much bandwagon hopping.
The return-to-office fad is similar, a bunch of executives following the mandates of their board, all because there's a few CEOs who were REALLY worked up about it and there was a decision that workers had it too easy. Watching the executive class sacrifice profits for power is pretty fascinating.
Edit: A good way to decentralize the power and have better decision making would be to have less centralized rewards in the capital markets. Right now are living through a new gilded age with a few barons running things, because we have made the rewards too extreme and too narrowly distributed. Most market economics assumes that there's somewhat equal decision making power amongst the econs. We are quickly trending away from that.
The funniest thing is that somehow the executive class is even more out of touch than they used to be.
At least before there was a certain common baseline derived from everyone watching the same news and reading the same press. Now they are just as enclosed in their thought bubbles as everyone else. It is entirely possible for a tech CEO to have a full company of tech workers despising the current plan and yet that person being constantly reinforced by linkedin and chatgpt.
The out of touch leader is a trope that I'm willing to bet has existed as long as we've had leaders.
I remember first hearing the phrase "yes man" in relation to a human ass kisser my dad worked with in like 1988.
It's very easy to unknowingly surround yourself with syncophants and hangers on when you literally have more money than some countries. This is true now and has been true forever. I'm not sure they're more out of touch, as much as we're way more aware?
It's more than the fact they are surrounded by sycophants. It's also that, despite the mythology the executive-worship-industry tries to paint, CxOs and board members of companies are just not very creative or visionary people. They largely spend their time looking at their peers and competitors for hints about what they should be doing. And today, those hints all are "do AI". They're not sitting down and deriving from first principles that AI is the way--they're seeing their buddies steering other companies and they're all saying AI is the way, so they say AI is the way, too.
> They're not sitting down and deriving from first principles that AI is the way--they're seeing their buddies steering other companies and they're all saying AI is the way, so they say AI is the way, too.
I think you're underestimating a bit. We must implement AI because they were able to sell it so good that they got billion $ investors (see all the money coming from Qatar/saudi arabia etc). That's a lot of money coming in that allows to innovate/etc.
Sounds quite a bit like stock market. The more sober and cynical of them see fads as fads, irrational but powerful movements, and ride the waves, selling to a greater fool.
Out-of-touch leaders existed for millennia. The "Emperor's New Clothes" tale was published in 1837 as a reproduction of a much older folk take. Sima Qian criticizes out-of-touch lords and emperors in his book about ancient history, written in 1th century BC. Maybe there is even older evidence.
No surprise, the CxO class barely lives in the same physical world as us peasants. They all hang out together in their rich-people restaurants and rich-people galas and rich-people country clubs and rich-people vacation spots, socializing with other rich-people and don't really have a lot of contact with normal people, outside of a handful of executive assistants and household servants.
This was Lina Khan's big thing, and I'd argue that our current administration is largely a result of Silicon Valkey no longer being able to get exits in the form or mergers and IPOs.
Perhaps a better approach to anti-monopoly and anti-trust is possible, but I'm not sure anybody knows what that is. Khan was very well regarded and I don't know anybody who's better at it.
Another approach would be a wealth and income taxation strategy to ensure sigmoid income for the population. You can always make more, but with diminishing returns to self, and greater returns to the rest of society.
Sorry, how did she stand in the way of IPOs? She was against the larger players providing easy off-ramps to smaller players but I don’t recall anything about IPOs. Indeed, Figma’s IPO is precisely because she undid the pending Adobe / Figma merger if I recall correctly.
a better approach might be to farming out shares to stakeholders. that seems a lot more dynamic and self-correcting than periodic taxation battles after the fact
Khan was largely ineffectual. The current administration, if it can be blamed on SV at all, is more likely to be the result of Harris's insanely ill-timed proposal to tax unrealized capital gains just as election season was kicking into high gear.
IMO Khan was by far the best we've had in at least 2 decades. Her FCC even got a judge to rule to break up Google! The biggest downside Khan had was being attached to a 1 term president. There's just not that many court cases against trillion dollar companies you can take from investigation to winning the appeal on in 4 years
All true, and I'm not making a value statement about whether her influence was good or bad. However, Khan only threatened the oligarchs' companies, while Harris point-blank threatened their fortunes.
Don't pick a fight with people who buy ink by the barrel and bandwidth by the exabyte-second. Or at least, don't do it a month before an election.
The oligarchs hated Kahn with the intensity of a thousand burning suns. If you listened to All In all they were doing is ranting about her and Gary Gensler.
That being said, Kamala's refusal to run on Kahn's record definitely helped cost her the election. She thought she could play footsie with Wall Street and SV by backchanneling that she would fire Kahn, so she felt like she couldn't say anything good about Kahn without upsetting the oligarchs, but what she was doing was really popular.
Samsung lost a large percentage of market share to their competitors in the last couple years, so I'm pretty sure they already have to participate in markets.
I think a better solution is exponential tax on a company size. I.e. once a company starts to earn above, say, 1 billion, it will be taxed by income by ever increasing amount. Or put it another way, use taxes to break the power law and winner takes effect all into a Gaussian distribution of company sizes.
> I think a better solution is exponential tax on a company size. I.e. once a company starts to earn above, say, 1 billion, it will be taxed by income by ever increasing amount.
This is in the right spirit but you want two things to be different about it.
The first is that the threshold for a given industry doesn't make sense as a dollar amount, it makes sense as a market share percentage. Having more than 15% market share should be a thing companies don't want, regardless of whether it's a $100 trillion industry or a $100 million one.
And the second is that taxes create a perverse incentive for the government. You absolutely do not want the government to have even more of a financial incentive to sustain and create more of the companies of that size. What you want is to have fewer of them.
So, what you want is a rule that if a company has more than 15% market share, the entire general public is allowed to sue them into bankruptcy for the offense of market consolidation. Which also removes the problem where they buy off the government prosecutors, because if they commit the offense then anybody can sue them.
And who determines what makes for a good market share size to be the threshold?
And by having such a rule, an industry that would have higher efficiency to when consolidated would not be able to (but you wouldn't know). It's a bad set of policy imho.
A better way would be for gov't to increase competition by adding supply, or demand, whichever one is the bottleneck to competition. If a company, such as AWS, is getting a lot of marketshare, but their profit margins is still high, then the gov't should incentivize competition by funding or giving loans to businesses that want to compete with AWS.
However, if AWS's profit margins, even at high market share, remains very low (e.g., amazon's commerce side), then there's no need for the gov't to "step in" at all, as there would be no incentive for any competitor to try enter the market due to low margins.
The goal is to not have it happen, because the company is going to see that they're only slightly below the threshold and voluntarily split themselves into smaller pieces and buy themselves a safety margin because if they don't everybody knows the lawsuits are going to vaporize them once they exceed the threshold.
> And who determines what makes for a good market share size to be the threshold?
Anything in the vicinity of 5%-15% would be fine.
> And by having such a rule, an industry that would have higher efficiency to when consolidated would not be able to (but you wouldn't know).
This is extremely rare and the circumstances where it happens aren't a mystery. It's when entering the market has extremely high fixed costs but then the unit cost of usage is negligible, e.g. it costs a huge amount of money to install water and sewer but then the incremental cost of someone washing their hands is insignificant.
For those things you either have the government do them, or if it's a private company then it's a regulated utility which is completely banned from anything that even vaguely resembles vertical integration as the price of being allowed to have more than the threshold amount of market share.
> A better way would be for gov't to increase competition by adding supply, or demand, whichever one is the bottleneck to competition.
The problem is generally caused by the incumbents capturing the government and then enacting rules that inhibit rather than increase competition. That's why you need anyone to be able to initiate the lawsuit, so they can't capture the government department which is supposed to be thwarting them because then it's the entire public.
so why not solve this issue directly? Transparency, auditing and public awareness etc are needed to prevent regulatory capture. Public apathy are the reason why it is currently "easy" to do capture regulators.
The fact is even if a law suit is possible from anyone in the public, no one is going to pay to do a law suit (which has costs), when the result doesn't net them more profit. So unless the law suit enables the accuser to wholesale take a piece of that company as private property from the owners - which no law currently would allow nor have precedents for - why would anyone expend private money for a public good?
And in any case, i don't the apathy going away, even if the law suit was free. Because currently, the same apathy is allowing regulatory capture in the first place. So solving public apathy first, and foremost, is the solution.
> Transparency, auditing and public awareness etc are needed to prevent regulatory capture. Public apathy are the reason why it is currently "easy" to do capture regulators.
It's mostly easy because the people doing it are good at lying. When they create a rule it isn't called the "mandate this company's product rule" or the "increase fixed costs to lock out smaller competitors rule", it's sold as a safety measure or consumer protection or some other pretext, even though the effect is to raise costs to the benefit of the companies getting the money or exclude competitors to the benefit of the incumbents.
Or they simply don't prosecute antitrust violations, and then there is nothing to audit because there is nothing happening, meanwhile people are kept distracted with other things.
> The fact is even if a law suit is possible from anyone in the public, no one is going to pay to do a law suit (which has costs), when the result doesn't net them more profit.
It does net them more profit. The premise is that having more than the threshold amount of market share is a strict liability antitrust violation, which allows any customer or prospective customer (i.e. anyone) to sue them for it. The person who files the lawsuit would get the money, the same as someone who sues a company for pollution or fraud.
The point of letting people sue you for polluting or fraud or, in this case, market consolidation, isn't to make plaintiffs rich, it's to deter the thing you don't want companies to do. The goal isn't to have a lot of lawsuits, the goal is to have companies not want the market to consolidate and actively prevent it because if it happens they'll get sued.
> So solving public apathy first, and foremost, is the solution.
Apathy is cyclical. People don't care until the problem gets bad enough, then they care enough to demand change and make it go away for a while, then they stop caring until it gets bad enough again.
But you don't want people to have to die or get severely abused before the problem gets addressed. What you want is to change the structure of the system to prevent it from getting that bad to begin with, by making sure that the power to nip the problem in the bud (i.e. stop market consolidation at 5% or 15% instead of 50% or 90%) is held by someone who will actually exercise it, which can be accomplished by granting that power to everyone affected, which in this context is each and every member of the public.
This would permanently increase DRAM prices. Memory fabricators either earn billions of dollars in income each year or they can't keep going. There are no little Mom and Pop businesses that can do photolithography on leading process nodes.
Chip fabs used to be like book publishers; you don't have to own a printing press to be an author. Carver Mead even described his vision of the industry that way.
Nowadays you have to get your cell libraries and a large chunk of your toolchain from the fab. Of course it's laundered through cadence+synopsys, but it's still coming from the fab. You have to buy your masks from the fab (heck they aren't even allowed to leave the fab so do you really own them?). And on and on.
For the record I don't agree with the "exponential" part, but otherwise this is an underappreciated and powerful technique.
In another comment you proposed a sane version of the parent proposal. I wouldn't have commented if fpoling had originally floated that scheme. I was mainly objecting to drastically increasing taxes "once a company starts to earn above, say, 1 billion" without regard for the minimum viable scale of different businesses.
I can still make a book like that in my basement. People do this as a hobby now. You can still build chips like that in your garage. People do this as a hobby now.
These things DO NOT SCALE... you cant have 10,000 people running printing presses in their basement to crank out the NYT every day. A modern chip fab has more in common with the printer for the NYT than it does with what you can crank out in your garage.
Let's look at TSMC's plant in AZ. They went and asked intel "hey where are you sourcing your sulfuric acid from. When they looked at the American vendors TSMC asked intel "how are you working with this". Intels response was that it was the best they could get.
It was not.
TSMC now imports sulfuric acid from Taiwan, because it needs to be outrageously pure. Intel is doing the same.
Every single part, component, step and setup in the chain is like that. There is so much arcane knowledge that loss of workers represents a serious set back. There are people in the production chain, with PHD's, who are literally training their successors because thats sort of the only option.
Do you know who has been trying the approach you are proposing? China. It has not worked.
> I can still make a book like that in my basement. People do this as a hobby now. You can still build chips like that in your garage. People do this as a hobby now.
You can absolutely manufacture a convincingly-professional, current-generation book in your basement with a practically-small capital investment.
You cannot manufacture a convincingly-professional chip (being generous: feature size and process technology from the last two decades) in your basement without a 6-7 figure capital expenditure, and even then - good luck.
Is that revenue, or profit? If revenue, it'll slam certain kinds of high-volume low-profit businesses, and if it's profit then the company will just arrange to have big compensation "expenses" for executives.
The latter would have to be backstopped by taxes on individual income.
The sane version of this proposal omits the "exponential" part, applies to profits (net income), and makes the tax rate industry-specific (just like Washington State's revenue tax).
> There's too much group-think in the executive class.
I think this is actually the long tail of "too big to fail." It's not that they're all thinking the same way, it's that they're all no longer hedging their bets.
> we have made the rewards too extreme and too narrowly distributed
We give the military far too much money in the USA.
> We give the military far too much money in the USA.
~ themafia, 2025
(sorry)
On a more serious note the military is sure a money burning machine, but IMHO it's only government spending, when most of the money in the US is deliberately private.
The fintech sector could be a bigger representation of a money vacuuming system benefiting statistically nobody ?
It's around 3.4% GDP. That puts us in the top 10% or so worldwide, but it's not ridiculously high. It's on a similar level as countries such as Morocco and Colombia, which aren't known for excessive military spending. It's still kind of high for a country with no nearby enemies, but for the most part, US military spending is large because the US economy is large.
It's around 16% of the total federal budget. To be fair about 1/3 of "military spending" is actually Salaries, Medical, Housing and GI/Retirement costs.
It's also the case that none of the CIA, NSA or DHS budgets show up under the military, even though they're performing some of the same functions that would be handled by militaries in other countries.
We also have "black appropriations." So the total of the spending on surveillance and kinetic operations is often unknowable. Add to this the fact the Pentagon has never successfully performed an audit and I think people are right to be suspicious of the topline "fraction of GDP" number.
Military spending is a type of wealfare for the wealthy it is one of the only forms of public or government spending that doesn't crowd out private investors, the way public housing or publicly funded hospitals do. The high military spending and the contractor class often vote more conservative than typical for their demographic and economic peers It's been high since WW2, with maybe a slight drop in the late 70s. The current stat of "3.4 times gdp" ignores the fact that a large part of our national debt is from the military and war budgets. I saw a statistic in the mid 1990s that if we had kept our military budget at inflation adjusted levels equal to 1976 our debt would have gone to zero as early as 1994.
Our national debt is from our unwillingness to raise taxes to balance the budget. Federal spending is somewhat high historically, but not absurdly so. Relative to the economy, it's at about the same level as it was in the 1980s. Measured as a percentage of GDP, the current military budget is the lowest since before the Second World War, aside from a brief period at the end of the 1990s where it was slightly lower.
Comparing budgets by adjusting for inflation doesn't make any sense. A budget that served a country of 218 million in 1976 would, when adjusted for inflation, serve a country of 218 million in 2026. Percentage of GDP is what you want to look at.
But federal spending has been historically high ever since like the New Deal.
Budget-to-GDP ratio in the US is close to 40%. (On that note, you should really consider federal + state combined rather than just federal.)
In early 1900s this same ratio was around 5-10%.
It has been increasing pretty much everywhere during the 20th century. It has made me wonder whether much of the prosperity we've seen and felt might not be a result of this ever-increasing percentage. Essentially we're spending more and more and that makes it feel like we're progressing faster than we are. Eventually it's going to have to stop though and I dread what happens when we do.
Exactly. So instead of electing the people who will allocate the resources, the people who are successful in one thing are given the right to manage the resources for whatever they wish and they can keep being very wrong for very long time when other people are deprived from the resources due to the mismanagement and can't do anything about it.
In theory I guess this creates a demand that should be satisfied by the market but in reality it seems like when the wealth is too concentrated in the hands of the few that call all the decision the market is unable to act.
Centralized planning is needed in any civilization. You need some mechanism to decide where to put resources, whether it's to organize the annual school's excursion or to construct the national highway system.
But yeah in the end companies behave in trends, if some companies do it then the other companies have to do it too, even if this makes things less efficient or is even hurtful. We can put that onto the human factor, but I think even if we replaced all CEOs with AIs, those AIs would all see the same information and make similar decisions on those information.
There is pascal's wager arguments to be had: for each individual company, the punishment of not playing the AI game and missing out on something big is bigger than the punishment of wasting resources by allocating them towards AI efforts plus annoying customers with AI features they don't want or need.
> Right now are living through a new gilded age with a few barons running things, because we have made the rewards too extreme and too narrowly distributed.
The usa has rid itself multiple times of its barons. There is mechanisms in place, but I am not sure that people really are going to exercise those means any time soon. If this AI stuff is successful in the real world as well, then increasing amounts of power will shift away from the people to the people controlling the AI, with all the consequences this has.
If you get paid for being rich in proportion to how rich you are -- because that's how assets work -- it turns into an exponential, runs away, and concentrates power until something breaks.
how is this centralized planning? It’s a corporate decision making operating in a free market to optimize for what majority shareholders want (though the majority of shares are owned by few).
I think the implied thought (?) is there is a similarity between central planning and oligopoly bandwagoning. To my eye, the causes and dynamics are different enough to warrant bucketing them separately.
> Every corporation is a (not so) little pocket of centrally planned economy.
This is confused. Here is how classical economists would frame it: a firm chooses how much to produce based on its cost structure and market prices, expanding production until marginal cost equals marginal revenue. This is price guided production optimization, not central planning.
The dominant criticism of central planning is trying to set production quantities without prices. Firms (generally) don’t do this.
Company prices resources within itself completely arbitrarily. How much the hour of work of an employee A is worth with the company and and how much using paperclip costs has no relation how much these things actually cost in the real money. Once they are acquired by company they are utilized not according to their value but to central plans instead. This way paperclip might get vastly overvalued and scarce while hour of work can be vastly undervalued and wasted.
This is why I think taxes on the very wealthy should be so high that billionaires can't happen. The usual reasons are either about raising revenue or are vague ideas about inequality. It doesn't raise enough revenue to matter, and inequality is a fairly weak justification by itself.
But the power concentration is a strong reason. That level of wealth is incompatible with democracy. Money is power, and when someone accumulates enough of it to be able to personally shake entire industries, it's too much.
You'll just get a different form of power concentration. Do you think the Soviet Union didn't have power concentration in individuals? Of course it did, that's why the general secretary of the party was more important than the actual heads of state and government.
>It is a weird form of centralized planning. Except there's no election to get on to the central committee, it's like in the Soviet era where you had to run in the right circles and have sway in them.
No, it's pure capitalism where Atlas shrugged and ordered billions worth of RAM. You might not like it but don't call it "centralized planning" or "Soviet era".
We have been living on the investment of previous centuries and decades in the West for close to 40 years now. Everything is broken but that didn't matter because everything that needed a functioning physical economy had moved to the East.
AI is the first industrial breakthrough in a century that needs the sort of infrastructure that previous industrial revolutions needed: namely a ton of raw power.
The bubble is laying bare just how terrible infrastructure is and how we've ignored trillions of maintenance to give a few thousand people tax breaks they don't really need.
> the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns.
This resonates deeply, especially to someone born in the USSR.
This is part of how free markets self correct, misallocate resources and you run out of resources.
You can blame irrational exuberance, bubbles, or whatnot markets are ultimately individual choices times economic power. Ai, Crypto, housing, Dotcom etc going back through history all had excess because it’s not obvious when to join and when to stop.
The problem is that memory manufacturing is hard enough that there are essentially 3 major companies that do it globally: Samsung, SK Hynix, and Micron.
> This is part of how free markets self correct, misallocate resources and you run out of resources.
Except that these corporations will almost certainly get a bail out, under the auspices of national security or some other BS. The current admin is backed by the same VCs that are all in on AI.
They're treating it as a "winner takes it all"-kind of business. And I'm not sure this is a reasonable bet.
The only way the massive planned investments make sense is if you think the winner can grab a very large piece of a huge pie. I've no idea how large the pie will be in the near future, but I'm even more skeptical that there will be a single winner.
What's odd about this is I believe there does exist a winner takes all technology. And that it's AR.
The more I dream about the possibilities of AR, the more I believe people are going to find it incredibly useful. It's just the hardware isn't nearly ready. Maybe I'm wrong but I believe these companies are making some of the largest strategic blunders possible at this point in time.
It’s maybe new to you (you’re one of today’s lucky 10,000!), but this kind of market failure has been going on since at least the south sea bubble and tulip mania, if not all the way back to Roman times.
It's not exactly a new type of failure. It's roughly equivalent to Riccardian rent, or pecuniary externalities for the general term. Though I suppose this is a speculative variant, which could be worse somehow.
For example: allocating the resources to only few industries deprives everyone else: small players, hobbyists, gamers, tinkerers from opportunities to play with their toys. And small players playing with random toys is a source of multiple innovations.
The tone from the AI industry sounds more like a dependent addict by comparison. They're well past the phase where they're enjoying their fix and into the "please, just another terawatt, another container-ship full of Quadros, to make it through the day" mode.
More seriously, I could see some legitimate value in saying "no, you can't buy every transistor on the market."
It forces AI players to think about efficiency and smarter software rather than just throwing money at bigger wads of compute. This might be part of where China's getting their competitive chops from-- having to do more with less due to trade restrictions seems to be producing some surprisingly competitive products.
It also encourages diversification. There is still no non-handwavey road to sustainable long-term profitability for most of the AI sector, which is why we keep hearing answers like "maybe the Extra Fingers Machine cures cancer." Eventually Claude and Copilot have to cover their costs or die. If you're nVidia or TSMC, you might love today's huge margins and willing buyers for 150% of your output, but it's simple due diligence to make sure you have other customers available so you can weather the day the bubble bursts.
It's also a solid PR play. Making sure people can still access the hobbies they enjoy is an easy way to say you're on the side of the mass public. It comes from a similar place to banning ticket scalping or setting reasonable prices on captive concessions. The actual dollars involved are small (how many enthusiast PCs could you outfit with the RAM chips or GPU wafer capacity being diverted to just one AI data centre?) but it makes it look like you're not completely for sale to the highest bidder.
This happens when you get worse and worse inequality when it comes to buying power. The most accurate prediction into how this all plays out I think is what Gary Stevenson calls "The Squeeze Out" -> https://www.youtube.com/watch?v=pUKaB4P5Qns
Currently we are still at the stage of extraction from the upper/middle class retail investors and pension funds being sucked up by all the major tech companies that are only focused on their stock price. They have no incentive to compete, because if they do, it will ruin the game for everyone. This gets worse, and the theory (and somewhat historically) says it can lead to war.
Agree with the analysis or not, I personally think it is quite compelling to what is happening with AI, worth a watch.
I wonder, is there any way to avoid this kind of market failure? Even a planned economy could succumb to hype - promises that improved societal efficiency are just around the corner.
> Is there any way to avoid this kind of market failure?
There are potentially undesirable tradeoffs and a whole new game of cheats and corruption, but you could frustrate rapid, concentrated growth with things like an increasing tax on raised funds.
Right now, we basically let people and companies concentrate as much capital as they want, as rapidly as they want, with almost no friction, presumably because it helped us economically outcompete the adversary during the Cold War. Broadly, we're now afraid of having any kind of brake or dampener on investments and we are more afraid of inefficiency and corruption if the government were to intervene than we are of speculation or exploitation if it doesn't.
In democratically regulated capitalism, there are levers to pull that could slow down this kind of freight train before it were to get out of control, but the arguments against pulling them remain more thoroughly developed and more closely held than those in favor of them.
There is a way, and if anyone tells you we have to go full Hitler or Stalin to do it they are liars because last time we let inequality cook this hard FDR and the New Deal figured out how to thread the needle and proved it could be done.
Unfortunately, that doesn't seem to be the flavor of politics on tap at the moment.
Sam Altman cornering the DRAM market is a joke, of course, but if the punchline is that they were correct to invest this amount of resources in job destruction, it's going to get very serious very quickly and we have to start making better decisions in a hurry or this will get very, very ugly.
Yeah I know HN is going to hate me for saying that.
If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
Once "better served" is quantified, you know the coefficient for taxation.
Make no mistake, this coefficient will be a political football, and will be fought over, just like the Fed prime interest rate. But it's a single scalar instead of a whole executive branch department and a hundred kilopages of regulations like we have in the antitrust-enforcement clusterfuck. Which makes it way harder to pull shenanighans.
> If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
Why? That's exactly the circumstances where the mere potential for small companies to pop up is enough to police the big company's behavior. You get lower costs (due to economies of scale) and a very low chance of monopolization. so everyone's happy. In the case of this DRAM/flash price spike, the natural "small" actors are fabs slightly off the leading edge, that will be able to retool their production and supply these devices for a higher profit.
>society is better served by having it produced by the few small companies than the one big company.
well, assuming the scale couldn't be used for the benefit of society and not to milk it dry. but yes probably the best that can have a reasonable chance at success, eventually, maybe.
> If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
How so? Costs will be higher with multiple small products, resulting in higher costs for customers. That's the opposite of "society is served better".
We draw the line at monopolies, which makes sense.
By the time a company becomes a monopoly, it is immensely powerful - politically and monetarily - getting rid of it or splitting it up is near impossible. Monopoly laws are near impossible to apply as the corporation has sufficient money and influence to turn politicians into servile puppets.
Best to nip corpos before they gain more revenue than a nation state and become "too big to fail".
Just like some of the crypto booms and busts if you time it right this could be a good thing. Buy on a refresh cycle when AWS dumps a bunch of chips and RAM used or refurbished (some places even offer warranty which is nice).
And if the market crashes or takes a big dip then temporarily eBay will flood with high end stuff at good prices.
Sucks for anyone who needs to upgrade in the next year or two though !
Markets are voting machines in the short term and weighing machines in the long term. We’re in the short term popularity phase of AI at the moment. The weighing will come along eventually.
> resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns
That's basically what the rich usually do. They command disproportionate amount of resources and misallocate them freely on a whim, outside of any democratic scrutiny, squeezing incredible number of people and small buisness out of something.
Whether that's a strength of the system or the weakness, I'm sure some rearch will show.
It's a little ironic but to call this a market failure due to resource misalocation because prices are high when high prices is how misalocation is avoided.
I'm a little suspicious that "misalocation" just means it's too expensive for you. That's a feature, not a bug.
> the insane frothing hype behind AI is showing me a new kind of market failure
I see people using "market failure" in weird ways lately. Just because someone thinks a use for a product isn't important, doesn't mean it's a market failure. It's actually the opposite - consumers are purchasing it at a price they value it.
Someone who doesn't really need 128GB of ram won't pay the higher cost, but someone who does need it will.
> … showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns.
Technically speaking, this is not a market failure. [1] Why? Per the comment above, it is the individuals that are acting irrationally, right? The market is acting correctly according to its design and inputs. The market’s price adjustment is rational in response. The response is not necessarily fair to all people, but traditional styles of neoclassical economic analysis deaccentuate common notions of fairness or equality; the main goal is economic efficiency.
I prefer to ask the question: to what degree is some particular market design serving the best interest of its stakeholders and society? In democracies, we have some degree of choice over what we want!
I say all of this as a person who views markets as mechanisms not moral foundations. This distinction is made clear when studying political economic (economics for policy analysis) though I think it sometimes gets overlooked in other settings.
If one wants to explore coordination mechanisms that can handle highly irrational demand spikes, you have to think hard. To some degree, one would have to give up a key aspect of most market systems — the notion of one price set by the idea of “willingness to pay”.
[1] Market failure is a technical term within economics meaning the mechanism itself malfunctions relative to its own efficiency criteria.
Or not cause inflation, rising cost of living etc. People said the same about crypto GPUs but it never really happened in the end. Those cheap pre-LHR RTX cards never really entered the picture.
OpenAI appears to have bought the DRAM, not to use it, as they are apparently buying it in unfinished form, but explicitly to take it off the market and cause this massive price increase & squash competition.
I would call that market manipulation(or failure if you wish)--in a just society Sam Alton would be heading to prison.
The market failure results from those people having way more money than logic and economic principles dictate they should. A person would normally have to make a lot of good decisions in a row to get that much money, and would be expected continue making good decisions, but also wouldn't live long enough to reach these extreme amounts. However, repeated misallocation by the federal government over the last several decades (i.e. excessive money printing) resulted in people getting repeatedly rewarded for making the right kind of bad economic decisions instead.
> the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns.
As someone who advocates that we only use capitalism as a tool in specific areas and try to move past it in other, I’ll defend it here to say that’s not really a market anymore when this happens.
Hyper concentration of wealth is going to lead to the same issues that command economies have where the low level capital allocations(buying shit) isn’t getting feedback from everyone involved and is just going off one asshole’s opinion
Not even. Tulips were non-productive speculative assets. NFTs were what the tulip was. The AI buildout is more like the railroad mania in the sense that there is froth but productive utility is still the output.
The actual underlying models of productive output for these AI tools is a tiny fraction (actually) of the mania, and can be trivially produced at massive quantity without the spend that is currently ongoing.
The big bubble is because (like with tulips back then), there was a belief in a degree of scarcity (due to apparent novelty) that didn’t actually exist.
Just like the beautiful woman who's luxury bag purchase she doesn't actually need, we can sit here and judge her for it, but at the end of the day it's not our money she's buying Louis Vuitton with, and we're not the one she's going home with.
Anyone who owns shares in US companies (most people here) both are ‘going home with’ the companies involved, and are buying ‘the bags’.
Not to mention all the people buying the bonds used to fund the whole AI data center buildout, which is a ton of probably pension funds and old folks planning for retirement (also probably more than a few millionaire/billionaires!).
I don't know if the term console even makes sense any more. It's a computer without a keyboard and mouse. And as soon as you do that, it's a PC. So I don't see how this makes any sense or will ever happen.
> Well, patience as a consumer might pay off in the next year or so when the music stops and hyperscalers are forced to dump their inventories.
Their inventories are not what consumers use.
Consumer DDR5 motherboards normally take UDIMMs. Server DDR5 motherboards normally take RDIMMs. They're mechanically incompatible, and the voltages are different. And the memory for GPUs is normally soldered directly to the board (and of the GDDRn family, instead of the DDRn or LPDDRn families used by most CPUs).
As for GPUs, they're also different. Most consumer GPUs are PCIe x16 cards with DP and HDMI ports; most hyperscaler GPUs are going to have more exotic form factors like OAM, and not have any DP or HDMI ports (since they have no need for graphics output).
So no, unfortunately hyperscalers dumping their inventories would be of little use to consumers. We'll have to wait for the factories to switch their production to consumer-targeted products.
Edit: even their NVMe drives are going to have different form factors like E1.S and different connectors like U.2, making them hard for normal consumers to use.
I bet that friendly Chinese entrepreneurs will sell inexpensive E1.S to m.2 adapters, and maybe even PCIe riser cards for putting an OAM and a bunch of fans, and maybe even an HDMI output. Good hardware won't be wasted, given some demand.
I imagine the cost is primarily in the actual DRAM chips on the DIMM. So availability of RDIMMs on the market will affect DRAM prices anyway. These days lots of motherboards come with Oculink, etc. and you can get a U.2 PCIe card for rather cheap.
I put together a small server with mostly commodity parts.
The problem is that it is not entirely clear that the hyperscalers are buying DDR5, instead it seems that supplies are being diverted so that more HBM/GDDR wafers can be produced.
HBM/GDDR is not necessarily as useful to the average person as DDR4/DDR5
I see it a bit differently. In marketing, companies like AppLovin with the Axon Engine and Zeta Global with Athena are already showing strong profitability, both in earnings and free cash flow. They’re also delivering noticeably higher returns on ad spend compared to pre-AI tools for their customers. This is the area I’m researching most closely, so I can only speak for marketing, but I’d love to hear from others seeing similar results in their industries.
Its a bit of a shame these AI GPUs don't actually have displayport/hdmi output ports because they would make for nice cheap and powerful gaming GPUs with a lot of VRAM, they would potentially be really good graphics cards.
Will just have to settle for insanely cheap second hand DDR5 and NVMe drives I guess.
AI GPUs suck for gaming, I have seen a video from a guy playing Red Dead Redemption 2 on a H100 at a whooping 8 FPS! That's after some hacks, because otherwise it wouldn't run at all.
AI GPUs are stripped away of most things display-related to make room for more compute cores. So in theory, they could "work", but there are bottlenecks making that compute power irrelevant for gaming, even if they had a display output.
A single machine for personal inference on models of this size isn't going to idle at some point so high that electricity becomes a problem and for personal use it's not like it would be under load often and if for some reason you are able to keep it under heavy load presumably it's doing something valuable enough to easily justify the electricity.
If you can't afford the electricity to afford to run the model on free hardware, you'd certainly never be able to afford the subscription to the same product as a service!
But anyway, the trick is to run it in the winter and keep your house warm.
At $DAYJOB, we have had confirmed and paid for orders be cancelled within the last week due to price hikes. One DDR5 server configuration went from ~$13k to near $25k USD in a matter of days.
We also were looking for DDR4 memory for some older machines and that has shot up 2x as well.
I mentioned this previously in a thread about node sizes and got down voted for it, but I stand by my opinion: the rest of the world ie normal people need China to become competitive in chip manufacturing.
Without that competition, everyday consumers are going to get priced out of the market by major corporations. We have reached a point in CPU technology where newer tech is no longer automatically cheaper and faster to make; therefore, we need more competition to keep prices down.
Companies that invested in CXL got their money's worth. CXL is basically older RAM connected over PCI-e. Not only you're not throwing away RAM which cannot be used with the current generation of motherboard and chipsets, but you also have a way to get a lot of slower memory for applications that don't need the best and the newest.
We've been getting increasingly fucked for years on housing prices, healthcare, food, live entertainment, etc. Consumer electronics were one of the few areas that you could at least argue you were getting more value per dollar each year. GPU's have been a mess for awhile now but now it seems like it's just going to be everything.
Wild experience building a PC today and discovering the prices are less competitive with Macs than they’ve always been. Building a well-appointed gaming/production/CAD rig is suddenly very expensive between RAM, GPU, and nvme prices being so high.
Basically every integrated circuit is exempt from retaliatory tariffs, current custom MacBook Pros are shipping from China direct: which tariffs are you referring to?
> hbm chips are now emerging as another bottleneck in the development of those models. Both sk Hynix and Micron, an American chipmaker, have already pre-sold most of their hbm production for next year. Both are pouring billions of dollars into expanding capacity, but that will take time. Meanwhile Samsung, which manufactures 35% of the world’s hbm chips, has been plagued by production issues and reportedly plans to cut its output of the chips next year by a tenth.
Somebody do the math on when we will reliably start running out of Grid Power. Than only this "AI builout" will slow down. Manufactiring generators is boring, and very less invested than manufacturing AI servers.
That's why its increasingly more important to find answers how to build these models that work sustainably. The approach of training with HUGE amount of data requiring HUGE infra seems to have blinded the hype-bros that they are not planning to innovate to do it in a small-scale.
If we're going to see retailers price-gouging on DDR5, maybe people will be willing to buy slightly older gear with DDR4 (and corresponding motherboard and CPU).
Especially for systems for which the workloads are actually bound by GPU compute, network, or storage.
2 months ago there were a load of second gen xeon scalable servers on offer. Now every one of them has had the ram stripped out and its just the chassis on offer.
I'm still on DDR4 but I hope this price gouging will be over by the time I need to finally upgrade :( I have a Ryzen so I did upgrade to the latest AM4 generation.
I just snagged an Asrock Rack mobo (X570), 5900x and 128gb ecc ddr4 for $680. Felt like a steal with how memory prices are going these days, ECC to boot.
Even if production capacity wasn't shared/shifting to the higher end products (which it seemingly is), there's certainly going to be an increase in demand for DDR4 as it acts as a substitute good. Prices are already up significantly.
Gamers Nexus is reporting increasing DDR4 prices, but it’s unclear to what extent it’s driven by the DDR5 market. DDR4 production is expected to be slowing anyway given the move to DDR5.
Haven't these memory companies been caught price fixing multiple times over the years? Just how sure are we the AI bubble is the entire reason for these absurd prices?
> Just how sure are we the AI bubble is the entire reason for these absurd prices?
We're not, and market dictates that they don't have to talk to know to jack up the prices.
This ram price spike is leading Nvidia reporting for this quarter: gross margins were 70 percent. It's looking like their year over year increase in margins (double) is not because it came anywhere close to shipping double the number of units.
Meanwhile if you look at Micron their gross margin was 41% for fiscal year 2025, and 2024 looks to be 24%.
Micron and its peers, are competing with Nvidia for shareholder dollars (the CEO's real customer). Them jacking up prices is because there is enough of the market, dumb enough, to bear it right this second. And every CEO has to be looking at those numbers and thinking the same things: "Where is my cut of the pie, why aren't we at 60 percent".
We're now at a point where hardware costs are going to inhibit development. Everyone short of the biggest players are now locked out, and thats not sustainable. Of the AI ventures there is only one that seems to have a reasonable product, and possibly reasonable financials. Many of the other players are likely going to be able to weather the write downs.
So first it was bitcoin/crypto, now it's ai. pc gaming is dead at this point. i wonder if it will force developers to care about doing more with less hardware and optimize now.
Many studios don't even hire rendering engineers anymore. Much of AAA is UE5 slop. And then there is the looming AI slop, which publishers are already thinking about. I think it'll burst at some point, but it'll get worse before it gets better.
I had a simple proxmox/k8s cluster going, and fitting RAM for nodes was the last on my list. It was cheapo ol' DDR4.
Where I live price for my little cluster project gone up from around ~400 usd in july (for 5 node setup) to almost 2000 usd right now. I just refreshed page and it's up by 20% day-to-day. Welp. I guess they are going to stay with 8gb sticks for a while.
My 2022 GPU has 24GB of ram. It's like 50% more than what is similarly affordable today. It's fucked up and I'd rather slow down my spending and see the whole market go down than get scammed by hype.
This website mentions the price increase for DDR, but AI companies use Nvidia GPUs, which probably use HBM or GDDR. So I assume the respective price increase for soldered-on memory on graphics cards is even steeper.
Semiconductor companies have been bitten in the past by scaling up production into a bubble, so of course Samsung just raises prices. When you buy DRAM, remember that you are financing oligarchs and that Stargate has lied yet again.
Who am I kidding, but the such a high increase means these changes are here to stay, it’s not a progressive change at all.
128GB used to be 400$ in June, and now it's over $1,000 for the same 2x64GB set..
I have no idea if/when prices will come back down but it sucks.
(That's just me eyeballing it, feel free to do the math)
In a traditional pork cycle there's a relatively large number of players and a relatively low investment cost. The DRAM market in the 1970s and 1980s operated quite similarly: you could build a fab for a few million dollars, and it could be done by a fab which also churned out regular logic - it's how Intel got started! There were dozens of DRAM-producing companies in the US alone.
But these days the market looks completely different. The market is roughly equally divided up between SK Hynix, Micron, and Samsung. Building a fab costs billions and can easily a year of 5 - if not a decade - from start to finish. Responding to current market conditions is basically impossible, you have to plan for the market you expect years from now.
Ignoring the current AI bubble, DRAM demand has become relatively stable - and so has the price. Unless there's a good reason to believe the current buying craze will last over a decade, why would the DRAM manufacturers risk significantly changing their plans and potentially creating an oversupply in the future? It's not like the high prices are hurting them...
Will the company be evicted from the country in 6 months? A year? Will there be 100% tariffs on competitions imports? Or 0%? Will there be an anti-labor gov’t in effect when the investment might mature, or a pro-labor?
The bigger the investment, the longer the investment timeframe, and the more sane the returns - the harder it is to make the investment happen.
High risk requires a correspondingly high potential return.
That everyone has to pay more for current production is a side effect of the uncertainty, because no one knows what the odds are of even future production actually happening, let along the next fancy wiz-bang technology.
But people do need the current production.
https://www.tomshardware.com/pc-components/storage/perfect-s...
...except current peak in demand is mostly driven by build-out of AI capacity.
Both inference and training workloads are often bottlenecked on RAM speed, and trying to shoehorn older/slower memory tech there would require non-trivial amount of R&D to go into widening memory bus on CPU/GPU/NPUs, which is unlikely to happen - those are in very high demand already.
I don't really understand why every little thing needs to be spelled out. It doesn't matter. We're not getting the RAM at an affordable price anymore.
If there is high demand for the former due to AI, they can increase production to generate higher profits. This cuts the production capacity of consumer DRAM, and lead to higher prices in that segment too. Simple supply & demand at work.
A consumer DDR5 module has two 32-bit-wide buses, which are both for example implemented using 4 chips which each handle 8 bits operating in parallel - just like RAID 0.
An enterprise DDR5 module has a 40-bit-wide bus implemented using 5 chips. The memory controller uses those 8 additional bits to store the parity calculated over the 32 regular bits - so just like RAID 4 (or RAID 5, I haven't dug into the details too deeply). The whole magic happens inside the controller, the DRAM chip itself isn't even aware of it.
Given the way the industry works (some companies do DRAM chip production, it is sold as a commodity, and others buy a bunch of chips to turn them into RAM modules) the factory producing the chips does not even know if the chips they have just produced will be turned into ECC or non-ECC. The prices rise and fall as one because it is functionally a single market.
Each memory DIMM/stick is made up of multiple DRAM chip. ECC DIMMs have an extra chip for storing the error correcting parity data.
The bottleneck is with the chips and not the DIMMs. Chip fabs are expensive and time consuming, while making PCBs and placing components down onto them is much easier to get into.
https://en.wikipedia.org/wiki/DRAM_price_fixing_scandal
Nobody is going to do anything if they can't be sure that they'll be able to run the fab they built for a long time and sell most of what they make. Conversely fabs don't tend to idle a lot. Sometimes they're only built if their capacity is essentially sold already. Given how massive the AI bubble is looking right now, I personally wouldn't expect anyone to make a gamble building a new fab.
* Someone explained this at length on here a while ago, but I can't seem to find their comment. Should've favorited it.
"Hyperscalers" already have multi-year contracts going. If the demand really was there, they could make it happen. Now it seems more like they're taking capacity from what would've been sold on the spot or quarterly markets. They already made their money.
It's kinda sad when you grow up in a period of rapid hardware development and now see 10 years going by with RAM $/GB prices staying roughly the same.
That period of time had some benefits. Programmers learned to squeeze absolutely everything out of that hardware.
Perhaps writing software for today's hardware is again becoming the norm rather than being horribly inefficient and simply waiting for CPU/GPU power to double in 18 months.
I was lucky. I built my am5 7950x Ryzen pc with 2x48gb ddr5 2 years ago. I just bought 4x48gb kit a month ago with an idea to build another home server with the old 2*48gb kit.
Today my old g.skill 2x48gb kit costs Double what I paid for the 4x48gb.
Furthermore I bought two used rtx3090 (for AI) back then. A week ago I bought a third one for the same price... ,(for vram in my server).
Really wish that I could replace my old skylake-x system but even ddr4 rdimms for an older xeon are crazy now let alone ddr5. Unfortunately I need slots for 3xTitan V's for the 7.450 TFLOPS each of FP64. Even the 5090 only does 1.637 TFLOPS for FP64, so just hopping that old system keeps running.
Same order, same bill of materials, 17.5K USD per unit today.
That is roughly a 5.5k increase for 768GB of DDR5 ECC memory and the 4 2tb nvme ssds.
Upgraded by adding 64GB.. last Friday I sold the 32 GB I took out for what I paid for the 64 GB in July... insane
(Including the submitter. In their comment history is "Tip: You can sell used server RAM or desktop modules through BuySellRam to recover value from old hardware." at https://news.ycombinator.com/item?id=45800881 and all of the submissions of this domain are from this user: https://news.ycombinator.com/from?site=buysellram.com )
When the AI bubble bursts we can get back to the old price
Years, or when the AI bubble pops, whatever comes first.
Similar situation with QLC flash and HDDs btw.
Actually, the textile market is pretty volatile in the US these days with Joan's out of business. Pick a poison, I guess? There's little room for stability in a privately-owned-world.
Usually after the companies are fined for price-fixing
https://en.wikipedia.org/wiki/DRAM_price_fixing_scandal
Close a few Chrome tabs, and save some DDR5 for the rest of us. :-)
RAM uses power.
If you are working on an application that has several services (database, local stack, etc.) as docker containers, those can take up more memory. Especially if you have large databases or many JVM services, and are running other things like an IDE with debugging, profiling, and other things.
Likewise, if you are using many local AI models at the same time, or some larger models, then that can eat into the memory.
I've not done any 3D work or video editing, but those are likely to use a lot of memory.
You're welcome.
(Not at all)openAI saw they are getting behind their competitors (gpt 5 and 5.1 were progressively worse for my use case - actual problem solving and tweaking existing scripts) are getting better. (Claude and sonnet were miles ahead and I used gpt only due to lower price). Now not only open weights models like Qwen3 and kimik2 exceeded their capability and you can run them at home if you have the hardware or for peanuts on a variety of providers. Cheap-er hardware like strix halo (and Nvidia dgx) made 128gb vram achievable to enthusiast. And Google is eating their punch with Gemini.
All while their CFO starts talking about government bailing them out from spending they cannot possibly fund.
Of course they will attempt to blow up the entire hardware market so if they AI flops they will be able to at least re not you hardware like AWS.
Of course they
Kinda curious how the story
- the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns. Even if it squeezes out every single other sector that happens to want to use SDRAM to do things OTHER than buffer memory before it's fed into a PCIE lane for a GPU.
- I'm really REALLY glad i decided to buy brand new gaming laptops for my wife and I just a couple months ago, after not having upgraded our gaming laptops for 7 and 9 years respectively. It seems like gamers are going to have this the worst - GPUs have been f'd for a long time due to crypto and AI, and now even DRAM isn't safe. Plus SSD prices are going up too. And unlike many other DRAM users where it's a business thing and they can to some degree just hike prices to cover - gamers are obviously not running businesses. It's just making the hobby more expensive.
There's too much group-think in the executive class. Too much forced adoption of AI, too much bandwagon hopping.
The return-to-office fad is similar, a bunch of executives following the mandates of their board, all because there's a few CEOs who were REALLY worked up about it and there was a decision that workers had it too easy. Watching the executive class sacrifice profits for power is pretty fascinating.
Edit: A good way to decentralize the power and have better decision making would be to have less centralized rewards in the capital markets. Right now are living through a new gilded age with a few barons running things, because we have made the rewards too extreme and too narrowly distributed. Most market economics assumes that there's somewhat equal decision making power amongst the econs. We are quickly trending away from that.
At least before there was a certain common baseline derived from everyone watching the same news and reading the same press. Now they are just as enclosed in their thought bubbles as everyone else. It is entirely possible for a tech CEO to have a full company of tech workers despising the current plan and yet that person being constantly reinforced by linkedin and chatgpt.
I remember first hearing the phrase "yes man" in relation to a human ass kisser my dad worked with in like 1988.
It's very easy to unknowingly surround yourself with syncophants and hangers on when you literally have more money than some countries. This is true now and has been true forever. I'm not sure they're more out of touch, as much as we're way more aware?
I think you're underestimating a bit. We must implement AI because they were able to sell it so good that they got billion $ investors (see all the money coming from Qatar/saudi arabia etc). That's a lot of money coming in that allows to innovate/etc.
Perhaps a better approach to anti-monopoly and anti-trust is possible, but I'm not sure anybody knows what that is. Khan was very well regarded and I don't know anybody who's better at it.
Another approach would be a wealth and income taxation strategy to ensure sigmoid income for the population. You can always make more, but with diminishing returns to self, and greater returns to the rest of society.
Don't pick a fight with people who buy ink by the barrel and bandwidth by the exabyte-second. Or at least, don't do it a month before an election.
That being said, Kamala's refusal to run on Kahn's record definitely helped cost her the election. She thought she could play footsie with Wall Street and SV by backchanneling that she would fire Kahn, so she felt like she couldn't say anything good about Kahn without upsetting the oligarchs, but what she was doing was really popular.
Well, assuming they haven't revived the cartel.
This is in the right spirit but you want two things to be different about it.
The first is that the threshold for a given industry doesn't make sense as a dollar amount, it makes sense as a market share percentage. Having more than 15% market share should be a thing companies don't want, regardless of whether it's a $100 trillion industry or a $100 million one.
And the second is that taxes create a perverse incentive for the government. You absolutely do not want the government to have even more of a financial incentive to sustain and create more of the companies of that size. What you want is to have fewer of them.
So, what you want is a rule that if a company has more than 15% market share, the entire general public is allowed to sue them into bankruptcy for the offense of market consolidation. Which also removes the problem where they buy off the government prosecutors, because if they commit the offense then anybody can sue them.
who bears the costs of this suit?
And who determines what makes for a good market share size to be the threshold?
And by having such a rule, an industry that would have higher efficiency to when consolidated would not be able to (but you wouldn't know). It's a bad set of policy imho.
A better way would be for gov't to increase competition by adding supply, or demand, whichever one is the bottleneck to competition. If a company, such as AWS, is getting a lot of marketshare, but their profit margins is still high, then the gov't should incentivize competition by funding or giving loans to businesses that want to compete with AWS.
However, if AWS's profit margins, even at high market share, remains very low (e.g., amazon's commerce side), then there's no need for the gov't to "step in" at all, as there would be no incentive for any competitor to try enter the market due to low margins.
The goal is to not have it happen, because the company is going to see that they're only slightly below the threshold and voluntarily split themselves into smaller pieces and buy themselves a safety margin because if they don't everybody knows the lawsuits are going to vaporize them once they exceed the threshold.
> And who determines what makes for a good market share size to be the threshold?
Anything in the vicinity of 5%-15% would be fine.
> And by having such a rule, an industry that would have higher efficiency to when consolidated would not be able to (but you wouldn't know).
This is extremely rare and the circumstances where it happens aren't a mystery. It's when entering the market has extremely high fixed costs but then the unit cost of usage is negligible, e.g. it costs a huge amount of money to install water and sewer but then the incremental cost of someone washing their hands is insignificant.
For those things you either have the government do them, or if it's a private company then it's a regulated utility which is completely banned from anything that even vaguely resembles vertical integration as the price of being allowed to have more than the threshold amount of market share.
> A better way would be for gov't to increase competition by adding supply, or demand, whichever one is the bottleneck to competition.
The problem is generally caused by the incumbents capturing the government and then enacting rules that inhibit rather than increase competition. That's why you need anyone to be able to initiate the lawsuit, so they can't capture the government department which is supposed to be thwarting them because then it's the entire public.
so why not solve this issue directly? Transparency, auditing and public awareness etc are needed to prevent regulatory capture. Public apathy are the reason why it is currently "easy" to do capture regulators.
The fact is even if a law suit is possible from anyone in the public, no one is going to pay to do a law suit (which has costs), when the result doesn't net them more profit. So unless the law suit enables the accuser to wholesale take a piece of that company as private property from the owners - which no law currently would allow nor have precedents for - why would anyone expend private money for a public good?
And in any case, i don't the apathy going away, even if the law suit was free. Because currently, the same apathy is allowing regulatory capture in the first place. So solving public apathy first, and foremost, is the solution.
It's mostly easy because the people doing it are good at lying. When they create a rule it isn't called the "mandate this company's product rule" or the "increase fixed costs to lock out smaller competitors rule", it's sold as a safety measure or consumer protection or some other pretext, even though the effect is to raise costs to the benefit of the companies getting the money or exclude competitors to the benefit of the incumbents.
Or they simply don't prosecute antitrust violations, and then there is nothing to audit because there is nothing happening, meanwhile people are kept distracted with other things.
> The fact is even if a law suit is possible from anyone in the public, no one is going to pay to do a law suit (which has costs), when the result doesn't net them more profit.
It does net them more profit. The premise is that having more than the threshold amount of market share is a strict liability antitrust violation, which allows any customer or prospective customer (i.e. anyone) to sue them for it. The person who files the lawsuit would get the money, the same as someone who sues a company for pollution or fraud.
The point of letting people sue you for polluting or fraud or, in this case, market consolidation, isn't to make plaintiffs rich, it's to deter the thing you don't want companies to do. The goal isn't to have a lot of lawsuits, the goal is to have companies not want the market to consolidate and actively prevent it because if it happens they'll get sued.
> So solving public apathy first, and foremost, is the solution.
Apathy is cyclical. People don't care until the problem gets bad enough, then they care enough to demand change and make it go away for a while, then they stop caring until it gets bad enough again.
But you don't want people to have to die or get severely abused before the problem gets addressed. What you want is to change the structure of the system to prevent it from getting that bad to begin with, by making sure that the power to nip the problem in the bud (i.e. stop market consolidation at 5% or 15% instead of 50% or 90%) is held by someone who will actually exercise it, which can be accomplished by granting that power to everyone affected, which in this context is each and every member of the public.
Chip fabs used to be like book publishers; you don't have to own a printing press to be an author. Carver Mead even described his vision of the industry that way.
Nowadays you have to get your cell libraries and a large chunk of your toolchain from the fab. Of course it's laundered through cadence+synopsys, but it's still coming from the fab. You have to buy your masks from the fab (heck they aren't even allowed to leave the fab so do you really own them?). And on and on.
For the record I don't agree with the "exponential" part, but otherwise this is an underappreciated and powerful technique.
I can still make a book like that in my basement. People do this as a hobby now. You can still build chips like that in your garage. People do this as a hobby now.
These things DO NOT SCALE... you cant have 10,000 people running printing presses in their basement to crank out the NYT every day. A modern chip fab has more in common with the printer for the NYT than it does with what you can crank out in your garage.
Let's look at TSMC's plant in AZ. They went and asked intel "hey where are you sourcing your sulfuric acid from. When they looked at the American vendors TSMC asked intel "how are you working with this". Intels response was that it was the best they could get.
It was not.
TSMC now imports sulfuric acid from Taiwan, because it needs to be outrageously pure. Intel is doing the same.
Every single part, component, step and setup in the chain is like that. There is so much arcane knowledge that loss of workers represents a serious set back. There are people in the production chain, with PHD's, who are literally training their successors because thats sort of the only option.
Do you know who has been trying the approach you are proposing? China. It has not worked.
https://www.youtube.com/asianometry probably the best rough and ready education you can get on the industry.
> https://www.youtube.com/asianometry probably the best rough and ready education you can get on the industry.
I would take anything from that channel regarding China with a pinch of salt.
You can absolutely manufacture a convincingly-professional, current-generation book in your basement with a practically-small capital investment.
You cannot manufacture a convincingly-professional chip (being generous: feature size and process technology from the last two decades) in your basement without a 6-7 figure capital expenditure, and even then - good luck.
The latter would have to be backstopped by taxes on individual income.
I think this is actually the long tail of "too big to fail." It's not that they're all thinking the same way, it's that they're all no longer hedging their bets.
> we have made the rewards too extreme and too narrowly distributed
We give the military far too much money in the USA.
~ themafia, 2025
(sorry)
On a more serious note the military is sure a money burning machine, but IMHO it's only government spending, when most of the money in the US is deliberately private.
The fintech sector could be a bigger representation of a money vacuuming system benefiting statistically nobody ?
It's also the case that none of the CIA, NSA or DHS budgets show up under the military, even though they're performing some of the same functions that would be handled by militaries in other countries.
We also have "black appropriations." So the total of the spending on surveillance and kinetic operations is often unknowable. Add to this the fact the Pentagon has never successfully performed an audit and I think people are right to be suspicious of the topline "fraction of GDP" number.
At least not in the data set I use:
https://www.usaspending.gov/explorer/agency
Comparing budgets by adjusting for inflation doesn't make any sense. A budget that served a country of 218 million in 1976 would, when adjusted for inflation, serve a country of 218 million in 2026. Percentage of GDP is what you want to look at.
Budget-to-GDP ratio in the US is close to 40%. (On that note, you should really consider federal + state combined rather than just federal.)
In early 1900s this same ratio was around 5-10%.
It has been increasing pretty much everywhere during the 20th century. It has made me wonder whether much of the prosperity we've seen and felt might not be a result of this ever-increasing percentage. Essentially we're spending more and more and that makes it feel like we're progressing faster than we are. Eventually it's going to have to stop though and I dread what happens when we do.
https://www.cbo.gov/publication/59946
In theory I guess this creates a demand that should be satisfied by the market but in reality it seems like when the wealth is too concentrated in the hands of the few that call all the decision the market is unable to act.
But yeah in the end companies behave in trends, if some companies do it then the other companies have to do it too, even if this makes things less efficient or is even hurtful. We can put that onto the human factor, but I think even if we replaced all CEOs with AIs, those AIs would all see the same information and make similar decisions on those information.
There is pascal's wager arguments to be had: for each individual company, the punishment of not playing the AI game and missing out on something big is bigger than the punishment of wasting resources by allocating them towards AI efforts plus annoying customers with AI features they don't want or need.
> Right now are living through a new gilded age with a few barons running things, because we have made the rewards too extreme and too narrowly distributed.
The usa has rid itself multiple times of its barons. There is mechanisms in place, but I am not sure that people really are going to exercise those means any time soon. If this AI stuff is successful in the real world as well, then increasing amounts of power will shift away from the people to the people controlling the AI, with all the consequences this has.
It's a form of "centralized planning", except it's not centralized at all.
The only saving grace is that it can die and others will scoop up released resources.
When country level planned economy dies, people die and resources get destroyed.
This is confused. Here is how classical economists would frame it: a firm chooses how much to produce based on its cost structure and market prices, expanding production until marginal cost equals marginal revenue. This is price guided production optimization, not central planning.
The dominant criticism of central planning is trying to set production quantities without prices. Firms (generally) don’t do this.
But the power concentration is a strong reason. That level of wealth is incompatible with democracy. Money is power, and when someone accumulates enough of it to be able to personally shake entire industries, it's too much.
No, it's pure capitalism where Atlas shrugged and ordered billions worth of RAM. You might not like it but don't call it "centralized planning" or "Soviet era".
We have been living on the investment of previous centuries and decades in the West for close to 40 years now. Everything is broken but that didn't matter because everything that needed a functioning physical economy had moved to the East.
AI is the first industrial breakthrough in a century that needs the sort of infrastructure that previous industrial revolutions needed: namely a ton of raw power.
The bubble is laying bare just how terrible infrastructure is and how we've ignored trillions of maintenance to give a few thousand people tax breaks they don't really need.
All the infrastructure will be useless when the data centers move to the next city/state offering a tax cut.
The British didn't industrialise Indian for a reason.
https://en.wikipedia.org/wiki/De-industrialisation_of_India
Is it?
This resonates deeply, especially to someone born in the USSR.
You can blame irrational exuberance, bubbles, or whatnot markets are ultimately individual choices times economic power. Ai, Crypto, housing, Dotcom etc going back through history all had excess because it’s not obvious when to join and when to stop.
If it was a couple billion dollars of memory purchasing nobody would care.
It happens more often than you might expect.
The Onion Futures Act and what led to it is always a fun read: https://en.wikipedia.org/wiki/Onion_Futures_Act
Except that these corporations will almost certainly get a bail out, under the auspices of national security or some other BS. The current admin is backed by the same VCs that are all in on AI.
The only way the massive planned investments make sense is if you think the winner can grab a very large piece of a huge pie. I've no idea how large the pie will be in the near future, but I'm even more skeptical that there will be a single winner.
The more I dream about the possibilities of AR, the more I believe people are going to find it incredibly useful. It's just the hardware isn't nearly ready. Maybe I'm wrong but I believe these companies are making some of the largest strategic blunders possible at this point in time.
maybe AI cures cancer, or at least writes some code
The tone from the AI industry sounds more like a dependent addict by comparison. They're well past the phase where they're enjoying their fix and into the "please, just another terawatt, another container-ship full of Quadros, to make it through the day" mode.
More seriously, I could see some legitimate value in saying "no, you can't buy every transistor on the market."
It forces AI players to think about efficiency and smarter software rather than just throwing money at bigger wads of compute. This might be part of where China's getting their competitive chops from-- having to do more with less due to trade restrictions seems to be producing some surprisingly competitive products.
It also encourages diversification. There is still no non-handwavey road to sustainable long-term profitability for most of the AI sector, which is why we keep hearing answers like "maybe the Extra Fingers Machine cures cancer." Eventually Claude and Copilot have to cover their costs or die. If you're nVidia or TSMC, you might love today's huge margins and willing buyers for 150% of your output, but it's simple due diligence to make sure you have other customers available so you can weather the day the bubble bursts.
It's also a solid PR play. Making sure people can still access the hobbies they enjoy is an easy way to say you're on the side of the mass public. It comes from a similar place to banning ticket scalping or setting reasonable prices on captive concessions. The actual dollars involved are small (how many enthusiast PCs could you outfit with the RAM chips or GPU wafer capacity being diverted to just one AI data centre?) but it makes it look like you're not completely for sale to the highest bidder.
Currently we are still at the stage of extraction from the upper/middle class retail investors and pension funds being sucked up by all the major tech companies that are only focused on their stock price. They have no incentive to compete, because if they do, it will ruin the game for everyone. This gets worse, and the theory (and somewhat historically) says it can lead to war.
Agree with the analysis or not, I personally think it is quite compelling to what is happening with AI, worth a watch.
There are potentially undesirable tradeoffs and a whole new game of cheats and corruption, but you could frustrate rapid, concentrated growth with things like an increasing tax on raised funds.
Right now, we basically let people and companies concentrate as much capital as they want, as rapidly as they want, with almost no friction, presumably because it helped us economically outcompete the adversary during the Cold War. Broadly, we're now afraid of having any kind of brake or dampener on investments and we are more afraid of inefficiency and corruption if the government were to intervene than we are of speculation or exploitation if it doesn't.
In democratically regulated capitalism, there are levers to pull that could slow down this kind of freight train before it were to get out of control, but the arguments against pulling them remain more thoroughly developed and more closely held than those in favor of them.
Care to share some keywords here?
Unfortunately, that doesn't seem to be the flavor of politics on tap at the moment.
Sam Altman cornering the DRAM market is a joke, of course, but if the punchline is that they were correct to invest this amount of resources in job destruction, it's going to get very serious very quickly and we have to start making better decisions in a hurry or this will get very, very ugly.
Yeah I know HN is going to hate me for saying that.
If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
Once "better served" is quantified, you know the coefficient for taxation.
Make no mistake, this coefficient will be a political football, and will be fought over, just like the Fed prime interest rate. But it's a single scalar instead of a whole executive branch department and a hundred kilopages of regulations like we have in the antitrust-enforcement clusterfuck. Which makes it way harder to pull shenanighans.
Why? That's exactly the circumstances where the mere potential for small companies to pop up is enough to police the big company's behavior. You get lower costs (due to economies of scale) and a very low chance of monopolization. so everyone's happy. In the case of this DRAM/flash price spike, the natural "small" actors are fabs slightly off the leading edge, that will be able to retool their production and supply these devices for a higher profit.
If that were true, "you're in Amazon's kill zone" wouldn't be something VC's say to startups. And yet, they do say that.
well, assuming the scale couldn't be used for the benefit of society and not to milk it dry. but yes probably the best that can have a reasonable chance at success, eventually, maybe.
How so? Costs will be higher with multiple small products, resulting in higher costs for customers. That's the opposite of "society is served better".
We draw the line at monopolies, which makes sense.
Best to nip corpos before they gain more revenue than a nation state and become "too big to fail".
And if the market crashes or takes a big dip then temporarily eBay will flood with high end stuff at good prices.
Sucks for anyone who needs to upgrade in the next year or two though !
That's basically what the rich usually do. They command disproportionate amount of resources and misallocate them freely on a whim, outside of any democratic scrutiny, squeezing incredible number of people and small buisness out of something.
Whether that's a strength of the system or the weakness, I'm sure some rearch will show.
It's a little ironic but to call this a market failure due to resource misalocation because prices are high when high prices is how misalocation is avoided.
I'm a little suspicious that "misalocation" just means it's too expensive for you. That's a feature, not a bug.
I see people using "market failure" in weird ways lately. Just because someone thinks a use for a product isn't important, doesn't mean it's a market failure. It's actually the opposite - consumers are purchasing it at a price they value it.
Someone who doesn't really need 128GB of ram won't pay the higher cost, but someone who does need it will.
Technically speaking, this is not a market failure. [1] Why? Per the comment above, it is the individuals that are acting irrationally, right? The market is acting correctly according to its design and inputs. The market’s price adjustment is rational in response. The response is not necessarily fair to all people, but traditional styles of neoclassical economic analysis deaccentuate common notions of fairness or equality; the main goal is economic efficiency.
I prefer to ask the question: to what degree is some particular market design serving the best interest of its stakeholders and society? In democracies, we have some degree of choice over what we want!
I say all of this as a person who views markets as mechanisms not moral foundations. This distinction is made clear when studying political economic (economics for policy analysis) though I think it sometimes gets overlooked in other settings.
If one wants to explore coordination mechanisms that can handle highly irrational demand spikes, you have to think hard. To some degree, one would have to give up a key aspect of most market systems — the notion of one price set by the idea of “willingness to pay”.
[1] Market failure is a technical term within economics meaning the mechanism itself malfunctions relative to its own efficiency criteria.
I would call that market manipulation(or failure if you wish)--in a just society Sam Alton would be heading to prison.
As someone who advocates that we only use capitalism as a tool in specific areas and try to move past it in other, I’ll defend it here to say that’s not really a market anymore when this happens.
Hyper concentration of wealth is going to lead to the same issues that command economies have where the low level capital allocations(buying shit) isn’t getting feedback from everyone involved and is just going off one asshole’s opinion
It’s a classic ‘tulip bubble’.
The actual underlying models of productive output for these AI tools is a tiny fraction (actually) of the mania, and can be trivially produced at massive quantity without the spend that is currently ongoing.
The big bubble is because (like with tulips back then), there was a belief in a degree of scarcity (due to apparent novelty) that didn’t actually exist.
Anyone who owns shares in US companies (most people here) both are ‘going home with’ the companies involved, and are buying ‘the bags’.
Not to mention all the people buying the bonds used to fund the whole AI data center buildout, which is a ton of probably pension funds and old folks planning for retirement (also probably more than a few millionaire/billionaires!).
There still isn't a clear path to profitability for any of these AI products and the capital expenditure has been enormous.
Their inventories are not what consumers use.
Consumer DDR5 motherboards normally take UDIMMs. Server DDR5 motherboards normally take RDIMMs. They're mechanically incompatible, and the voltages are different. And the memory for GPUs is normally soldered directly to the board (and of the GDDRn family, instead of the DDRn or LPDDRn families used by most CPUs).
As for GPUs, they're also different. Most consumer GPUs are PCIe x16 cards with DP and HDMI ports; most hyperscaler GPUs are going to have more exotic form factors like OAM, and not have any DP or HDMI ports (since they have no need for graphics output).
So no, unfortunately hyperscalers dumping their inventories would be of little use to consumers. We'll have to wait for the factories to switch their production to consumer-targeted products.
Edit: even their NVMe drives are going to have different form factors like E1.S and different connectors like U.2, making them hard for normal consumers to use.
Wrong. It is still just NVMe over PCIe like every other modern SSD form factor.
I put together a small server with mostly commodity parts.
HBM/GDDR is not necessarily as useful to the average person as DDR4/DDR5
Will just have to settle for insanely cheap second hand DDR5 and NVMe drives I guess.
AI GPUs are stripped away of most things display-related to make room for more compute cores. So in theory, they could "work", but there are bottlenecks making that compute power irrelevant for gaming, even if they had a display output.
But anyway, the trick is to run it in the winter and keep your house warm.
We also were looking for DDR4 memory for some older machines and that has shot up 2x as well.
Hate this AI timeline.
I picked up 32GB (2x16GB) DDR4 (CMK32GX4M2E3200C16) last September for $55. Now it's $155.
Without that competition, everyday consumers are going to get priced out of the market by major corporations. We have reached a point in CPU technology where newer tech is no longer automatically cheaper and faster to make; therefore, we need more competition to keep prices down.
They’ve all pretty much 5x’ed YTD. That’s completely wild.
> hbm chips are now emerging as another bottleneck in the development of those models. Both sk Hynix and Micron, an American chipmaker, have already pre-sold most of their hbm production for next year. Both are pouring billions of dollars into expanding capacity, but that will take time. Meanwhile Samsung, which manufactures 35% of the world’s hbm chips, has been plagued by production issues and reportedly plans to cut its output of the chips next year by a tenth.
not only is it impossible to build that much power generation on those timelines
it's also not possible to build enough GPUs to fill a purported tripling of US datacenter capacity
what's the ROI on giant empty warehouses full of empty server racks and no electricity?
They can afford to pay more.
Especially for systems for which the workloads are actually bound by GPU compute, network, or storage.
2 months ago there were a load of second gen xeon scalable servers on offer. Now every one of them has had the ram stripped out and its just the chassis on offer.
Fabs are not wasting their time on DDR4 now.
https://m.youtube.com/watch?v=9hLiwNViMak
We're not, and market dictates that they don't have to talk to know to jack up the prices.
This ram price spike is leading Nvidia reporting for this quarter: gross margins were 70 percent. It's looking like their year over year increase in margins (double) is not because it came anywhere close to shipping double the number of units.
Meanwhile if you look at Micron their gross margin was 41% for fiscal year 2025, and 2024 looks to be 24%.
Micron and its peers, are competing with Nvidia for shareholder dollars (the CEO's real customer). Them jacking up prices is because there is enough of the market, dumb enough, to bear it right this second. And every CEO has to be looking at those numbers and thinking the same things: "Where is my cut of the pie, why aren't we at 60 percent".
We're now at a point where hardware costs are going to inhibit development. Everyone short of the biggest players are now locked out, and thats not sustainable. Of the AI ventures there is only one that seems to have a reasonable product, and possibly reasonable financials. Many of the other players are likely going to be able to weather the write downs.
The music will stop, the question is when.
Enjoy yohr number goes up fad.
But for real, that sucks. The alternatives -- much older, used RAM -- may not be very attractive, depending on what you're doing.
Where I live price for my little cluster project gone up from around ~400 usd in july (for 5 node setup) to almost 2000 usd right now. I just refreshed page and it's up by 20% day-to-day. Welp. I guess they are going to stay with 8gb sticks for a while.
They put ads in the refrigerators. Never buy Samsung anything ever again.
That includes everyone who works in supply chain at big tech. Permanent total boycott.
https://openai.com/index/samsung-and-sk-join-stargate
The Samsung announcement contains no reference to scaling up production:
https://news.samsung.com/ca/samsung-and-openai-announce-stra...
Semiconductor companies have been bitten in the past by scaling up production into a bubble, so of course Samsung just raises prices. When you buy DRAM, remember that you are financing oligarchs and that Stargate has lied yet again.