I'm struggling to understand why anyone would think Google is doing this for user's privacy. I admit I haven't dug into what's going on here in detail, but my first reaction was Google is running a small model on the user's side because it's doing things that _can_ be done on the user side and they don't want to waste their own compute to do it on their server side. I'm pretty sure whatever this thing is doing, Google can easily beam up some small amount of data, have a model churn on it and spit back the result to the user's browser. But why do all that if you can just run some small inference on the user's device?
How's this conspiracy supposed to work? A technical audience who cares about privacy aren't going to be placated by 4GB sitting on their disk. They're going to want some sort of analysis (like http interception), or probably not use chrome in the first place. A non-technical audience isn't going to make the association between 4GB of disk usage and the privacy implications.
1. I've got a Chrome local model stored on my drive
2. I see a heavily promoted "AI search" box in chrome
Natural Conclusion: when I use all the promoted AI features in chrome it's using the local AI model. This is not true; Google is being intentionally misleading.
I suspect the type of person who is even aware of this 4GB blob is the type of person who would research its usage. Pretty high venn diagram crossover.
That misses the point of the original commenter. He is saying that local model only powers things where privacy is not so relevant and that creates the illusion.
I had wondered if this was actually a bug and not intentional:
> When a user downloads or updates Chrome, Gemini Nano is downloaded on demand to ensure Chrome downloads the correct model for the user's hardware. The initial model download is triggered by the first call to a *.create() function (for example, Summarizer.create()) of any built-in AI API that depends on Gemini Nano.
This sounds like it could be possible that some part of Chrome, or perhaps a privileged website (ie; google.com), could be invoking `*.create()` 100% of the time? I don't actually know that this is what's going on or even if it's likely mind you.
It is also quite ironic that one of the docs pages is titled "Inform users of model download" although it goes on to talk about notifying in terms of model download time, not necessarily getting user consent:
This, the vm bundle which reappears after you delete it. They say it's For Cowork and Claude Code, but if you don't use Cowork or CC sandboxing, it has no value. Considering I'm always finding things to delete on apples anaemic 512gb because I run out of space.
First IE, now Chrome. What gets into these companies heads once they get biggest market share? And the people working for these companies. How do you sleep at night bro?
It's crazy to me how consumer computer storage has stalled out at the 2010 level for so long. And if anything we're going backwards now in 2026. We should be having many TBs in our home computers and laptops. Instead most users are still stuck with 256GB and trying to tetris around to fit even their average amount of small data.
It is nothing. This whole fiasco is being blown way out of proportion when there are a hundred other issues with Chrome that we could be complaining about.
>It's crazy to me how consumer computer storage has stalled out at the 2010 level for so long. And if anything we're going backwards now in 2026. We should be having many TBs in our home computers and laptops. Instead most users are still stuck with 256GB and trying to tetris around to fit even their average amount of small data.
Well we got to the point where you can have 8TB of slow storage or 256GB of faster storage and everyone chose speed.
> Well we got to the point where you can have 8TB of slow storage or 256GB of faster storage and everyone chose speed.
In 2014-2015, $100 would get you either 3TB of hard drive or 256GB of SSD.
In 2023-2024, $100 would get you 2TB of SSD. (For a few months even 3TB.)
So yeah everyone chose the speed option, but the speed option should have kept growing. Outside of bargain basement models 1-2TB should have become the minimum size.
AI companies bought up all the NAND manufacturing capacity, limiting the available manufacturing capacity for consumer products. These data centers also use hard drives for some of their data storage.
I reckon until the recent ai-gobbles-everything-up phenomena, this was mainly an Apple problem. Even fairly budget PCs come with at least 1tb of storage. Considering much beyond 2tb NAND gets scary pricing wise, I'm not that surprised we don't see much beyond that.
Yes, but I don't think it was just Apple. The switch to charge trap based SSD storage set all pre-built consumer computers back a full decade in terms of storage size. We were only just getting back beyond 2010 levels when the megacorps started buying up all the flash fab capacity and now even most of the HDD plates are going to enterprise.
A full decade is a bit of an exaggeration. Not just in terms of storage capacity but especially when you consider than switching from HDDs to SSDs was a massive leap in performance for PCs and laptops.
There's no debating the performance. Charge trap flash makes computing so much better. It's just a shame things went SSD only. It really isn't an exaggeration when it comes to actual storage space available per prebuilt.
I don’t know what pre-builts you’ve seen, but when I bought 2 middle-range laptops 5 years ago, all the models were between 500GB to 1TB of storage.
And it’s not a trap when most people aren’t going to fill 5TB of storage with their accounts spreadsheets but they are going to notice the performance difference between an SSD and a HDD.
Yep. 500GB-1000GB is 2010 level of storage. And I in my experience they fill it up with photos and videos and then move onto unreliable, expensive, slow externals.
I've got a cheap chromebook I take when traveling with 32gb ssd... 4gb is a huge chunk of that. But it doesn't matter as it constantly complains to me about no space available.
https://news.ycombinator.com/item?id=48019219
How's this conspiracy supposed to work? A technical audience who cares about privacy aren't going to be placated by 4GB sitting on their disk. They're going to want some sort of analysis (like http interception), or probably not use chrome in the first place. A non-technical audience isn't going to make the association between 4GB of disk usage and the privacy implications.
Natural Conclusion: when I use all the promoted AI features in chrome it's using the local AI model. This is not true; Google is being intentionally misleading.
And I want $1 billion dollars.
Doesn’t mean someone’s going to give it to me.
> When a user downloads or updates Chrome, Gemini Nano is downloaded on demand to ensure Chrome downloads the correct model for the user's hardware. The initial model download is triggered by the first call to a *.create() function (for example, Summarizer.create()) of any built-in AI API that depends on Gemini Nano.
This sounds like it could be possible that some part of Chrome, or perhaps a privileged website (ie; google.com), could be invoking `*.create()` 100% of the time? I don't actually know that this is what's going on or even if it's likely mind you.
https://developer.chrome.com/docs/ai/understand-built-in-mod...
It is also quite ironic that one of the docs pages is titled "Inform users of model download" although it goes on to talk about notifying in terms of model download time, not necessarily getting user consent:
https://developer.chrome.com/docs/ai/inform-users-of-model-d...
You can find more info here: https://github.com/anthropics/claude-code/issues/22543#issue...
But maybe that was a me error and worth a second shot.
I don't want that AI crap on my computer. This is like a trojan horse.
I always avoided Chrome as much as possible, now I have a real reason to do so.
I wonder if Chromium-based browsers is or will do the same?
It's crazy to me how consumer computer storage has stalled out at the 2010 level for so long. And if anything we're going backwards now in 2026. We should be having many TBs in our home computers and laptops. Instead most users are still stuck with 256GB and trying to tetris around to fit even their average amount of small data.
Well we got to the point where you can have 8TB of slow storage or 256GB of faster storage and everyone chose speed.
In 2014-2015, $100 would get you either 3TB of hard drive or 256GB of SSD.
In 2023-2024, $100 would get you 2TB of SSD. (For a few months even 3TB.)
So yeah everyone chose the speed option, but the speed option should have kept growing. Outside of bargain basement models 1-2TB should have become the minimum size.
And it’s not a trap when most people aren’t going to fill 5TB of storage with their accounts spreadsheets but they are going to notice the performance difference between an SSD and a HDD.
Related:
Chrome removes claim of On-device Al not sending data to Google Servers
https://news.ycombinator.com/item?id=48050964
> "Still using” the most popular browser in the market by an absolutely huge margin?
The strawman derail notwithstanding, the answer is no. No they do not.