A recent (last year?) Windows update moved a lot of people's Documents folder into OneDrive, without asking. In some cases people lost data, in others it was a nuisance as all sorts of embedded or saved paths broke.
Yup, they managed to get my Dad on this. When he uninstalled it, they give you a small warning "oh you might lose some data" - when it is 100% guaranteed if you were over their 5gb limit.
Absolutely insane, glad I had Backblaze to restore from, but it even remapped his Documents and other home folders to a new place, so using restore didn't immediately make the files appear.
> A recent (last year?) Windows update moved a lot of people's Documents folder into OneDrive, without asking.
Something similar happened to my Dropbox on Mac: the change to the macOS File Provider API meant that a lot of my locally synced files became cloud-sync only. Not cool when I was expecting limited internet access.
I still have no idea how to redownload and keep everything local while synced with cloud.
There are plenty of reasons to criticize OneDrive and I personally would not use it. But I think comparing it with a weekend vibe coded self hosted project is a bit of a stretch.
I think many people are just looking at the outcomes. Slop is now being retroactively applied to human produced software as well, as a new adjective for software generally.
There is also sentiment / understanding that Microslop has been pushing their devs to do the ai thing and that it is resulting in more downtime and bugs. This is not limited to one company and more an artifact of the hype cycle. If anything, it's worse with the corporate product because there should be more checks and balances, but here we are.
How can I practically verify 2TB of a life's worth of files while guaranteeing I won't have data loss due to some edge cases and race conditions that delete my data.
Every time I've created my own backup script I realized knowing what to delete and when is not easy. IMO the practical solution to this is to just pay for more storage (within reason).
How can you guarantee you'll have access to your 2TB Google Drive when they ban your Google account for breaching terms or accidentally tripping a circuit breaker across one of their offerings?
This is unironically why I do not depending on google for products this important. I do have premium google drive as I needed barely over 15gb, but my main cloud storage is dropbox. A YT comment I made 10 years ago can't break Dropbox's TOS, and since premium storage is their whole business, they will take the product more seriously.
I also have a 14TB RAID 5 NAS at home. And my Desktop PC has 6TB of RAID 5 (had that first, mostly used for video games these days).
A backup is not something I fear of losing access to because by definition it’s a copy.
However I am more afraid of my data being exfiltrated and imo there is a more risk of that with a “vibe coded 1 person 1 week old” app rather than any of the major providers.
> How can I practically verify 2TB of a life's worth of files while guaranteeing I won't have data loss due to some edge cases and race conditions that delete my data.
Same with literally every other backup software. Have two, and test restorations regularly. It's not easy, but nothing worth it when you need it ever is.
I will never use one drive even if they paid me. I use Dropbox + Homelab NAS with RAID 5 + old desktop PC with a RAID 5 drive. I have a lot of RAW photos to keep.
I'll never forget when my Grandpa died 20 years ago, the first thing my dad did - even before telling us - was look for photos. His siblings did the same and they came up with a collage of around 30 photos I had never seen before that gave me a small glimpse of the highlights of his life.
My other grandpa, controversially used a big chunk of their wedding money on a good camera. They traveled the world and lived abroad for several years right before and after my mom and aunt were born. Because of this, we are all able to see such a fascinating and meticulous glimpse into their lives. Each photo tells a story even if the story is boring, but I really appreciated the small details. Even random pictures of cars that my Grandpa thought were cool. Or the mean guard dog they had in Taiwan while it was still a puppy. Or my mom on the Trans Siberian Railroad in the middle of the Cold War.
These stories and my own appreciation of photography have made me realize how valuable every photo I have is, and I'm willing to put in effort to save them. When I'm old and dying of dementia, I'll be able to look back at my life in incredible detail one last time. Even the dumb meme's I decided to save will tell a story.
I still have a deep appreciation for living in the moment and knowing not everything should be captured, but we live in an era where I have a really good camera in my pocket at all times, and the ability to store all those photos forever cheaply.
The selling point of Dropbox/Google Drive isn't the storage itself, but that there's app for mobile and desktop operating systems which deeply integrates it in the OS so it's just like a local folder that's magically synced.
So it's a cool project, but not really what I'd say is a Dropbox replacement.
That’s only an issue if you use Dropbox for sharing with non-Dropbox users, rather than for syncing files across devices and accounts, and having an extra versioned copy in the cloud.
Right - you pay for the GUI and the well-balanced user experience.
It's less about, strictly speaking, the storage.
Which is, in the end, true of a lot of tools where the underlying 'things' aren't particularly spectacular but rather it's the user experience that sells it
Yep, I use rsync to sync files / directories between my desktop, laptop and even phone (Android). Also an external drive.
I ended up creating https://github.com/nickjj/bmsu which calls rsync under the hood but helps you build up a valid rsync command with no surprises. It also codifies each of your backup / restore strategies so you're not having to run massively long rsync commands each time. It's 1 shell script with no dependencies except rsync.
Nothing leaves my local network since it's all local file transfers.
Except that for macOS it uses the FileProvider Framework. So files that are rarely accessed get deleted from your local storage and synced back automagically when you access them. Saving space on your disk because on mac you can’t upgrade your ssd without a soldering iron.
> but that there's app for mobile and desktop operating systems which deeply integrates it in the OS so it's just like a local folder that's magically synced
Which mobile OS would that be?
The big reason I stopped being excited about cloud storage is that on mobile, from what I can tell, none of the major providers care about "folder that syncs" experience. You only get an app that lets you view remote storage. The only proper "folder that syncs" I had working on my phone so far was provided via Syncthing, but maintaining that turned out to be more effort than my tiny attention span can afford these days.
On iOS, Dropbox integrates with the Files app. Since that was added a couple of years ago, I rarely have to open the Dropbox app itself. About the only time is when I want to restore an earlier version of a file.
You can also mark complete folders as “Make Available Offline”, which will keep their contents updated, though I don’t really use that personally.
The biggest benefit of the ios dropbox app is to search through the contents of all files. When accessing from the files app that is not possible, unfortunately.
Wow, I’m surprised. Of all my self-hosting solutions, this needs least maintenance, for me. Recently had to move to a fork of SyncTrazor, because a new project picked up support, but that’s the first time I had to think about it in years. Wish NextCloud and Immich were that easy!
I'm using iOS and macOS. On macOS I have the folder that syncs experience (I'm using Synology Drive, but Dropbox works the same way), on iOS I have the "browse remote files" experience but I can pin files I always want to keep available which is what I want.
Right. It's similar to Windows and Android experience. Thing is, for the latter, I don't want to "pin files I always want to keep available" - I want them to actually exist as files in the shared storage, so other apps can operate on them as if on regular files.
(Unfortunately, both mobile platforms themselves are actively fighting this approach, and instead encourage apps to keep data in private databases and never expose them as actual files.)
I get the pain point, but databases are a much better data model for multi-device, intermittently-connected sync. Filesystems just aren’t designed to be async and conflict-resolving.
My only major complaint with gdrive on Mac (besides Apple and Google but I have to deal with them for work) is that you can’t set the storage folder to an external location like with windows. I don’t want to be constantly loading/unloading media on my internal storage, but I don’t have a choice without janky work arounds. It’s a very frustratingly “Apple” thing to do.
Free, opensource, works on computers and phones, can in most cases puncture nat, supports local discovery (lan, multicast).
No googles, no dropboxes, no clouds, no AI training, no "my kid likes the wrong video on youtube, now our whole family lost access to every google account we had, so we lost everything, including family photos", just sync!
This is my go to solution for code sync across macOS laptop, Windows VMs, and Linux VMs to build and run/debug across environments. Unless something has changed, exclusions of build artifacts was always an issue with cloud sync providers.
I have been doing more cross compilation on macOS, copy and run on those other machines lately for prototypes, but for IDE based debugging it’s great to edit local or remote and get it all synced to the machine to run it in seconds.
The only issue I have, with this amazing piece of software that I heavily use across multiple devices, is management of sync failures and exclusions via the UI. I have been using it for long enough to know the tips and tricks but it would be great for the web UI to allow easy management of conflict issues and the ability to mark files/folders as exclusions in a friendly manner.
That's a very typical HN/developer response that often underestimates the effort and the interest people have into a self-rolled solution outside of tech / self-hosting circles. (https://news.ycombinator.com/item?id=47674176)
Given how many fuckups sync had over lifetime of it (at one point it basically asked for re-log every day, at other it just corrupted data/didn't finish sync), no
Cool project but calling it a Dropbox replacement is doing a disservice to both Dropbox and to people who might actually use this for important files. Dropbox is a sync engine. This is a web UI over object storage. Those are fundamentally different products solving fundamentally different problems
Same for Drive, which has the office suite included in people's minds.
I'm def looking for an alternative, taking proton for a test drive this week, because they have the office tools, which I didn't know and assume they are not as good, but hopefully good enough for my usage
not picking on you, specifically, but i wonder how many people could roughly draw a cardboard box template correctly. It is an easier object than a bicycle, which people routinely have issues drawing, too.
generally, there's not "missing pieces" in a cardboard box.
Fair, that template resolves to a box but it's missing stuff like tabs to make the bottom properly stick ; and it's probably not optimal in its use of cardboard. Also it was design in a minute in draw.io to make a stranger on the internet chuckle, so lots of constraints to fulfill.
Yes, and they have features like default soft delete with hard delete after x days that makes it a very compelling backup choice (guard against malware and mistakes). I'm a satisfied customer.
A family member has uploaded a backup of all of the family photos to Amazon Glacial Storage, on the order of a few hundred GB, and gleefully sends me screenshots of the <$1/mo charges.
AWS Glacier is cold storage, for things like legally mandated retention that you never need to access, or for humans, say digitizing your grandma's 35mm slides. It's not the same use-case as typical file backup, with performance that's probably not acceptable if you want a file (or even a listing) <now>. Good rule of thumb: Glacier is for things that you might need but ideally will never access again.
Hah, wow. A post with an ID under 10k. Meanwhile this one is over 47M.
I didn't realize I've been reading HN nearly its whole existence. For all my complaining about what's happened to the internet since those days, HN has managed to stay high quality without compromising.
Instead, the reply/rebuttal almost always comes from a new person. It makes nice reading when you have 6 people in an argument keeping each other honest vs 2.
Every so often someone is like, Dropbox isn’t that hard. Look at this amazing ZFS/whatever! So simple. Yeah, I keep paying Dropbox every year so I don’t have to think about it. I shoot a sync off to backblaze every once in a while.
I dislike Dropbox for reasons that aren't technical, but the big thing for me is that I want either E2EE, or control/ownership of where my data is stored. These are my personal files (no, not that kind of personal), I'm not just going to scatter them on the internet.
My solution so far has been NextCloud, but I'm getting pretty fed up with it. But not enough to actually do anything about it... yet.
> I dislike Dropbox for reasons that aren't technical, but the big thing for me is that I want either E2EE, or control/ownership of where my data is stored.
You could run something like Cryptomator on top of Dropbox:
I do agree with you at a philosophical level. I have worked in infosec long enough to know. I am pretty careful with what I upload. It’s just hard. Every little home hosted thing. They eat your time. Take effort. Even the “easiest” solutions have a real human cost if you are hosting it yourself.
My solution is… I have no fucks left to give about it. I haven’t for a long time. It works. My family will have all our photos and valuable sentimental data preserved. I keep a local backup. I spend my time on other stuff more valuable to me. If dropbox and backblaze disappear tomorrow oh well. If all my data is leaked? Knock yourself out. All the good stuff is in encrypted volumes.
My data has been breached 12 times by my count. /Twice/ by the OPM itself as I had a security clearance. DOGE goons have all our data and walked out with USB drives with all of it.
Equifax and credit agencies are a joke. My threat model is simple, nothing you can blackmail me with goes to the cloud. And that’s that. Breach me, try to hack my finances, try to steal my crypto. I have had my SIM transferred and someone unsuccessfully attacked my crypto accounts with it.
I love Dropbox, I pay annually. I use the open source client https://maestral.app/ on the Mac for workstation use, but also integrate other systems with my Dropbox account using their API. If someone built an open source Dropbox server that sat on top of S3 compatible storage, I would not only use it, but pay to have that optionality to get out of Dropbox if they ever enshittify. I can recognize form and function worth paying for ("value"), but still want an exit plan. It's not about the spend, it is about data sovereignty. This is colloquially referred to as “vendor and third party risk management.”
at the risk of a comment that doesn't age well, for most people on HN I would definitely look into just using rclone. I also has a GUI for people who want that. rclone is mind-blowingly good. You can set up client-side encryption (so object storage never sees the data or even the filename) to be seamless. I'm a huge fan
To be fair, I can't remember the last time I needed Dropbox or Google Drive, but I do use iCloud, since it comes with plenty of storage for my family plan. I don't send anyone files like back in the day where people would send me a Dropbox link and I'd send them one back.
The critical part of Dropbox is not just the storage layer but a combination of their client and server. Even small things like how do you handle conflicting writes to the same file from multiple threads, matter a great deal for data consistency and durability.
A lot of the backend bucket providers can handle file versioning.
I too would like the answer to this concern because the features page doesn’t mention it. I want to be able to handle file version history.
I’m currently using Filen which I find very reasonable and, critically, it has a Linux client. But I wish it was faster and I wish the local file explorer integration was more like Dropbox where it is seamless to the OS rather than the current setup where you mount a network share.
Features I'm guessing this is missing (in no particular order): Recycle bin, Sharing with permissions, Editor, Versioning, Search, Partial sync, zone redundancy/backup, Windows, Android, Mac, and iOS clients
There was a similar program I used when I was working at a smaller company and built an automated backup system for their call center recordings. It basically mapped the S3 bucket to a Windows drive letter, since the PBX call recording software was running on a Windows server. It was a while ago so I can't recall the name of the program
I think the idea is any s3 compatible api endpoint can be used. The code also clearly supports both backblaze, and more importantly, local blob storage
Fwiw, 2TB on R2 is $30/month, Wasabi is $14/month, both support S3, neither have egress fees. Backblaze is $10/month for 2TB but has an egress quota after which there are fees, so Backblaze is same cost as Dropbox, minus the egress quota. If Dropbox works for you there's no reason to switch.
Just saying, but this is not really fair. It's not like you use that 2TB. So you shouldn't compare it to a 2TB bucket. Most of these plans have limits to prevent abuse but they're well beyond the 'I need to care' level.
Maybe you use 1TB, maybe just 10GB. As a user on this site I expect you know that a 10GB plan and a 1TB plan won't be that much different.
It's pretty magical. It nails the "online" vs "cloud only" paradigm via the SeaDrive client. I have it running on my file server, and now all my machines have access to terabytes of storage with local performance, since it can cache a subset of your content locally.
And since I can run the server on my LAN, the throughput is way better than Dropbox would be too.
If you are not ready to trust a vibe-coded app with all your digital life, I recommend Filestash[1], easy to install self-hosted frontend for virtually any type of storage. Written in Go, it can be enhanced with plugins. Uses local SQLite database.
I am using it with Hetzner Storage Box[2], which is insane value for money at 11 euro per 5 terabytes per month.
But you have to pay for your own S3 bucket as well... and it's generally several times more expensive per terabyte, though this depends on different factors. (Not to mention you might still have to pay for Google if your e.g. Gmail doesn't fit into the free tier anymore.)
If this is supposed to be financially motivated, the creator seems to have it somewhat backwards.
That is the feature that gives your drive as a mounted file system that stream files as you need them.
It gives me the ease of having access to a giant amount of files stored in my gdrive without having to worry about the space they take up locally nor moving files up and down.
Actually, what solutions to that might already exist? I don't really use the web UI of gdrive as much as use it as a cloud disk drive.
Funny I kept getting this "you are using 98% of your storage space" message with my 15GB so I'd be finding/deleting old attachments then eventually I was like fine I'll pay, it's like 48 cents a month or something
Dropbox is a lot more than file storage. The syncing itself has been through serious tests to characterise its behaviour. Sure, some may not like the decisions taken to direct its sync behaviour one way or another but at least all these are known through formal testing.
I used to be excited by these kind of tools, I love to self-host stuff. When I clicked on the link, I had this hesitation, suspecting "maybe it's LLM generated". And sure enough, coming back to HN, description says it is.
File sync can't be that hard! Enters the first 3 way conflict and everything explodes.
Dont misunderstand me, this is a cool idea. But if your rotation time between ideating a project and pushing it to HN is a week, you don't understand the problem space. You didn't go through the pain of realizing its complexity. You didn't test things properly with your own data, lost a bunch of it and fixed the issues, or realized it was a bad idea and abandoned it. I have no guarantee you'll still be there in a month to patch any vulnerabilities.
Not that any open-source project had these kind of guarantees until now, but the effort invested in them to get to that point was at least a secondary indicator about who built it, their dedication, and their understanding of the space.
I bought 35$/mo 16TB server from OVH. I am running 2 replicas of Garage, one on this server. I am using this for backup for now but probably I will also move my Nextcloud files there and websites. This is fine for now and less pricey than any S3 provider I was able to find.
I was fishing for nice price for server like that for few months.
To be true I saw something similar but even cheaper last year and I should buy then for 20$, but thought that I do not need it. Now those servers are more like 60$/mo.
I already am using almost 4tb just for backup. 2TB for all of my files in NAS.
As I wrote above I will most probably move my Nextcloud instance storage there so I will be already using 6TB of storage. With built-in instant replication between nodes in garage I should have instant backup of my files.
Right now I do not have time, but it would be nice to move storage all of my services there so in case of trouble with one server I could instantly spin them up on other machine by mounting S3 storage. Performance probably wont be great but if main machine will go down I will be still able to use my home automation for example on some secondary without much of a hassle.
Anyway having dedicated server and backup storage for 30$/mo does not seem unreasonable.
I love it! How would you position yourself relative to existing OSS products in the space like Filestash or Seafile? I'm trying to pick a solution at the moment, and the mobile experience matters a lot to me.
The thing I find interesting about apps made by Claude et. al. is that they always fallback to using dotenv for configuration. I thought dotenv was on its way out! Personally, I've been using sOPs for this purpose.
Looks like a good light weight solution to front object storage with a front end and auth. One suggestion is to add the license to the repo. The readme says License: MIT, but there’s no license file.
Feels like this is missing some of the key points of using generic bucket storage for me:
1. Archive pricing for really large old documents.
2. Cross-provider backups; especially for critical documents.
Ok, I'll see it later but please use the 'Release' feature of GitHub. It is the easiest way to tell for your customers that a new release is out. Even GH can send notifications. Thanks.
Samba, rsync, sshfs, even fusefs... but there's seemingly nothing that can keep your files yours across your own devices without extensive hacking/setup, suspicious EULAs, MitMs, etc. We can build it, but normies can't
For all the people pooh-poohing this, I'm very interested in this business model (bring your own provider token) and this looks to be nicely done. I'm going to try it out. In particular I want to see if it supports payload encryption for the data in S3. I'd need that to be comfortable stashing all my personal data in AWS or Wasabi.
I use a mini pc with small smb shares (less than 1 TB). This thing is on 24/7, but runs energy efficient.
When it's time to move data, i copy it to a Synology NAS that holds lots of TB's. Then it's also time to backup the really important stuff, which goes to a Hetzner Storage Box[2].
This is in Go, exposes both webdav and SFTP servers, with user and admin web interfaces. You can configure remotes, then compose user space from various locations for each user, some could be local, others remote.
I use archive storage class on google cloud, to store old movies and wedding videos, pictures of old vacations.
For everything else I use paid onedrive subscription.
The biggest problem is user interface with s3 like storage and predictable pricing because remember you also pay for data retrieval and other storage apis, with dropbox etc you pay a fixed amount. Every year or so I roll over data into the bucket.
Absolutely insane, glad I had Backblaze to restore from, but it even remapped his Documents and other home folders to a new place, so using restore didn't immediately make the files appear.
Something similar happened to my Dropbox on Mac: the change to the macOS File Provider API meant that a lot of my locally synced files became cloud-sync only. Not cool when I was expecting limited internet access.
I still have no idea how to redownload and keep everything local while synced with cloud.
What do you think Windows: ME stood for? Miserable Experience
There is also sentiment / understanding that Microslop has been pushing their devs to do the ai thing and that it is resulting in more downtime and bugs. This is not limited to one company and more an artifact of the hype cycle. If anything, it's worse with the corporate product because there should be more checks and balances, but here we are.
https://v2.spacedrive.com/
...most of the software industry is one rung above back alley smack dealer when it comes to the kind of business they run.
Most software developers are bartering for food and shelter. They're not curing cancer or building an essential new bridge for a community.
Every time I've created my own backup script I realized knowing what to delete and when is not easy. IMO the practical solution to this is to just pay for more storage (within reason).
Yup, pay more but get 2 providers.
I also have a 14TB RAID 5 NAS at home. And my Desktop PC has 6TB of RAID 5 (had that first, mostly used for video games these days).
However I am more afraid of my data being exfiltrated and imo there is a more risk of that with a “vibe coded 1 person 1 week old” app rather than any of the major providers.
Same with literally every other backup software. Have two, and test restorations regularly. It's not easy, but nothing worth it when you need it ever is.
They too thought they were storing important history. Only for their heirs to bin their stuff in order to focus on their lives.
Be less needy. No one cares anyway.
My other grandpa, controversially used a big chunk of their wedding money on a good camera. They traveled the world and lived abroad for several years right before and after my mom and aunt were born. Because of this, we are all able to see such a fascinating and meticulous glimpse into their lives. Each photo tells a story even if the story is boring, but I really appreciated the small details. Even random pictures of cars that my Grandpa thought were cool. Or the mean guard dog they had in Taiwan while it was still a puppy. Or my mom on the Trans Siberian Railroad in the middle of the Cold War.
These stories and my own appreciation of photography have made me realize how valuable every photo I have is, and I'm willing to put in effort to save them. When I'm old and dying of dementia, I'll be able to look back at my life in incredible detail one last time. Even the dumb meme's I decided to save will tell a story.
I still have a deep appreciation for living in the moment and knowing not everything should be captured, but we live in an era where I have a really good camera in my pocket at all times, and the ability to store all those photos forever cheaply.
So it's a cool project, but not really what I'd say is a Dropbox replacement.
Which is, in the end, true of a lot of tools where the underlying 'things' aren't particularly spectacular but rather it's the user experience that sells it
I ended up creating https://github.com/nickjj/bmsu which calls rsync under the hood but helps you build up a valid rsync command with no surprises. It also codifies each of your backup / restore strategies so you're not having to run massively long rsync commands each time. It's 1 shell script with no dependencies except rsync.
Nothing leaves my local network since it's all local file transfers.
Which mobile OS would that be?
The big reason I stopped being excited about cloud storage is that on mobile, from what I can tell, none of the major providers care about "folder that syncs" experience. You only get an app that lets you view remote storage. The only proper "folder that syncs" I had working on my phone so far was provided via Syncthing, but maintaining that turned out to be more effort than my tiny attention span can afford these days.
You can also mark complete folders as “Make Available Offline”, which will keep their contents updated, though I don’t really use that personally.
(Unfortunately, both mobile platforms themselves are actively fighting this approach, and instead encourage apps to keep data in private databases and never expose them as actual files.)
Fyi, a filesystem is a database too.
And acid SQL databases ain't much better at that domain from my perspective.
Free, opensource, works on computers and phones, can in most cases puncture nat, supports local discovery (lan, multicast).
No googles, no dropboxes, no clouds, no AI training, no "my kid likes the wrong video on youtube, now our whole family lost access to every google account we had, so we lost everything, including family photos", just sync!
(not affiliated, just really love the software)
I'm def looking for an alternative, taking proton for a test drive this week, because they have the office tools, which I didn't know and assume they are not as good, but hopefully good enough for my usage
https://fev.al/assets/2026/Carboard-bank-secure-box-template...
generally, there's not "missing pieces" in a cardboard box.
1 TB is roughly 20-30 USD per month at AWS/GCP only in storage, plus traffic and operations. R2 is slightly cheaper and includes traffic.
Compared to e.g a Google AI plan where you get 5 TB storage for the same price (25 USD/month) + Gemini Pro thrown in.
https://news.ycombinator.com/item?id=9224
I didn't realize I've been reading HN nearly its whole existence. For all my complaining about what's happened to the internet since those days, HN has managed to stay high quality without compromising.
My solution so far has been NextCloud, but I'm getting pretty fed up with it. But not enough to actually do anything about it... yet.
You could run something like Cryptomator on top of Dropbox:
https://cryptomator.org/
It even has (paid) iOS and Android apps for mobile access.
My solution is… I have no fucks left to give about it. I haven’t for a long time. It works. My family will have all our photos and valuable sentimental data preserved. I keep a local backup. I spend my time on other stuff more valuable to me. If dropbox and backblaze disappear tomorrow oh well. If all my data is leaked? Knock yourself out. All the good stuff is in encrypted volumes.
My data has been breached 12 times by my count. /Twice/ by the OPM itself as I had a security clearance. DOGE goons have all our data and walked out with USB drives with all of it.
Equifax and credit agencies are a joke. My threat model is simple, nothing you can blackmail me with goes to the cloud. And that’s that. Breach me, try to hack my finances, try to steal my crypto. I have had my SIM transferred and someone unsuccessfully attacked my crypto accounts with it.
Good luck to everyone with my data :)
(25+ years in tech, ymmv)
I too would like the answer to this concern because the features page doesn’t mention it. I want to be able to handle file version history.
I’m currently using Filen which I find very reasonable and, critically, it has a Linux client. But I wish it was faster and I wish the local file explorer integration was more like Dropbox where it is seamless to the OS rather than the current setup where you mount a network share.
I'd rather control the whole stack, even if it means deploying my own hardware to one or more redundant, off-site locations.
Edit: Are there robust, open source, self-hosted, S3-compliant engines out there reliable and performant enough to be the backend for this?
But then you still need a bazillion dependencies and a db just to manage files already on your filesystem.
How much on S3? A LOT more.
Maybe you use 1TB, maybe just 10GB. As a user on this site I expect you know that a 10GB plan and a 1TB plan won't be that much different.
https://www.seafile.com/en/home/
It's pretty magical. It nails the "online" vs "cloud only" paradigm via the SeaDrive client. I have it running on my file server, and now all my machines have access to terabytes of storage with local performance, since it can cache a subset of your content locally.
And since I can run the server on my LAN, the throughput is way better than Dropbox would be too.
I am using it with Hetzner Storage Box[2], which is insane value for money at 11 euro per 5 terabytes per month.
[1]: https://github.com/mickael-kerjean/filestash [2]: https://www.hetzner.com/storage/storage-box/
But you have to pay for your own S3 bucket as well... and it's generally several times more expensive per terabyte, though this depends on different factors. (Not to mention you might still have to pay for Google if your e.g. Gmail doesn't fit into the free tier anymore.)
If this is supposed to be financially motivated, the creator seems to have it somewhat backwards.
Isn't that compromises the whole purpose of the project immediatly?
Moreover, any reasonably adequate dev would work on expansion of syncthing ecosystem, not inventing a rasclet instead of a wheel
Feature request: Google Drive for desktop.
That is the feature that gives your drive as a mounted file system that stream files as you need them.
It gives me the ease of having access to a giant amount of files stored in my gdrive without having to worry about the space they take up locally nor moving files up and down.
Actually, what solutions to that might already exist? I don't really use the web UI of gdrive as much as use it as a cloud disk drive.
I'm a kid in a candy store playing around with this stuff.
Dropbox is a lot more than file storage. The syncing itself has been through serious tests to characterise its behaviour. Sure, some may not like the decisions taken to direct its sync behaviour one way or another but at least all these are known through formal testing.
File sync can't be that hard! Enters the first 3 way conflict and everything explodes.
Dont misunderstand me, this is a cool idea. But if your rotation time between ideating a project and pushing it to HN is a week, you don't understand the problem space. You didn't go through the pain of realizing its complexity. You didn't test things properly with your own data, lost a bunch of it and fixed the issues, or realized it was a bad idea and abandoned it. I have no guarantee you'll still be there in a month to patch any vulnerabilities.
Not that any open-source project had these kind of guarantees until now, but the effort invested in them to get to that point was at least a secondary indicator about who built it, their dedication, and their understanding of the space.
I would have considered it when rebuilding my media infra but haven't seen anything close to this
It was something like that. https://www.kimsufi.com/pl/?range=kimsufi&storage=12000%7C11...
As you can see price is 180% of that now for bit more storage.
Yeah I thought that it would be the KS range but didn't see anything close in term of pricing.
16TB for $35 is a no-brainer!
I'm currently migrating away from a KS because the disks are almost dead now so I had to go with another solution for TB storage.
But it was great when it was working!
Right now I do not have time, but it would be nice to move storage all of my services there so in case of trouble with one server I could instantly spin them up on other machine by mounting S3 storage. Performance probably wont be great but if main machine will go down I will be still able to use my home automation for example on some secondary without much of a hassle.
Anyway having dedicated server and backup storage for 30$/mo does not seem unreasonable.
Example 2TB:
Google $10/mo vs S3 ~&45/mo?
You could get cheaper that Google Drive with glacier tiers but that’s a different level of restrictions and still has retrieval fees.
Sure, ChatGPT can help, but to use it reliably, you still need enough medical knowledge to ask good questions and evaluate the answers.
(and regarding contributors for all of his projects, it's mostly vibe-coded)
Old technology still works, even if it is old!
And so easy to set up on a home computer. Except it's not always on and doesn't come with backups.
I'm not saying S3 is where it's at but might need a bit more than just Samba. Or maybe you don't but people who need Dropbox do.
Turning on SMB is usually just a click of a button, even macOS supports it
Any user technical enough to be able to set up an S3 bucket, Syncthing, Nextcloud or this "Locker" tool from OP can also set up an SMB share
I was responding to the above thread, where sharing files on an offline network is being discussed. Backups were not mentioned as a requirement.
But sharing a folder on my Mac with my wife’s MacBook has been a Google diving, arcane command line headache.
I would have thought sharing the folder, and marking ‘everyone’ for all the read/write modes would be enough. But, no.
I guess with APFS it’s a lot more fiddly. It’s not intuitive, and not in the info panel.
The comment is disingenuous, though, since Locker doesn't need AWS S3 to function.
For a better alternative, run MinIO on a cloud provider of your choice, or stick with a secure option like Proton Drive.
> run MinIO
When people say "s3", they mean "any s3 compatible storage" in my experience, not "amazon s3 specifically" or just "s3 as a protocol".
I use a mini pc with small smb shares (less than 1 TB). This thing is on 24/7, but runs energy efficient.
When it's time to move data, i copy it to a Synology NAS that holds lots of TB's. Then it's also time to backup the really important stuff, which goes to a Hetzner Storage Box[2].
[1]: https://en.wikipedia.org/wiki/Backup#3-2-1_Backup_Rule [2]: https://www.hetzner.com/storage/storage-box/
This is in Go, exposes both webdav and SFTP servers, with user and admin web interfaces. You can configure remotes, then compose user space from various locations for each user, some could be local, others remote.
For everything else I use paid onedrive subscription. The biggest problem is user interface with s3 like storage and predictable pricing because remember you also pay for data retrieval and other storage apis, with dropbox etc you pay a fixed amount. Every year or so I roll over data into the bucket.
But for infrequently accessed data its fine.
Doesn’t require an external database (just a s3 bucket) and is a single binary. A webui is shipping in the next few days.