I expect these comments to be full of agreement. Corporate behavior in the computer space leaves much to he desired.
I will however observe;
None of the supplied examples showed any form of network effect. It was all stuff you did at home.
Today, there are certainly options for personal computing for most everything- as long as network effects are not in play.
Those options may not be as convenient, as cheap, or as feature-rich as the invasive option. That's fair though - you decide what you want to prioritize.
Network effects are harder to deal with. To the extent that in order to be in community you need to adopt the software the community has chosen.
Not surprisingly, software producers that can build-in network effects, do so. It's excellent from a lock-in point of view.
The title of the article is perhaps then ironic. It's trivial to make computing personal. All the tools to do so already exist.
The issue is not Personal Computing. It's Community Computing.
Reminds me that back in 2001 I was suggesting the Squeak (Smalltalk) Foundation's purpose should include supporting collaborative software and the community around it -- beyond supporting just personal development. There were a bunch of interesting discussions related to that although ultimately the focus still ended up moving towards the narrower traditional Smalltalk culture idea of empowering the user rather than the community.
Example of something I wrote then:
https://lists.squeakfoundation.org/archives/list/squeakfound...
"I guess I always saw Squeak's purposes as a bit broader still, relating
more to "individual and group empowering transparent ubiquitous
computing". For example, I see Squeak concepts as providing an OS
neutral platform for various languages (Python, Lisp, Forth) [an open
source .NET] and I don't see how that is going to fit into a mission
statement that links to Squeak as is with a perception of a Smalltalk
environment only. Granted, most people joining this list may have no
direct interest in this, so that is not to say the purpose of the
organization should necessarily incorporate that, especially if it has
detrimental effect by making things too overly broad.
Here's an alternative -- anchor the effort on one side by Squeak and
have it open ended on the other. For example, "To assist the evolution
of a individual and group empowering transparent open-source ubiquitous
computing platform starting from the initial Squeak code base".
I don't think this is out of character with for example where Alan Kay
has wanted to go with Squeak in regards to a "Dynabook"."
Smalltalk (and Squeak and its derivatives) of course can empower groups like with, say, the Croquet project. What I was talking about then was mostly about emphasis around a common shared purpose (like the Chaordic Commons approach suggests) in the context of creating a formal "foundation" organization.
https://en.wikipedia.org/wiki/Croquet_Project
If I squint, it sort-of appears that JavaScript/asm.js (supporting a variety of transpired languages and running in every browser and communicating in a variety of ways) is a sort-of realization of part of that vision.
Due to the way iOS apps are sandboxed together with their user created content a lot of users have video projects that are locked into CapCut without an easy way to access them following the ban of the TikTok suite of apps. Remind me how your iPhone is yours, when your creations on your device can be locked away from you.
That's not iOS fault. Apps can store their files in a folder visible in the Files app, or can ask the user to open a file or folder from a file provider (also visible in Files app), or to save a file or folder in a file provider (always visible in the Files app).
It's not the 2011 iOS anymore, if an app today hides its video projects from the user, it's entirely the app fault.
Arguably this is still on Apple, because they don’t let you access the full filesystem as you can on other operating systems, and in particular because an app developer may rightfully want to create a class of internal-use files that are not explicitly exposed to the typical user, but would be available to users seeking them out.
I imagine, for example, that if the internal project files for a popular video editing app were accessible, we’d see competing and/or open source apps emerge that could parse them, were the original app to become suddenly unavailable. Instead they’re just lost because your phone won’t let you access them.
Blame can be shared. The OS vendor for providing a way for applications to hide files on the user's filesystem from the user, and the application for using it instead of making the user's documents available to the user. They are both working together in unison against the user.
Non-typical compared to what? It's not any better on Android, unless you root it. Google has been going out of its way to deny users access to data stored on their phone, by allowing and encouraging apps to claim sole ownership on data, as well as removing interoperability features (around which Android OS was initially designed), all in name of sekhurity.
In a pile of devices, Apples are non-typical. The number of users is not terribly relevant.
However, sure, lots of users chose Apple knowing exactly what it is. Apple's not going to change since their model clearly appeals to lots of people.
If you don't like Apple's model, then don't choose Apple devices. What everyone else chooses is somewhat irrelevant to you. (Other than network effects noted earlier.)
The two things you brush over are the most important though - and feed into each other: network effects are relevant (and very much so because they affect all sorts of things you can do with something) and they are directly influenced by the number of users, which makes them incredibly relevant. What others choose are also relevant because of these network effects.
I can hack up a "device" with a raspberry pi zero or whatever and call it "HaxyDeck" and claim it is all open to anyone who wants to tinker with it, but at the end it'd be irrelevant because only me (and perhaps a couple other people) would have it. The aspects you want to ignore (number of users, be something other than Apple, what others are using) would actually affect my use of HaxyDeck directly: since i'd be the only one (or one among a tiny number) using it, i'd be the only one having to make it do things i want, it wont have software from others, it wont support software other people may want to use for communication, software some services that theoretically have nothing to do with phones or computers (e.g. banks) wont work because HaxyDeck's userbase is irrelevant for them, etc. All of these have to do exactly with what others are doing.
Basically see how all the non-Android Linux phones (like PinePhone) are faring. You can't just ignore what effect having a large user base some platform (be it a device, an OS or even service) has and say "just use something different".
Well, I have access in Files to a lot of content from my apps - that’s a decision of the app creator to not use this and keep the content created in the locked area of the app.
For example, the apps from Omni do this, as do obsidian, Linea…
Obviously the blame lies on Apple for locking away your device's contents from you. Developers should not be able to have more control over what you can access on your device than you do. Even if they make bad choices (like making accessing the files hard) it should be you who has the final say, not them.
Apple making it possible for developers to make bad choices and go against users' control over their own devices is to blame.
If I understand what you're saying ... my music listening, magazine browsing, movie watching, are all offline these days (#fuckstreaming). I do 3D modeling in an offline app (FreeCAD), 2D "modeling" (Affinity Designer) in an offline app.
The internet is where I get ideas and news (and some of the above content — magazines as PDF for example).
So I guess the "network effect" I keep to as much of a minimum as I reasonably can?
(EDIT: oh, I don't really use my phone except as a camera and road navigator. I would love to have a completely offline map app that was decent.)
The reason they don’t grow is much more trivial - they simply have no sign up funnels and are visibly technically complicated. Every step and choice there is arcane and ideologic.
That’s it. Both Open Source and Federated thinks that distribution gateway federation something is something a user must know and be fond of. The user not only couldn’t care less but actively refuses this complexity because they cannot trust their own uneducated decisions. They go for the nearest CorpThing that seemingly just works for everyone and decides everything for them after they tap “Next” a few times.
I changed phones, and tried to log back into my Fediverse/Mastadon accounts. The app asks me which servers I'm on—I can't find the accounts in my password manager, can't figure out which servers they were, and the ones I thought I was on maybe don't exist anymore? Or were accessible in one but not another app.
So I managed to log into one of the 3 accounts I'm sure that I have still. And I'm a software nerd who makes "educated" decisions all the time around this stuff.
Protocol People really care about that, and you know what? It becomes their network effect. But it is a self-selecting network. The nature or design of what effects and attracts the network is the same mechanism for limiting its size.
TikTok, Instagram, Snapchat all focus on things that other people really care about—namely video creation, photo curation and ephemeral small network cohesion. and those focuses attract other userbases.
Probably, there's a lot more people who want to create and watch short videos than there are people who want to nerd out over what their 1/10,000 servers' community rules and protocol settings are.
You are way too nice on TikTok, Instagram, Snapchat. It's basically a giant narcissism enabler, just with different media.
The federated stuff is unsuccessful not just because of protocol stuff (if people really wanted, they would find a way) but because it's not cool yet.
The only reason people go on those networks is to try their luck at popularity and find a way to cash out in various manners. Other than that, there is not much point going on there, why would you waste time broadcasting all kinds of things you do instead of just doing more...
Which is related to the fact that OP doesn't understand that you can't fix a social and political problem with technology. Technology is always downstream of the establishment, whatever that looks like.
We only think of computing as "personal" at all because of that brief period in the 70s when very simple toy computers, just powerful enough to run a spreadsheet and play some basic games, became affordable.
But computing was invented to solve wartime problems of various kinds, including ballistic calculations, warhead design, cryptography, and intelligence analysis.
Almost immediately it moved into corporate accounting and reporting, and commercial science and engineering.
It took thirty years for it become "personal." Its roots are corporate and military, and it was never - ever - suddenly going to give those up.
Worse, a lot of open/free/etc "solutions" are built by people who like tinkering, for other people who like tinkering. That's fine when you're making an OS for a web server, but a disaster if you want technology that's open from the POV of the average non-technical user.
You can just about, now, start to imagine an anti=internet which is distributed, secure, non-corporate, and controlled by ordinary non-technical people telling smart agents what to do.
That might, just about, with many caveats (it's not hard to think of them), become a technological solution that builds a true decentralised network.
But for now we're stuck with corporate centralisation. And that's not going to be fixed by going back to 8-bit micros, or with a Linux phone.
I tried to install Element messenger and even asked several of my family members to install it for communication. With default server, it turned out to be extremely slow - like hours to send a small video. Looked into installing my own server, but the complexity scared me away, and I have 40 years of coding experience behind my belt. So we are back to whatsapp now.
I would disagree with the implication that everything has to grow, but solely on the grounds that I am not convinced human beings are psychologically mature enough as a species to be that connected with that many other humans and still retain their capacity for acting like a a good human.
The federated networks I am part of are pretty small and we have a lovely time sharing diverse interests, getting to know each other and even disagreeing sometimes without the blind hate, persistent negativity and gotch'a seeking you typically find on places like Facebook, Twitter and Reddit. Too much growth too quickly would destroy that, turning those small federated networks into another cesspool of bad behavior.
However, I am open to hearing why people disagree. My personal experience drives my opinion, so ymmv.
I'm not sure what you mean by lacking a network effect, unless you mean they don't yet have sufficient users to draw in new ones?
I personally switched to Lemmy from Reddit after the API debacle, and I've found it to be an extremely compelling platform exactly -because it was federated. I can curate my feed from hundreds of large and small instances with nary a corporation in sight! It's self-hosting as far as the eye can see, yet it has enough interesting content and discussions to keep me coming back, without any ads or algorithm trying to manipulate me.
It feels like 90's internet full of webrings, and it's glorious.
Well yes and no. There's no existing network (for something new), so certainly there's no network effect making them grow.
On the other hand the pitch to get people to join is weak. I don't pitch it to my friends because (currently) its a pretty poor experience compared to what they are already using.
I don't think the fediverse experience is poor, I rate it as superior to the walled gardens.
I don't pitch it to my friends because quantity invariably destroy quality, or at the very lease hide it behind a huge pile of dirt. I don't pitch it because people who are interested in a better internet already care and know how to find it. I don't want to ruin a nice well behaving network.
> They lack a network effect that makes them grow.
Isn't lack of fast growth a good thing? I swear, I left every social network in the two years after my mom joined.
At some point of a network popularity, it feels like there is an influx of people who want to talk to you but lack reading comprehension to read your answers.
Or maybe it's specifically that every "become popular fast" algorithm tries to repeatedly throw you to them.
Curating a corner of web for yourself takes time and effort, and if a social network popularity outpaces you, then you just can't do that.
wouldn't "growth" have to be reinterpreted for such technology? cause part of its appeal is to _not_ grow unchecked. you don't want everybody on there and you want the setup to be a little difficult. about as difficult as it was to hook up with the web in the 90s.
I'd largely agree that most of the components are there, however one thing I think that's very important but is perhaps missed with the focus on the PC is the phone.
Most people's primary, if not only, computing device is their phone - which at the same time is probably the most restricted device.
And if you wanted to build your own and connect to the mobile network - it's considerably harder than doing the same for a traditional personal computer.
I agree- though I think the problem is more that the focus of attention is not on making personal computing better, so it's withered. And some programs you could get as a buy once works offline experience are now subscription based -as-a-service
> I expect these comments to be full of agreement.
It's interesting there is a lot of agreement. In a way I'm surprised because I often get the impression a lot of people here have pretty well drunk the Kool-aid of corporatism.
It's the least surprising thing in the world! The article is a totally standard bit of left-tech activist writing of the sort that has been widely found online for decades. It used to be a staple of Slashdot, a staple of USENET and it's a staple of HN too. RMS made a living giving talks exactly like this.
What would be actually surprising is to read a full throated defense of modern tech and the companies that build it, and then see an HN thread full of agreement. It's certainly possible, I'd disagree with almost everything in the article. But the sort of people who disagree tend not to waste as much time on HN as me :)
I'm curious what you disagree with! Personally I understand the sentiment, but I'm not sure it's necessarily a bad thing that stuff get's more locked down. I've delved deep into custom roms and linux, riced my desktop and advocated for FOSS and discussed privacy concerns with friends and colleagues. But at some point you also need to work and be productive. Use the technology that's available. I need Office for my work, and I'd like to point my partner to a nice restaurant with Google Maps when we're on holiday. The Microsofts, googles and Apples of this world excel in actually delivering results. And it can be argued that that's more important than you really "owning" a device or a service.
The article is nearly pure ideology, with many statements that are just obviously wrong to anyone who isn't on the author's part of the left. For example:
"At its core, the PC movement was about a kind of tech liberty"
There was no such thing as the PC "movement". Personal computing was a market driven phenomenon in which competition drove the price of computing down far enough that people could afford to have one at home - that's it. It didn't represent any particular philosophy of society either. A microcomputer in the 80s was one of a wide mix of competing manufacturers, all of whom were much more closed than a modern computer. Proprietary hardware and software ruled the day. DRM was widely used in this era, including "hardware" DRM like code books or oddly manufactured floppy disks.
By the mid 90s IBM has fluffed its control over the PC platform, so hardware is at least getting more open and interoperable and you can "control" your device in the sense of adding more RAM or extension devices. Pretty useless to anyone who isn't a HW manufacturer but nice in terms of better enabling a free and competitive market, continuing the downwards pricing pressure.
But open source operating systems barely existed. Linux was just a few years old and most of the world was connected to the internet by a modem if at all - Windows 95 didn't even come with a TCP/IP stack so unless you happened to work at a university or other org with plentiful bandwidth, and have the time and patience to compile a lot of kernels, it was basically not possible to obtain an open source OS at all. DRM was still widespread, now with exciting things like USB dongles and garbled CD-ROMs.
The world this guy thinks existed never did. To the extent there was anything special about the microcomputer, it was that aggressive market competition made previously expensive devices cheap enough for people to buy at home. Nothing about this was a social movement though, and nothing about it came with any particular ideology of freedom or control. That's why words like "freedom" in the software context are indelibly associated not with the PC pioneers like Bill Gates or IBM but rather with RMS, who didn't develop for the PC at all. He was writing stuff like emacs and gcc for the existing proprietary UNIX big iron of the time, which were fully proprietary.
Arguably the modern computer is more open, more free and more hackable than any previous time in history. You can compile and run an open source web browser, run it on an open source OS that's booted by an open source BIOS, on an open source CPU, speaking openly documented protocols from A-Z. I don't remember any of that even being imaginable in the 80s or 90s.
The unsurprising thing is that people here think that this is left-tech activism. The true cool-aid[1] is this particular tech ideology which is all about “liberty” on the surface but is either agnostic of or embraces privatization.[2] Yeah, unsurprisingly the author explicitly embraces “the tech industry”. It’s just gone wrong or too far. It’s not like the good old privatization in the old days.
Wanting tech companies to be regulated more in this day and age of such extreme tech behemoth domination is left-wing activism in the same sense as (not being a Peter Thiel-style maniac) = left-wing.
Yes. Vast majority of computing is still under powered. Chromebook for example. Apple Silicon fanless MacBook Air only arrives in 2021. And I would argue if we want AR or latency sensitive applications our computing power is still off by at least an order of magnitude.
That’s not true in many domains where doing it on a personal computer would be either too long or too long in asfar as you are are skillful as using faster memory as cache.
video production, climate simulations, pdes, protein folding, etc etc
I agree with you; all of those needed vastly more computing than was available in a PC. If anything, the power of modern hardware has made a lot of it more available in personal workstations. Though it is true that hyped-for-the-masses personal computing devices are not optimized in that direction. You get what you buy.
The part that is especially annoying there is that it's not just about speed, but about AI tools being closely tied to a specific architecture. Lots of them only work on Nvidia cards, but not on AMD. A fallback to CPU is often not provided either. If you don't have enough VRAM a lot of them won't work at all, not just run slower.
Indeed, when a network effect can be monopolized, it'd be bad business not trying to become the monopoly.
There were of course "home computery" phenomenons with network effects: IRC and Usenet, for example. There are several reasons why they've fallen out of fashion, but corporations shepherding new users into silos is surely a big one. It's a classic tale of enthusiasts vs. the Powers That Be, although the iteration speed and overall impact is perhaps most noticeable in digital technology.
Perhaps we were naïve to think we'd be left alone with a good thing. I too hope for a comeback of "personal computing", but in every scenario conceivable to me, we end up roughly where we are now - unless also re-imagining society from first principles. And if we do that, the question is whether personal computing would have emerged at all.
Extrapolating this point outward, I don't think there is really any community computing.
Most people I know literally still to use the lowest common denominator of communications because corporates have managed to screw up interoperability in their land grabs to build walled gardens. The lowest common denominator in my area is emailing word documents or PDFs around. Same as we have been doing for the last 30 years. The network effect there was Word being the first thing on the market.
All other attempts have been entirely transient and are focused in either social matters or some attempt at file storage with collaboration bolted on the top. The latter, OneDrive being a particularly funny one, generally results in people having millions of little pockets of exactly what they were doing before with no collaboration or community at all.
If we do anything now it's just personal computing with extra and annoying steps.
And no, 99% of the planet doesn't use github. They just email shitty documents around all day. Then they go back home and stare at their transient worthless social community garbage faucet endlessly until their eyes fall shut.
Society doesn’t exist according to the extreme version of liberal ideology. We’re all just individuals with some negative freedoms. So yes, accoarding to that mindset nothing is wrong. Because we can all just individually opt out of these networks.
Yeah, but I think part of the point is people don't actually want or need network effects for a lot of things. Even where connection is needed, companies have used it to wedge in stuff that doesn't benefit users.
This article made me even more sad than I already was. I've just been reading about Bambu Lab (a leading 3d printer manufacturer, who introduced really good 3d printers a couple of years ago and really shook up the entire market) self-destructing itself and burning through all the goodwill accumulated over the years. They are working on closing down access to their printers, apparently with the end goal of locked-down subscription-based access. This is much like the path that HP followed with their printers.
I also write this on a Mac, where I'm watching with sadness the formerly great company being run by bean-counters, who worry about profits, not user experience. The Mac is being progressively locked down and many things break in the process. Yes, it is still better than Windows, where apparently the start menu is just advertising space and the desktop isn't mine, but Microsoft's, but the path is definitely sloping down.
It's just sad.
I don't know what else to write. There isn't much we can do, as long as everybody tolerates this.
My prediction, is that, in the not too far future, perhaps 20-25 years, with the "blessing" of national security, ads business and other big players, devices will be further locked down and tracked, INCLUDING personal computers.
A lot of people already don't own a computer nowadays, except for the pocket one. In that future PCs, if they still exist, perhaps are either thin clients connecting to the vast National Net, where you can purchase subscriptions for entertainment, or completely locked down pads that ordinary people do not even have the tool to open properly. Oh, all "enhanced" with AI agents of course. You might even get a free one every 5 years -- part of your basic income package.
They won't make learning low level programming or hardware hacking illegal, because they are still valuable skills, and some people need to do that anyway. But for ordinary people it's going to be a LOT tougher. The official development languages of your computer system are some sort of Java and Javascript variants that are SAFE. You simply don't get exposed to the lower level. Very little system level API is going to be exposed because you won't have to know. If you have some issues, submit a ticket to the companies who program the systems.
We are already halfway there. Politicians and super riches are going to love that.
Single player video games are not going to die. And the market seems to punish any push for always online model (which is obviously a scam). I say this because a bulk of market for personal computing is driven by video games.
anybody who really wants to learn to code will just install linux on an old clunker. every uni and highschool student i know who means business does this.
Anyone who wants to learn coding cannot do so in a locked down environment. This is why an iPad is actually detrimental to digital competency in the long run compared to personal computers. They aren't even safer, since these devices often have payment information baked into their being and your kid spending roblox bucks.
Also, you cannot experiment in a safe environment. Safe environments are adequate if your are infantile. But you stay that way if you don't get freedom.
i dont know any 15 year olds without access to a laptop in some form. mac os has a very posix shell. i did K&R c on an old macbook. stop wasting time worrying about this and go buy a 50 pack of old office thinkpads and give them to every teenager you know. and if they say no just break their phone in half. thank me later son i love you papa i wish i knew how much i loved arch linux before you died.
True and I do distribute machines where I can and people are generally very happy about it. MacOS is still a very useful system compared to iOS. I hope they adapt the latter to the former and not the other way around.
I'm scared too - the iPhone is pretty completely locked down. Microsoft Realllly wanted TPMs to be mandatory for win 11. It seems like only a matter of time that your OS will be inconfigurable.
There are billions of CPUs floating around and none of them are going to be magically prevented from running Linux. New computers in some markets like US and UK, I don’t know, maybe if a hot cyberwar with china breaks out there will be a ban on new PCs that can’t be secured by the NSA but I can’t relate to the hand wringing about losing control, we still live in a capitalist society and there’s enough people that hate Windows randomly breaking shit that there will be other options besides CoPilot PCs and Apple Intelligent Macs.
> There isn't much we can do, as long as everybody tolerates this.
I don't know if this will be effective in any way, but I've decided to start hosting services for my friends and family out of my closet. It seems that money destroys everything it touches, so it feels nice to be doing something for reasons besides money.
My friends and family are not particularly influential people, but I think it'll be nice to create a little pocket of the world who knows what it was like to not be exploited by their tech.
I just booted up a Creality ender 3 from 2017 that’s been dormant in my neighbors shed for years. My maker-friend scoffed and said toss it, get a bambu ! And yeah it’s a chore to level the bed every time I bumped into it and I might have to unclog the nozzle every once in a while but I did print a replacement part (the nut for the spool holder) and I know ill be able to maintain it, barebones means there’s not much to break.
As for operating systems, I’ve been daily driving Fedora 40 and now 41 with gnome environment and it’s been the best OS experience I’ve had yet, I haven’t had to open terminal once to configure a single thing, all my apps are installed from the software “store” GUI and I’ve got the sleep and wake behavior all dialed in the way I like.
It runs equally well on a 2014 Intel Mac mini and a 2024 Mac Studio via Asahi Linux, which was also a super simple install process (uninstalling it and reclaiming the disk space required me to reset the whole drive but pretty sure that was my fault for deleting the asahi partition the wrong way)
Anyway maybe give it a shot, and self hosting things is only getting easier, Jellyfin and Immich have changed my life, at least the virtual side of it :)
I know the usual comments will crop up but, now if ever is the best chance to give it a try, at least as a semi daily driver if you still want to play games or such.
I used KDE (24.04) for a while now. Also used Linux 2000-2008-ish. Have read APUE.
When I win-left/right a window and then resize it, then close it, now win-left/right always resizes it as previous one. There’s no way to reset it to 50:50 (unless logout).
Notification center is obnoxious. Regular “Program stopped working” with zero details. Why do I need this information. Copying a folder in dolphin pops up a progress notification that doesn’t go away when finished. You did something, expect a notification to pop up. You did nothing, still expect it.
Windows either steal focus or fail to steal it when needed, depending on your settings and astrology. Software update nags you to click update, then you click it, then it downloads something for a few minutes (you go doing your things), then a sudo password popup fails to get focus, and it all crashes after a timeout. Or things will pop up right in your face. VNC connection closed? Yes, bring that white empty window to front asap and tell “Connection closed” with an OK button.
Start menu was designed by an idiot, a wrong mousemove and you are in a wrong section. Sections reside exactly on the 80% of bezier paths from start menu to the section content and have zero activation timeout. So you have to maze your mouse around to avoid surprises. Logout, restart, shutdown, sleep buttons in the start menu all show the same fullscreen “dialog” that requires you to choose an action again. What’s the point of separate buttons even.
I could go on about non-working automatic vpn connections, mangled fonts/dpi/geometry if you vnc into a turned off physical display, console that loves printing ~[[A half of the times when you press an arrow. And so on and so forth, the list is so big I just can’t remember it.
Idk how Linux users are using Linux so that they do not meet any issues.
gnome fedora has been revelation, I love how the task switcher is a combination of Spotlight and Mission Control (to put it in macOS terms) and dragging windows to the edge works well out of the box. When I mouse over the volume slide and scroll, it works. I have it installed on an Intel Mac mini and an m1 MacBook Pro, suspend works.
I’ve given up on Debian a dozen times but feel I might actually have a future with Fedora.
Since I have to deal with it (work related activity), I'd like to know how much of the above "things" I can change in the system settings. My experience is this:
win-left/right: an unresolved issue in KDE tracker that seems to remain so.
notifications: enter every program in a long list and change settings (most realistic to change).
focus issues: tried all levels (named "low" to "extreme" without any explanation, none work as intended.
start menu vs launchers: I don't find creating a launcher for every app I have reasonable. Fix the menu then call it "a daily driver".
vpn autoconnect: is on, doesn't work.
mangled fonts/dpi/geometry: I'm all ears how to fix that.
I want to click an icon, not type "like everyone else". All this customizability goes out of the window when it becomes emotionally inconvenient, eh? I also don't want to remap win-left to:
And a similar abomination for win-right, which I would have to maintain somewhere. And in wayland, what are even the options?
Use a better font
I think you misunderstood this. While the physical display is "on", it all looks correct. When you turn it off and vnc into the main X display, it's all mangled, regardless of the font. The order of turn off / vnc into doesn't matter either.
Also, I don't want to globally disable notifications. Maybe I just have to globally disable graphics? That would indeed solve many issues with linux desktops.
Cause Linux Desktop is very far from being ready to use, complete or bug/stupidity-free. Cause you didn’t address even a half of the issues here and was only picking on trivial functions that you understand and found coping workarounds for. “Reject any solution”, lol, I have yet to see any solution apart from “turn it off completely” or “don’t use”. I did all
my due diligence, my complaints are not even remotely lazy.
And I want other people to know that, before they buy into fanboy advices from people who seem to either barely use anything in the OS beyond a browser, or are just lying to themselves. Answers like these speak even better than any of my complaints here could. Feel free to advise next time, I’ll be there as well.
Every discussion like this works the same. You mention a set of real use case issues and ask what to do, and all the advisors suddenly appear too busy to answer, with a rare exception of the most defensive deniers.
I switched to Linux a couple years ago and overall am glad I did, but it's only a partial solution.
As I see it, one way to phrase the problem is that Linux (along with its ecosystem) isn't really user-focused either. It's developer-focused. Tons of weird decisions get made that are grounded in developer desires quite removed from any user concerns. There are a lot of developers out there that do care about users, so often you get something decent, but it's still a bit off-center.
A great example is Firefox, which decided to break all extensions for developer-focused reasons (i.e., "too hard to maintain") and continues to make baffling UI changes that no one asked for. Another obvious example is the mere existence of various open-source software that is only distributed in source form, making it totally inaccessible to users who just want to click and install.
But mostly you just see it when you file a Github issue and a contributor/developer responds with something like "Sorry, that's not my priority right now". You see it when people reply with "PRs welcome". There is still a widespread mentality in the FOSS world that people who want features should be willing to somehow do at least part of the work themselves to make it happen. That's not user-focused.
Don't get me wrong, there's a ton of great open-source software out there and overall I think I'm happier with it than I would be with modern Windows (let alone MacOS; whether I'm happier than I was with Windows pre-10 is a tougher question). But basically what I mean is there are developers out there writing proprietary software who will implement features they actively dislike because they are told that users want them; that mindset is not so prevalent in the open source world.
> A great example is Firefox, which decided to break all extensions for developer-focused reasons (i.e., "too hard to maintain")
That was only a problem for extension developers. Users weren't really impacted as developers built new versions of popular extensions.
> and continues to make baffling UI changes that no one asked for.
No one ever asked for the iphone/smartphones, yet people buy them instead of dumb phones. My firefox has evolved a bit over the year if I look at former screenshots, but everything happened so gradually it has never been a problem for users.
And all kind of software do, not only FOSS.
> Another obvious example is the mere existence of various open-source software that is only distributed in source form, making it totally inaccessible to users who just want to click and install.
There are so many apps available through the software repos and flatpak packages that users who aren't into building a software from source shouldn't even feel concerned.
> But mostly you just see it when you file a Github issue and a contributor/developer responds with something like "Sorry, that's not my priority right now". You see it when people reply with "PRs welcome". There is still a widespread mentality in the FOSS world that people who want features should be willing to somehow do at least part of the work themselves to make it happen. That's not user-focused.
Prioritization is happening everywhere, in proprietary software too. Dev teams work with finite time and resource constraints.
PRs welcome is a bonus, not a con.
> But basically what I mean is there are developers out there writing proprietary software who will implement features they actively dislike because they are told that users want them; that mindset is not so prevalent in the open source world.
Mostly only when they are paid for it. And some proprietary dev also don't implement stuff they don't like. I don't think you can generalize, this behavior is not based on the choice of license.
Some FOSS projects also do work on some features if users raise a bounty for it.
Sure, I agree. That's basically all I'm saying. FOSS gets rid of the tracking and dark patterns but it's still not what I'd call user-focused. It's like in proprietary software the decisions are made based on what the developer wants, and in FOSS it's made based on what the developer wants. But in theory with FOSS there could be people out there who are taking the opportunity of freedom from profit-driven orientation to actually figure out what users want and do that with the same level of drive that proprietary companies apply to seek profit. But it doesn't happen. It's not terrible, it's not even bad, but it's not what I'd call truly user-focused.
I know what you're saying; sometimes open source is presented as the answer to all the user-hostile decay that the platform owners introduce, but you must prove yourself worthy through study and sacrifice. If you want to build your own system, great, but if you want to share the joys with others you cannot attract them with an austere religion.
Hobbyist developers develop software because it solves a need they have first, and they have fun doing that. If they don't have any fun or interest to do it, they lose motivation. Hobbyist developers are the primary users of the app they develop usually.
Commercial FOSS developers do have to take users into account and I think they do but they also have to seek profit.
I don't think there is another way unless government starts employing developers to develop FOSS software based on tax payers wishes.
I don't really disagree with anything you're saying, but it's all just another way of saying "Yes, FOSS is also not user-focused." I'm not saying FOSS is "supposed" to be anything else, I'm just saying that if you want user-focused software, you won't really get it by switching from profit-driven software to FOSS. You might get closer in some ways and further in others.
Government employing developers would be just another form of doing it for pay. There is another way, which is the same way that various other kinds of charitable things happen: through a desire to meet the needs of others rather than having "fun" or "interest" for the person doing the action. There are people who donate their time and energy to do things like give food to the homeless, clean up trash, or whatever. Obviously they derive some kind of satisfaction from it but I think many people who do these kinds of things wouldn't say they do it because it's "fun"; they do it because they think it meets a need that other people have. There could be software like that, but there isn't much of it.
The same way there isn't much people giving food to the homeless or clean up trash compared to the general population size.
You are looking for a unicorn imho. Having said that hobbyist developers, regardless if they do FOSS or freeware, are likely to make stuff that is in line with your particular needs because more often than not people have common needs. They may not agree or have time to implement every single feature you want but in a sense this is use-focused if not user-focused.
Linux definitely exists.... except that it isn't free from this philosophy either. From the "don't theme my apps" movement, to Wayland's "security above usability" philosophy... I recently even read about some kallsyms functions being unexported from an 5.x release because it could be used to lookup symbols and it shouldn't be that easy to access internal kernel symbols or something.
Not to mention many projects refusing to add configurability and accessibility, citing vague maintainibility concerns or ideological opposition.
Another blatant example is the 6.7 kernel merging anti-user "features" in AMDGPU... previously you could lower your power limits as much as you wanted, now you have to use a patched kernel to lower your PL below -10%...
Everywhere you go, you can find these user- and tinkerer-hostile decisions. Linux isn't much better than Windows for the semi-casual tinkerer either - at least on Windows you don't get told to just fork the project and implement it yourself.
I'm a bit hesitant to call this corporate greed as it's literally happening in the OSS ecosystem too. Sadly I don't have answers why, only more questions. No idea what happened.
> Everywhere you go, you can find these user- and tinkerer-hostile decisions. Linux isn't much better than Windows for the semi-casual tinkerer either - at least on Windows you don't get told to just fork the project and implement it yourself.
The obvious difference being that in Windows you can't even do that or (easily) apply a patch. Isn't this very ability to patch (or create a fork of) the kernel the opposite of being tinkerer-hostile?
> Linux definitely exists.... except that it isn't free from this philosophy either.
Yes it is, through the power of choice.
>From the "don't theme my apps" movement,
Which anyone is free to ignore and actively do.
> to Wayland's "security above usability" philosophy...
1. wayland is super usable right now and has been for at least a number of years so your statement is mostly a lie. Only thing missing right now are color management and HDR. This impact a small portion of the users who can still fallback to xorg.
2. we are free not to use it. Distributions made it a default choice only recently and you can still install and run xorg, and will so for pretty much as long as you want, especially as some distros are targeted at people not liking the mainstream choices.
> Not to mention many projects refusing to add configurability and accessibility, citing vague maintainibility concerns or ideological opposition.
So you are saying having opinions is bad?
You are still free to use whatever desktop you want or patch your kernel. You have the source and the rights to do whatever you want with it.
> Another blatant example is the 6.7 kernel merging anti-user "features" in AMDGPU... previously you could lower your power limits as much as you wanted, now you have to use a patched kernel to lower your PL below -10%...
I don't think putting safeguards in a GPU driver to make sure users don't fry their expensive GPU inadvertently is an attempt against your freedom. The kernel and gpu driver are still under an open source license that expressly permit you to do the modifications you want.
> Everywhere you go, you can find these user- and tinkerer-hostile decisions.
What is more tinkerable than having the source available and the right to modify them and do whatever you want with it?
I think you are mistaking user and tinkerer-hostile decisions with your and users excessive entitlement mentality. Developers have finite resources and can't possibly agree and accept all users suggestions and desires, and have to put limits on the scope of their projects so they can maintain it, support it and not be overwhelmed by bugs/issues. This is not about freedom.
I should probably have a pre-defined disclaimer "signature" whenever I write about Mac OS, since I always get this response.
I know Linux exists. In fact, I've been using it as my primary OS roughly from 1994 to 2006, and since then intermittently for some tasks, or as a main development machine for a couple of years. I wrote device drivers for Linux and helped with testing the early suspend/hibernate implementations. I'm all in support of Linux.
But when I need to get work done, I do it on MacOS, because it mostly works and I don't have to spend time on dealing with font size issues, window placement annoyances, GPU driver bugs, and the like. And I get drag&drop that works anywhere. All this makes me more efficient.
But I don't want to turn this discussion into a Linux vs MacOS advocacy thread: that's not what it's about. In fact, if we were to turn back to the main topic (ensh*ttification of everything around us), Linux would be there, too: my Ubuntu already displays ads in apt, as well as pesters me with FUD about security updates that I could get if I only subscribed. This problem is not magically cured by switching to any Linux, it's endemic in the world we live in today.
No, it really is cured by switching to Linux, or more precisely to free/libre software. Ubuntu introduced ads, so I switched to Mint. I could do that because the code is all GPL and the ecosystem is large enough that there were sufficient other people with beefs about Ubuntu to do something. The license and the ability of the community to fork are the keys.
Consumer software has gone straight downhill for the last 20 years and while the FOSS alternatives have some rough edges I always at least try them first. The outcome has been that I am shielded from most of the industry's worst excesses. Bad things happen, the world gets worse, and I just read about it, it doesn't affect me. I am more of a radical than the post author, I say in your personal life, roll it all back 100%, return to history, modernity is garbage, computing has taken a wrong turn because we have allowed it to be owned by illegal monopolies and cartels. I do make compromises in the software stack we use for business simply because my employees are not as radical as I am and I need to be able to work with normal humans.
> I do make compromises in the software stack we use for business simply because my employees are not as radical as I am and I need to be able to work with normal humans.
That becomes the problem. Not just in the business world either. Like if all your friends are communicating on platforms that are locked down and harvesting your data, how do you arrange to get together for a burger? If all the stores closed down and you can only buy things on Amazon, how do you clothe yourself? Obviously I'm exaggerating but the big problems of this situation arise precisely because most people don't realize it is a problem, and thus working "outside the system" requires an increasing amount of effort.
Your explanation why Linux isn't the solution is actually a massive pro in favor of Linux. There's nothing special about Ubuntu that's holding you hostage and if you wanted to switch distros, you could do it in an afternoon. Unlike switching from Mac or Windows which would take much longer and would probably never be a 100% migration.
It would be nice if we could trust corporations to stay some kind of course and have our best interests at heart, but they don't, and at some point it starts being our own fault if we keep enduring it. It then follows though that once you have full control over your tools, it's our own fault if we choose not to go solve the issues, but that doesn't feel entirely fair.
We can't personally be responsible for everything. So to bring it back home to enshitification, a free market, free from monopolies or duopolies, should be the solution. As one product gets shit, a hole in the market opens up and could be filled. That's not happening though, so what's going wrong? If it could happen anywhere it's Sillicon Valley, so much money and the culture of disruption and innovation, all the right skills are floating in the employment pool. But software keeps getting more and more shit.
macOS has been very conservative in redesigning the user experience; it's aging slowly like fine wine. There are a few hiccups occasionally but I feel it's a lot more polished and elegant compared to the quirkiness of earlier versions. I don't get this common sentiment that it was better in Snow Leopard etc.
Stability is great, power consumption is awesome since the introduction of the M-series chips and I can still do everything (and more) that I did on my mac 20 years ago. Yes there are some more hoops here and there but overall you have to keep in mind that macOS never fell into the quagmire Windows did with bloatware, malware and all the viruses (although I think the situation is much better today).
macOS has been walking a fine balance between lockdown and freedom, there is no reason to be alarmist about it and there are still projects like OpenCore Legacy Patcher that prove you can still hack macOS plenty to make it work on older macs.
We're eating good as mac users in 2025, I don't buy the slippery slope fallacy.
The new settings are half-baked and terrible. The OS pesters me constantly to update to the latest revision and I can't turn those messages off, not even close the notification without displaying the settings app. And I don't want the latest revision, because it is buggy, breaks a number of apps, and introduces the "AI" features that I do not want or need.
More and more often good apps get broken by the new OS policies (SuperDuper is a recent example).
The old style of system apps that did wonderful things is gone (think Quicktime player, or Preview), these apps are mostly neglected. The new style is half-baked multi-platform apps like settings, that do little, and what they do, they do poorly.
Unlike your parent comment I do think that Mac favored lockdown all the way.
But it does a wonderful job at doing so.
Macs feel less like a personal computer and more like an appliance. Which works great if you do things that don't require tinkering, like office tasks or even webdev.
And I do love Linux, specially the more hobbyist OS's like Gentoo or Nix.
But at some point in my life I decided to spend more of my time (aside work) with other parts of my life. And in result, having to spend a weeken to solve some weird usecase, be it the package manager or the WM, is a pain.
The compromise would be to use non-hobbyist Linux.
I have never spent a weekend fixing a problem. The worst I can remember was when an update early version of Ubuntu broke X Windows and the update had run on multiple machines in a small office so needed to be fixed multiple time, and there was a delay while they fixed the problem IIRC. Still, it was a few hours.
Even now, using Manjaro which is relatively likely to have problems, I have had no major issues so far.
I have not used Macs so cannot compare, but IMO Linux compares very favourably with Windows.
Mac users I know rave about it, but every time they come up with a specific example of why they are better it turns out to be something like functionality other OSes have. Sometimes Macs have the advantage of being preconfigured (e.g. copy and paste between devices required installing software on both and pairing - but a 10 min one off when you buy a new device is acceptable to me).
- Virtual Desktop per monitor? Nope, because Xorg didn't support it back then. And now it's a 10 year bug on the Kde bugtracker.
- A Dock? It worked ok. Until Wayland came and everything broke. It's supported now, but you have to clone the latest git commit of the biggest dock project which is not almost abandoned. And it breaks while compiling. A lot.
- Global Menu? The support is all over the place.
- Fractional scaling? It works. But in MacOS (with the help of an app, I admit) I can have incredible granularity.
On top of that, add the generally inferior hardware revolving a laptop, aside CPU, storage and Ram.
The MacOS desktop feels like a Gnome2 in an alternate universe where the devs never made bad decisions, and things like Wayland (1) never occurred.
(1) Not because the project itself, but the act of breaking compatibility and passing blame to other people has sent decades of FOSS development and manpower down the drain.
I know Linux since Slackware 2.0, have used most well known commercial UNIX systems, and rather use GNU/Linux on VMs, instead of laptops, as I had enough of this.
Last attempt to try otherwise was a UEFI bios, without fallback to legacy BIOS, that just couldn't get along with whatever top distro from Distrowatch I would try to.
Apple, Google, Microsoft walled gardens are more confortable to stay on, and as long as I can have some kind of ISO C and ISO C++ support for everything else that depends upon them, I am good.
That said, my current laptop came with Linux preinstalled so I knew I would not have hardware issues.
I would also rather have one off issues to install than unpredictable issues later on. Subjective preference, of course. I have had lots of issues with Android. Never at the start, but with app upgrades.
My ASUS 1215B EEE PC, the netbook generation, came with Linux as typical of them, had wlan problems, and after the AMD driver was replaced by the open source one, it never achieved the same OpenGL capabilities as it had originally, and hardware video decoding never worked after Flash was gone.
As a user of mac for 7 years (went back to windows since around 2018), I’m watching my mac buddy from time to time and it’s not what you are describing. It definitely gets stupid-level worse every year and yes, Snow Leopard was peak mac indeed.
never fell into the quagmire Windows did with bloatware, malware and all the viruses (although I think the situation is much better today).
Windows has 10:1 program ratio compared to Mac. You can usually choose from full-bloat gamified experience to a simple tool doing its job. Windows itself is crap by default though, but that’s at least fixable if you know what you want and where to look at.
> macOS has been very conservative in redesigning the user experience; it's aging slowly like fine wine. There are a few hiccups occasionally but I feel it's a lot more polished and elegant compared to the quirkiness of earlier versions. I don't get this common sentiment that it was better in Snow Leopard etc.
I completely get this sentiment. MacOS since Big Sur has had a quite indeterministic UI, from scrollbars that only appear halfway of the time, to the new app toolbars that truncate filenames and hide the search bar in the overflow menu. The Settings app is still worse than what it replaced, the Mail app keeps randomly appearing, the random confirmation pop-ups are more common than in Windows Vista.
Snow Leopard was (and still is) a bliss, it didn't nag you with "this app is dangerous" bullshit, built-in apps that look and work like an intern's multiplatform UI practice, constant updates... It was an OS primarily designed to let people get work done, not to encourage them to spend all their time on Apple TV/Music/...
I used OSX from 10.0.0.4 to 10.4 and it was OK then. I recently had to use a Mac for work something-something-tree-whatever and it's slow even on an M1, it's double the weight of an X1 carbon, and the window manager hasn't evolved meaningfully and is junky. I haven't had so many troubles with arranging windows on two screens in almost 20 years.
Maybe people have been slowly boiled? I got my partner on a Mac 10 years ago but would not get her another Mac. Apple's push to make evetything e-waste, foxconn, and the general surveillance in the name of security ensure that. My observation is less that it has aged "like a fine wine" and more that Macs become prisons shaped like a computer.
I’ve moved away from MacOS for a number of reasons, specificslly the strenghts you highlight I consider weaknesses:
- the oh-so-pretty GUI takes up too much white space, I have little left for actual content. I dont need massive icons with wide spacing since, I need the opposite.
- what is the deal with hiding folders from Finder (like ~/Libraries). Do I honestly need the command line to open this directory?
- every iteration after SnowLeopard I feel their “features” are going backwards and taking away “usability” from me.
- OpenGL stuck on 4.1, Vulkan (Molten) is a 3rd party hack. Seriously?
- its become a case of one step forward, 2 steps backwards.
> burning through all the goodwill accumulated over the years
Bambu Lab never really had any goodwill. No one liked the fact they were proprietary in a previously very open market. They just made really good, affordable printers, and those who were more interested in making stuff than in supporting an open community got them, sometimes reluctantly.
And BTW, Macs have always been locked down and backwards compatibility is not their priority (which means stuff broke). They cared a lot about their user experience though, I don't know the situation now, but I don't think it can be worse than Windows.
There is heaps you can do, but admittedly not all of it will you want to, and not all of the results will be equivalent.
For one though, you can support open source software, especially linux OS's. Similarly, ditch the Bambu. There are countless better and more open printers out there, and you can DIY excellent 3D printers that get great results.
I think that's the point of difference between now and the past, information has spread so far, and people have fought so hard for open source software and hardware, that we actually have a good defence against corporate greed. You accept some compromise and work a little harder for it, but it's really not that bad.
You cannot simultaneously complain about companies closing systems off and give Apple any credit at all for the past 20 years of operation. They are the absolute worst offender in the industry without exception.
And no, if the axis you are measuring on is openness versus locked down then Microsoft is not worse. You have simply been brainwashed.
> There isn't much we can do, as long as everybody tolerates this.
So you should become the change you want to see, shouldn't you? Try switching to Linux until it works for you. Debian is rock-solid today; xfce is so good that my non-technical relatives use it every day with no complains. I'm using a GNU/Linux phone as a daily driver and I'm sick of iPhone and Mac users complaining that everything is going downhill without taking any actions. I'm also sick of articles without any suggestions of what to do, when all tools are available today. No, it's not trivial. But it's doable.
1. If you were a bit more familiar with Apple history, you'd know that the Mac was actually Steve Job's push to make things more proprietary and locked down, not less. Make of that what you will.
2. If your ideological stance is in opposition to companies like Microsoft/Apple/etc. and you work in the tech industry, the most effective action you can take as an individual is to deny them your labor.
These kinds of articles pop up all the time, along with all the "Web 3" ideas, and all of them seem to view the past with a sort of rose-tinted nostalgia, forgetting that the corporate business world of the 80s, 90s, and early 2000s was just as sleazy and run by assholes as it is today; the only difference is that the technology is finally catching up with the ambitions of said sleazy assholes and allowing them to do what they've been trying to do since the outset, i.e. grow into enormous ungovernable conglomerates and wield godlike omnipotent control over the flow of information.
As a matter of fact, this stink of sleaziness that permeated the early Web was so prominent and overpowering that it played a key role in the rise of these huge companies like Google. Google's algorithms and page crawlers were not that revolutionary or different from anything the other search engines were doing; Google just happened to be in a position where they were sitting on lots of cash and were able to run a search engine for several years with no ads or clutter or any of the other annoyances of its competitors, seemingly providing a free service that asks nothing in return. They made this part of their carefully curated public image, of being the hip and cool tech company with the "don't be evil" mantra. They probably burned through ungodly amounts of money doing things this way, but once all the competing search engines withered away and died and Google had the entire market cornered they grew into a multi-trillion dollar megacorporation and are now unstoppable and now all their services they provide are deteriorating because they have no competition.
Ironically, it was this false underdog narrative, the idea of the young trendy cool tech companies overthrowing the stuffy old corporate tech companies, that sort of paved the way for the tech industry to become more monopolized and horrible than ever. And now it's happening again with lots of "Web3" companies trying to present themselves as the new champions, who will overthrow the stuffy old corporate tech companies like Google and bring us into a new era of the Web that is even worse than this one.
>Google's algorithms and page crawlers were not that revolutionary or different from anything the other search engines were doing;
Back in 1998, Google's algorithm ("pagerank") of weighting href backlinks using linear algebra was revolutionary compared to the other search engines like Yahoo, Lycos, Infoseek, AltaVista, etc that were built on TF-IDF (term frequency-inverse document frequency)[1].
The more simplistic TF-IDF approach of older search engines suffered from "keyword stuffing" such as invisible words at the bottom of the HTML page for SEO. Google's new search engine was an immediate improvement because it surfaced more relevant pages that beat the useless pages with keyword stuffing. At the time, Google Search results were truly superior to the junk Yahoo and AltaVista was showing.
A compelling story, but Google became profitable in 2001 shortly after the introduction of AdWords, three years after its founding. At the time their funding was $25 million.
> the young trendy cool tech companies overthrowing the stuffy old corporate tech companies, that sort of paved the way for the tech industry to become more monopolized and horrible than ever
Not following the thread here. Do you think the web would be less monopolized if Altavista or Yahoo had won?
I don't believe it makes any difference at all. The transition from a free web, made by people for the people, to the collection of corporate walled gardens we have today would have happened regardless, it was simply the natural progression of things - that we failed to recognize and avert in time. Initiatives like making computing personal again are exactly what's needed if we want to go back.
> Google's algorithms and page crawlers were not that revolutionary or different from anything the other search engines were doing;
Google was revolutionary when it launched. It was clean, super fast, and had way superior search results. It blew the competition away. Within weeks of Google's launch techies started scolding people for using AltaVista or Yahoo, when they should be using something better.
Oh yeah — who doesn't remember all the "Pamela Anderson" meta tags (thousands) people would put into their HTML files to drive up the page rankings on the various web crawlers.
So easy to game the system before Google. (Now easy again judging by the shitty results I've been getting for years now.)
I find it hard to imagine that today's iteration of Google is what Larry and Sergey had in mind when they initially founded the company, or what Paul Buchheit had in mind when he was working on the earliest forms of Gmail. I don't think that "Don't be evil" was tongue-in-cheek back then.
The company had a legitimate business model, was innovative, agile and profitable from early on. It rightly earned a lot of respect.
But something went wrong at some point. It's debatable when, why or how, but it happened.
That isn't true. People made fun of software that showed ads. Exception was shareware, but it did that only for the software itself.
The braindead hordes accepting things they couldn't really understand did have a negative effect on overall quality.
Just before someone argues against the misanthropy in my comment, some of my most loved family members belong to the braindead horde. I love them, but their failure in education makes the landscape worse for everyone. And it is also very visible and not something imaginary.
Today we accept our OS spying on us, showing us ads, paternalizing its users with updates and the whole mobile catastrophe is a dilemma in itself. Smartphones are powerful devices but the software landscape disabled a whole dimension of software and is responsible for unnecessary waste.
Yes, it got worse on the software department. A few less driver issues because a lot of companies and hardware suppliers were consolidated is not a win.
And honestly, it isn't really hard to notice these changes at all.
Google is a good example. It didn't have better search, but its site wasn't plastered in ugly advertising from top to bottom. This was quite a factor in its success. Clean, fast, good. Not the nightmare it did on Android, where every app onboarding is a horror story in a thousand popups. There are profound differences in quality, intelligence and ability.
This article really resonated with me. Unfortunately I think things aren't going back. What the article doesn't appreciate--and we techies don't either--is just how much the scale of today's tech market absolutely dwarfs the scale of the tech market back in the days before the internet.
The market wanted growth. Early tech companies, like Microsoft, Apple, eBay, and then Google, went from zero to huge in a very short period of time. But companies like the FAANGs kept up the absurd levels of growth (20+% YoY growth in the case of Google) that Wall Street got hooked on, and it's been on a drug binge ever since. The result is that we have multiple trillion dollar companies that will...never not want to be a trillion dollar company.
The total amount of money in the PC market was miniscule compared to today, and the internet and its online retail plus ads bonanza even dwarfed that. The PC software market, the video games industry, everything--it was all so much smaller. As the internet swallowed the world, it brought billions of users. And those billions of users can only use so many devices and so many games and spreadsheets and stuff. They had to be made into cash cows in other ways.
The tech market just has to keep growing. It's stuck tripping forward and must generate revenue somehow to keep the monsters' stomachs fed (and their investors too). We will never be free of their psychotic obsession with monetization.
And advertising is soooo insidious. Everything looks like it's free. But it isn't. Because our eyeballs and our mindshare is for sale. And when they buy our eyeballs their making back those dollars of us--it's the whole point. So whether you like it or not, you're being programmed to spend money in other parts of your life that you wouldn't otherwise. It cannot move any direction but falling forward into more consumerism.
I'm afraid I'm a doomer in this regard. We're never going back to not being bothered to death by these assholes who want to make money off us 24/7.
It is the legal system that hasn't caught up with how tech scales seemingly small damage.
What were small conflicts of interest before (a little trash here or there, a little use of personal information for corporate instead of customer benefit here or there, ...) now scales to billions of people. And dozens of transactions, impressions, actions, points of contact, etc., a day for many of us.
That not only makes it more pervasive, but massively profitable, which has kicked in a feedback loop for sketchy behavior, surveillance, coercion, gatekeeping, etc., driven by hundreds of billions of dollars of revenue and trillions in potential market caps.
Things that were only slightly unethical before, now create vast and growing damage to our physical and mental environments.
It should simply be illegal to use customer information in a way not inherent to the transaction in question. Or to gather data on customers from other sources. Or share any of that data.
It should be illegal, to force third party suppliers to pay a tax to hardware makers, for any transaction that doesn't require their participation. And participation cannot be made mandatory.
Etc.
One commonality here, is that there is often a third party involved. Third party gatekeeper. Third party advertisers. Third parties introduce conflicts. (This is different from non-personalized ads on a site they have relevance for, which are effectively two independent, 2-party transactions.)
Another commonality, is the degree to which many third party actors, those we know, and many we never hear of, who "collude" with respect to dossiers, reaching us, and milking us by many coordinated means.
> It is the legal system that hasn't caught up with how tech scales seemingly small damage.
Most administrations are squishy-soft on corporate crime. If there were regular antitrust prosecutions, violations of Federal Trade Commission regulations were crimes, wage theft was treated as theft, forging safety certifications was prosecuted as forgery, and federal law on warranties was strictly enforced, most of the problems would go away.
In the 1950s and 1960s, all that was normal. The Americans who lived through WWII were not putting up with that sort of thing.
The economy was also wildly different back then - there were massive, fundamental, competitive advantages the US was continuing to reap due to being on the winning side of WW2 (in every way).
For instance, nearly every country was paying the US loans back, in USD, or was having to depend on the US in some way.
Nearly every other country in the world had their industrial base (and often male population) crushed in the war.
Etc.
Those things cost money/effort, and require a consistent identity and discipline.
In some respects, I agree. Yet I don't think we have to put up with it all of the time. Most of the technology in our life is either frivilous or has a workable alternative. It is not as though we have to abandon technology in, or even current technology in pursuit of the personal. Yes, it involves making more careful decisions. Yes, it will likely be limited to people with technical knowledge. On the other hand, that was true of computing in the 1980's and largely true of computing in the 1990's.
In many respects, we are also better off than we were in the 1980's. There are more of us, we are connected globally, and the tools that we have access to are significantly better. We also have a conceptual framework to work within. Technically speaking, Free Software may have existed back then but few people even knew of it. People were struggling with ideas like public domain software (rarely with an understanding of what that meant). If you wanted to make money, outside of traditional publishing channels, you were usually toying with ideas like shareware (where you had pretty much no control over distribution). If you wanted to spend money of software, outside of traditionally published stuff, chances are that you had to send cheques or cash to somebody's house.
And then there is communicating with likeminded people. We may like to complain about things like Discord or Reddit, but they are not the only players on the block. Plenty of people still run small or private forums. Yeah, they can be hard to find. On the other hand, that has more to do with the noise created by the marketplace rather than their lack of presence.
The problem with the nimby/ecofascist/exclusionary perspectives is the obvious retort is always "okay, yes there are too many people in this domain. The solution then is for you to quit, not me." And substitute whichever group doesn't encompass you which usually falls along racial, gender, or class lines. At the end of it, no one wants to fall on their sword for everyone else.
The thing is the older I get, the more it does seem like at the very least we are not growing pie in a number of areas (the example at the top of my mind is academia) and sometimes it just seems like an easier solution is to decrease the numerator. But I don't know how you can do that and justify it morally, both to society and to yourself.
It's time we give up on the majority of people who don't care for freedom and focus on the few that do.
Unfortunately at the time we need them the most pretty much every pro-user organization is imploding because everyone and their grandmother wants to turn them into vehicles for whatever their pet cause is.
Also, even if they're not, they're getting squeezed out. It's hard to stay afloat trying to just do a thing without your eye on the "prize" of getting bought out by Google et al.
>What the article doesn't appreciate--and we techies don't either--is just how much the scale of today's tech market absolutely dwarfs the scale of the tech market back in the days before the internet.
I understand it and know it. But I don't appreciate it either (in the sense of liking it).
I mean, the solution is inside your definition of the problem. Infinite capital growth isn't possible. They will either finally make their products unusable or collapse. When they have collapsed enough and we have reached the plateau of innovation someone will make some basic device interoperable with everything and leave us be to count their millions instead of billions.
Its just another bubble, one predicated on mining the users rather than expanding the product.
I think it's easy to forget that computing technology is a tool. Of course it was bound to be huge today, because it's supposed to be a tool in the toolbox of every company. It wasn't as big back then because not every industry could incorporate it right away, knew how to, or was interested in doing so.
It's not bad that it's big. It only needs to grow because the rest of the economy needs to grow.
I am also afraid you're a doomer in this regard. You don't think the bigwigs with their fax machines in the 1980s wanted to make money off of us 24/7? Of course they did.
Tech is scary in the sense that it's now gone quite a bit beyond the understanding of the average joe. Even most of us on this site probably don't fully understand how much detail data can paint a picture of a person. There are companies that probably know something about me that I don't even know.
I guess I don't know how to alleviate that feeling, and maybe it's the correct default assumption to be a doomer. It certainly would be very helpful if the US treated the situation more like the EU treats the situation.
It's tiny, clearly built with love for the user, doesn't do a heck of a lot, and has some interesting ideas that are just fun to mess around in. And unlike some of the similar retrocomputing OS's (which are also lovely but grounded in old fashioned design), genode feels like a glimpse into the good future.
That looks like the most radical/unusual operating system thing I have seen in recent memory. Not sure how practical it is, but kudos for trying something so different.
It's so cool, I could talk about it forever. It's practical enough for the devs to use it as a daily driver (though with linux in VirtualBox or Seoul for some things like running their builds) and theres a few businesses built on it.
But nowhere near as practical as Linux at the moment of course
I had not heard of Genode/Sculpt, but it looks interesting. These days, I feel like if I boot a new operating system, I have no idea what all it's doing and whether or not things are secure--I'm basically relying on the operating system to have good defaults. And then it's so easy to screw something up!
I like the idea of Qubes and it looks like Genode might be an even better idea...
It's a very similar philosophy to Qubes - one of their open challenges is to port the qubes infrastructure over since qubes is (in theory at least) hypervisor independent. https://genode.org/about/challenges Which would be nice since NOVA hypervisor is dramatically less code then Xen and Nitpicker/Dialog for the management console is dramatically less code than Fedora.
I've looked into it briefly but it seems like too much work for me right now.
The True Genode Way of course is that everything worth having would eventually be ported as a native genode component instead of a qubes style VM. They've put a lot of effort into making that as easy as they can with Goa (a nix-inspired package management and build tool) and adding to their C standard library and ports of popular 3rd party libs like SDL
Not sure! They have a system set up for porting drivers from Linux into userspace components so it bats above its size.
From their description: "It is tested best on laptops of the Lenovo X and T series (X220, X250, X260, T430, T460, T470, T490)",
200 isn't on the list but you'd probably have about as good a time as you can
We are loosing personal computing, because most communities in FOSS and related movements are too much into individualism. Comments in this thread are clearly showing that.
Human emancipation always required mass political movement, and such movement requires some sense of common purpose and solidarity - to sacrifice some of our individuality. If they lost that, they lost their cause.
Society cannot be changed without changes in policy, and changes in policy require political power to enact them. You do not gain this kind of power through constant forking, dramas, and even gazillion lines of code using AGPL license.
Convenience kills. I think every sane individual in the world knows the article speaks the truth, and I think everyone wants this to happen. But corporations are not individuals, corps are their own life-form, and even though humans make up the corp, the corp is not human, it is not even inhumane, it is a whole different thing, and the humans that operate it has very little influence on it.
So, as far as a corp can understand anything, it can't understand this human article. I don't know if one can write articles that a corp can understand, maybe it cannot understand medium in the same way we can.. It seems to act based on information it sees in "markets" and "consumer behaviour", and we don't yet know how to write an article with those (even if "vote with your money" was once believed to be it, until we discovered that mankind as a whole is not an individual that can make a decision)
Humans definitely have control over how a corp behaves. The argument that they don’t is just a convenient way to absolve a small minority of greedy people from blame for the harm they created as the direct result of their greed.
> Humans definitely have control over how a corp behaves.
I will counter this with: The argument that the humans that make up a corp are in control over it, and that the corp behaviour simply results from their flawed and greedy characters, is just a convenient way to blame someone because the real problem of understanding what kind of entities corps are, and how to influence and control them is too hard.
then maybe corps should be dissolved. publicly traded corps are here to maximize profit by any means necessary, and there are individuals that work for them, and also those who invest in them. at the end of the day corp is a group of humans with certain interests.perhaps, rouge corps that act against humanity should dissolved and / or be heavily taxed with taxation helping the public overcome the hurdles created by this corp. in reality, politicians are helping these corps grow even bigger.
I'm not defending corps. I'm not defending the people that make them up. I'm not attacking them either per se. I'm just saying that maybe viewing them in a different perspective, from a different level of abstraction, may unlock a new solution space.
When dealing with people, we tend to view them as as distinct entities from their individual cells and neurons when we argue about their behavior. We don't talk about this or that individual neuron causing a human to take an action, if anything, we may discuss a group of neurons, but often, we argue about the entire "brain chemistry", even if it's strictly true that some group of distinct neurons are "responsible" for the action the human takes, then so is the bones in their hands, the fibres in their muscles, because they didn't refuse (to refuse, is to no longer be part of that body). Maybe it's the moral thing to do, for a cell to refuse to be part of the immoral human, but it does not absolve the human from responsibility, and it does not put all the responsibility on the indvidual cells that make it up, humans are complex organism, corps are made out of humans, they are even more complex. Treating them as a collection of humans is what we have done so far, and while we've gained some satisfaction seeing a (too few) very disgusting people getting what they deserved, it's not changed the overall behaviour of the companies, because, killing or changing one neuron won't change a brain, replace neuron with human and brain with corp.
On a higher level of abstraction, the corp as a form of life, maybe a cancer, maybe a rat, or something that could potentially be a positive thing, we may start a new way of reasoning about and with them.
After all, I can talk to you, I know how to do that, but I can't talk to your neurons directly, I don't understand their modes of communication, it's on a different level from me.. This is why psychiatry is behind, we don't know how brains work well enough, we can give medications to take the worst out of them, like treatments for adhd, or schizophrenia, but they don't work on the brain in a coherent way, they work on the individual neurons in a very crude way, and so, the effects are nowhere near perfect, and the side effects can be almost as bad, or in some cases worse than the decease.
At no point has anyone ever said a cooperation is controlled by a single entity. It’s already acknowledged that a corporation behaves according to a culture.
What I’m saying is that culture isn’t its own uncontrollable entity independent of influence from the people that run the corporation. A company’s culture is dictated by the people who lead and make decisions for that corporation. A culture is driven from the top down.
The problem isn’t understanding what type of entity a corporation is, it’s fining people who are both motivated to make the change but also has the power to make any changes.
The real hard part is working against the rigged system. People who can enact change won’t because it’s not profitable. Whether you’re the MP bribed, sorry I mean “lobbied”, by corporations, or you’re the corporate director that had to navigate the cutthroat ranks to reach your position, there’s literally no personal interest to do the right thing. Literally everyone who can control these beasts suffer from massive conflicts of interest.
So the problem isn’t understanding the problem. We already know what the problem is. We just don’t care enough to change it.
Corps are an alien life-form. They are permitted to kill without themselves being killed in return or in retribution or as punishment. They are an abomination.
And they are the product of a fluke of legal-economic history.
I agree, I think it would be very reasonable to instate the death penalty for corps. It's difficult though because they've already identified legal structures that make them very difficult to kill and even more difficult to kill in a way where they can't reassemble. Identifying an effective way to truly and permanently kill a corp, so that the death penalty would make a real difference, would be a good start.
No, the humans in direction has the responsible per actions that inhumane made, if you think because you can not change the current corp rule and agree with did a imoral thing well sorry but you is the problem, if you do not want made anything imoral, ilegal or anything you need resigned immediately and think what you want, corporation have accountability in person if you remove this you remove the responsibility and the consequences well the world need understand consequences of choice
So far, your view (that the individuals making up the corp can significantly influence its behaviour) has been the approach the world has taken so far, and yet, corps are not exactly getting more well behaved.
Please reconsider whether continuing to do the same (simple and easy) thing we've always done, that we've already seen does not work.
Take a step back, re-evaluate the problem in a new context, even if you don't end up agreeing with my perspective, attempting to think about the situation in a new light might be helpful.
It might be that something in the way corps grow up, maybe their environment (regulations, lack of same, incentives, consumerism, trade, markets) may influence them to grow into the immoral unethical monsters they often become.
Maybe we should consider them too dangerous and harmful, and simply destroy them, I'm not convinced that's better, but maybe there's a way to understand them at a different level, that allows us to "write articles" they understand well enough to actually adjust their behavior (and not just try and circumvent whatever "obstacle" has been put in their path).
If you stop treating psychopaths like empaths they are very easy to get along with and very useful. Empaths will respect you if you are loving, caring, generous etc. For psychopaths those are weaknesses to be exploited. They will respect you if they are afraid of you and if you help them in a humiliating way. They will go out of their way to return the favor.
As society keeps forcing them into pretend empathy they know every detail about it. They can exploit it and imitate it but they cant hide how precious their ego is to them. It sticks out like a sore thumb.
Corporate creatures are similar, they simply don't share our emotions. That doesn't mean they don't understand or wont cater to them.
> Corporate creatures are similar, they simply don't share our emotions. That doesn't mean they don't understand or wont cater to them.
I agree, in many ways corps exhibit the same behavior as psychopaths, after all, people is the thing they subside on.
I think their way of cognition is even more alien to a normal human than that of a psychopath is to a normal human.
They may share a fundamental, that we can't make them truly understand why something is wrong, but we may be able to come up with consequences that deter them from undesired behavior.
One difference though, is that we can't make it illegal to be a psychopath, only make some of the actions that only a psychopath would perform illegal.. We could make it illegal to be a corp (I'm not saying this is going to be a better idea.. But we could make it legal to kill corps, if only we can find out how to kill them so they actually die of it [1]).
This crapification of tech is just a microcosm of where America is headed as a whole. I found a blog recently that talks about it in depth, and I find it hard to argue with any of the analysis or conclusions [1].
Hard to ignore the signs that the US is an empire in decline, heading towards collapse.
> How many Nintendo Entertainment System games sustained themselves with in-app purchases and microtransactions? What more did the console ask of you after you bought a cartridge? Maybe to buy another one later if it was fun?
True, but unlike the Apple II, the NES was not an open system. The NES had hardware DRM, which allowed Nintendo to control what games were published for the system and to charge licensing fees (much as Nintendo, or Apple, do today). Nintendo also tried (unsuccessfully) to shut down Game Genie.
In 1992, you more options. Your friends could tell you for free, you could stumble on them yourself, or you could get them with a magazine or book (which you didn't necessarily have to buy, you could just flip through it at the store and memorize the cheats.)
> you'd call the Sega Hotline on a premium phone number
I remember the ads for that but I've never met a single person who did that. (Or whose parents would be okay with it.) Cheat codes were either shared by word of mouth among friends or in magazines. Or you bought a game genie, but that was more for messing around with a game's mechanics than actual, blatant cheating.
Tbf, very few Switch games "sustain themselves with in-app purchases and microtransactions". Especially relative to the number of games which is an order of magnitude greater.
The web as an app platform form was primarily pushed by Netscape to circumvent Microsoft’s monopoly. Sun tried the same with Java. Both of these efforts led directly to every problem the author is complaining about. There was no utopia. The world went from an expensive monopoly under IBM to an expensive monopoly under WinTel to an expensive duopoly under Microsoft and Apple.
If people dislike exploitative SaaS and content platforms, stop using them. No one is forcing anyone. Plenty of people use home servers and Linux. Go for it. There are also tools like Chris Titus’s UWU to make Windows more tolerable, and MS still sells an office suite that can be installed locally. You don’t even have to “sign in” with it, though you can.
I’ve lived through several distinct eras of computing. This one may not be the most exciting, but it’s by far the best. You can use SaaS or locally installed stuff, and emulators (both hardware and software) exist to keep the older stuff alive. Even better, I don’t have to panic save every 5 seconds, reboot my computer every hour, and my computer can come with me. I don’t get disconnected when someone places a call, and while some software is expensive, it’s cheaper than it used to be when inflation adjusted.
Go fire up an Apple II, an H89, a TRS-80, or a PET without any modern supplementation and tell me that those are preferable. You may groan about Google, but go back to purchasing tons of manuals that may or may not be specific to your machine, read through them only to find no answer and proceed to play detective for a few weeks. How much more productive is your time with a dang search engine?
That's a side effect of the way we've educated the market to expect everything to be "free." That leaves the only option available being indirect monetization through ads or in-app purchases or something similar to that.
It was funny when the corporatists here blamed consumers for using free (or “free” or whatever) services, falling into the FB trap etc. As if that mattered at all? As if FB and its ilk wouldn’t use the Free strategy every time in order to grow humoungous or fail? Of course they would bet their money on network effects over getting money in the short term—there’s no point in a boutique social network. Those hypothetical consumers who wanted to pay upfront for a 2K user social network (patently irrational but okay) would have remained unserviced.
Once upon a time, it was illegal to discount something to gain market share and then charge extra once you've bullied out your competition. Technically it's still illegal, but good luck finding enforcement.
This is called dumping and yes it was and maybe still is illegal with things like commodities and manufactured goods.
It was never enforced with software or services. If it had the entire standard VC startup playbook would be different.
It’s also never been enforced internationally. China has arguably been subsidizing its industries and effectively dumping cheap manufactured goods for years to become the workshop of the world, and it works.
> "Our economy isn’t one that produces things to be used, but things that increase usage."
...the quote, *AS A SOUNDBITE*, only sounds good on a surface level, but collapses under the slightest test. All products in some form or another increase the usage of resources in order to reach a certain goal.
The article, where the quote originates from, contextualizes the quote marking (a) the difference between products in service of an actual goal, (b) products that are only meant to look good on a balance sheet, and (c) how companies have morphed towards (b) in order to attract investor funds and increase share prices / company market values.
The quote, BY ITSELF AND WITHOUT CONTEXT, is a twisted Neo-luddist version of its original self.
I think it means increase usage of the thing itself, and I think it's a good insight. While there is a natural supply and demand curve, unscrupulous growth-focused businesses optimize their products (unhealthy food) and services (gambling, social media, mobile games) for high levels of consumption (at least for a portion of vulnerable users), irrespective of harmful effects. It's the tobacco industry model reborn.
I think a more generous interpretation of this is simply one that is critiquing planned obsolescence and addicting algorithm. Some things need to increase usage by nature, but how many services have you used that really needed a subscription as a necessary model to work?
My hot take is planned obsolescence doesn't exist.
It's a side effect of items being built to cost, and the marketing phenomenon that consumers follow fashion trends.
Your car doesn't have planned obsolescence: it has a warranty period. If you want a longer one, you'll pay more because that is not a free service to provide.
I love the "which part of.." examples of companies and services that the author lists, along with the screenshots. I know that nostalgic feelings tend to not be an accurate representation of the past, but I do know that I used to look at a lot of those companies and products with some admiration. No, things were not perfect back then, but a lot of these products had a level of innocence, goodwill or benevolence that does not exist today. They seemed more rooted in innovation than value extraction at all costs.
Today, I look at those same companies with absolute derision over their completely unethical and hostile approaches to the world, the economy and dealing with the people that use or rely on them.
Worse, my ability to get excited about new companies, products, services and innovations has been completely blunted by the expectation that anyone working on something I think is "cool" will inevitably be co-opted by people who have the worst instincts: those who actually have no respect for technology or computing and view people as less than human, simply entities from which maximum value must be extracted at any cost.
I would argue that computing has never been more personal if you’re willing to put in a little effort. The advent of containerization, miniaturization of PC’s, and overall drop in cost of technology has allowed anyone to run there own personal intranet, homelab, whatever.
If you want to run your own little silo completely disconnected from your fellow human beings then it has never been easier. But that was never really all difficult in the first place. I don't think it's truly the problem that needs to be solved.
Buy-in from the community is indeed the hard part but I have friends running irc and phpbb we hang out in, and matrix is more or less viable to self host for a group chat, it’s just that 100x more people are using Discord and Signal because of network effects, your one account can give you access to a million communities.
I guess activitypub and matrix are meant to be similar in that regard but for whatever reason the learning curve is just a little steeper, so you have to be motivated by ideology to put up with the gaps in usability
Yeah, that network effect is always the hard part once you want/need to reach out past your personal circle. Even with a personal circle it can ha hard to make people use a better but smaller service.
I mean, the building blocks are there, but so much has moved into "the cloud". You can't run Just Photoshop anymore, you can only run CC that sends all your images to them, you can't run Word without running Office 365, you cant run most games without Steam, etc etc. And all of the exciting new software is -As-A-Service
So while there's more options now for homelab things the overall ecosystem has moved strongly away so there's a lot more to avoid
One of the aspects I've wondered about is the software bundled with either the OS or by the device manufacturer. When PCs (broadly, not just IBM compatibles) were penetrating into the home market they would often come with a suite of tools or demos to show what it could do, or let you create things even if they were the basic editions. Before the internet became part of the furniture, if you'd spent several hundreds on some hot technology there was a good chance you'd buy print magazines for it as well, and they would come with cover discs that exposed people to a lot of what was possible with computing.
Without wanting to sound like a stick in the mud, the focus of computing has definitely changed now. I see it as an interesting thought exercise on how to get someone running around with what is usually a marvel of computing in their pocket to try and imagine that is not the apex of computing, whether to explore what other means of computing offer or what comes next besides a slightly better version of what we have now.
> I see it as an interesting thought exercise on how to get someone running around with what is usually a marvel of computing in their pocket to try and imagine that is not the apex of computing, whether to explore what other means of computing offer or what comes next besides a slightly better version of what we have now.
That is a great way of thinking about it and I'm curious what you've come up with. I think it's a pretty hard sell for most people, especially for things like messaging that have become very central to daily life. Also, there's a big difference between convincing someone to try something a bit less mainstream and convincing them to reject the mainstream version. Like, you may be able to get someone to install LibreOffice but it's a lot harder to get them to uninstall Excel.
Anecdotally, I've found that people who have some other kind of retro/niche/subculture interest can be somewhat more receptive to the idea that the newest thing isn't necessarily the greatest. Like someone who's into hunting for vintage clothes, or woodworking, or whatever. Ironically such people are on average more tech-averse than a typical "normie", but they often understand the concept that it can be useful to actually put effort into getting something that's not just whatever's handed to you. In a way the insidious aspect of recent tech is the way it's conditioned people to expect that they shouldn't have to think much about how to do things, and to just want "smart" technology that reduces decisions.
My $dayjob involves lots of officework as in paperwork.
You bet your fucking ass I'm using Microsoft Office and reliant on it to a bloody fault. I literally and sincerely can't rely on LibreOffice to open and save documents in ways that everyone else would. If I use LibreOffice then at best I'll embarass myself, at worst I'll waste someone's time and either way I risk losing business for no good reason.
A Microsoft 365 subscription (or even buying an Office 2024 license) is chump change because I know I am speaking Industry, not Libre. Nobody understands nor gives a damn about Libre in the real, professional world because the lingua franca is Industry aka Microsoft Office.
Yup, anyone who suggests LibreOffice (or some other alternative suite) as a sufficient substitute for MS Office is someone who's never actually done real-world work with external stakeholders. Practically everyone else uses MS Office, which means you must too.
I learned this first-hand when I was a grad student and had to write a technical report to submit to our government funders (using their required MS Office templates and all). I used LibreOffice and saved it as DOCX and everything looked great. Then my advisor opened it on his computer running MS Office and asked me WTF was up with all the mangled formatting.
But corporate computing is also suffering from the deteriorating user experience.
I have access to large amounts of harware and software through my employer. And while Microsoft Office is unavoidable, I hate it everytime I open Word or Excel (daily), even if it is on my company machine.
The privacy concerns are arguable even more concerning on a company machine. I wish there was a feasible alternative.
> you can only run CC that sends all your images to them
It doesn’t send all your images to Adobe, it only sends the ones you choose to save in the cloud.
If you want to complain about having to subscribe to the software instead of purchasing outright, then do that. Don’t complain about something that isn’t happening.
Aside from the Cloud bullshit, Photoshop's auto-update is a pain. After some regression I disabled updates right after they fixed it. But recently when I open Photoshop, it's started giving me a nagging popup about being out-of-date. It's done more-or-less the same thing for many years, just take my subscription money and leave me alone!
Your definition of "anyone" is pretty skewed toward the tech-savvy.
Spend some time in the tech support desk of a mobile phone store to get an idea of the general level of technical sophistication of the average person. Average folks are not running containers. They're not installing... anything... except maybe an app from an App Store. Half of them aren't sure what a file is.
But the hardware availability and affordability gives them the opportunity to learn and experiment if they want to.
Even tech-savvy people couldn't do that a decade or 2 ago. Not on a budget.
There are three reasons why this happens. The first is described perfectly by bjornn:
> <...> corporate business world of the 80s, 90s, and early 2000s was just as sleazy and run by assholes as it is today; the only difference is that the technology is finally catching up with the ambitions of said sleazy assholes and allowing them to do what they've been trying to do since the outset, <...>
Second, computers are cheap now. They are no longer for the financial and/or intellectual elite.
Third, there is an overall culture/intellecual/value decline in the Western world. Probably because life after the Cold War was too easy. Now many(?) young people, at least in America, can't write by hand, and men who cut their genitals are not considered to be in need of very serious therapy, and Harvard students support HAMAS, and so on.
> At its core, the PC movement was about a kind of tech liberty—–which I’ll define as the freedom to explore new ideas, control your own creative works, and make mistakes without punishment.
The PC movement of the 90s, where it feels like this author is reminiscing, was about arbitraging the delta between what the tech could do and the literacy and expertise in government.
> But over the past decade in particular, the Internet and digital rights management (DRM) have been steadily pulling that control away from us and putting it into the hands of huge corporations.
This period of computing was notable for how a bunch of nerds figured out how to use new networking technology to stretch/abuse/violate/break copyright and fair-use laws around media.
So many ways to get ripped content then. It was fun for a teenager and felt like a victimless loophole. It both opened a bloom of interesting new creative works, but also decimating existing markets and systems, so that eventually new monopolists Netflix and Spotify could take over.
But the conditions and tools available then never went away, and private personal computing is more available today than ever before. In a few hours someone can read a few tutorials and buy/build and run a whole redudant content serving service for all of their personal needs, while writing or conglomerating a whole system of tools and capabilities to automate or agument nearly anything they could think of.
I agree with the article but it's a shame it just ends with some call for legislation.
The government is not going to save you.
If you want that better future, you need to build it. Look at the Steam Deck for example that goes against the grain on all these matters (right to repair, mostly FOSS, unrestricted, etc.) and it's been a huge success.
We need a mobile platform like PinePhone / Librem to have the same level of success and reliability.
The Steam Deck is a great example, and I attribute its success to riding that line between usability and customization. I can get into Linux and mess around, or I can never touch the back-end stuff and just play my games.
SteamDeck would be a great example if they pushed for GNU/Linux gaming, as it is, it remains to be seen for how long Microsoft will tolerate Proton, and in what way they will respond.
I think people are ready (if not yearning) for a much larger, personal web, built with a different set of incentives. The problem appears to be that the technical class currently lacks the imagination (or more specifically, a kind of epistemological hunger, a craving to deepen the mystery of their craft) to synthesize the new reality of the web with the freedom of the old. I see what the designers are working on, and there's clearly a very large gap of communication between what people want to see in the Web, and what the people in charge of the Web can be bothered to make.
I've been working to build a company on my own hoping to fill that gap - I tell the career SWEs in my social circle "I want to give people the true freedom of creating whatever you want on the web," and I just get blank looks, ha :p
There’s plenty of people with the imagination, skills, and doing their best. If you don’t see them you’re not looking hard enough.
The problem is that those people have families to feed and clothe and housing and utilities to pay for and you can’t expect them to work for free (or a pittance) when they’d need to be paid a high 5 figure/low 6 figure salary to be able to afford their basic cost of living.
Users broadly don’t want to pay and will turn up their nose at having to spend $50 a year on a service or $10 on an app built by honest people with privacy and respect of the user in mind (when they don’t have any issues blowing hundreds of dollars on much more ridiculous things that don’t respect them as customers, but that’s another story…)
And on top of that, how do you make your services known when trillion dollar companies will always beat you in ad spending while offering a free product they have hundreds of people working on?
As an example from just a couple days ago, Read.cv just announced they were shutting down and acqui-hired by Perplexity even though they were a lean 3-person team with a monetized product that their users loved. They were at it for 4 years and couldn’t make it work.
Very sincerely: good luck, I hope you succeed in your goals.
But just as sincerely, if you truly believe the real problem is that the technological class lacks an “epistemological hunger” and not the basic money/visibility issues I raised above, you’re in for a rude awakening.
yeah, tough times, taking home 6 figures with good benefits year after year, stock prices at historic highs, taking their picks from the housing bubble, voting for idiots while the world burns... I don't quite get why you're complaining to me about these things, but obviously I'm not talking about people who are struggling to "feed and clothe" their families?
> you’re in for a rude awakening
I've seen how techies spend their time, I'm not the one in for a rude awakening.
A web browser that has all the shit the big tech companies added to make it hard to create a good web browser would be a good start. A web browser does not need webgl or wasm. I want to log into my bank to see how much money I have.
I work for a solar CAD+CRUD app that runs in browser. WebGL is more than fun stuff. Wasm is becoming The Way to run applications in a portable way in servers.
I want to log into my bank to see how much money I have.
The most infuriating part is that this is perfectly doable with 90s web technology. Even encryption was already available in the form of SSL, and that's arguably the only thing which has improved since then. The majority of technological "progress" has merely been the reinvention of existing technology in more inefficient ways.
Good rant. The answer is decentralization technologies as much as it is anything else. Open protocols that create holistic but freely evolving systems without a central gatekeeper. That's how you compete with the technopolies.
Maybe legislation and culture or something can help also, but it will be most effective if part of that is adopting and spreading the right technology to facilitate those changes.
For a while—in the ’80s, ’90s, and early 2000s—it felt like nerds were making the world a better place. Now, it feels like the most successful tech companies are making it worse.
Don’t forget Zuckerberg, Musk, Bezos were nerds - don’t blame everything on „corporations”. That is also nerds after they got ahold of influence and money - that is how the story ends.
I 100% want the shift towards this as probably everyone in this comment section right here.
But how do we sell to the layman that he is missing something, which he never experienced in the first place? Sadly, I believe we are doomed to be niche.
I actually think going back is a good idea. Throw in an almost free Raspberry Pi Pico, install a 6502-based machine emulator, make it launch at startup and straight away go full screen and you are good to go.
I'm thinking about experimenting that with myself and my son when he is older. But he is of the impatient type so maybe this is a bad idea as vintage computers typically need more focus and research.
Maybe a DOS emulator then. It has better tools and games.
I always thought it would be cool to have a PC that you just turn on and in 200 milliseconds, you're at a Python REPL or something. Like the old Commodore 64 that booted you right to BASIC from ROM. No POST, no beep, no detecting this and initializing that. No fucking splash screen. Just power switch on and BAM, you're at the prompt.
According to a Google search, a typical C64 was on in 3 seconds. Maybe I've got rose colored glasses about how long it took, because my recollection was that it was basically instant.
Hell, my monitor can't even turn on in 3 seconds. You hit the power button and it gets busy doing... something... who knows what... not turning on, that's for sure.
As you can see, it is currently written in Haskell, as a PoC. But I am re-writing it in Rust, for reasons that also include making it possible to do something like you ask for.
I'm thinking what is a best solution for quick game programming. Python still seems to be a little heavy. What do you think? I used to read a few magazines that have Python game program listings which is very cool, but sadly it stopped publishing a few years ago.
our brains adapt for "better" fast - faster feedback loops, crisper images, higher resolution, etc - and it's really hard to go back. I'd imagine it's something around the reward system of the brain. I grew up with Z80s and CRT monitors, and my brain remembers Enduro as a vivid and fast game. Reality is completely different when you try it out for a second, after having lived in an era of iPhones and 8k displays. It's crazy how uncomfortable even a device from 5 years ago feels...
This is what I'm afraid of. Back then I had the patience to tinker with a 8086 IBM PC (to no avail as my father wanted me to study for IOI which bored me out so he never taught me anything about computer) because it was the new thing and had some games I can play. Nowadays my 4 year old son is already hooked to my in-laws' Tiktok. We are trying to pull him off but unless they leave it's going to be difficult.
I don't know what the future lies. I'm fully prepared that my son is going to have zero common interest with me.
Pedantic, but the Pico is a microcontroller that'd be running a very thin emulator and would have no operating system, so no concept of applications, "full screen" or video output at all. A Pi Zero would be great for that purpose, though.
Might be a bit too bespoke, but there's a company that actually repurposes old industrial x86 motherboards and builds old DOS/Windows 9x PCs for modern usage.
I guess I am in the minority and being contrarian again.
I don't have a problem with lock down, repair etc. At least not with the current iteration of computing.
While they are bad, these lock down means less competitions. And having less competition means less choices, etc, all of that leads to my final point which in my view is perhaps the most important, these company create crap products, software and services.
If they had continue to innovate and push I would have less of a problem. Look at Microsoft, and now to a less extend Apple as well. In the pursuit of more revenue they now make crap.
Therefore the more personal computing in my view isn't the benefits listed. It is to keep the company itself honest. To make them aware they need to innovate. To give a damn, to make something better.
Also, looking at abandoned blogs and old photos of people lying next to their computers from the early 2000s is so interesting. It captures a time when people truly connected with their machines and made them part of their identity.
Hmm - rather than identity, I suspect it was more of a marketing trope.
I associate this genre of photo with the photo-shoots with Gates, Jobs and others. All the interviews and full page ads in the 80s 90s had variations of sitting/lying on desks, hugging CRT monitors or the classic folded-arms lean on a CRT from behind.
I don't recall old-school blogs doing this or really having author photos at all (photos on that bandwidth/hosting?!) but I imagine whenever a blogger was interviewed for print media they would lean on the "computer person" standards.
It's a nice article, but like so many I feel like it has a reluctance to address some of the issues head-on. Like this:
> I’m not calling the tech industry evil.
Well. . . why not? I think at this point the tech industry is evil. Not in the sense that water is wet, and maybe not even in the sense that mammals birth live young, but sort of in the sense that ice occurs near the Earth's poles. There are some bits and pieces here and there that don't follow the pattern but they are the exception and they're getting smaller.
That doesn't mean that technology is evil, but the ways its being used and developed often are.
And that gets to another aspect of this that I feel like people in the tech world sometimes overlook when talking about this: enshittification is not a technological phenomenon. It's a social phenomenon that is driven by our socio-legal apparatus which allows and encourages high-risk pursuit of sky-high profits. Corporate raiding a la the Sears debacle, consolidation in grocery stores, even something like tobacco/health or oil/climate-change coverups, all these are forms of enshittification. Tech is the most prominent venue, maybe because it's especially suited to certain forms of vendor lock, but it's by no means the only one.
Enshittification happens because we are not willing to take a sledgehammer to the idea that making businesses bigger and bigger is a good thing. Timid reforms focused on things like data privacy are woefully inadequate. Large companies need to be directly dismantled, wealth needs to be directly deconcentrated, and broad swaths of behavior that are currently happening left and right need to become the kind of thing that will land you in prison for 10 years.
I'm not optimistic this is going to happen without some kind of "hitting bottom", though, whatever form that may take.
Maybe I'm too cynical, but too many people in power directly benefit fom enshittification for anything about it to change. Even just the problem of fixing the housing market while the majority of politicians own several properties is an example of this. There's zero incentive for anything to change.
I'm cynical enough to basically agree. :-) I do think it may change, but that's what I mean about hitting bottom. It may have to get so bad that there is some kind of violent uprising or societal collapse/fracturing.
Just by the by, ironically, I've heard, from my tenuous connections into ideological spheres outside my own, that a decent number of people voted for Trump out of a similar desire to shake the foundations of the system. Of course they've likely been hoodwinked, but I think the opportunity is there for a Bernie Sanders-esque person on the other side to make some change by whipping up a frenzy of rage at the status quo and promising to get out the pitchforks. The question is whether such a frenzy can be accompanied by enough cool-headed calculation to be effective.
It's very easy to make computing personal: unplug it from the Internet. Unfortunately, the options are much more limited and very soon you discover, boring.
When someone can't accept that other people value different things than they do, and that person also has uncommon values, they will spend a lot of time upset that everyone is wrong. They'll look for something that explains why everyone is choosing to be stupid and evil, and they'll try to find that comfort in the worldview that made them upset in the first place, retreating ever further from the chance to get real answers.
When people with that kind of worldview roll their eyes at empathy, or scoff at any need to see beyond their own opinions[0], they are all but guaranteed to seal themselves inside.
FOSS, decentralized, et al. attract a lot of people with those worldviews, and that story is the story of them failing with consumers over and over and over, doubling down on what failed every time.
For me, this was the most touching part. The rest I mostly agree with, but have thought or read many times before.
> And there’s another problem. Very soon, we might be threatening the continuity of history itself with technologies that pollute the historical record with AI-generated noise. It sounds dramatic, but that could eventually undermine the shared cultural bonds that hold cultural groups together.
> That’s important because history is what makes this type of criticism possible. History is how we know if we’re being abused because we can rely on written records of people who came before (even 15 minutes ago) and make comparisons. If technology takes history away from us, there may be no hope of recovery.
History is written by the winner. So I would not say that historical records themselves have a value, but historical studies overall including archaeology, cross-checking multiple scripts from different sources.
So noise has always been there as well as methods do deal with noise.
The author claims, "Internet surveillance, the algorithmic polarization of social media, predatory app stores, and extractive business models have eroded the freedoms the personal computer once promised, effectively ending the PC era for most tech consumers."
I'm not required to use social media and extractive business models. Intenet surveillance is lamentable but I don't see why he thinks app stores are predatory. The PC is still mostly a force for freedom. The privacy losses are more than offset by the gains of communicating with everyone on the planet.
App stores are middlenen who dictate what users get to see and consume, while taking 30% from what's approved. I think thars eye predatory part.
Apple is the much more obvious offender, even for stuff not traditionally stigmatized against. Microsoft struggled to release Xcloud becsuse Apple didn't want a game streaming service on IOS. Meanwhile, steaming music, videos, and anything that works on its purposefully botched internet was fine.
>The privacy losses are more than offset by the gains of communicating with everyone on the planet.
Definitely a contentious take these days, given recent events.
That's kinda what I was thinking too. There is a privacy loss for sure, but the average consumer also gained things for that loss.
Maybe Amazon in 2000 wasn't so icky but there was also no free same day shipping. Apple II could be repaired without "special tools" but those machines were huge, heavy, mostly empty space, and gap and glass alignment was way worse. I wish I could say something smart about Windows 95 but I've worked hard to erase it from my memory, so I can't. :)
Electronics things, just in general, did a lot less in the past. With that comes good and bad.
Privacy is a trade-off and right now the general public doesn't place a high value on privacy so they're happy to trade it away for anything. Honestly I understand it. I'm convinced I'm going to get bombarded with marketing nonsense regardless so I might as well get something for it.
> I wish I could say something smart about Windows 95 but...
Remember how its uptime was limited to 49.7 days because of a timer's numeric overflow (and in something like an audio driver, too, it shouldn't have been system critical). Good times.
A lot of computing in the 90s and earlier was terribly unstable. And that was without considering how prevalent viruses were in the 90s, too.
You're equating avoiding mass-market social media with becoming a hermit on a remote island.
But you're here, saying that on HN.
I've seen people say similar things on Reddit, in IRC channels, on blogs, Gemlogs, Mastodon posts, and other similar venues, without realizing the irony of it.
Bizarre stretch of logic to extract an irony. Meta platforms in particular (facebook and whatsapp) are in many countries an almost exclusive intermediary to any online communication. You are basically incapacitated if you do not use them.
Thats true, but its theoretical. As an individual you frequently face a take it or leave it option. Have you ever tried to move an existing community e.g., from Whatsapp to Signal or can you do anything if an entire country has chosen to reside on Facebook? In the short term, you either accept defeat and learn to love the adtech bomb or you withdraw into the digital wilderness. In the long term... we are all dead.
> Have you ever tried to move an existing community e.g., from Whatsapp to Signal or can you do anything if an entire country has chosen to reside on Facebook?
What does an "entire country" have to do with it? People move online communities between platforms all the time -- and many communities have presences on multiple platforms.
> In the short term, you either accept defeat and learn to love the adtech bomb or you withdraw into the digital wilderness.
I'm just not seeing the argument here. Suppose you've got 50 users on Discord and would prefer to move to Matrix. So you post a link to the Matrix channel on your Discord server, lock stuff for further posting in Discord, and update external links and documents. People do this sort of stuff all the time without being "defeated".
yeah, thats pretty clear. Because you choose to focus on cases where you do have agency to do something, e.g. its my discord and I am moving us to matrix - and goodbye to those who will not migrate.
Now think about an established group where you are a simple member and you say, "hey folks, why don't we move to something that is better for us, no ads, no data collection, etc."? And they look at you with glazed eyes, and... shrug, and that is the end of the conversation. Now what Don Quixote?
> What does an "entire country" have to do with it?
In countries with high facebook/meta adoption if you want up-to-date information about an event or an establishment it may only exist on meta platforms. Only larger entities can afford to have an independent website, and many such sites are typically in a state of disrepair and neglect.
As an individual trying to go against so-called network effects most of the time you have very little leverage. Its really fighting against wind mills.
The previous comment was complaining that we can't improve the situation, as there are no viable alternatives, in a discussion that is taking place on one of the alternatives. That's the irony.
> I'm not required to use social media and extractive business models.
Most people do use them though.
> The privacy losses are more than offset by the gains of communicating with everyone on the planet.
I completely disagree. Most people aren't actually communicating. At least not in any form that matters. The drastic increase in loneliness and depression that correlates with the increase in connectivity should at least show that more social media doesn't mean more happy.
Sadly relatable to a lot of real life. Modern people don't talk "to" each other some portion of the time. They talk past each other in this mimicry of discussion.
We’ve been working on a new kind of home computer for a few years now based around microcontrollers. Unlike a traditional setup, it’s aimed at replacing the traditional light switch to provide environmental awareness and bring families closer together. Although the base OS is not open source, the SDK is totally scriptable, meaning as an owner you will be able to trace and understand the device fully.
Examples of what it can do
- Autonomous lighting with mmWave radar with 180 degrees fov and ambient light sensor
- Recording of temperature, humidity, barometric pressure and VOX to onboard SQLite database at a chosen interval.
- Onboard web server, which serves as dashboard and configuration page.
- Communication platform with integrated microphone (hardware indicator light, off by default) and speakers. I’m also experimenting with talking to LLMs like this.
And many more things. If you’d like to reach us hello [at] sentionic.com
Yes. The thinking was: displays are all around you already. Why not build an ambient computing platform. If it replaces infrastructure in your home it needs to provide enough value. I think we’ve covered that now, but it’s taken a few years.
We have to be cautious about the attention-grabbing stuff that's everywhere now, but we're only human and easily tempted. This is bad for unsupervised children/teens. Even young adults can be pretty immature now.
When I go out for a walk in the forest, I see maybe one or two people walking a dog. Where is the rest of humanity? Watching TV, playing a mobile game, whatever...
I keep telling VCs and Angel investors that now is the time to look for relatively under-valued high growth startups using Reinforcement Learning to solve real world B2B / engineering / logistics problems ...
The balanced NPU/GPU/CPU with close cache ram, such as the recent Lunar Lake chip, coupled with better integrated GPUs in consumer class laptops, and a nice web GPU webgl api .. make for a lot of capability for problems that are well solved by Monte Carlo simulation ... and RL generally.
3D apps are now really doable in browser on current devices... and the browser is a great delivery device for applications, avoiding platform specific installs and dependency hell.
If we wanted to replicate the 90s, 00s, and even the 2010s era gaming experience - mostly single player, no micro-transactions, and so on - how do we do that today? Is there way to discover games that aren't trying to extract as much money from you as possible?
There are actually a fair number of indie games like that. Of course, a lot of them are only available via something like Steam, which is a networked platform that has some dubious aspects. GOG is somewhat better (e.g., you can download offline installers that are supposed to be self-contained and work even if GOG disappears). The harder part is finding out which such games are good. The best way I've found is finding communities of people who play them and recommend them to one another. Of course, those communities nowadays are often on a platform like Reddit or Discord which are themselves in the process of or ripe for enshittification. But for the games themselves, it's possible.
I have taken a further step back and started using an electric typewriter for my thoughts and ideas. It’s been good for me since I can’t switch away to another task without having to stand up and go to my other desk. I have written more down in the last month than I have in the past two years.
> For example, which part of the Apple II was predatory?
How about the price?
A quick googling suggests that it cost ~6,500 USD (in today's money) to buy an Apple II when it launched. Obviously it was a different time, but that sort of price today would likely be called predatory by at least some people.
Personally, I feel passionately about being held back from innovating due to legal and corporate anti-competitive barriers. I feel that these artificial barriers contribute to a less competitive environment and increases the problem of e-waste.
# The barriers put in place by companies to prevent their hardware from being tinkered with
Locked bootloaders and the absence of hardware documentation to base the development of drivers on is an example of this.
This prevents the community from taking over when a device reaches end of life or expanding a device to be more useful/open than the company originally intended.
Examples of this are:
- Apple's M-series Macs/MacBooks. While the hardware is remarkable, Apple's anti-competitive practices manifest in MacOS holding the devices back from their potential. Asahi Linux is an indicator of demand and its success is remarkable given what they are up against. If Apple was compelled to provide reference documentation of their hardware sufficient for driver development then the resulting alternative operating systems introduces competition to an otherwise stagnant market.
- Microsoft's Surface laptops and broadly the new X-Elite hardware lineup shares the same criticism as Apple's platform
- Mobile phones. Imagine an iPhone running Android. Imagine a Galaxy, Pixel, etc running Linux where Android apps are executed within Waydroid containers? Not going to happen because we are either blocked by bootloaders or blocked by a lack of drivers (deliberately hidden by manufacturers)
- Better health trackers. Imagine buying a FitBit, installing an community maintained operating system that has no subscription fees and handles health inference through transparent algorithms that can be contributed to by academics around the world.
# No "right to repair" software as it's practically illegal
It's virtually illegal to repair software. Decompiling software and fixing it, even if it's end of life, can land you in court.
There are so many software projects out there that I would personally love to revive. Think of games like Heroes of Might and Magic 3
Anyway, I've been ranting too much on this topic but you get the idea. I wish governments would grant people protection to tinker/improve hardware AND software and compel corporations to provide sufficient documentation to practically enable that.
In the 90s I had no DRM or walled garden OS, and I couldn't stream movies or do online banking from my phone.
In the 2020s I can do both of those things from walled garden OSs with DRM. The option to use an offline OS is still here too, although I can't do those things with it. That's a step forward?
- legalizing DRM circumvention (repealing DMCA section 1201)
- comprehensive privacy legislation (US adopting GDPR+)
- right to repair legislation.
- antitrust action and enforcement against tech companies
As he notes, these are currently non-starters because of legislative and regulatory capture.
But even if they were implemented, laws and regulatory enforcement might not have the desired effect. Surveillance capitalism, adtech, and data brokers seem to be surprisingly GDPR-resistant, and California's CCPA seems to have had minimal effect. Right-to-repair is limited by miniaturization and component integration, and the result seems to be Apple's impractical and expensive repair kits. Antitrust seems to be ineffective (see IBM, Microsoft.) Even the DMA, carefully crafted to target Apple, Google, Meta et al. (and to extract billions from them for noncompliance) doesn't seem to be affecting the dominance of those companies just yet.
there's only two things that can change how companies operate: stop buying their products or services. Another, keep informing people about what they're actually buying, since companies like to keep that hidden.
Until a lot of people start questioning their habits as consumers, companies won't change.
Things change. Some for the better, others for the worse, but not all these changes are bad and some are inevitable, and it's not s if the past was all roses anyway.
What is a coin in an arcade videogame if it's not a microtransaction?
Software as a service is just different, and it's not all bad. You have automated upgrades, a consistently funded developer that can better plan and deliver updates, if you only need the software for a short period of time it can be cheaper. Frankly the packaged software approach was a kludge due to the technical limitations of the time. Now if big releases make sense developers do that, if incremental updates over time provided as a service make sense, they can do that.
Most of the section on what we can do about all this is focused on stuff that didn't exist in the past. The internet and online services, social media. Going back to the past wouldn't be to do those in some ideal way that used to exist, it would mean not doing them at all. Sure.
There is no ideal past to go back to without pulling the online plug. However, that plug isn't going away, and we don't actually want it to. The "How we can reclaim control" bit at the end is mostly correct, but it's really about coming to grips with managing the new reality, not going back to a situation we've outgrown.
On my days those microtransactions were 20 escudos, 25 escudos, 50 escudos, 50 euros (great deal with this one, 1 euro == 200.482 escudos), 1 euro, eventually arcades died after this.
Not who you replied to, but one view might be something along the lines of "We can't break up these American tech monopolies because at least they're American. If we break up these companies European or - worse! - Chinese or Russian companies will use their outsized scale to simply replace the American companies we kneecapped and now we're worse off".
That is, some people might see it as a lesser of two evils situation where you're choosing the domestic monopolist rather than the foreign one. I think there are certain worldviews where this is worth enough to sacrifice privacy, competition, etc.
A depent system is a stable system . If the population has diabetes, it can not riot. If the computing can not be done without a network , you can not plot any systemic change revolution.
One must stop seeing things from a technical tradeoff perspective and start perceiving them from a political stable due to hostages perspective. The arguments make sense because they must.
Just as your iPhone or Android sends GB of data back and forth without your knowledge or approval, our dear friends in Microsoft want a piece of that pie.
Most people are already consumed by the 'machine'. Those who resist will stick to Win10 as much as possible (I got a 2014 laptop that runs Win10Pro and runs perfectly) and my gaming desktop doesn't need an upgrade for another 10 years.
All we need is selective updates, run privacy/blocking tools, change our hosts file, run a firewall (WindowsFirewallControl) on MediumFiltering, etc.
Unfortunately only few can do such fine-tuning to their PCs (a few thousand people in the 8bn people).
The rest will be consumed by the 'machine'. I've mentioned on another topic/comment. We are cattle. We push back very rarely and on very few topics.
I am a Gibson-kinda-guy. Take the $200. Give me the OS. Stay away.
While there are nice ideas in general, too much of it is looking at the past with rose colored glasses. And this makes the argument to go back to these ideals kinda icky. If we really want to do something, we should have a real critical look at why we're here in the first place IMHO, and this isn't it.
> For a while—in the ’80s, ’90s, and early 2000s—it felt like nerds were making the world a better place.
The nerds (dare I say "we" ?) made the world a different and more connected place, with clear evolutions in regarding finance, productivity and science.
Does it make the world a better place ? Did the productivity and finance improvements bring a better and more welcoming society for instance ?
It can be argued either way, but that question can't be glossed over as a given IMHO.
Then there is no reflection on how computing has become a commodity. It still needs more freedom and control, but these two ideals don't mean the same thing if you're a 30yo single DevOps engineer or a 50yo at home parent watching over 5 kids. Both need computing, but the purpose and intricate needs are completely different. Focusing only on one because it's easier kinda misses the point IMHO (and we're back to the role of technology and how exactly it makes the world better)
I admit that I did not RTFA, but for a while I've been thinking that with the ascendancy of dumbed-down, peck-to-get-your-reward touchscreen devices... computers are returning to the domain of computing enthusiasts.
I remember taking my Atari 400 on family vacations, reading Compute magazine by the pool and learning a shitload about programming just by reading. Oh and yes, I did ride my bike and have friends and play baseball and go to the beach. Computers were just another fun thing to do. And eventually I put what I learned to use at Apple and several other big-name companies.
Today, the dominant platform (Windows) is an execrable, intolerable shitshow of anti-user arrogance and aggression and abuse. Apple's platforms are better, but I have little confidence in how long that'll last. In the end, I guess we're going back to "nerds" using real computers running Linux, and pigeons pecking at big colored buttons on touchscreens to get their reward pellets.
Very good summary of all the things that have gone wrong with tech (and this is a long and growing list). The part on how to recover from this unfortunate mess is a bit handwavy though.
Not that there are ready made solutions that are being ignored, but if we are going to move beyond conceptual statements it will require some pretty potent medicines that can start fighting the cancer by taking it head on.
The enshittification is now in an advanced stage and the billions of addicted masses an enormous inertial weight. Witness e.g., the grotesque politics around the tiktok non-ban.
Imho a key ingredient is to ditch the focus on the "personal" and start thinking of "interpersonal computing" (just made that term up). Basically personal computing that is network-first, web-native. The owner-operator is empowered to join the matrix, find their way around without gatekeepers, connect with agency, exchange, filter, process with helpful and transparent algorithms and get on top of the informatiom firehose. Nothing radically new in terms of hardware or software, just rearranged furniture to serve citizens, not some digital oligarchy.
The huge success of social media is because it tapped into the immense sociability of our species. Somehow we need to reclaim that trait for the good side of technology, with devices and software that are actually desirable without being leeches that suck society dry.
Yeah people often say "you love things that are invented before you're 35, and you hate the things after that".
But do young people really love this hyper-commercial internet these days? All the subscription services? The empty social media content?
I do see what they mean a bit because I'm pretty sceptical of AI, though I did set up my own server to experiment with it in a way where my stuff doesn't end up in the cloud.
Young people use social media as time-wasters, self-promotion avenues, or ways to boost their business and use group chats to do most of their actual communications. Techies really love to get angry about social networks but the reality is younger folks know how to treat social media the same way older people know to take random internet comments with a grain of salt.
The techie web isn't coming back and wishing it so won't do so. You can always just drop into an IRC network or Mastodon server with other nerds, but the days that everyone on the web was a techie nerd with general-purpose computing interests is long gone.
Well I don't really get angry about socials. Just don't use them.
> The techie web isn't coming back and wishing it so won't do so. You can always just drop into an IRC network or Mastodon server with other nerds, but the days that everyone on the web was a techie nerd with general-purpose computing interests is long gone.
Well in that way it's still there. It lives on here on HN and the other places you mention. Probably as big as it was in those days. I don't think it's really gone. Just the internet grew around it with all the commercial BS and big tech companies viewing users as products.
Also, us techies manage to avoid the worst of that with adblockers, pay wall blockers, pirate video downloads, self hosted services etc. I probably see only a handful of ads every day. Even my phone blocks most of them. I also have custom scripts for the sites I frequent the most to make them more info dense like hacker news (and remove most of the big photos)
Honestly, techies like us will always have an edge on how we produce and consume digital information. Folks who do home maintenance like jobs can often do repairs and customizations on their on homes or rental properties much cheaper than the average person. Plenty of other jobs give comparative advantages. These comparative advantages are IMO a fun output of human diversity.
> Every generation looks back and says, “Things used to be better,” whether they are accurate or not.
Yep, earlier eras of computing were characterized by more user control, less surveillance and fewer predatory business models. Yet it’s important not to overlook the progress we’ve made. Modern tech is vastly more powerful, accessible, and interconnected than what came before. And in a case of a tech world nostalgia should inspire action for improvement
The political approach in the article might help a bit in forcing some design changes across the hardware and software industries:
> "We need comprehensive privacy legislation in the United States that enshrines individual privacy as a fundamental right. We need Right to Repair legislation that puts control of our devices back into our hands—and also DRM reform, especially repealing Section 1201 of the DMCA so we can control the goods we own and historians can preserve our cultural heritage without the need for piracy."
Laws alone won't be enough - there's a need for new design approaches for production devices and systems. For each of the above:
-Expanding high-speed internet to all regions of the country is a positive, but privacy is limited because metadata is visible, and if we assume all nations are tapping into the trunk of the internet and collecting everything that transits their systems, this means strong encryption should be the concept around which all communication systems are built, so that at least the content of the messages can't be read.
-Right-to-Repair should extend to device design goals in which maintenance and replacement of components is intended and user alterations and upgrades aren't actively blocked. Batteries should be relatively easy to replace, etc.
-For cultural history preservation, allow archivists to bypass DRM and store offline backups of materials. Also make it easy to become an archivist and build communities of archivists.
1. Basically all the examples given in the article are of tech that is better now:
- Nintendo Switch games don't have microtransactions now
- VHS are unplayable now because people no longer have the machines. You can still buy anything on Blu-ray and own it forever but most people prefer the convenience of not needing a machine and disc collection.
- On Amazon now, nearly literally anything you could find in a box store is available, and you can have most of it in 1 day. Just buy from reputable brands or Amazon itself and you will be fine.
- There are so many benefits of a smartphone -- maps, internet browser for emergencies, music streaming, audiobooks, 2-factor. Flip phones are still around but no one uses them
- Google search is barely even needed now because of chatgpt, which also doesn't have ads and seo trash
- Ubuntu is better than Microsoft 95 and doesn't track you
- Social media is worse now, I'll concede here.
2. The article seemingly champions personal liberty and then has a section titled "How we can reclaim control". How about we let consumers decide what they want? If you don't like microtransactions don't buy games with microtransactions, eg.
3. It's ironic that the community run by the premier tech vc seems so against capitalism.
The embrace of Apple’s ecosystem shows that many people, especially those in tech, prioritise this aspect less. I just wish people would stop doing mental gymnastics to "justify" their choice—not because I think it’s hypocritical, but because I believe there’s no wrong choice here, only personal preference. If you feel the need to justify your decision, it might be because it doesn’t fully align with your true values.
The Californian Ideology is either pro-corporate or corporate-naive. Either technology itself is deterministically going to democratize things (like the Internet was an auto-democratic force, they thought—or now AI is going to help liberate everyone, hah) and/or you just don’t need to worry about private interests.
But private (corporate) interests didn’t just come out of nowhere. The Internet was created by the US state (federal) sector and then handed over for commercialization around the Clinton era. Should anyone be surprised about the turn of events?
Now the author, just as naive as the rest, talks about reigning in corprorate interests by enacting laws. And who is gonna make the politicians do that? The rich control the government. Many of them are the tech-rich.
Biden said in his farewell address that he was worried about a rising Tech Robber Baron era. Yikes. Someone should have done something about that. Like the departing president, perhaps?[1]
All of this was mainly done by the rich. Not by nerds (because not all nerds are rich). But the less wealthy California nerds who bought into the Californian Ideology helped it along.
[1] It’s not that he did nothing. It’s that he did a half-spirited job of it. If he really meant and was motivated by his own words he would have done more.
1. They are doing a little bit of revisionist history, as the industry was fiercly capitalist and proprietary at that time.
2. This topic really does feel rather beaten to death and I think the target audience is not getting any new information.
Speaking specifically about the revisionist history part:
> At its core, the PC movement was about a kind of tech liberty—–which I’ll define as the freedom to explore new ideas, control your own creative works, and make mistakes without punishment.
Was it? The PC has its roots in IBM, and it became the target product to clone because, since the project was something of a sidenote to IBM's main business, IBM was too cheap/lazy/wahtever to develop proprietary parts. They cobbled together a system that was easy to clone, perhaps entirely by accident.
The PC wasn't a universal compatible open standard because of tech liberty, it was a compatible standard because (among other reasons) Microsoft introduced a new OS business model where PC clones fighting each other over low margins benefitted Microsoft. Before Microsoft DOS, each PC was its own moat with its own hardware, its own operating system, and its own proprietary software. Microsoft made everything easy and wonderful as long as you kept using Windows.
Apple operated back with the OS/hardware/software moat back then and that's essentially how they continue to operate. They are the only company that survived after that era using that fully proprietary business model and still operates that way.
As another commenter pointed out, Nintendo was ruthless about hardware DRM and was a full blown monopoly in their heyday. That's why your parents always call it "Nintendo" instead of "video games," because there was no other vendor anywhere near as successful at that time.
Another example of a lack of tech liberty, "Don't Copy that Floppy" was all over the place, a phrase that I've heard injected into Computer Chronicles episodes. Companies were doing all kinds of things to try and prevent you from inspecting, modifying, and copying their software.
The Linux kernel didn't exist until 1991, and most UNIX flavors were proprietary.
The only reason that era didn't have invasive privacy and data extraction problems is because it wasn't feasible, not because it was an era and movement that had excellent tech liberty.
Compare that to today, and it's actually today that's much more of an era of personal computing freedom. I certainly wasn't using an open source web browser, open source IDE, open source server operating system, open source graphics driver, open source PDF editor/viewer, or much other open source software in the 90's. It would have been unthinkable back then to use an open source program to do something like 3D graphics rendering, that would have been reserved for 5-figure Silicon Graphics workstations. And good luck replacing Adobe with something open source.
Hosting a major commercial website for a fortune 100 company on an open source operating system? You would be laughed out of town.
Most older closed systems were vanquished by the end of the 80s, early 90s.
Unfortunately, the mobile revolution didn't work that way. Regular folks don't care about open, flexible, and cheap. Only convenient and cheap. The gravity of those folks has led us here.
I will however observe;
None of the supplied examples showed any form of network effect. It was all stuff you did at home.
Today, there are certainly options for personal computing for most everything- as long as network effects are not in play.
Those options may not be as convenient, as cheap, or as feature-rich as the invasive option. That's fair though - you decide what you want to prioritize.
Network effects are harder to deal with. To the extent that in order to be in community you need to adopt the software the community has chosen.
Not surprisingly, software producers that can build-in network effects, do so. It's excellent from a lock-in point of view.
The title of the article is perhaps then ironic. It's trivial to make computing personal. All the tools to do so already exist.
The issue is not Personal Computing. It's Community Computing.
Example of something I wrote then: https://lists.squeakfoundation.org/archives/list/squeakfound... "I guess I always saw Squeak's purposes as a bit broader still, relating more to "individual and group empowering transparent ubiquitous computing". For example, I see Squeak concepts as providing an OS neutral platform for various languages (Python, Lisp, Forth) [an open source .NET] and I don't see how that is going to fit into a mission statement that links to Squeak as is with a perception of a Smalltalk environment only. Granted, most people joining this list may have no direct interest in this, so that is not to say the purpose of the organization should necessarily incorporate that, especially if it has detrimental effect by making things too overly broad. Here's an alternative -- anchor the effort on one side by Squeak and have it open ended on the other. For example, "To assist the evolution of a individual and group empowering transparent open-source ubiquitous computing platform starting from the initial Squeak code base". I don't think this is out of character with for example where Alan Kay has wanted to go with Squeak in regards to a "Dynabook"."
Smalltalk (and Squeak and its derivatives) of course can empower groups like with, say, the Croquet project. What I was talking about then was mostly about emphasis around a common shared purpose (like the Chaordic Commons approach suggests) in the context of creating a formal "foundation" organization. https://en.wikipedia.org/wiki/Croquet_Project
If I squint, it sort-of appears that JavaScript/asm.js (supporting a variety of transpired languages and running in every browser and communicating in a variety of ways) is a sort-of realization of part of that vision.
It's not the 2011 iOS anymore, if an app today hides its video projects from the user, it's entirely the app fault.
I imagine, for example, that if the internal project files for a popular video editing app were accessible, we’d see competing and/or open source apps emerge that could parse them, were the original app to become suddenly unavailable. Instead they’re just lost because your phone won’t let you access them.
If you chose to use iAnything then it's a bit late to start complaining about lock in now.
However, sure, lots of users chose Apple knowing exactly what it is. Apple's not going to change since their model clearly appeals to lots of people.
If you don't like Apple's model, then don't choose Apple devices. What everyone else chooses is somewhat irrelevant to you. (Other than network effects noted earlier.)
I can hack up a "device" with a raspberry pi zero or whatever and call it "HaxyDeck" and claim it is all open to anyone who wants to tinker with it, but at the end it'd be irrelevant because only me (and perhaps a couple other people) would have it. The aspects you want to ignore (number of users, be something other than Apple, what others are using) would actually affect my use of HaxyDeck directly: since i'd be the only one (or one among a tiny number) using it, i'd be the only one having to make it do things i want, it wont have software from others, it wont support software other people may want to use for communication, software some services that theoretically have nothing to do with phones or computers (e.g. banks) wont work because HaxyDeck's userbase is irrelevant for them, etc. All of these have to do exactly with what others are doing.
Basically see how all the non-Android Linux phones (like PinePhone) are faring. You can't just ignore what effect having a large user base some platform (be it a device, an OS or even service) has and say "just use something different".
They are hardly irrelevant, especially if you like money.
For example, the apps from Omni do this, as do obsidian, Linea…
Let’s assign the blame where it should be here.
Obviously the blame lies on Apple for locking away your device's contents from you. Developers should not be able to have more control over what you can access on your device than you do. Even if they make bad choices (like making accessing the files hard) it should be you who has the final say, not them.
Apple making it possible for developers to make bad choices and go against users' control over their own devices is to blame.
The internet is where I get ideas and news (and some of the above content — magazines as PDF for example).
So I guess the "network effect" I keep to as much of a minimum as I reasonably can?
(EDIT: oh, I don't really use my phone except as a camera and road navigator. I would love to have a completely offline map app that was decent.)
That's what's wrong with the various "federated" social networks. They lack a network effect that makes them grow.
That’s it. Both Open Source and Federated thinks that distribution gateway federation something is something a user must know and be fond of. The user not only couldn’t care less but actively refuses this complexity because they cannot trust their own uneducated decisions. They go for the nearest CorpThing that seemingly just works for everyone and decides everything for them after they tap “Next” a few times.
So I managed to log into one of the 3 accounts I'm sure that I have still. And I'm a software nerd who makes "educated" decisions all the time around this stuff.
Protocol People really care about that, and you know what? It becomes their network effect. But it is a self-selecting network. The nature or design of what effects and attracts the network is the same mechanism for limiting its size.
TikTok, Instagram, Snapchat all focus on things that other people really care about—namely video creation, photo curation and ephemeral small network cohesion. and those focuses attract other userbases.
Probably, there's a lot more people who want to create and watch short videos than there are people who want to nerd out over what their 1/10,000 servers' community rules and protocol settings are.
The federated stuff is unsuccessful not just because of protocol stuff (if people really wanted, they would find a way) but because it's not cool yet.
The only reason people go on those networks is to try their luck at popularity and find a way to cash out in various manners. Other than that, there is not much point going on there, why would you waste time broadcasting all kinds of things you do instead of just doing more...
We only think of computing as "personal" at all because of that brief period in the 70s when very simple toy computers, just powerful enough to run a spreadsheet and play some basic games, became affordable.
But computing was invented to solve wartime problems of various kinds, including ballistic calculations, warhead design, cryptography, and intelligence analysis.
Almost immediately it moved into corporate accounting and reporting, and commercial science and engineering.
It took thirty years for it become "personal." Its roots are corporate and military, and it was never - ever - suddenly going to give those up.
Worse, a lot of open/free/etc "solutions" are built by people who like tinkering, for other people who like tinkering. That's fine when you're making an OS for a web server, but a disaster if you want technology that's open from the POV of the average non-technical user.
You can just about, now, start to imagine an anti=internet which is distributed, secure, non-corporate, and controlled by ordinary non-technical people telling smart agents what to do.
That might, just about, with many caveats (it's not hard to think of them), become a technological solution that builds a true decentralised network.
But for now we're stuck with corporate centralisation. And that's not going to be fixed by going back to 8-bit micros, or with a Linux phone.
The federated networks I am part of are pretty small and we have a lovely time sharing diverse interests, getting to know each other and even disagreeing sometimes without the blind hate, persistent negativity and gotch'a seeking you typically find on places like Facebook, Twitter and Reddit. Too much growth too quickly would destroy that, turning those small federated networks into another cesspool of bad behavior.
However, I am open to hearing why people disagree. My personal experience drives my opinion, so ymmv.
I personally switched to Lemmy from Reddit after the API debacle, and I've found it to be an extremely compelling platform exactly -because it was federated. I can curate my feed from hundreds of large and small instances with nary a corporation in sight! It's self-hosting as far as the eye can see, yet it has enough interesting content and discussions to keep me coming back, without any ads or algorithm trying to manipulate me.
It feels like 90's internet full of webrings, and it's glorious.
On the other hand the pitch to get people to join is weak. I don't pitch it to my friends because (currently) its a pretty poor experience compared to what they are already using.
I don't pitch it to my friends because quantity invariably destroy quality, or at the very lease hide it behind a huge pile of dirt. I don't pitch it because people who are interested in a better internet already care and know how to find it. I don't want to ruin a nice well behaving network.
At some point of a network popularity, it feels like there is an influx of people who want to talk to you but lack reading comprehension to read your answers. Or maybe it's specifically that every "become popular fast" algorithm tries to repeatedly throw you to them.
Curating a corner of web for yourself takes time and effort, and if a social network popularity outpaces you, then you just can't do that.
Most people's primary, if not only, computing device is their phone - which at the same time is probably the most restricted device.
And if you wanted to build your own and connect to the mobile network - it's considerably harder than doing the same for a traditional personal computer.
It's interesting there is a lot of agreement. In a way I'm surprised because I often get the impression a lot of people here have pretty well drunk the Kool-aid of corporatism.
What would be actually surprising is to read a full throated defense of modern tech and the companies that build it, and then see an HN thread full of agreement. It's certainly possible, I'd disagree with almost everything in the article. But the sort of people who disagree tend not to waste as much time on HN as me :)
"At its core, the PC movement was about a kind of tech liberty"
There was no such thing as the PC "movement". Personal computing was a market driven phenomenon in which competition drove the price of computing down far enough that people could afford to have one at home - that's it. It didn't represent any particular philosophy of society either. A microcomputer in the 80s was one of a wide mix of competing manufacturers, all of whom were much more closed than a modern computer. Proprietary hardware and software ruled the day. DRM was widely used in this era, including "hardware" DRM like code books or oddly manufactured floppy disks.
By the mid 90s IBM has fluffed its control over the PC platform, so hardware is at least getting more open and interoperable and you can "control" your device in the sense of adding more RAM or extension devices. Pretty useless to anyone who isn't a HW manufacturer but nice in terms of better enabling a free and competitive market, continuing the downwards pricing pressure.
But open source operating systems barely existed. Linux was just a few years old and most of the world was connected to the internet by a modem if at all - Windows 95 didn't even come with a TCP/IP stack so unless you happened to work at a university or other org with plentiful bandwidth, and have the time and patience to compile a lot of kernels, it was basically not possible to obtain an open source OS at all. DRM was still widespread, now with exciting things like USB dongles and garbled CD-ROMs.
The world this guy thinks existed never did. To the extent there was anything special about the microcomputer, it was that aggressive market competition made previously expensive devices cheap enough for people to buy at home. Nothing about this was a social movement though, and nothing about it came with any particular ideology of freedom or control. That's why words like "freedom" in the software context are indelibly associated not with the PC pioneers like Bill Gates or IBM but rather with RMS, who didn't develop for the PC at all. He was writing stuff like emacs and gcc for the existing proprietary UNIX big iron of the time, which were fully proprietary.
Arguably the modern computer is more open, more free and more hackable than any previous time in history. You can compile and run an open source web browser, run it on an open source OS that's booted by an open source BIOS, on an open source CPU, speaking openly documented protocols from A-Z. I don't remember any of that even being imaginable in the 80s or 90s.
Wanting tech companies to be regulated more in this day and age of such extreme tech behemoth domination is left-wing activism in the same sense as (not being a Peter Thiel-style maniac) = left-wing.
[1] https://news.ycombinator.com/item?id=42769886
[2] A tech-specific offshoot of the half-a-century long propaganda campaign to associate “liberty” with “capitalism”
In the past you could do almost anything on a personal computer, it was generally about as fast as mainframe or high end workstation.
Training large AI models is currently impossible for most home users. They just do not have the processing power.
For all the examples mentioned in the parent article, PCs were significantly under-powered compared to workstations, much less main frames.
An explosion of hardware development between 2005 and 2020 has lead to an era where hardware outperformed software needs. Now software is catching up.
But there have always been use cases for high end hardware, and always will be.
video production, climate simulations, pdes, protein folding, etc etc
There were of course "home computery" phenomenons with network effects: IRC and Usenet, for example. There are several reasons why they've fallen out of fashion, but corporations shepherding new users into silos is surely a big one. It's a classic tale of enthusiasts vs. the Powers That Be, although the iteration speed and overall impact is perhaps most noticeable in digital technology.
Perhaps we were naïve to think we'd be left alone with a good thing. I too hope for a comeback of "personal computing", but in every scenario conceivable to me, we end up roughly where we are now - unless also re-imagining society from first principles. And if we do that, the question is whether personal computing would have emerged at all.
Most people I know literally still to use the lowest common denominator of communications because corporates have managed to screw up interoperability in their land grabs to build walled gardens. The lowest common denominator in my area is emailing word documents or PDFs around. Same as we have been doing for the last 30 years. The network effect there was Word being the first thing on the market.
All other attempts have been entirely transient and are focused in either social matters or some attempt at file storage with collaboration bolted on the top. The latter, OneDrive being a particularly funny one, generally results in people having millions of little pockets of exactly what they were doing before with no collaboration or community at all.
If we do anything now it's just personal computing with extra and annoying steps.
And no, 99% of the planet doesn't use github. They just email shitty documents around all day. Then they go back home and stare at their transient worthless social community garbage faucet endlessly until their eyes fall shut.
No problem at all.
If you don't want or need network effects then don't use them.
Yes, the point is, it's evil to do that.
I also write this on a Mac, where I'm watching with sadness the formerly great company being run by bean-counters, who worry about profits, not user experience. The Mac is being progressively locked down and many things break in the process. Yes, it is still better than Windows, where apparently the start menu is just advertising space and the desktop isn't mine, but Microsoft's, but the path is definitely sloping down.
It's just sad.
I don't know what else to write. There isn't much we can do, as long as everybody tolerates this.
My prediction, is that, in the not too far future, perhaps 20-25 years, with the "blessing" of national security, ads business and other big players, devices will be further locked down and tracked, INCLUDING personal computers.
A lot of people already don't own a computer nowadays, except for the pocket one. In that future PCs, if they still exist, perhaps are either thin clients connecting to the vast National Net, where you can purchase subscriptions for entertainment, or completely locked down pads that ordinary people do not even have the tool to open properly. Oh, all "enhanced" with AI agents of course. You might even get a free one every 5 years -- part of your basic income package.
They won't make learning low level programming or hardware hacking illegal, because they are still valuable skills, and some people need to do that anyway. But for ordinary people it's going to be a LOT tougher. The official development languages of your computer system are some sort of Java and Javascript variants that are SAFE. You simply don't get exposed to the lower level. Very little system level API is going to be exposed because you won't have to know. If you have some issues, submit a ticket to the companies who program the systems.
We are already halfway there. Politicians and super riches are going to love that.
Also, you cannot experiment in a safe environment. Safe environments are adequate if your are infantile. But you stay that way if you don't get freedom.
I don't know if this will be effective in any way, but I've decided to start hosting services for my friends and family out of my closet. It seems that money destroys everything it touches, so it feels nice to be doing something for reasons besides money.
My friends and family are not particularly influential people, but I think it'll be nice to create a little pocket of the world who knows what it was like to not be exploited by their tech.
As for operating systems, I’ve been daily driving Fedora 40 and now 41 with gnome environment and it’s been the best OS experience I’ve had yet, I haven’t had to open terminal once to configure a single thing, all my apps are installed from the software “store” GUI and I’ve got the sleep and wake behavior all dialed in the way I like.
It runs equally well on a 2014 Intel Mac mini and a 2024 Mac Studio via Asahi Linux, which was also a super simple install process (uninstalling it and reclaiming the disk space required me to reset the whole drive but pretty sure that was my fault for deleting the asahi partition the wrong way)
Anyway maybe give it a shot, and self hosting things is only getting easier, Jellyfin and Immich have changed my life, at least the virtual side of it :)
I know the usual comments will crop up but, now if ever is the best chance to give it a try, at least as a semi daily driver if you still want to play games or such.
I used KDE (24.04) for a while now. Also used Linux 2000-2008-ish. Have read APUE.
When I win-left/right a window and then resize it, then close it, now win-left/right always resizes it as previous one. There’s no way to reset it to 50:50 (unless logout).
Notification center is obnoxious. Regular “Program stopped working” with zero details. Why do I need this information. Copying a folder in dolphin pops up a progress notification that doesn’t go away when finished. You did something, expect a notification to pop up. You did nothing, still expect it.
Windows either steal focus or fail to steal it when needed, depending on your settings and astrology. Software update nags you to click update, then you click it, then it downloads something for a few minutes (you go doing your things), then a sudo password popup fails to get focus, and it all crashes after a timeout. Or things will pop up right in your face. VNC connection closed? Yes, bring that white empty window to front asap and tell “Connection closed” with an OK button.
Start menu was designed by an idiot, a wrong mousemove and you are in a wrong section. Sections reside exactly on the 80% of bezier paths from start menu to the section content and have zero activation timeout. So you have to maze your mouse around to avoid surprises. Logout, restart, shutdown, sleep buttons in the start menu all show the same fullscreen “dialog” that requires you to choose an action again. What’s the point of separate buttons even.
I could go on about non-working automatic vpn connections, mangled fonts/dpi/geometry if you vnc into a turned off physical display, console that loves printing ~[[A half of the times when you press an arrow. And so on and so forth, the list is so big I just can’t remember it.
Idk how Linux users are using Linux so that they do not meet any issues.
I’ve given up on Debian a dozen times but feel I might actually have a future with Fedora.
We open systemsettings and change things. We also use a launcher, not the start menu.
win-left/right: an unresolved issue in KDE tracker that seems to remain so.
notifications: enter every program in a long list and change settings (most realistic to change).
focus issues: tried all levels (named "low" to "extreme" without any explanation, none work as intended.
start menu vs launchers: I don't find creating a launcher for every app I have reasonable. Fix the menu then call it "a daily driver".
vpn autoconnect: is on, doesn't work.
mangled fonts/dpi/geometry: I'm all ears how to fix that.
~[[A: would love to know.
You can use xmodmap to remap any key to anything you want.
> I don't find creating a launcher for every app I have reasonable
?????????????????????????????????????????????????????????????????????? Just press alt+space and type like everyone else.
> mangled fonts/dpi/geometry: I'm all ears how to fix that.
Use a better font. For some reason they all use the android noto crap by default, so that must be changed if you want to see letters.
Use a better font
I think you misunderstood this. While the physical display is "on", it all looks correct. When you turn it off and vnc into the main X display, it's all mangled, regardless of the font. The order of turn off / vnc into doesn't matter either.
Also, I don't want to globally disable notifications. Maybe I just have to globally disable graphics? That would indeed solve many issues with linux desktops.
Feel free :)
Cause Linux Desktop is very far from being ready to use, complete or bug/stupidity-free. Cause you didn’t address even a half of the issues here and was only picking on trivial functions that you understand and found coping workarounds for. “Reject any solution”, lol, I have yet to see any solution apart from “turn it off completely” or “don’t use”. I did all my due diligence, my complaints are not even remotely lazy.
And I want other people to know that, before they buy into fanboy advices from people who seem to either barely use anything in the OS beyond a browser, or are just lying to themselves. Answers like these speak even better than any of my complaints here could. Feel free to advise next time, I’ll be there as well.
Every discussion like this works the same. You mention a set of real use case issues and ask what to do, and all the advisors suddenly appear too busy to answer, with a rare exception of the most defensive deniers.
As I see it, one way to phrase the problem is that Linux (along with its ecosystem) isn't really user-focused either. It's developer-focused. Tons of weird decisions get made that are grounded in developer desires quite removed from any user concerns. There are a lot of developers out there that do care about users, so often you get something decent, but it's still a bit off-center.
A great example is Firefox, which decided to break all extensions for developer-focused reasons (i.e., "too hard to maintain") and continues to make baffling UI changes that no one asked for. Another obvious example is the mere existence of various open-source software that is only distributed in source form, making it totally inaccessible to users who just want to click and install.
But mostly you just see it when you file a Github issue and a contributor/developer responds with something like "Sorry, that's not my priority right now". You see it when people reply with "PRs welcome". There is still a widespread mentality in the FOSS world that people who want features should be willing to somehow do at least part of the work themselves to make it happen. That's not user-focused.
Don't get me wrong, there's a ton of great open-source software out there and overall I think I'm happier with it than I would be with modern Windows (let alone MacOS; whether I'm happier than I was with Windows pre-10 is a tougher question). But basically what I mean is there are developers out there writing proprietary software who will implement features they actively dislike because they are told that users want them; that mindset is not so prevalent in the open source world.
That was only a problem for extension developers. Users weren't really impacted as developers built new versions of popular extensions.
> and continues to make baffling UI changes that no one asked for.
No one ever asked for the iphone/smartphones, yet people buy them instead of dumb phones. My firefox has evolved a bit over the year if I look at former screenshots, but everything happened so gradually it has never been a problem for users.
And all kind of software do, not only FOSS.
> Another obvious example is the mere existence of various open-source software that is only distributed in source form, making it totally inaccessible to users who just want to click and install.
There are so many apps available through the software repos and flatpak packages that users who aren't into building a software from source shouldn't even feel concerned.
> But mostly you just see it when you file a Github issue and a contributor/developer responds with something like "Sorry, that's not my priority right now". You see it when people reply with "PRs welcome". There is still a widespread mentality in the FOSS world that people who want features should be willing to somehow do at least part of the work themselves to make it happen. That's not user-focused.
Prioritization is happening everywhere, in proprietary software too. Dev teams work with finite time and resource constraints.
PRs welcome is a bonus, not a con.
> But basically what I mean is there are developers out there writing proprietary software who will implement features they actively dislike because they are told that users want them; that mindset is not so prevalent in the open source world.
Mostly only when they are paid for it. And some proprietary dev also don't implement stuff they don't like. I don't think you can generalize, this behavior is not based on the choice of license.
Some FOSS projects also do work on some features if users raise a bounty for it.
Sure, I agree. That's basically all I'm saying. FOSS gets rid of the tracking and dark patterns but it's still not what I'd call user-focused. It's like in proprietary software the decisions are made based on what the developer wants, and in FOSS it's made based on what the developer wants. But in theory with FOSS there could be people out there who are taking the opportunity of freedom from profit-driven orientation to actually figure out what users want and do that with the same level of drive that proprietary companies apply to seek profit. But it doesn't happen. It's not terrible, it's not even bad, but it's not what I'd call truly user-focused.
I just want tools to be more accessible.
Hobbyist developers develop software because it solves a need they have first, and they have fun doing that. If they don't have any fun or interest to do it, they lose motivation. Hobbyist developers are the primary users of the app they develop usually.
Commercial FOSS developers do have to take users into account and I think they do but they also have to seek profit.
I don't think there is another way unless government starts employing developers to develop FOSS software based on tax payers wishes.
Government employing developers would be just another form of doing it for pay. There is another way, which is the same way that various other kinds of charitable things happen: through a desire to meet the needs of others rather than having "fun" or "interest" for the person doing the action. There are people who donate their time and energy to do things like give food to the homeless, clean up trash, or whatever. Obviously they derive some kind of satisfaction from it but I think many people who do these kinds of things wouldn't say they do it because it's "fun"; they do it because they think it meets a need that other people have. There could be software like that, but there isn't much of it.
The same way there isn't much people giving food to the homeless or clean up trash compared to the general population size.
You are looking for a unicorn imho. Having said that hobbyist developers, regardless if they do FOSS or freeware, are likely to make stuff that is in line with your particular needs because more often than not people have common needs. They may not agree or have time to implement every single feature you want but in a sense this is use-focused if not user-focused.
Not to mention many projects refusing to add configurability and accessibility, citing vague maintainibility concerns or ideological opposition.
Another blatant example is the 6.7 kernel merging anti-user "features" in AMDGPU... previously you could lower your power limits as much as you wanted, now you have to use a patched kernel to lower your PL below -10%...
Everywhere you go, you can find these user- and tinkerer-hostile decisions. Linux isn't much better than Windows for the semi-casual tinkerer either - at least on Windows you don't get told to just fork the project and implement it yourself.
I'm a bit hesitant to call this corporate greed as it's literally happening in the OSS ecosystem too. Sadly I don't have answers why, only more questions. No idea what happened.
The obvious difference being that in Windows you can't even do that or (easily) apply a patch. Isn't this very ability to patch (or create a fork of) the kernel the opposite of being tinkerer-hostile?
Yes it is, through the power of choice.
>From the "don't theme my apps" movement,
Which anyone is free to ignore and actively do.
> to Wayland's "security above usability" philosophy...
1. wayland is super usable right now and has been for at least a number of years so your statement is mostly a lie. Only thing missing right now are color management and HDR. This impact a small portion of the users who can still fallback to xorg.
2. we are free not to use it. Distributions made it a default choice only recently and you can still install and run xorg, and will so for pretty much as long as you want, especially as some distros are targeted at people not liking the mainstream choices.
> Not to mention many projects refusing to add configurability and accessibility, citing vague maintainibility concerns or ideological opposition.
So you are saying having opinions is bad?
You are still free to use whatever desktop you want or patch your kernel. You have the source and the rights to do whatever you want with it.
> Another blatant example is the 6.7 kernel merging anti-user "features" in AMDGPU... previously you could lower your power limits as much as you wanted, now you have to use a patched kernel to lower your PL below -10%...
I don't think putting safeguards in a GPU driver to make sure users don't fry their expensive GPU inadvertently is an attempt against your freedom. The kernel and gpu driver are still under an open source license that expressly permit you to do the modifications you want.
> Everywhere you go, you can find these user- and tinkerer-hostile decisions.
What is more tinkerable than having the source available and the right to modify them and do whatever you want with it?
I think you are mistaking user and tinkerer-hostile decisions with your and users excessive entitlement mentality. Developers have finite resources and can't possibly agree and accept all users suggestions and desires, and have to put limits on the scope of their projects so they can maintain it, support it and not be overwhelmed by bugs/issues. This is not about freedom.
I know Linux exists. In fact, I've been using it as my primary OS roughly from 1994 to 2006, and since then intermittently for some tasks, or as a main development machine for a couple of years. I wrote device drivers for Linux and helped with testing the early suspend/hibernate implementations. I'm all in support of Linux.
But when I need to get work done, I do it on MacOS, because it mostly works and I don't have to spend time on dealing with font size issues, window placement annoyances, GPU driver bugs, and the like. And I get drag&drop that works anywhere. All this makes me more efficient.
But I don't want to turn this discussion into a Linux vs MacOS advocacy thread: that's not what it's about. In fact, if we were to turn back to the main topic (ensh*ttification of everything around us), Linux would be there, too: my Ubuntu already displays ads in apt, as well as pesters me with FUD about security updates that I could get if I only subscribed. This problem is not magically cured by switching to any Linux, it's endemic in the world we live in today.
Consumer software has gone straight downhill for the last 20 years and while the FOSS alternatives have some rough edges I always at least try them first. The outcome has been that I am shielded from most of the industry's worst excesses. Bad things happen, the world gets worse, and I just read about it, it doesn't affect me. I am more of a radical than the post author, I say in your personal life, roll it all back 100%, return to history, modernity is garbage, computing has taken a wrong turn because we have allowed it to be owned by illegal monopolies and cartels. I do make compromises in the software stack we use for business simply because my employees are not as radical as I am and I need to be able to work with normal humans.
That becomes the problem. Not just in the business world either. Like if all your friends are communicating on platforms that are locked down and harvesting your data, how do you arrange to get together for a burger? If all the stores closed down and you can only buy things on Amazon, how do you clothe yourself? Obviously I'm exaggerating but the big problems of this situation arise precisely because most people don't realize it is a problem, and thus working "outside the system" requires an increasing amount of effort.
We can't personally be responsible for everything. So to bring it back home to enshitification, a free market, free from monopolies or duopolies, should be the solution. As one product gets shit, a hole in the market opens up and could be filled. That's not happening though, so what's going wrong? If it could happen anywhere it's Sillicon Valley, so much money and the culture of disruption and innovation, all the right skills are floating in the employment pool. But software keeps getting more and more shit.
macOS has been very conservative in redesigning the user experience; it's aging slowly like fine wine. There are a few hiccups occasionally but I feel it's a lot more polished and elegant compared to the quirkiness of earlier versions. I don't get this common sentiment that it was better in Snow Leopard etc.
Stability is great, power consumption is awesome since the introduction of the M-series chips and I can still do everything (and more) that I did on my mac 20 years ago. Yes there are some more hoops here and there but overall you have to keep in mind that macOS never fell into the quagmire Windows did with bloatware, malware and all the viruses (although I think the situation is much better today).
macOS has been walking a fine balance between lockdown and freedom, there is no reason to be alarmist about it and there are still projects like OpenCore Legacy Patcher that prove you can still hack macOS plenty to make it work on older macs.
We're eating good as mac users in 2025, I don't buy the slippery slope fallacy.
The new settings are half-baked and terrible. The OS pesters me constantly to update to the latest revision and I can't turn those messages off, not even close the notification without displaying the settings app. And I don't want the latest revision, because it is buggy, breaks a number of apps, and introduces the "AI" features that I do not want or need.
More and more often good apps get broken by the new OS policies (SuperDuper is a recent example).
The old style of system apps that did wonderful things is gone (think Quicktime player, or Preview), these apps are mostly neglected. The new style is half-baked multi-platform apps like settings, that do little, and what they do, they do poorly.
But it does a wonderful job at doing so.
Macs feel less like a personal computer and more like an appliance. Which works great if you do things that don't require tinkering, like office tasks or even webdev.
And I do love Linux, specially the more hobbyist OS's like Gentoo or Nix.
But at some point in my life I decided to spend more of my time (aside work) with other parts of my life. And in result, having to spend a weeken to solve some weird usecase, be it the package manager or the WM, is a pain.
I have never spent a weekend fixing a problem. The worst I can remember was when an update early version of Ubuntu broke X Windows and the update had run on multiple machines in a small office so needed to be fixed multiple time, and there was a delay while they fixed the problem IIRC. Still, it was a few hours.
Even now, using Manjaro which is relatively likely to have problems, I have had no major issues so far.
I have not used Macs so cannot compare, but IMO Linux compares very favourably with Windows.
Mac users I know rave about it, but every time they come up with a specific example of why they are better it turns out to be something like functionality other OSes have. Sometimes Macs have the advantage of being preconfigured (e.g. copy and paste between devices required installing software on both and pairing - but a 10 min one off when you buy a new device is acceptable to me).
It's the desktop space that annoys me.
- Virtual Desktop per monitor? Nope, because Xorg didn't support it back then. And now it's a 10 year bug on the Kde bugtracker.
- A Dock? It worked ok. Until Wayland came and everything broke. It's supported now, but you have to clone the latest git commit of the biggest dock project which is not almost abandoned. And it breaks while compiling. A lot.
- Global Menu? The support is all over the place.
- Fractional scaling? It works. But in MacOS (with the help of an app, I admit) I can have incredible granularity.
On top of that, add the generally inferior hardware revolving a laptop, aside CPU, storage and Ram.
The MacOS desktop feels like a Gnome2 in an alternate universe where the devs never made bad decisions, and things like Wayland (1) never occurred.
(1) Not because the project itself, but the act of breaking compatibility and passing blame to other people has sent decades of FOSS development and manpower down the drain.
Last attempt to try otherwise was a UEFI bios, without fallback to legacy BIOS, that just couldn't get along with whatever top distro from Distrowatch I would try to.
Apple, Google, Microsoft walled gardens are more confortable to stay on, and as long as I can have some kind of ISO C and ISO C++ support for everything else that depends upon them, I am good.
That said, my current laptop came with Linux preinstalled so I knew I would not have hardware issues.
I would also rather have one off issues to install than unpredictable issues later on. Subjective preference, of course. I have had lots of issues with Android. Never at the start, but with app upgrades.
never fell into the quagmire Windows did with bloatware, malware and all the viruses (although I think the situation is much better today).
Windows has 10:1 program ratio compared to Mac. You can usually choose from full-bloat gamified experience to a simple tool doing its job. Windows itself is crap by default though, but that’s at least fixable if you know what you want and where to look at.
I completely get this sentiment. MacOS since Big Sur has had a quite indeterministic UI, from scrollbars that only appear halfway of the time, to the new app toolbars that truncate filenames and hide the search bar in the overflow menu. The Settings app is still worse than what it replaced, the Mail app keeps randomly appearing, the random confirmation pop-ups are more common than in Windows Vista.
Snow Leopard was (and still is) a bliss, it didn't nag you with "this app is dangerous" bullshit, built-in apps that look and work like an intern's multiplatform UI practice, constant updates... It was an OS primarily designed to let people get work done, not to encourage them to spend all their time on Apple TV/Music/...
Maybe people have been slowly boiled? I got my partner on a Mac 10 years ago but would not get her another Mac. Apple's push to make evetything e-waste, foxconn, and the general surveillance in the name of security ensure that. My observation is less that it has aged "like a fine wine" and more that Macs become prisons shaped like a computer.
(Edit: s/has/hasn't/)
- the oh-so-pretty GUI takes up too much white space, I have little left for actual content. I dont need massive icons with wide spacing since, I need the opposite.
- what is the deal with hiding folders from Finder (like ~/Libraries). Do I honestly need the command line to open this directory?
- every iteration after SnowLeopard I feel their “features” are going backwards and taking away “usability” from me.
- OpenGL stuck on 4.1, Vulkan (Molten) is a 3rd party hack. Seriously?
- its become a case of one step forward, 2 steps backwards.
- I can go on and on, but you get the gist.
Bambu Lab never really had any goodwill. No one liked the fact they were proprietary in a previously very open market. They just made really good, affordable printers, and those who were more interested in making stuff than in supporting an open community got them, sometimes reluctantly.
And BTW, Macs have always been locked down and backwards compatibility is not their priority (which means stuff broke). They cared a lot about their user experience though, I don't know the situation now, but I don't think it can be worse than Windows.
For one though, you can support open source software, especially linux OS's. Similarly, ditch the Bambu. There are countless better and more open printers out there, and you can DIY excellent 3D printers that get great results.
I think that's the point of difference between now and the past, information has spread so far, and people have fought so hard for open source software and hardware, that we actually have a good defence against corporate greed. You accept some compromise and work a little harder for it, but it's really not that bad.
And no, if the axis you are measuring on is openness versus locked down then Microsoft is not worse. You have simply been brainwashed.
> The Mac is being progressively locked down
> There isn't much we can do, as long as everybody tolerates this.
So you should become the change you want to see, shouldn't you? Try switching to Linux until it works for you. Debian is rock-solid today; xfce is so good that my non-technical relatives use it every day with no complains. I'm using a GNU/Linux phone as a daily driver and I'm sick of iPhone and Mac users complaining that everything is going downhill without taking any actions. I'm also sick of articles without any suggestions of what to do, when all tools are available today. No, it's not trivial. But it's doable.
1. If you were a bit more familiar with Apple history, you'd know that the Mac was actually Steve Job's push to make things more proprietary and locked down, not less. Make of that what you will.
2. If your ideological stance is in opposition to companies like Microsoft/Apple/etc. and you work in the tech industry, the most effective action you can take as an individual is to deny them your labor.
Yet he accidentally made OS X considerably more open than its predecessor by [I presume] pure accident?
As a matter of fact, this stink of sleaziness that permeated the early Web was so prominent and overpowering that it played a key role in the rise of these huge companies like Google. Google's algorithms and page crawlers were not that revolutionary or different from anything the other search engines were doing; Google just happened to be in a position where they were sitting on lots of cash and were able to run a search engine for several years with no ads or clutter or any of the other annoyances of its competitors, seemingly providing a free service that asks nothing in return. They made this part of their carefully curated public image, of being the hip and cool tech company with the "don't be evil" mantra. They probably burned through ungodly amounts of money doing things this way, but once all the competing search engines withered away and died and Google had the entire market cornered they grew into a multi-trillion dollar megacorporation and are now unstoppable and now all their services they provide are deteriorating because they have no competition.
Ironically, it was this false underdog narrative, the idea of the young trendy cool tech companies overthrowing the stuffy old corporate tech companies, that sort of paved the way for the tech industry to become more monopolized and horrible than ever. And now it's happening again with lots of "Web3" companies trying to present themselves as the new champions, who will overthrow the stuffy old corporate tech companies like Google and bring us into a new era of the Web that is even worse than this one.
Back in 1998, Google's algorithm ("pagerank") of weighting href backlinks using linear algebra was revolutionary compared to the other search engines like Yahoo, Lycos, Infoseek, AltaVista, etc that were built on TF-IDF (term frequency-inverse document frequency)[1].
The more simplistic TF-IDF approach of older search engines suffered from "keyword stuffing" such as invisible words at the bottom of the HTML page for SEO. Google's new search engine was an immediate improvement because it surfaced more relevant pages that beat the useless pages with keyword stuffing. At the time, Google Search results were truly superior to the junk Yahoo and AltaVista was showing.
[1] https://en.wikipedia.org/wiki/Tf%E2%80%93idf
> the young trendy cool tech companies overthrowing the stuffy old corporate tech companies, that sort of paved the way for the tech industry to become more monopolized and horrible than ever
Not following the thread here. Do you think the web would be less monopolized if Altavista or Yahoo had won?
I don't believe it makes any difference at all. The transition from a free web, made by people for the people, to the collection of corporate walled gardens we have today would have happened regardless, it was simply the natural progression of things - that we failed to recognize and avert in time. Initiatives like making computing personal again are exactly what's needed if we want to go back.
Google was revolutionary when it launched. It was clean, super fast, and had way superior search results. It blew the competition away. Within weeks of Google's launch techies started scolding people for using AltaVista or Yahoo, when they should be using something better.
So easy to game the system before Google. (Now easy again judging by the shitty results I've been getting for years now.)
The company had a legitimate business model, was innovative, agile and profitable from early on. It rightly earned a lot of respect.
But something went wrong at some point. It's debatable when, why or how, but it happened.
The braindead hordes accepting things they couldn't really understand did have a negative effect on overall quality.
Just before someone argues against the misanthropy in my comment, some of my most loved family members belong to the braindead horde. I love them, but their failure in education makes the landscape worse for everyone. And it is also very visible and not something imaginary.
Today we accept our OS spying on us, showing us ads, paternalizing its users with updates and the whole mobile catastrophe is a dilemma in itself. Smartphones are powerful devices but the software landscape disabled a whole dimension of software and is responsible for unnecessary waste.
Yes, it got worse on the software department. A few less driver issues because a lot of companies and hardware suppliers were consolidated is not a win.
And honestly, it isn't really hard to notice these changes at all.
Google is a good example. It didn't have better search, but its site wasn't plastered in ugly advertising from top to bottom. This was quite a factor in its success. Clean, fast, good. Not the nightmare it did on Android, where every app onboarding is a horror story in a thousand popups. There are profound differences in quality, intelligence and ability.
The market wanted growth. Early tech companies, like Microsoft, Apple, eBay, and then Google, went from zero to huge in a very short period of time. But companies like the FAANGs kept up the absurd levels of growth (20+% YoY growth in the case of Google) that Wall Street got hooked on, and it's been on a drug binge ever since. The result is that we have multiple trillion dollar companies that will...never not want to be a trillion dollar company.
The total amount of money in the PC market was miniscule compared to today, and the internet and its online retail plus ads bonanza even dwarfed that. The PC software market, the video games industry, everything--it was all so much smaller. As the internet swallowed the world, it brought billions of users. And those billions of users can only use so many devices and so many games and spreadsheets and stuff. They had to be made into cash cows in other ways.
The tech market just has to keep growing. It's stuck tripping forward and must generate revenue somehow to keep the monsters' stomachs fed (and their investors too). We will never be free of their psychotic obsession with monetization.
And advertising is soooo insidious. Everything looks like it's free. But it isn't. Because our eyeballs and our mindshare is for sale. And when they buy our eyeballs their making back those dollars of us--it's the whole point. So whether you like it or not, you're being programmed to spend money in other parts of your life that you wouldn't otherwise. It cannot move any direction but falling forward into more consumerism.
I'm afraid I'm a doomer in this regard. We're never going back to not being bothered to death by these assholes who want to make money off us 24/7.
What were small conflicts of interest before (a little trash here or there, a little use of personal information for corporate instead of customer benefit here or there, ...) now scales to billions of people. And dozens of transactions, impressions, actions, points of contact, etc., a day for many of us.
That not only makes it more pervasive, but massively profitable, which has kicked in a feedback loop for sketchy behavior, surveillance, coercion, gatekeeping, etc., driven by hundreds of billions of dollars of revenue and trillions in potential market caps.
Things that were only slightly unethical before, now create vast and growing damage to our physical and mental environments.
It should simply be illegal to use customer information in a way not inherent to the transaction in question. Or to gather data on customers from other sources. Or share any of that data.
It should be illegal, to force third party suppliers to pay a tax to hardware makers, for any transaction that doesn't require their participation. And participation cannot be made mandatory.
Etc.
One commonality here, is that there is often a third party involved. Third party gatekeeper. Third party advertisers. Third parties introduce conflicts. (This is different from non-personalized ads on a site they have relevance for, which are effectively two independent, 2-party transactions.)
Another commonality, is the degree to which many third party actors, those we know, and many we never hear of, who "collude" with respect to dossiers, reaching us, and milking us by many coordinated means.
Most administrations are squishy-soft on corporate crime. If there were regular antitrust prosecutions, violations of Federal Trade Commission regulations were crimes, wage theft was treated as theft, forging safety certifications was prosecuted as forgery, and federal law on warranties was strictly enforced, most of the problems would go away.
In the 1950s and 1960s, all that was normal. The Americans who lived through WWII were not putting up with that sort of thing.
For instance, nearly every country was paying the US loans back, in USD, or was having to depend on the US in some way.
Nearly every other country in the world had their industrial base (and often male population) crushed in the war.
Etc.
Those things cost money/effort, and require a consistent identity and discipline.
In many respects, we are also better off than we were in the 1980's. There are more of us, we are connected globally, and the tools that we have access to are significantly better. We also have a conceptual framework to work within. Technically speaking, Free Software may have existed back then but few people even knew of it. People were struggling with ideas like public domain software (rarely with an understanding of what that meant). If you wanted to make money, outside of traditional publishing channels, you were usually toying with ideas like shareware (where you had pretty much no control over distribution). If you wanted to spend money of software, outside of traditionally published stuff, chances are that you had to send cheques or cash to somebody's house.
And then there is communicating with likeminded people. We may like to complain about things like Discord or Reddit, but they are not the only players on the block. Plenty of people still run small or private forums. Yeah, they can be hard to find. On the other hand, that has more to do with the noise created by the marketplace rather than their lack of presence.
Why is this good?
The thing is the older I get, the more it does seem like at the very least we are not growing pie in a number of areas (the example at the top of my mind is academia) and sometimes it just seems like an easier solution is to decrease the numerator. But I don't know how you can do that and justify it morally, both to society and to yourself.
Unfortunately at the time we need them the most pretty much every pro-user organization is imploding because everyone and their grandmother wants to turn them into vehicles for whatever their pet cause is.
I understand it and know it. But I don't appreciate it either (in the sense of liking it).
Its just another bubble, one predicated on mining the users rather than expanding the product.
It's not bad that it's big. It only needs to grow because the rest of the economy needs to grow.
I am also afraid you're a doomer in this regard. You don't think the bigwigs with their fax machines in the 1980s wanted to make money off of us 24/7? Of course they did.
Tech is scary in the sense that it's now gone quite a bit beyond the understanding of the average joe. Even most of us on this site probably don't fully understand how much detail data can paint a picture of a person. There are companies that probably know something about me that I don't even know.
I guess I don't know how to alleviate that feeling, and maybe it's the correct default assumption to be a doomer. It certainly would be very helpful if the US treated the situation more like the EU treats the situation.
It's tiny, clearly built with love for the user, doesn't do a heck of a lot, and has some interesting ideas that are just fun to mess around in. And unlike some of the similar retrocomputing OS's (which are also lovely but grounded in old fashioned design), genode feels like a glimpse into the good future.
But nowhere near as practical as Linux at the moment of course
I think it'd be very cool to have a fully verified kernel...
I like the idea of Qubes and it looks like Genode might be an even better idea...
I've looked into it briefly but it seems like too much work for me right now.
The True Genode Way of course is that everything worth having would eventually be ported as a native genode component instead of a qubes style VM. They've put a lot of effort into making that as easy as they can with Goa (a nix-inspired package management and build tool) and adding to their C standard library and ports of popular 3rd party libs like SDL
They dont assume you want a RAM-Only filesystem. By default it starts out completely immutable with nothing being able to save anything anywhere.
If you want to save anything to a hard drive you have to enable that driver because they don't assume that you'd need one.
Copy and paste is an optional extra to install
It's wild :p
From their description: "It is tested best on laptops of the Lenovo X and T series (X220, X250, X260, T430, T460, T470, T490)", 200 isn't on the list but you'd probably have about as good a time as you can
Nobody fears a toothless dog.
So, as far as a corp can understand anything, it can't understand this human article. I don't know if one can write articles that a corp can understand, maybe it cannot understand medium in the same way we can.. It seems to act based on information it sees in "markets" and "consumer behaviour", and we don't yet know how to write an article with those (even if "vote with your money" was once believed to be it, until we discovered that mankind as a whole is not an individual that can make a decision)
I will counter this with: The argument that the humans that make up a corp are in control over it, and that the corp behaviour simply results from their flawed and greedy characters, is just a convenient way to blame someone because the real problem of understanding what kind of entities corps are, and how to influence and control them is too hard.
When dealing with people, we tend to view them as as distinct entities from their individual cells and neurons when we argue about their behavior. We don't talk about this or that individual neuron causing a human to take an action, if anything, we may discuss a group of neurons, but often, we argue about the entire "brain chemistry", even if it's strictly true that some group of distinct neurons are "responsible" for the action the human takes, then so is the bones in their hands, the fibres in their muscles, because they didn't refuse (to refuse, is to no longer be part of that body). Maybe it's the moral thing to do, for a cell to refuse to be part of the immoral human, but it does not absolve the human from responsibility, and it does not put all the responsibility on the indvidual cells that make it up, humans are complex organism, corps are made out of humans, they are even more complex. Treating them as a collection of humans is what we have done so far, and while we've gained some satisfaction seeing a (too few) very disgusting people getting what they deserved, it's not changed the overall behaviour of the companies, because, killing or changing one neuron won't change a brain, replace neuron with human and brain with corp.
On a higher level of abstraction, the corp as a form of life, maybe a cancer, maybe a rat, or something that could potentially be a positive thing, we may start a new way of reasoning about and with them.
After all, I can talk to you, I know how to do that, but I can't talk to your neurons directly, I don't understand their modes of communication, it's on a different level from me.. This is why psychiatry is behind, we don't know how brains work well enough, we can give medications to take the worst out of them, like treatments for adhd, or schizophrenia, but they don't work on the brain in a coherent way, they work on the individual neurons in a very crude way, and so, the effects are nowhere near perfect, and the side effects can be almost as bad, or in some cases worse than the decease.
What I’m saying is that culture isn’t its own uncontrollable entity independent of influence from the people that run the corporation. A company’s culture is dictated by the people who lead and make decisions for that corporation. A culture is driven from the top down.
The problem isn’t understanding what type of entity a corporation is, it’s fining people who are both motivated to make the change but also has the power to make any changes.
The real hard part is working against the rigged system. People who can enact change won’t because it’s not profitable. Whether you’re the MP bribed, sorry I mean “lobbied”, by corporations, or you’re the corporate director that had to navigate the cutthroat ranks to reach your position, there’s literally no personal interest to do the right thing. Literally everyone who can control these beasts suffer from massive conflicts of interest.
So the problem isn’t understanding the problem. We already know what the problem is. We just don’t care enough to change it.
And they are the product of a fluke of legal-economic history.
Please reconsider whether continuing to do the same (simple and easy) thing we've always done, that we've already seen does not work.
Take a step back, re-evaluate the problem in a new context, even if you don't end up agreeing with my perspective, attempting to think about the situation in a new light might be helpful.
It might be that something in the way corps grow up, maybe their environment (regulations, lack of same, incentives, consumerism, trade, markets) may influence them to grow into the immoral unethical monsters they often become.
Maybe we should consider them too dangerous and harmful, and simply destroy them, I'm not convinced that's better, but maybe there's a way to understand them at a different level, that allows us to "write articles" they understand well enough to actually adjust their behavior (and not just try and circumvent whatever "obstacle" has been put in their path).
As society keeps forcing them into pretend empathy they know every detail about it. They can exploit it and imitate it but they cant hide how precious their ego is to them. It sticks out like a sore thumb.
Corporate creatures are similar, they simply don't share our emotions. That doesn't mean they don't understand or wont cater to them.
I agree, in many ways corps exhibit the same behavior as psychopaths, after all, people is the thing they subside on.
I think their way of cognition is even more alien to a normal human than that of a psychopath is to a normal human.
They may share a fundamental, that we can't make them truly understand why something is wrong, but we may be able to come up with consequences that deter them from undesired behavior.
One difference though, is that we can't make it illegal to be a psychopath, only make some of the actions that only a psychopath would perform illegal.. We could make it illegal to be a corp (I'm not saying this is going to be a better idea.. But we could make it legal to kill corps, if only we can find out how to kill them so they actually die of it [1]).
[1] https://www.youtube.com/watch?v=H73L7M0xsnM
Hard to ignore the signs that the US is an empire in decline, heading towards collapse.
1. https://www.oftwominds.com/blogjun24/negativity6-24.html
True, but unlike the Apple II, the NES was not an open system. The NES had hardware DRM, which allowed Nintendo to control what games were published for the system and to charge licensing fees (much as Nintendo, or Apple, do today). Nintendo also tried (unsuccessfully) to shut down Game Genie.
https://en.wikipedia.org/wiki/CIC_(Nintendo)#10NES
If you wanted to cheat in 1992, you'd call the Sega Hotline on a premium phone number and they'd give you cheat codes.
It's the same thing, just a different medium and middleman.
I remember the ads for that but I've never met a single person who did that. (Or whose parents would be okay with it.) Cheat codes were either shared by word of mouth among friends or in magazines. Or you bought a game genie, but that was more for messing around with a game's mechanics than actual, blatant cheating.
If people dislike exploitative SaaS and content platforms, stop using them. No one is forcing anyone. Plenty of people use home servers and Linux. Go for it. There are also tools like Chris Titus’s UWU to make Windows more tolerable, and MS still sells an office suite that can be installed locally. You don’t even have to “sign in” with it, though you can.
I’ve lived through several distinct eras of computing. This one may not be the most exciting, but it’s by far the best. You can use SaaS or locally installed stuff, and emulators (both hardware and software) exist to keep the older stuff alive. Even better, I don’t have to panic save every 5 seconds, reboot my computer every hour, and my computer can come with me. I don’t get disconnected when someone places a call, and while some software is expensive, it’s cheaper than it used to be when inflation adjusted.
Go fire up an Apple II, an H89, a TRS-80, or a PET without any modern supplementation and tell me that those are preferable. You may groan about Google, but go back to purchasing tons of manuals that may or may not be specific to your machine, read through them only to find no answer and proceed to play detective for a few weeks. How much more productive is your time with a dang search engine?
"Our economy isn’t one that produces things to be used, but things that increase usage."
There never was a choice.
We're seeing the "free" version of that.
It was never enforced with software or services. If it had the entire standard VC startup playbook would be different.
It’s also never been enforced internationally. China has arguably been subsidizing its industries and effectively dumping cheap manufactured goods for years to become the workshop of the world, and it works.
...the quote, *AS A SOUNDBITE*, only sounds good on a surface level, but collapses under the slightest test. All products in some form or another increase the usage of resources in order to reach a certain goal.
https://www.wheresyoured.at/the-anti-economy/
The article, where the quote originates from, contextualizes the quote marking (a) the difference between products in service of an actual goal, (b) products that are only meant to look good on a balance sheet, and (c) how companies have morphed towards (b) in order to attract investor funds and increase share prices / company market values.
The quote, BY ITSELF AND WITHOUT CONTEXT, is a twisted Neo-luddist version of its original self.
It's a side effect of items being built to cost, and the marketing phenomenon that consumers follow fashion trends.
Your car doesn't have planned obsolescence: it has a warranty period. If you want a longer one, you'll pay more because that is not a free service to provide.
Today, I look at those same companies with absolute derision over their completely unethical and hostile approaches to the world, the economy and dealing with the people that use or rely on them.
Worse, my ability to get excited about new companies, products, services and innovations has been completely blunted by the expectation that anyone working on something I think is "cool" will inevitably be co-opted by people who have the worst instincts: those who actually have no respect for technology or computing and view people as less than human, simply entities from which maximum value must be extracted at any cost.
I guess activitypub and matrix are meant to be similar in that regard but for whatever reason the learning curve is just a little steeper, so you have to be motivated by ideology to put up with the gaps in usability
While im willing to out up with it, its a hard sell to get your friends to use something worse
So while there's more options now for homelab things the overall ecosystem has moved strongly away so there's a lot more to avoid
Without wanting to sound like a stick in the mud, the focus of computing has definitely changed now. I see it as an interesting thought exercise on how to get someone running around with what is usually a marvel of computing in their pocket to try and imagine that is not the apex of computing, whether to explore what other means of computing offer or what comes next besides a slightly better version of what we have now.
That is a great way of thinking about it and I'm curious what you've come up with. I think it's a pretty hard sell for most people, especially for things like messaging that have become very central to daily life. Also, there's a big difference between convincing someone to try something a bit less mainstream and convincing them to reject the mainstream version. Like, you may be able to get someone to install LibreOffice but it's a lot harder to get them to uninstall Excel.
Anecdotally, I've found that people who have some other kind of retro/niche/subculture interest can be somewhat more receptive to the idea that the newest thing isn't necessarily the greatest. Like someone who's into hunting for vintage clothes, or woodworking, or whatever. Ironically such people are on average more tech-averse than a typical "normie", but they often understand the concept that it can be useful to actually put effort into getting something that's not just whatever's handed to you. In a way the insidious aspect of recent tech is the way it's conditioned people to expect that they shouldn't have to think much about how to do things, and to just want "smart" technology that reduces decisions.
Gog & itch & humble are great and as good as steam if they have what you want but the collection is a lot smaller
You bet your fucking ass I'm using Microsoft Office and reliant on it to a bloody fault. I literally and sincerely can't rely on LibreOffice to open and save documents in ways that everyone else would. If I use LibreOffice then at best I'll embarass myself, at worst I'll waste someone's time and either way I risk losing business for no good reason.
A Microsoft 365 subscription (or even buying an Office 2024 license) is chump change because I know I am speaking Industry, not Libre. Nobody understands nor gives a damn about Libre in the real, professional world because the lingua franca is Industry aka Microsoft Office.
I learned this first-hand when I was a grad student and had to write a technical report to submit to our government funders (using their required MS Office templates and all). I used LibreOffice and saved it as DOCX and everything looked great. Then my advisor opened it on his computer running MS Office and asked me WTF was up with all the mangled formatting.
I have access to large amounts of harware and software through my employer. And while Microsoft Office is unavoidable, I hate it everytime I open Word or Excel (daily), even if it is on my company machine.
The privacy concerns are arguable even more concerning on a company machine. I wish there was a feasible alternative.
> you can only run CC that sends all your images to them
It doesn’t send all your images to Adobe, it only sends the ones you choose to save in the cloud.
If you want to complain about having to subscribe to the software instead of purchasing outright, then do that. Don’t complain about something that isn’t happening.
They do use the images you upload to train their (non-generative) AI for things like background removal.
Spend some time in the tech support desk of a mobile phone store to get an idea of the general level of technical sophistication of the average person. Average folks are not running containers. They're not installing... anything... except maybe an app from an App Store. Half of them aren't sure what a file is.
> <...> corporate business world of the 80s, 90s, and early 2000s was just as sleazy and run by assholes as it is today; the only difference is that the technology is finally catching up with the ambitions of said sleazy assholes and allowing them to do what they've been trying to do since the outset, <...>
Second, computers are cheap now. They are no longer for the financial and/or intellectual elite.
Third, there is an overall culture/intellecual/value decline in the Western world. Probably because life after the Cold War was too easy. Now many(?) young people, at least in America, can't write by hand, and men who cut their genitals are not considered to be in need of very serious therapy, and Harvard students support HAMAS, and so on.
The PC movement of the 90s, where it feels like this author is reminiscing, was about arbitraging the delta between what the tech could do and the literacy and expertise in government.
> But over the past decade in particular, the Internet and digital rights management (DRM) have been steadily pulling that control away from us and putting it into the hands of huge corporations.
This period of computing was notable for how a bunch of nerds figured out how to use new networking technology to stretch/abuse/violate/break copyright and fair-use laws around media.
So many ways to get ripped content then. It was fun for a teenager and felt like a victimless loophole. It both opened a bloom of interesting new creative works, but also decimating existing markets and systems, so that eventually new monopolists Netflix and Spotify could take over.
But the conditions and tools available then never went away, and private personal computing is more available today than ever before. In a few hours someone can read a few tutorials and buy/build and run a whole redudant content serving service for all of their personal needs, while writing or conglomerating a whole system of tools and capabilities to automate or agument nearly anything they could think of.
The government is not going to save you.
If you want that better future, you need to build it. Look at the Steam Deck for example that goes against the grain on all these matters (right to repair, mostly FOSS, unrestricted, etc.) and it's been a huge success.
We need a mobile platform like PinePhone / Librem to have the same level of success and reliability.
I've been working to build a company on my own hoping to fill that gap - I tell the career SWEs in my social circle "I want to give people the true freedom of creating whatever you want on the web," and I just get blank looks, ha :p
The problem is that those people have families to feed and clothe and housing and utilities to pay for and you can’t expect them to work for free (or a pittance) when they’d need to be paid a high 5 figure/low 6 figure salary to be able to afford their basic cost of living.
Users broadly don’t want to pay and will turn up their nose at having to spend $50 a year on a service or $10 on an app built by honest people with privacy and respect of the user in mind (when they don’t have any issues blowing hundreds of dollars on much more ridiculous things that don’t respect them as customers, but that’s another story…)
And on top of that, how do you make your services known when trillion dollar companies will always beat you in ad spending while offering a free product they have hundreds of people working on?
As an example from just a couple days ago, Read.cv just announced they were shutting down and acqui-hired by Perplexity even though they were a lean 3-person team with a monetized product that their users loved. They were at it for 4 years and couldn’t make it work.
https://read.cv/a-new-chapter
> I've been working to build a company on my own
Very sincerely: good luck, I hope you succeed in your goals.
But just as sincerely, if you truly believe the real problem is that the technological class lacks an “epistemological hunger” and not the basic money/visibility issues I raised above, you’re in for a rude awakening.
> you’re in for a rude awakening
I've seen how techies spend their time, I'm not the one in for a rude awakening.
The most infuriating part is that this is perfectly doable with 90s web technology. Even encryption was already available in the form of SSL, and that's arguably the only thing which has improved since then. The majority of technological "progress" has merely been the reinvention of existing technology in more inefficient ways.
In the name of a "better experience".
I know, because that is exactly what many of our customers did on the agency I worked during those days.
Maybe legislation and culture or something can help also, but it will be most effective if part of that is adopting and spreading the right technology to facilitate those changes.
Don’t forget Zuckerberg, Musk, Bezos were nerds - don’t blame everything on „corporations”. That is also nerds after they got ahold of influence and money - that is how the story ends.
But how do we sell to the layman that he is missing something, which he never experienced in the first place? Sadly, I believe we are doomed to be niche.
I'm thinking about experimenting that with myself and my son when he is older. But he is of the impatient type so maybe this is a bad idea as vintage computers typically need more focus and research.
Maybe a DOS emulator then. It has better tools and games.
According to a Google search, a typical C64 was on in 3 seconds. Maybe I've got rose colored glasses about how long it took, because my recollection was that it was basically instant.
Hell, my monitor can't even turn on in 3 seconds. You hit the power button and it gets busy doing... something... who knows what... not turning on, that's for sure.
It probably took several seconds for your CRT to get bright enough. At least that is my guess. (I had a similar recollection.)
That is where I plan to go with SPADE, https://hackage.haskell.org/package/spade
As you can see, it is currently written in Haskell, as a PoC. But I am re-writing it in Rust, for reasons that also include making it possible to do something like you ask for.
It has a graphical stack. Still a bit underpowered, but enough to implement uxn[2], which many people use for writing fun little games.
[1]: http://duskos.org/
[2]: https://100r.co/site/uxn.html
I don't know what the future lies. I'm fully prepared that my son is going to have zero common interest with me.
https://pixelx86.com
I don't have a problem with lock down, repair etc. At least not with the current iteration of computing.
While they are bad, these lock down means less competitions. And having less competition means less choices, etc, all of that leads to my final point which in my view is perhaps the most important, these company create crap products, software and services.
If they had continue to innovate and push I would have less of a problem. Look at Microsoft, and now to a less extend Apple as well. In the pursuit of more revenue they now make crap.
Therefore the more personal computing in my view isn't the benefits listed. It is to keep the company itself honest. To make them aware they need to innovate. To give a damn, to make something better.
I associate this genre of photo with the photo-shoots with Gates, Jobs and others. All the interviews and full page ads in the 80s 90s had variations of sitting/lying on desks, hugging CRT monitors or the classic folded-arms lean on a CRT from behind.
https://i.pinimg.com/originals/dd/97/ed/dd97ed2a239c725ebe57...
https://i.pinimg.com/originals/13/b1/3a/13b13a8c0bc7ee256b37...
https://wolfsheadonline.com/wp-content/uploads/2007/02/Nolan...
https://alchetron.com/cdn/dan-bricklin-6a3581d7-0d8c-4413-91...
I don't recall old-school blogs doing this or really having author photos at all (photos on that bandwidth/hosting?!) but I imagine whenever a blogger was interviewed for print media they would lean on the "computer person" standards.
> I’m not calling the tech industry evil.
Well. . . why not? I think at this point the tech industry is evil. Not in the sense that water is wet, and maybe not even in the sense that mammals birth live young, but sort of in the sense that ice occurs near the Earth's poles. There are some bits and pieces here and there that don't follow the pattern but they are the exception and they're getting smaller.
That doesn't mean that technology is evil, but the ways its being used and developed often are.
And that gets to another aspect of this that I feel like people in the tech world sometimes overlook when talking about this: enshittification is not a technological phenomenon. It's a social phenomenon that is driven by our socio-legal apparatus which allows and encourages high-risk pursuit of sky-high profits. Corporate raiding a la the Sears debacle, consolidation in grocery stores, even something like tobacco/health or oil/climate-change coverups, all these are forms of enshittification. Tech is the most prominent venue, maybe because it's especially suited to certain forms of vendor lock, but it's by no means the only one.
Enshittification happens because we are not willing to take a sledgehammer to the idea that making businesses bigger and bigger is a good thing. Timid reforms focused on things like data privacy are woefully inadequate. Large companies need to be directly dismantled, wealth needs to be directly deconcentrated, and broad swaths of behavior that are currently happening left and right need to become the kind of thing that will land you in prison for 10 years.
I'm not optimistic this is going to happen without some kind of "hitting bottom", though, whatever form that may take.
Just by the by, ironically, I've heard, from my tenuous connections into ideological spheres outside my own, that a decent number of people voted for Trump out of a similar desire to shake the foundations of the system. Of course they've likely been hoodwinked, but I think the opportunity is there for a Bernie Sanders-esque person on the other side to make some change by whipping up a frenzy of rage at the status quo and promising to get out the pitchforks. The question is whether such a frenzy can be accompanied by enough cool-headed calculation to be effective.
When people with that kind of worldview roll their eyes at empathy, or scoff at any need to see beyond their own opinions[0], they are all but guaranteed to seal themselves inside.
FOSS, decentralized, et al. attract a lot of people with those worldviews, and that story is the story of them failing with consumers over and over and over, doubling down on what failed every time.
[0] https://news.ycombinator.com/item?id=29114882
Yeah sure BlueSky uses Mastodon and it’s more successful but that’s because it’s BlueSky which is not mastodon.
> And there’s another problem. Very soon, we might be threatening the continuity of history itself with technologies that pollute the historical record with AI-generated noise. It sounds dramatic, but that could eventually undermine the shared cultural bonds that hold cultural groups together.
> That’s important because history is what makes this type of criticism possible. History is how we know if we’re being abused because we can rely on written records of people who came before (even 15 minutes ago) and make comparisons. If technology takes history away from us, there may be no hope of recovery.
So noise has always been there as well as methods do deal with noise.
I'm not required to use social media and extractive business models. Intenet surveillance is lamentable but I don't see why he thinks app stores are predatory. The PC is still mostly a force for freedom. The privacy losses are more than offset by the gains of communicating with everyone on the planet.
Apple is the much more obvious offender, even for stuff not traditionally stigmatized against. Microsoft struggled to release Xcloud becsuse Apple didn't want a game streaming service on IOS. Meanwhile, steaming music, videos, and anything that works on its purposefully botched internet was fine.
>The privacy losses are more than offset by the gains of communicating with everyone on the planet.
Definitely a contentious take these days, given recent events.
This is allowed on iOS and Xcloud is still not available.
Why can’t we have both?
Maybe Amazon in 2000 wasn't so icky but there was also no free same day shipping. Apple II could be repaired without "special tools" but those machines were huge, heavy, mostly empty space, and gap and glass alignment was way worse. I wish I could say something smart about Windows 95 but I've worked hard to erase it from my memory, so I can't. :)
Electronics things, just in general, did a lot less in the past. With that comes good and bad.
Privacy is a trade-off and right now the general public doesn't place a high value on privacy so they're happy to trade it away for anything. Honestly I understand it. I'm convinced I'm going to get bombarded with marketing nonsense regardless so I might as well get something for it.
Remember how its uptime was limited to 49.7 days because of a timer's numeric overflow (and in something like an audio driver, too, it shouldn't have been system critical). Good times.
A lot of computing in the 90s and earlier was terribly unstable. And that was without considering how prevalent viruses were in the 90s, too.
Ofcourse you are not required. You also free to retire as a hermite in a remote island.
For anybody hoping to be a non-conspicuous part of society, refusing to condone abusive tech services extracts an ever growing toll.
But you're here, saying that on HN.
I've seen people say similar things on Reddit, in IRC channels, on blogs, Gemlogs, Mastodon posts, and other similar venues, without realizing the irony of it.
What does an "entire country" have to do with it? People move online communities between platforms all the time -- and many communities have presences on multiple platforms.
> In the short term, you either accept defeat and learn to love the adtech bomb or you withdraw into the digital wilderness.
I'm just not seeing the argument here. Suppose you've got 50 users on Discord and would prefer to move to Matrix. So you post a link to the Matrix channel on your Discord server, lock stuff for further posting in Discord, and update external links and documents. People do this sort of stuff all the time without being "defeated".
yeah, thats pretty clear. Because you choose to focus on cases where you do have agency to do something, e.g. its my discord and I am moving us to matrix - and goodbye to those who will not migrate.
Now think about an established group where you are a simple member and you say, "hey folks, why don't we move to something that is better for us, no ads, no data collection, etc."? And they look at you with glazed eyes, and... shrug, and that is the end of the conversation. Now what Don Quixote?
> What does an "entire country" have to do with it?
In countries with high facebook/meta adoption if you want up-to-date information about an event or an establishment it may only exist on meta platforms. Only larger entities can afford to have an independent website, and many such sites are typically in a state of disrepair and neglect.
As an individual trying to go against so-called network effects most of the time you have very little leverage. Its really fighting against wind mills.
https://knowyourmeme.com/editorials/guides/what-is-the-we-sh...
Most people do use them though.
> The privacy losses are more than offset by the gains of communicating with everyone on the planet.
I completely disagree. Most people aren't actually communicating. At least not in any form that matters. The drastic increase in loneliness and depression that correlates with the increase in connectivity should at least show that more social media doesn't mean more happy.
Examples of what it can do - Autonomous lighting with mmWave radar with 180 degrees fov and ambient light sensor - Recording of temperature, humidity, barometric pressure and VOX to onboard SQLite database at a chosen interval. - Onboard web server, which serves as dashboard and configuration page. - Communication platform with integrated microphone (hardware indicator light, off by default) and speakers. I’m also experimenting with talking to LLMs like this.
And many more things. If you’d like to reach us hello [at] sentionic.com
When I go out for a walk in the forest, I see maybe one or two people walking a dog. Where is the rest of humanity? Watching TV, playing a mobile game, whatever...
The balanced NPU/GPU/CPU with close cache ram, such as the recent Lunar Lake chip, coupled with better integrated GPUs in consumer class laptops, and a nice web GPU webgl api .. make for a lot of capability for problems that are well solved by Monte Carlo simulation ... and RL generally.
3D apps are now really doable in browser on current devices... and the browser is a great delivery device for applications, avoiding platform specific installs and dependency hell.
Not convinced the average user cares enough to accept the compromises though and the marketing budgets of big tech is a force to reckon with too
How about the price?
A quick googling suggests that it cost ~6,500 USD (in today's money) to buy an Apple II when it launched. Obviously it was a different time, but that sort of price today would likely be called predatory by at least some people.
# The barriers put in place by companies to prevent their hardware from being tinkered with
Locked bootloaders and the absence of hardware documentation to base the development of drivers on is an example of this.
This prevents the community from taking over when a device reaches end of life or expanding a device to be more useful/open than the company originally intended.
Examples of this are:
- Apple's M-series Macs/MacBooks. While the hardware is remarkable, Apple's anti-competitive practices manifest in MacOS holding the devices back from their potential. Asahi Linux is an indicator of demand and its success is remarkable given what they are up against. If Apple was compelled to provide reference documentation of their hardware sufficient for driver development then the resulting alternative operating systems introduces competition to an otherwise stagnant market.
- Microsoft's Surface laptops and broadly the new X-Elite hardware lineup shares the same criticism as Apple's platform
- Mobile phones. Imagine an iPhone running Android. Imagine a Galaxy, Pixel, etc running Linux where Android apps are executed within Waydroid containers? Not going to happen because we are either blocked by bootloaders or blocked by a lack of drivers (deliberately hidden by manufacturers)
- Better health trackers. Imagine buying a FitBit, installing an community maintained operating system that has no subscription fees and handles health inference through transparent algorithms that can be contributed to by academics around the world.
# No "right to repair" software as it's practically illegal
It's virtually illegal to repair software. Decompiling software and fixing it, even if it's end of life, can land you in court.
There are so many software projects out there that I would personally love to revive. Think of games like Heroes of Might and Magic 3
Anyway, I've been ranting too much on this topic but you get the idea. I wish governments would grant people protection to tinker/improve hardware AND software and compel corporations to provide sufficient documentation to practically enable that.
In the 2020s I can do both of those things from walled garden OSs with DRM. The option to use an offline OS is still here too, although I can't do those things with it. That's a step forward?
I feel like retaining control on a scale that effects the average person is basically impossible.
But even if they were implemented, laws and regulatory enforcement might not have the desired effect. Surveillance capitalism, adtech, and data brokers seem to be surprisingly GDPR-resistant, and California's CCPA seems to have had minimal effect. Right-to-repair is limited by miniaturization and component integration, and the result seems to be Apple's impractical and expensive repair kits. Antitrust seems to be ineffective (see IBM, Microsoft.) Even the DMA, carefully crafted to target Apple, Google, Meta et al. (and to extract billions from them for noncompliance) doesn't seem to be affecting the dominance of those companies just yet.
Until a lot of people start questioning their habits as consumers, companies won't change.
What is a coin in an arcade videogame if it's not a microtransaction?
Software as a service is just different, and it's not all bad. You have automated upgrades, a consistently funded developer that can better plan and deliver updates, if you only need the software for a short period of time it can be cheaper. Frankly the packaged software approach was a kludge due to the technical limitations of the time. Now if big releases make sense developers do that, if incremental updates over time provided as a service make sense, they can do that.
Most of the section on what we can do about all this is focused on stuff that didn't exist in the past. The internet and online services, social media. Going back to the past wouldn't be to do those in some ideal way that used to exist, it would mean not doing them at all. Sure.
There is no ideal past to go back to without pulling the online plug. However, that plug isn't going away, and we don't actually want it to. The "How we can reclaim control" bit at the end is mostly correct, but it's really about coming to grips with managing the new reality, not going back to a situation we've outgrown.
That is, some people might see it as a lesser of two evils situation where you're choosing the domestic monopolist rather than the foreign one. I think there are certain worldviews where this is worth enough to sacrifice privacy, competition, etc.
One must stop seeing things from a technical tradeoff perspective and start perceiving them from a political stable due to hostages perspective. The arguments make sense because they must.
Just as your iPhone or Android sends GB of data back and forth without your knowledge or approval, our dear friends in Microsoft want a piece of that pie.
Most people are already consumed by the 'machine'. Those who resist will stick to Win10 as much as possible (I got a 2014 laptop that runs Win10Pro and runs perfectly) and my gaming desktop doesn't need an upgrade for another 10 years.
All we need is selective updates, run privacy/blocking tools, change our hosts file, run a firewall (WindowsFirewallControl) on MediumFiltering, etc.
Unfortunately only few can do such fine-tuning to their PCs (a few thousand people in the 8bn people).
The rest will be consumed by the 'machine'. I've mentioned on another topic/comment. We are cattle. We push back very rarely and on very few topics.
I am a Gibson-kinda-guy. Take the $200. Give me the OS. Stay away.
While there are nice ideas in general, too much of it is looking at the past with rose colored glasses. And this makes the argument to go back to these ideals kinda icky. If we really want to do something, we should have a real critical look at why we're here in the first place IMHO, and this isn't it.
> For a while—in the ’80s, ’90s, and early 2000s—it felt like nerds were making the world a better place.
The nerds (dare I say "we" ?) made the world a different and more connected place, with clear evolutions in regarding finance, productivity and science.
Does it make the world a better place ? Did the productivity and finance improvements bring a better and more welcoming society for instance ?
It can be argued either way, but that question can't be glossed over as a given IMHO.
Then there is no reflection on how computing has become a commodity. It still needs more freedom and control, but these two ideals don't mean the same thing if you're a 30yo single DevOps engineer or a 50yo at home parent watching over 5 kids. Both need computing, but the purpose and intricate needs are completely different. Focusing only on one because it's easier kinda misses the point IMHO (and we're back to the role of technology and how exactly it makes the world better)
I remember taking my Atari 400 on family vacations, reading Compute magazine by the pool and learning a shitload about programming just by reading. Oh and yes, I did ride my bike and have friends and play baseball and go to the beach. Computers were just another fun thing to do. And eventually I put what I learned to use at Apple and several other big-name companies.
Today, the dominant platform (Windows) is an execrable, intolerable shitshow of anti-user arrogance and aggression and abuse. Apple's platforms are better, but I have little confidence in how long that'll last. In the end, I guess we're going back to "nerds" using real computers running Linux, and pigeons pecking at big colored buttons on touchscreens to get their reward pellets.
Not that there are ready made solutions that are being ignored, but if we are going to move beyond conceptual statements it will require some pretty potent medicines that can start fighting the cancer by taking it head on.
The enshittification is now in an advanced stage and the billions of addicted masses an enormous inertial weight. Witness e.g., the grotesque politics around the tiktok non-ban.
Imho a key ingredient is to ditch the focus on the "personal" and start thinking of "interpersonal computing" (just made that term up). Basically personal computing that is network-first, web-native. The owner-operator is empowered to join the matrix, find their way around without gatekeepers, connect with agency, exchange, filter, process with helpful and transparent algorithms and get on top of the informatiom firehose. Nothing radically new in terms of hardware or software, just rearranged furniture to serve citizens, not some digital oligarchy.
The huge success of social media is because it tapped into the immense sociability of our species. Somehow we need to reclaim that trait for the good side of technology, with devices and software that are actually desirable without being leeches that suck society dry.
But do young people really love this hyper-commercial internet these days? All the subscription services? The empty social media content?
I do see what they mean a bit because I'm pretty sceptical of AI, though I did set up my own server to experiment with it in a way where my stuff doesn't end up in the cloud.
The techie web isn't coming back and wishing it so won't do so. You can always just drop into an IRC network or Mastodon server with other nerds, but the days that everyone on the web was a techie nerd with general-purpose computing interests is long gone.
> The techie web isn't coming back and wishing it so won't do so. You can always just drop into an IRC network or Mastodon server with other nerds, but the days that everyone on the web was a techie nerd with general-purpose computing interests is long gone.
Well in that way it's still there. It lives on here on HN and the other places you mention. Probably as big as it was in those days. I don't think it's really gone. Just the internet grew around it with all the commercial BS and big tech companies viewing users as products.
Also, us techies manage to avoid the worst of that with adblockers, pay wall blockers, pirate video downloads, self hosted services etc. I probably see only a handful of ads every day. Even my phone blocks most of them. I also have custom scripts for the sites I frequent the most to make them more info dense like hacker news (and remove most of the big photos)
Yep, earlier eras of computing were characterized by more user control, less surveillance and fewer predatory business models. Yet it’s important not to overlook the progress we’ve made. Modern tech is vastly more powerful, accessible, and interconnected than what came before. And in a case of a tech world nostalgia should inspire action for improvement
> "We need comprehensive privacy legislation in the United States that enshrines individual privacy as a fundamental right. We need Right to Repair legislation that puts control of our devices back into our hands—and also DRM reform, especially repealing Section 1201 of the DMCA so we can control the goods we own and historians can preserve our cultural heritage without the need for piracy."
Laws alone won't be enough - there's a need for new design approaches for production devices and systems. For each of the above:
-Expanding high-speed internet to all regions of the country is a positive, but privacy is limited because metadata is visible, and if we assume all nations are tapping into the trunk of the internet and collecting everything that transits their systems, this means strong encryption should be the concept around which all communication systems are built, so that at least the content of the messages can't be read.
-Right-to-Repair should extend to device design goals in which maintenance and replacement of components is intended and user alterations and upgrades aren't actively blocked. Batteries should be relatively easy to replace, etc.
-For cultural history preservation, allow archivists to bypass DRM and store offline backups of materials. Also make it easy to become an archivist and build communities of archivists.
- Nintendo Switch games don't have microtransactions now
- VHS are unplayable now because people no longer have the machines. You can still buy anything on Blu-ray and own it forever but most people prefer the convenience of not needing a machine and disc collection.
- On Amazon now, nearly literally anything you could find in a box store is available, and you can have most of it in 1 day. Just buy from reputable brands or Amazon itself and you will be fine.
- There are so many benefits of a smartphone -- maps, internet browser for emergencies, music streaming, audiobooks, 2-factor. Flip phones are still around but no one uses them
- Google search is barely even needed now because of chatgpt, which also doesn't have ads and seo trash
- Ubuntu is better than Microsoft 95 and doesn't track you
- Social media is worse now, I'll concede here.
2. The article seemingly champions personal liberty and then has a section titled "How we can reclaim control". How about we let consumers decide what they want? If you don't like microtransactions don't buy games with microtransactions, eg.
3. It's ironic that the community run by the premier tech vc seems so against capitalism.
Wow, incredible progress in just 30 years!
The Californian Ideology is either pro-corporate or corporate-naive. Either technology itself is deterministically going to democratize things (like the Internet was an auto-democratic force, they thought—or now AI is going to help liberate everyone, hah) and/or you just don’t need to worry about private interests.
But private (corporate) interests didn’t just come out of nowhere. The Internet was created by the US state (federal) sector and then handed over for commercialization around the Clinton era. Should anyone be surprised about the turn of events?
Now the author, just as naive as the rest, talks about reigning in corprorate interests by enacting laws. And who is gonna make the politicians do that? The rich control the government. Many of them are the tech-rich.
Biden said in his farewell address that he was worried about a rising Tech Robber Baron era. Yikes. Someone should have done something about that. Like the departing president, perhaps?[1]
All of this was mainly done by the rich. Not by nerds (because not all nerds are rich). But the less wealthy California nerds who bought into the Californian Ideology helped it along.
[1] It’s not that he did nothing. It’s that he did a half-spirited job of it. If he really meant and was motivated by his own words he would have done more.
We may as well follow Steve Martin and get small again.
https://www.youtube.com/watch?v=w6Na0M-Ixm8
1. They are doing a little bit of revisionist history, as the industry was fiercly capitalist and proprietary at that time.
2. This topic really does feel rather beaten to death and I think the target audience is not getting any new information.
Speaking specifically about the revisionist history part:
> At its core, the PC movement was about a kind of tech liberty—–which I’ll define as the freedom to explore new ideas, control your own creative works, and make mistakes without punishment.
Was it? The PC has its roots in IBM, and it became the target product to clone because, since the project was something of a sidenote to IBM's main business, IBM was too cheap/lazy/wahtever to develop proprietary parts. They cobbled together a system that was easy to clone, perhaps entirely by accident.
The PC wasn't a universal compatible open standard because of tech liberty, it was a compatible standard because (among other reasons) Microsoft introduced a new OS business model where PC clones fighting each other over low margins benefitted Microsoft. Before Microsoft DOS, each PC was its own moat with its own hardware, its own operating system, and its own proprietary software. Microsoft made everything easy and wonderful as long as you kept using Windows.
Apple operated back with the OS/hardware/software moat back then and that's essentially how they continue to operate. They are the only company that survived after that era using that fully proprietary business model and still operates that way.
As another commenter pointed out, Nintendo was ruthless about hardware DRM and was a full blown monopoly in their heyday. That's why your parents always call it "Nintendo" instead of "video games," because there was no other vendor anywhere near as successful at that time.
Another example of a lack of tech liberty, "Don't Copy that Floppy" was all over the place, a phrase that I've heard injected into Computer Chronicles episodes. Companies were doing all kinds of things to try and prevent you from inspecting, modifying, and copying their software.
The Linux kernel didn't exist until 1991, and most UNIX flavors were proprietary.
The only reason that era didn't have invasive privacy and data extraction problems is because it wasn't feasible, not because it was an era and movement that had excellent tech liberty.
Compare that to today, and it's actually today that's much more of an era of personal computing freedom. I certainly wasn't using an open source web browser, open source IDE, open source server operating system, open source graphics driver, open source PDF editor/viewer, or much other open source software in the 90's. It would have been unthinkable back then to use an open source program to do something like 3D graphics rendering, that would have been reserved for 5-figure Silicon Graphics workstations. And good luck replacing Adobe with something open source.
Hosting a major commercial website for a fortune 100 company on an open source operating system? You would be laughed out of town.
Unfortunately, the mobile revolution didn't work that way. Regular folks don't care about open, flexible, and cheap. Only convenient and cheap. The gravity of those folks has led us here.