It takes five minutes to delete your TikTok, Meta, and Instagram accounts. Setting up forwarding rules from Gmail to Fastmail or another provider takes maybe a little longer, after three months hopefully all your emails are going to the new account after changing them. These companies can’t manipulate you if you don’t use their products.
“We respect your privacy” banner, with a big green ok button and a “manage data collection” tiny print text that had consent for everything automatically approved
The point isn't us. You should know that. The point is the 99.8% that doesn't have our skills, and is forced into these dark patterns, by deception or psychological manipulation.
Sure, making instagram as addictive as possible seems bad but I disagree with the framing a bit. Dark patterns get users to do things they don't want, that's why they get super annoyed at the design or the process or the outcome. Addictive apps are a different thing to me.
I don't think it's that compelling to say "obviously no one wants to be on Instagram and they're getting manipulated into it." ...yeah they do! The question is can you make a compelling case that spending time on it is harmful.
Worked great if the plan was to swoop in with a promise to restore the drug supply… but only if the dealer sells through a group of close friends who donate heavily to you.
Well seeing how we are all granted with one single life, maybe we should be more upset at things that take away our valuable time and replace it with things that make us angry? Who's to say that these things aren't worse than heroin? Lots of people would argue otherwise, I'm becoming one of them myself. Heroin only impacts one individual, social media impacts every connected person on the planet.
They are the wolf. The product is the user's attention, they are ad delivery networks disguised as "social media."
The entire revenue model is based on on engagement and clicks, the product is incentivized to maximize time spent on the service at any cost. Addiction is a core engineering requirement.
they know what they're doing, they've tried to bury the evidence but their own internal studies have shown addiction and harmful psychological effects in children
There are still supposedly serious people who should know better than insist "dark patterns" are not real and a mechanism to attack tech companies. I don't know how anyone these days can honestly reach that conclusion. Some of these sites use similar strategies as the old tobacco companies used to, all of this stuff is known already to marketers.
Look it's either this or we adopt an economic strategy that isn't basically "assume the market magically knows what is best"—i.e., communism, as I understand Americans to know the term.
While I agree with the premise, I do wonder how you can write a law that would stop the behavior we want to stop without hurting beneficial features or allowing the law to be too easily bypassed.
How do you describe in a legal way the difference between a useful feature people want and an addictive feature they don’t want?
> How do you describe in a legal way the difference between a useful feature people want and an addictive feature they don’t want?
For laws like this it always boils down to "I'll know it when I see it" which is such a shockingly poor way to write legislation that I'm flabbergasted it doesn't immediately fail any amount of rudimentary scrutiny. Not to mention the latitude it grants for selective enforcement. It's basically Washington asking (through the Economist) for a leash on platforms that host their critics that they can yank at any time the population gets too rowdy, with the convenient justification that the algorithm is too good and our attention spans are in danger or whatever.
Agree. My first thought is most people in early days didn’t even want to start using PCs for work to begin with. The businesses generally had to mandate it. I imagine many people are facing this today with AI.
Very simple - force companies into data interoperability. That will allow users to move to competition without any data loss. I.e. nobody actually cares that GitHub is constantly down because you can move your repos to a different git provider or to your own server.
Well, you could look to the gambling market for inspiration and let people voluntarily sign up for a blacklist on that feature.
That would be a lot of extra work for the platforms, but I think the results would be interesting. It amounts to legislating that certain features have to be optional and configurable.
The Irony is that in order to read this entry I had to pass a cookie wall, which gave me only ‘Accept all’ and ‘Manage’. Then I couldn’t read it, because I had no subscription.
> An internal memo found that 12-year-olds were three times as likely as 32-year-olds to stay on Facebook for the long term, despite the platform nominally requiring users to be at least 13; the memo concluded that Facebook “should consider investing more heavily in bringing in larger volumes of tweens”.
100 years from now the descendants of the engineers who work at Big Tech will be looked upon by their descendants with the same shame that people nowadays look at ancestors who were involved in tobacco.
This is an outrageously dumb thing to say. BIg Tobacco knowingly sold a product that physically addicted (the only real form of addiction) its users and killed them.
Facebook ran experiments on on unknowing teenage girls to study how being shown negative content leads to negative mental health outcomes, which has lead to suicide.
> Problem gambling (PG), also known as pathological gambling, gambling disorder, gambling addiction or ludomania, is repetitive gambling behavior despite harm and negative consequences. [0]
Addiction isn't just [chemical in blood stream] -> [addiction]. Addiction involves many steps, many of them in the brain, and many of those reactive to non-physical events.
> Contrary to the earlier notion that addiction is predominantly a substance dependency, research now suggests that any source or experience capable of stimulating an individual has addictive potential. This has led to a paradigm shift in the psychiatric understanding of behavioural addictions.
dopamine, the little “hit” you get on social media sites or when you get a “ping”, has a massive role to play in behavioural addictions. and with behavioural addiction it basically causes the same stuff in the brain that cocaine etc does (very simplified explanation).
also, i’m a recovering drug addict. and i can tell you for sure from my lived experience that addiction is definitely not limited to physical stuff like drugs. xD
Gambling is conventionally considered addictive, but the user isn't ingesting chemicals. I don't think a physical/non-physical binary really stands up under scrutiny. I mean, aren't all addictions physical insofar as they stimulate the body to produce neurotransmitters?
Plus, smoking doesn't kill people; its pathological outcomes do. Similarly, looking at a phone screen might hurt a user's eyes, but it won't kill them; however, the decisions that user makes over time due to the effects of the subject matter they interact with might definitely put them at risk. And if aspects of that subject matter are deliberately amplified for their addictive properties, should platforms be regulated to control this?
Step by step I am slowly backing away from
any technology that I dont like, sometimes going to ridiculous lengths to bypass certain imposed aysmmetric requirements, up to and including abandonment.
Nothing in my house beeps.
My only online subscription is for web space.
At this point it has become fun, as I have stoped reacting, and am experimenting and planning ahead, while figureing out ways to increase my income, while reduceing my personal spend
I'm no defender of engagement algorithms and social media (including upvote based algos and this site too)....but this is a ridiculous argument.
Social media is not making you behave in ways you don't want. On the contrary, it's giving you EXACTLY what you want. People want to doomscroll social media instead of engage reality, because the real world requires action, effort and social risk...doomscrolling is pure passive consumption.
If we're going to give people autonomy and freedom to choose how they spend their time, at some point we have to draw the line and hold people accountable for their own actions. Or we have to acknowledge we'd rather stay in a permanent state of adolescence and give full control of our lives to big brother.
This constant push by the urban monoculture to turn everything into an "addiction" and turn everyone into a "victim" is a terrible set of ideas to put in peoples heads and is equally as toxic as anything they claim smartphone apps are trivially doing with UI design.
Apps are not physically addictive like cigarettes or alcohol and never have been.
And if you're going to argue social media preys on reward systems in the brain, this is also true about everything that humans do. Reward systems in the brain govern every single action we take, so everything we do can turned into a victimization by some addictive outside force.
Everything is an addiction. Nothing is an addiction.
Why do you get out of bed at all in the morning? What drives you to exist? Why are you sitting at your keyboard right now arguing with a random stranger on the internet?
Are you procrastinating something else you should be doing instead...and is that Hackernews' fault or yours?
“We respect your privacy” banner, with a big green ok button and a “manage data collection” tiny print text that had consent for everything automatically approved
Before they existed websites would just put stuff on your computer without asking. They’re literally a consumer protection.
Direct your outrage elsewhere.
I think you're being condescending though, and missing the point.
I don't think it's that compelling to say "obviously no one wants to be on Instagram and they're getting manipulated into it." ...yeah they do! The question is can you make a compelling case that spending time on it is harmful.
I’ve been using the internet for longer than I care to admit, and I’ve never seen anything like it.
It was like 300 million junkies all lost their drug supplier at the same time.
Mass misery is still misery.
By that I mean- is the product addiction, with a shroud of media, or is it media which just happens to be addictive.
The entire revenue model is based on on engagement and clicks, the product is incentivized to maximize time spent on the service at any cost. Addiction is a core engineering requirement.
How do you describe in a legal way the difference between a useful feature people want and an addictive feature they don’t want?
For laws like this it always boils down to "I'll know it when I see it" which is such a shockingly poor way to write legislation that I'm flabbergasted it doesn't immediately fail any amount of rudimentary scrutiny. Not to mention the latitude it grants for selective enforcement. It's basically Washington asking (through the Economist) for a leash on platforms that host their critics that they can yank at any time the population gets too rowdy, with the convenient justification that the algorithm is too good and our attention spans are in danger or whatever.
That would be a lot of extra work for the platforms, but I think the results would be interesting. It amounts to legislating that certain features have to be optional and configurable.
Huh? Does anyone actually care any more? The kind of moralizing busybodies that spend their time shaming the tobacco industry are few and far between.
Facebook is not that.
Addiction isn't just [chemical in blood stream] -> [addiction]. Addiction involves many steps, many of them in the brain, and many of those reactive to non-physical events.
[0] https://en.wikipedia.org/wiki/Problem_gambling
https://journals.sagepub.com/doi/10.1177/26318318221116042
snippet from the abstract
> Contrary to the earlier notion that addiction is predominantly a substance dependency, research now suggests that any source or experience capable of stimulating an individual has addictive potential. This has led to a paradigm shift in the psychiatric understanding of behavioural addictions.
dopamine, the little “hit” you get on social media sites or when you get a “ping”, has a massive role to play in behavioural addictions. and with behavioural addiction it basically causes the same stuff in the brain that cocaine etc does (very simplified explanation).
also, i’m a recovering drug addict. and i can tell you for sure from my lived experience that addiction is definitely not limited to physical stuff like drugs. xD
gonna need a citation on that one, dawg
Plus, smoking doesn't kill people; its pathological outcomes do. Similarly, looking at a phone screen might hurt a user's eyes, but it won't kill them; however, the decisions that user makes over time due to the effects of the subject matter they interact with might definitely put them at risk. And if aspects of that subject matter are deliberately amplified for their addictive properties, should platforms be regulated to control this?
https://www.npr.org/sections/shots-health-news/2025/06/18/nx...
Social media is not making you behave in ways you don't want. On the contrary, it's giving you EXACTLY what you want. People want to doomscroll social media instead of engage reality, because the real world requires action, effort and social risk...doomscrolling is pure passive consumption.
If we're going to give people autonomy and freedom to choose how they spend their time, at some point we have to draw the line and hold people accountable for their own actions. Or we have to acknowledge we'd rather stay in a permanent state of adolescence and give full control of our lives to big brother.
This constant push by the urban monoculture to turn everything into an "addiction" and turn everyone into a "victim" is a terrible set of ideas to put in peoples heads and is equally as toxic as anything they claim smartphone apps are trivially doing with UI design.
Apps are not physically addictive like cigarettes or alcohol and never have been.
And if you're going to argue social media preys on reward systems in the brain, this is also true about everything that humans do. Reward systems in the brain govern every single action we take, so everything we do can turned into a victimization by some addictive outside force.
Why do you get out of bed at all in the morning? What drives you to exist? Why are you sitting at your keyboard right now arguing with a random stranger on the internet?
Are you procrastinating something else you should be doing instead...and is that Hackernews' fault or yours?