At the time, YouTube said: “Anything that would go against World Health Organization recommendations would be a violation of our policy.” [1] which, in my opinion, is a pretty extreme stance to take, especially considering that the WHO contradicted itself many times during the pandemic.
> the WHO contradicted itself many times during the pandemic
Did they? I remember them revising their guidance, which seems like something one would expect during an emerging crisis, but I don't remember them directly contradicting themselves.
it was an extreme time, but yes, probably the most authoritarian action I've seen social media take.
misinformation is a real and worsening problem, but censorship makes conspiracies flourish, and establishes platforms as arbiters of truth. that "truth" will shift with the political tides.
IMO we need to teach kids how to identify misinformation in school. maybe by creating fake articles, mixing them with real articles and having students track down sources and identify flaws. critical thinking lessons.
As I recall from my school days, in Social Studies class, there were a set of Critical Thinking questions at the end of every chapter in the textbook. Never once were we assigned any of those questions.
We live in a complicated world, and we do need the freedom to get things right and wrong. Never easy though in times of crisis.
Silver lining in this is the conversation continued and will continue. I can see governments needing to try to get accurate and helpful information out in crisis - and needing to pressure or ask more of private companies to do that. But also like that we can reflect back and go - maybe that didn’t work like what we wanted or maybe it was heavy-handed.
In many governments, the government can do no wrong. There are no checks and balances.
The question is - should we still trust YouTube/Google? Is YouTube really some kind of champion of free speech? No. Is our current White House administration a champion of free speech? Hardly.
But hopefully we will still have a system that can have room for critique in the years to come.
I'm very pro-vaccines, I don't think the 2020 election was stolen. But I think we have to realize silencing people doesn't work. It just causes the ideas to metastasize. A lot of people will say all kinds of craziness, and you just have to let it ride so most of us can roll our eyes at it.
> No one owes you distribution unless you have a contract saying otherwise.
The common carrier law says you have to for for some things, so it makes sense to institute such a law for some parts of social media as they are fundamental enough. It is insane that we give that much censorship power to private corporations.
It's interesting how much "they are a private company, they can do what they want" was the talking point around that time. And then Musk bought Twitter and people accuse him of using it to swing the election or whatever.
Even today, I was listening to NPR talk about the potential TikTok deal and the commenter was wringing their hands about having a "rich guy" like Larry Ellison control the content.
I don't know exactly what the right answer is. But given their reach -- and the fact that a lot of these companies are near monopolies -- I think we should at least do more than just shrug and say, "they can do what they want."
So you're saying that YouTube is a publisher and should not have section 230 protections?
They can't have it both ways. Sure remove content that violates policies but YouTube has long set itself up as an opinion police force, choosing which ideas can be published and monetized and which cannot.
The more important point (and this is really like a high school civics debate) is that the government and/or a big tech company shouldn't decide what people are "allowed" to say. There's tons of dumb stuff online, the only thing dumber is the state dictating how I'm supposed to think. People seem to forget that sometimes someone they don't agree with is in power. What if they started banning tylenol-autism sceptical accounts?
> the government and/or a big tech company shouldn't decide what people are "allowed" to say.
That "and/or" is doing a lot of work here. There's a huge difference between government censorship and forcing private companies to host content they don't want to host on servers they own.
Then again, Alphabet is now claiming they did want to host it and mean old Biden pressured them into pulling it so if we buy that, maybe it doesn't matter.
> What if they started banning tylenol-autism sceptical accounts?
What if it's pro-cannibalism or pedophilia content? Everyone has a line, we're all just arguing about where exactly we think that line should be.
It can simultaneously be legal/allowable for them to ban speech, and yet also the case that we should criticize them for doing so. The first amendment only restricts the government, but a culture of free speech will also criticize private entities for taking censorious actions. And a culture of free speech is necessary to make sure that the first amendment is not eventually eroded away to nothing.
Isn’t promoting/removing opinions you care about a form of speech?
If I choose to put a Kamala sign in my yard and not a Trump sign, that’s an expression of free speech.
If the marketing company I own decides to not work for causes I don’t personally support, that’s free speech.
If the video hosting platform I’m CEO of doesn’t host unfounded anti-vax content because I think it’s a bad business move, is that not also free speech?
Are you in favor of HN allowing advertisements, shilling, or spam in these threads? Because those things are free speech. Would you like to allow comments about generic ED pills?
I simply don't believe people who say they want to support a culture of free speech on a media or social media site. They haven't really thought about what that means.
The line should be what is illegal, which, at least in the US, is fairly permissive.
The legal process already did all the hard work of reaching consensus/compromise on where that line is, so just use that. At least with the legal system, there's some degree of visibility and influence possible by everyone. It's not some ethics department silently banning users they don't agree with.
The thing is that people will tell you it wasn’t actually censorship because for them it was only the government being a busy body nosey government telling the tech corps about a select number of people violating their terms (nudge nudge please do something)… so I think the and/or is important.
No one in Big Tech decides what you are allowed to say, they can only withhold their distribution of what you say.
As a book publisher, should I be required to publish your furry smut short stories? Of course not. Is that infringing on your freedom of speech? Of course not.
This is just a reminder that we're both posting on one the most heavily censored, big tech-sponsored spaces on the internet, and arguably, that's what allows for you to have your civics debate in earnest.
What you are arguing for is a dissolution of HN and sites like it.
Does Disney have a positive obligation to show animal cruelty snuff films on Disney Plus? Or are they allowed to control what people say on their network? Does Roblox have to allow XXX games showing non-consensual sex acts on their site, or are they allowed to control what people say on their network? Can WebMD decide not to present articles claiming that homeopathy is the ultimate cure-all? Does X have to share a "trending" topic about the refusal to release the Epstein files?
The reason we ban government censorship is so that a private actor can always create their own conspiracy theory + snuff film site if they want, and other platforms are not obligated to carry content they find objectionable. Get really into Rumble or Truth Social or X if you would like a very different perspective from Youtube's.
Perhaps free speech isn't the problem, but free speech x algorithmic feeds is? As we all know the algorithm favors the dramatic, controversial, etc. That creates an uneven marketplace for free speech where the most subversive and contrarian takes essentially have a megaphone over everyone else.
Glad to see this, was going to make a similar comment.
People should be free to say what they want online. But going down "YouTube conspiracy theory" rabbit holes is a real thing, and YouTube doesn't need to make that any easier, or recommend extreme (or demonstrably false) content because it leads to more "engagement".
Building on that, the crazy person spouting conspiracy theories in the town square, who would have been largely ignored in the past, suddenly becomes the most visible.
I think it made sense as a tactical choice at the moment, just like censorship during wartime - I dont think it should go on forever, because doing so is incompatible with a free society.
It didn't even make sense at the time. It tainted everything under a cloud that the official, accepted truth needed to suppress alternatives to win the battle of minds. It was disastrous, and it is astonishing seeing people (not you, but in these comments) still trying to paint it as a good choice.
It massively amplified the nuts. It brought it to the mainstream.
I'm a bit amazed seeing people still justifying it after all we've learned.
COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.
And to state my position like the root guy, I'm a progressive, pro-vaccine, medical science believer. I listen to my doctor and am skeptical if not dismissive of the YouTube "wellness" grifters selling scam supplements. I believe in science and research. I thought the worm pill people were sad if not pathetic. Anyone who gets triggered by someone wearing a mask needs to reassess their entire life.
But lockdowns went on way too long. Limits on behaviour went on way too long. Vaccine compliance measures were destructive the moment we knew it had a negligible effect on spread. When platforms of "good intentions" people started silencing the imbeciles, it handed them a megaphone and made the problem much worse.
And now we're living in the consequences. Where we have a worm-addled halfwit directed medicine for his child-rapist pal.
>It massively amplified the nuts. It brought it to the mainstream.
>COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.
In theory, I agree, kind of.
But also - we were 10+ months into COVID raging in the US before Biden’s administration, the administration that enacted the policies the article is about, came to be. Vaccine production and approval were well under way, brought to fruition in part due to the first Trump administration.
The “nuts” had long been mainstream and amplified before this “silencing” began. Misinformation was rampant and people were spreading it at a quick speed.
Most people I know who ultimately refused the vaccines made up their minds before Biden took office.
I think there's a difference between silencing people, and having an algorithm that railroads people down a polarization hole.
My biggest problem with YouTube isn't what it does/doesn't allow on its platform, it's that it will happily feed you a psychotic worldview if it keeps you on the site. I've had several family members go full conspiracy nut-job after engaging heavily with YouTube content.
I don't know what the answer is. I think many people would rightly argue that removing misinformation from the recommendation engine is synonymous with banning it. FWIW I'd be happy if recommendation engines generally were banned for being a societal cancer, but I'm probably in the minority here.
Obviously, the best solution would be prevention, by having good education systems and arming common people with the weapons to assess and criticize information, but we are kinda weak on that front.
These policies were put in place because the anti-vax and election skepticism content was being promoted by military intelligence organizations that were trying to undermine democracy and public healthy in the US.
The US military also promoted anti-vax propaganda in the Philippines [0].
A lot of the comments here raise good points about silencing well meaning people expressing their opinion.
But information warfare is a fundamental part of modern warfare. And it's effective.
An American company or individual committing fraud can be dealt with in the court system. But we don't yet have a good remedy for how to deal with a military power flooding social media with information intentionally designed to mislead and cause harm for people who take it seriously.
So
> I think we have to realize silencing people doesn't work
it seems to have been reasonably effective at combating disinformation networks
> It just causes the ideas to metastasize
I don't think this is generally true. If you look at old disinformation campaigns like the idea that the US faked the moon landings, it's mostly confined to a small group of people who are prone to conspiracy thinking. The idea of a disinformation campaign is you make it appear like a crazy idea has broad support. Making it appear that way requires fake accounts or at least boosters who are in on the scheme. Taking that away means the ideas compete on their own merit, and the ideas are typically real stinkers.
I think you are granting false neutrality to this speech. These misinfo folks are always selling a cure to go with their rejection of medicine.
It's a billion dollar industry built off of spreading fear and ignorance, and youtube doesn't have any obligation to host their content.
As an example, for 'curing' autism, the new grift is reject Tylenol and buy my folic acid supplement to 'fix' your child. Their stores are already open and ready.
To finish the thought, scientists at the CDC (in the before times) were not making money off of their recommendations, nor were they making youtube videos as a part of their day job. There's a deep asymmetry here that's difficult to balance if you assume the premise that 'youtube must accept every kind of video no matter what, people will sort themselves out'. Reader, they will not.
And silencing these people only lends credence to their "they don't want you to know this" conspiracy theories. Because at that point it's not a theory, it's a proven fact.
These people will claim they were 'silenced' regardless. Even as they appear with their published bestseller about being silenced on every podcast and news broadcast under the sun, they will speak of the 'conspiracy' working against them at every step. The actual facts at hand almost never matter.
Even at a press conference where the President is speaking on your behalf they'll speak of the 'groups' that are 'against' them, full of nefarious purpose.
There is no magical set of actions that changes the incentive they have to lie, or believe lies. (except regulation of snake oil, which is not going to happen any time soon)
In general, you can't argue or 'fact' people out of beliefs they were not argued into. The best you can do is give them a safe place to land when disconfirmation begins. Don't be too judgy, no one is immune to propaganda.
I agree. People today are far more anti-vaccine than they were a few years ago which is kinda crazy when you consider we went through a global pandemic where one of the only things that actually worked to stop people dying was the roll out of effective vaccines.
I think if public health bodies just laid out the data they had honestly (good and bad) and said that they think most people should probably take it, but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.
I think the anti-vax thing is mostly because the average Western education level is just abysmal.
Add in a healthy dose of subconsciously racist beliefs about how advanced Western society is (plus ideas of how this means they must be smart too) and how catching diseases preventable by vaccines is only a brown people thing.
Basically, it's easy to be anti-vax when the disease isn't in your face and you have an out-group to blame even if it does end up in your face (a common excuse by anti-vaxxers I see when measles is in the news is that the immigrants are bringing it in and should be blamed instead of anti-vaxxers)
You don't see those, because it's on their faces. Or more accurately on our faces. I live in such country, and we kill for having our kids vaccinated. We live these diseases, so we aren't so stupid to fall for misinformation.
The anti-vax thing is because every single comparative study of vaccinated and unvaccinated children found a greater rate of developmental disorders in vaccinated children. They're also the only products for which you're not allowed to sue the manufacturers for liability, and the justification given by the manufacturers for requesting this liability protection was literally that they'd be sued out of business otherwise. If they were as safe as other treatments they wouldn't need a blanket liability immunity.
Anthony R. Mawson, et al., “Pilot Comparative Study on the Health of Vaccinated and Unvaccinated 6 to 12-year-old U.S. Children,” Journal of Translational Science 3, no. 3 (2017): 1-12, doi: 10.15761/JTS.1000186
Anthony R. Mawson et al., “Preterm Birth, Vaccination and Neurodevelopmental Disorders: A Cross-Sectional Study of 6- to 12-Year-Old Vaccinated and Unvaccinated Children,” Journal of Translational Science 3, no. 3 (2017): 1-8, doi:10.15761/JTS.1000187.
Brian Hooker and Neil Z. Miller, “Analysis of Health Outcomes in Vaccinated and Unvaccinated Children: Developmental Delays, Asthma, Ear Infections and Gastrointestinal Disorders,” SAGE Open Medicine 8, (2020): 2050312120925344, doi:10.1177/2050312120925344.
Brian Hooker and Neil Z. Miller, “Health Effects in Vaccinated versus Unvaccinated Children,” Journal of Translational Science 7, (2021): 1-11, doi:10.15761/JTS.1000459.
James Lyons-Weiler and Paul Thomas, “Relative Incidence of Office Visits and Cumulative Rates of Billed Diagnoses along the Axis of Vaccination,” International Journal of Environmental Research and Public Health 17, no. 22 (2020): 8674, doi:10.3390/ijerph17228674.
James Lyons-Weiler, "Revisiting Excess Diagnoses of Illnesses and Conditions in Children Whose Parents Provided Informed Permission to Vaccinate Them" September 2022 International Journal of Vaccine Theory Practice and Research 2(2):603-618 DOI:10.56098/ijvtpr.v2i2.59
NVKP, “Diseases and Vaccines: NVKP Survey Results,” Nederlandse Vereniging Kritisch Prikken, 2006, accessed July 1, 2022.
Joy Garner, “Statistical Evaluation of Health Outcomes in the Unvaccinated: Full Report,” The Control Group: Pilot Survey of Unvaccinated Americans, November 19, 2020.
Joy Garner, “Health versus Disorder, Disease, and Death: Unvaccinated Persons Are Incommensurably Healthier than Vaccinated,” International Journal of Vaccine Theory, Practice and Research 2, no. 2, (2022): 670-686, doi: 10.56098/ijvtpr.v2i2.40.
Rachel Enriquez et al., “The Relationship Between Vaccine Refusal and Self-Report of Atopic Disease in Children,” The Journal of Allergy and Clinical Immunology 115, no. 4 (2005): 737-744, doi:10.1016/j.jaci.2004.12.1128.
Mawson et al. 2017 (two papers) – internet survey of homeschoolers recruited from anti-vaccine groups; non-random, self-reported, unverified health outcomes. Retracted by the publisher after criticism.
Hooker & Miller 2020/2021 – analysis of “control group” data also from self-selected surveys; same methodological problems.
Lyons-Weiler & Thomas 2020, 2022 – data from a single pediatric practice run by one of the authors; serious selection bias.
Joy Garner / NVKP surveys – activist-run online surveys with no verification.
Enriquez et al. 2005 – a small cross-sectional study about allergy self-reports, not about overall neurodevelopment.
Large, well-controlled population studies (Denmark, Finland, the U.S. Vaccine Safety Datalink, etc.) comparing vaccinated vs. unvaccinated children show no increase in autism, neurodevelopmental disorders, or overall morbidity attributable to recommended vaccines.
I picked one at random (NVKP, "Diseases and Vaccines: NVKP Survey Results") and, while I needed to translate it to read it, it's clear (and loud!) about not actually being a scientific study.
"We fully realize that a survey like this, even on purely scientific grounds, is flawed on all counts. The sample of children studied is far too small and unrepresentative, we didn't use control groups, and so on."
Turns out the NVKP roughly translates to "Dutch Organization for those critical towards vaccines."
I understand being skeptical about vaccines, but the skepticism needs to go both ways
"If they were as safe as other treatments they wouldn't need a blanket liability immunity." Citation very much needed for this inference.
Even if I granted every single paper's premise here. I'd still much rather have a living child with a slightly higher chance of allergies or asthma or <insert survivable condition here> than a dead child. How quickly we forget how bad things once were.
Do you dispute that vaccines also accounted for 40% of the decline in infant mortality over the last 50 years?
And before that, TB, Flu, and Smallpox killed uncountably many people. Vaccines are a public good and one of the best things we've ever created as a species.
Do you also have theories about autism you'd like to share with the class?
A very good point. These studies should be comparing QALYs (quality-adjusted life years, a measure of disease burden) instead of relative prevalence of a handful of negative outcomes, the latter of which is much more vulnerable to p-hacking.
Here’s where the “bad ideas out in the open get corrected” now is tested. There are 4 really good refutations of your evidence. Outside of the unspoken “perhaps vaccines cause some measurable bad outcomes but compare then to measles. And without the herd immunity vaccinations aren’t nearly as useful” argument.
So the important question is: Are you now going to say “well, I guess i got some bad data and i have to go back and review my beliefs” or dig in?
The studies you cite are the typical ones circulated by antivaxers and are not considered credible by the medical community due to severe methodological flaws, undisclosed biases, retractions, etc.
To the contrary, high quality studies consistently show that vaccines are not linked to developmental disability or worse health outcomes.
> Anthony R. Mawson, et al., “Pilot Comparative Study on the Health of Vaccinated and Unvaccinated 6 to 12-year-old U.S. Children,” Journal of Translational Science 3, no. 3 (2017): 1-12, doi: 10.15761/JTS.1000186
If you edit down your list to journal articles that you know you be valid and unretracted, I will reconsider looking through it. However, journal access in general is too expensive for me to bother reading retracted articles.
They're into folk medicine, but their anti-vax issues generally come from people who don't have any means of knowing better (i.e. never been to school, dropped out at a very early grade, isolated, not even literate). Typically just education and having a doctor or a local elder respectfully explain to them that the Polio shot will help prevent their child from being paralyzed for life is enough to convince them.
Meanwhile the 'educated' Westerner, to whom Polio is a third-world disease, will convince themselves that the doctor is lying for some reason, will choose to take the 75% chance of an asymptomatic infection because they don't truly appreciate how bad it can otherwise be, will use their access to a vast collection of humanity's information to cherry pick data that supports their position (most likely while also claiming to seek debate despite not intending to seriously consider opposing evidence), and if their gamble fails, will probably just blame immigrants, government or 'big pharma' for doing it.
>SEA and others are still better educated than us.
Honest question: is this true? What’s the data around this? If it is true, why are there so many people from SEA in American universities? Wouldn’t they stay in their home country or another in the area?
I’m truly trying to learn here and square this statement with what I’ve come to understand so far.
Anti-vax has never really been a thing though. I don't know what the data is these days, but it used to be like 1% of the population who were anti-vax.
We have the same thing going on with racism in the West where people are convinced racism is a much bigger problem than it actually is.
And whether it's anti-vax or racist beliefs, when you start attacking people for holding these views you always end up inadvertently encouraging people to start asking why that is and they end up down rabbit holes.
No one believes peas cause cancer for example, but I guarantee one of best ways to make people start to believing peas cause cancer is for the media to start talking about how some people believe that peas do cause cancer, then for sites like YouTube and Facebook to starting ban people who talk about it. Because if they allow people to talk about UFOs and flat Earth conspiracies why are they banning people for suggesting that peas cause cancer? Is there some kind of conspiracy going on funded by big agriculture? You can see how this type of thinking happens.
Anti-vax was enough of an issue that vaccine mandates were necessary for Covid.
It also isn't convincing to be claiming that racism isn't as big in the West given all the discourse around H1Bs, Indians (the Trump base has been pretty open on this one, with comments on JD Vance's wife, the flood of anti-Indian racism on social media, and recently the joy taken in attempting to interfere with Indians forced to fly back to the US in a hurry due to the lack of clarity on the H1B thing), how ICE is identifying illegals, a senator openly questioning the citizenship of a brown mayoral candidate and so on.
I agree that denying something is the easiest way to convince people of the opposite, but it's also understandable when social media companies decide to censor advice from well known individuals that people should do potentially harmful things like consume horse dewormer to deal with Covid. Basically, it's complicated, though I would prefer to lean towards not censoring such opinions.
>where one of the only things that actually worked to stop people dying was the roll out of effective vaccines.
The only reason you believe that is because all information to the contrary was systematically censored and removed from the media you consume. The actual data doesn't support that, there are even cases where it increased mortality, like https://pmc.ncbi.nlm.nih.gov/articles/PMC11278956/ and increased the chance of future covid infections, like https://pubmed.ncbi.nlm.nih.gov/39803093/ .
People have become more anti-Vax because the Covid vaccines were at best ineffective and as you said anything contra-narrative is buried or ignored.
If you push a shitty product and force people to take it to keep their jobs it’s going to turn them into skeptics of all vaccines, even the very effective ones.
More harm than good was done there. The government should have approved them for voluntary use so the fallout would not have been so bad.
This is typical of Covid conspiracy theorists, or conspiracy theorists of any sort: one or two papers on one side prove something, but an overwhelming mountain of evidence on the other side does not prove something. The theorist makes no explanation as to how a planetful of scientists missed the obvious truth that some random dudes found; they just assert that it happened, or make some hand-waving explanation about how an inexplicable planet-wide force of censors is silencing the few unremarkable randos who somehow have the truth.
The first paper seems to claim a very standard cohort study is subject to "immortal time bias", an effect whereby measuring outcomes can seem to change them.
The typical example of sampling time bias is that slow-growing cancers are more survivable than fast-growing ones, but also more likely to be measured by a screening, giving a correlation between screening and survivablility. So you get a time effect where more fast-acting cancers do not end up in the measurement, biasing the data. But in measurements such that one outcome or the other does not bias the odds of that outcome being sampled, there can be no measurement time effect, which is why it's a pretty uncommon thing to correct for. The authors do not explain why measurement time effects would have anything to do with detecting or not detecting death rates in the abstract, or anywhere else in the paper, and why such an unconventional adjustment is necessary, because they are quacks seeking a preferred outcome. Nor do they explain why measurement methods immune to any possible such effect consistently yield the result that vaccines work.
> but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.
Nah, the same grifters who stand to make a political profit of turning everything into a wedge issue would have still hammered right into it. They've completely taken over public discourse on a wide range of subjects, that go well beyond COVID vaccines.
As long as you can make a dollar by telling people that their (and your) ignorance is worth just as much - or more - than someone else's knowledge, you'll find no shortage of listeners for your sermon. And that popularity will build its own social proof. (Millions of fools can't all be wrong, after all.)
I agree. Again the vast majority would have gotten the vaccine.
There's always going to be people for all kinds of reasons pushing out bad ideas. That's part of the trade-off of living in a free society where there is no universal "right" opinion the public must hold.
> They've completely taken over public discourse on a wide range of subjects
Most people are not anti-vax. If "they've" "taken over public discourse" in other subjects to the point you are now holding a minority opinion you should consider whether "they" are right or wrong and why so many people believe what they do.
If can't understand their position and disagree you should reach out to people in a non-confrontational way, understand their position, then explain why you disagree (if you still do at that point). If we all do a better job at this we'll converge towards truth. If you think talking and debate isn't the solution to disagreements I'd argue you don't really believe in our democratic system (which isn't a judgement).
> one of the only things that actually worked to stop people dying was the roll out of effective vaccines
"A total of 913 participants were included in the final analysis. The adjusted ORs for COVID-19 infection among vaccinated individuals compared to unvaccinated individuals were 1.85 (95% CI: 1.33-2.57, p < 0.001). The odds of contracting COVID-19 increased with the number of vaccine doses: one to two doses (OR: 1.63, 95% CI: 1.08-2.46, p = 0.020), three to four doses (OR: 2.04, 95% CI: 1.35-3.08, p = 0.001), and five to seven doses (OR: 2.21, 95% CI: 1.07-4.56, p = 0.033)." - ["Behavioral and Health Outcomes of mRNA COVID-19 Vaccination: A Case-Control Study in Japanese Small and Medium-Sized Enterprises" (2024)](https://www.cureus.com/articles/313843-behavioral-and-health...)
"the bivalent-vaccinated group had a slightly but statistically significantly higher infection rate than the unvaccinated group in the statewide category and the age ≥50 years category" - ["COVID-19 Infection Rates in Vaccinated and Unvaccinated Inmates: A Retrospective Cohort Study" (2023)](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10482361/)
"The risk of COVID-19 also varied by the number of COVID-19 vaccine doses previously received. The higher the number of vaccines previously received, the higher the risk of contracting COVID-19" - ["Effectiveness of the Coronavirus Disease 2019 (COVID-19) Bivalent Vaccine" (2022)](https://www.medrxiv.org/content/10.1101/2022.12.17.22283625v...)
"Confirmed infection rates increased according to time elapsed since the last immunity-conferring event in all cohorts. For unvaccinated previously infected individuals they increased from 10.5 per 100,000 risk-days for those previously infected 4-6 months ago to 30.2 for those previously infected over a year ago. For individuals receiving a single dose following prior infection they increased from 3.7 per 100,000 person days among those vaccinated in the past two months to 11.6 for those vaccinated over 6 months ago. For vaccinated previously uninfected individuals the rate per 100,000 person days increased from 21.1 for persons vaccinated within the first two months to 88.9 for those vaccinated more than 6 months ago." - ["Protection and waning of natural and hybrid COVID-19 immunity" (2021)](https://www.medrxiv.org/content/10.1101/2021.12.04.21267114v...)
It also turns into a talking point for them. A lot of these weird conspiracies would have naturally died out if some people didn’t try to shut them down so much.
no, letting misinformation persist is counterproductive because of the illusory truth effect. the more people hear it, the more they think (consciously or not) "there must be something to this if it keeps popping up"
Elon Musk's takeover of X is already a good example of what happens with unlimited free speech and unlimited reach.
Neo-nazis and white nationalists went from their 3-4 replies per thread forums, 4chan posts, and Telegram channels, to now regularly reaching millions of people and getting tens of thousands of likes.
As a Danish person I remember how American media in the 2010s and early 2020s used to shame Denmark for being very right-wing on immigration. The average US immigration politics thread on X is worse than anything I have ever seen in Danish political discussions.
>A lot of people will say all kinds of craziness, and you just have to let it ride so most of us can roll our eyes at it.
Except many people don't roll their eyes at it, that's exactly the problem. QAnon went from a meme on 4chan to the dominant political movement across the US and Europe. Anti-vax went from fringe to the official policy position of the American government. Every single conspiracy theory that I'm aware of has only become more mainstream, while trust in any "mainstream" source of truth has gone down. All all of this in an environment of aggressive skepticism, arguing, debating and debunking. Sunlight is not disinfecting anything.
We're literally seeing the result of the firehose of misinformation and right-wing speech eating people's brains and you're saying we just have to "let it ride?"
Silencing people alone doesn't work, but limiting the damage misinformation and hate speech can do while pushing back against it does work. We absolutely do need to preserve the right of platforms to choose what speech they spread and what they don't.
The best disinfectant is sunlight. I'm similarly appalled by some of the behaviour after a certain political activist was murdered, but I don't want them to get banned or deplatformed. I'm hoping what we're seeing here is a restoration of the ability to disagree with each other
Speech generally hasn’t been restricted broadly. The same concepts and ideas removed from YouTube still were available on many places (including here).
Yet we still have so many people believing falsehoods and outright lies. Even on this very topic of COVID, both sides present their “evidence” and and truly believe they are right, no matter what the other person says.
I honestly don’t know. My libertarian foundation want me to believe that any and all ideas should be able to be spread. But with the technological and societal changes in the past 10-15 years, we’ve seen how much of a danger this can be too. A lie or mistrust can be spread faster than ever to a wider audience than previously ever possible.
I don’t have solution, but what we have not is clearly not working.
How's that working out? The worst ideas of the 20th century are resurfacing in plain sunlight because the dem's couldn't pluck their heads out of the sand and actually fight them.
Now vaccines are getting banned and the GOP is gerrymandering the hell out of the country to ensure the end of the democratic process. Sure, let's do nothing and see where that brings us. Maybe people will magically come to their senses.
Well, people literally died. So, I think we all know how it played out.
The same thing since time eternal will continue to occur: the educated and able will physically move themselves from risk and others will suffer either by their own volition, or by association, or by lot.
Sure, let the right-wing propaganda machine churn lies and misinformation full-blast, maybe people will magically come to their senses and realize that, no, vaccines and paracetamol don't cause autism, or that the 2020 election wasn't stolen.
Look at twitter before and after Musk, and tell me again that deplatforming doesn't work.
Funny thing, several person that counter responded and disagreed got grayed out (aka negative downvoted ... as in censored).
Reality is, i have personally seen what this type of uncontrolled anti-vax stuff does. The issue is that its harder to disprove a negative with a positive, then people realize.
The moment you are into the youtube, tiktok or whatever platform algorithm, you are fed a steady diet of this misinformation. When you then try to argue with actual factual studies, you get the typical response from "they already said that those studies are made up"... How do you fight that? Propaganda works by flooding the news and over time, people believe it.
That is the result of uncensored access because most people do not have the time to really look up a scientific study. The amount of negative channels massive out way positive / fact based channels because the later is "boring". Its the same reason why your evening news is 80% deaths, corruptions, thefts, politicians and taxes or other negative world news. Because it has been proven that people take in negative news much more. Clickbait titles that are negative draw in people.
There is a reason why holocaust denial is illegal in countries. Because the longer some people can spew that, the more people actually start to believe it.
Yes, i am going to get roasted for this but people are easily influenced and they are not as smart as they think themselves are. We have platforms that cater to people's short attention span with barely 1~3 min clips. Youtube video's longer then 30min are horrible for the youtubers income as people simply do not have the attention span and resulting lost income.
Why do we have laws like seatbelt, speed limits, and other "control" over people. Because people left to their own devices, can be extreme uncaring about their own family, others, even themselves.
Do i like the idea of censorship for the greater good, no. But when there are so many that spew nonsense just to sell their powders, and their homemade vitamine C solutions (made in China)... telling people information that may hurt or kills themselves, family or others.
Where is the line of that unbridled free speech? Silencing people works as in, your delaying the flow of shit running down a creek, will it stop completely? No, but the delay helps people downstream. Letting it run uninterrupted, hoping that a few people downstream with a mop will do all the work, yea ...
We only need to look at platforms like X, when "censorship" got removed (moderation). Full of free speech, no limits and it turned into a soak pit extreme fast (well, bigger soak pit).
Not sure why i am writing this because this is a heated topic but all i can say is, I have seen the damage that anti-vax did on my family. And even to this day, that damage is still present. How a person who never had a issue with vaccinations, never had a bad reaction beyond the sour arm for a day, turned so skeptical to everything vaccination. All because those anti-vax channels got to her.
The anti-vax movement killed people. There is scientific study upon study how red states in the US ended up with bigger amounts of deaths given the time periodes. And yet, not a single person was ever charged for this, ... all simply accepted this and never looked back. Like it was a natural thing, that people's grandparents, family members died that did not need to die.
The fact that people have given up, and now accept to let those with often financial interests, spew nonsense as much as they like. Well, its "normal".
I weep for the human race because we are not going to make it.
I agree, but how do you combat propaganda from Putin? Do you match him dollar for dollar? I am sure YouTube would like that, but who has deep enough pockets to counter the disinformation campaigns?
Similar issue with Covid... when you are in the middle of a pandemic, and dead bodies are piling up, and hospitals are running out of room, how do you handle misinformation spreading on social media?
What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?
> What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?
Time delay. No content based restrictions. Just, like, a 2- to 24-hour delay between when a post or comment is submitted and when it becomes visible, with the user free to delete or change (in this case, the timer resets) their content.
I’d also argue for demonetising political content, but idk if that would fly.
Ok, but how does that get implemented? Not technically, but who makes it happen and enforces the rules? For all content or just “political”? Who decides what’s “political”? Information about the disease behind a worldwide pandemic isn’t inherently “political”, but somehow it became so.
Who decides agar falls in this bucket. The government? That seems to go against the idea of restricting speech and ideas.
Congress for the first. Either the FCC or, my preference, private litigants for the second. (Treble damages for stupid suits, though.)
> For all content or just “political”?
The courts can already distinguish political speech from non-political speech. But I don’t trust a regulator to.
I’d borrow from the French. All content within N weeks of an in the jurisdiction. (I was going to also say any content that mentions an elected by name, but then we’ll just get meme names and nobody needs that.)
Bonus: electeds get constituent pressure to consolidate elections.
Alternative: these platforms already track trending topics. So an easy fix is to slow down trending topics. It doesn’t even need to be by that much, what we want is for people to stop and think and have a chance to reflect on what they do, maybe take a step away from their device while at it.
If the government asks private companies to do that, then that's a violation of 1st amendment, isn't it?
This is the conundrum social media has created. In the past only the press, who were at least semi-responsible, had the ability to spread information on a massive scale. Social media changed that. Now anyone can spread information instantly on a massive scale, and often it is the conspiracy theories and incorrect information that people seek out.
"We were a bit naive: we thought the internet, with the availability of information, would make us all a lot more factual. The fact that people would seek out—kind of a niche of misinformation—we were a bit naive." -- Bill Gates to Oprah, on "AI and the Future of us".
> If the government asks private companies to do that, then that's a violation of 1st amendment, isn't it?
Yes. An unfortunate conclusion I’m approaching (but have not reached, and frankly don’t want to reach) is the First Amendment doesn’t work in a country that’s increasingly illiterate and addicted to ad-powered algorithmic social media.
No. This perspective is wrong in both directions: (1) it is bad medicine and, (2) the medicine doesn't treat the disease. If we could successfully ban bad ideas (assuming that "we" could agree on what they are) then perhaps we should. If the damage incurred by the banning of ideas were sufficiently small, perhaps we should. But both of these are false. Banning does not work. And it brings harm. Note that the keepers of "correct speech" doing the banning today (eg in Biden's day) can quickly become the ones being banned another day (eg Trump's). It's true that drowning the truth through volume is a severe problem, especially in a populace that doesn't care to seek out truth, to find needles in haystacks. But again, banning doesn't resolve this problem. The real solution is develop a populace that cares about, seeks out, and with some skill identifies the truth. That may not be an achievable solution, and in the best case it's not going to happen quickly. But it is the only solution. All of the supply-based solutions (controlling speech itself, rather than training good listeners) run afoul of this same problem, that you cannot really limit the supply, and to the extent you can, so can your opponents.
What do you think about measures that stop short of banning? Like down ranking, demonetizing, or even hell 'banning' that just isolates cohorts that consistently violate rules?
Not OP, but my opinion is that if a platform wants to do so, then I have zero issues with that, unless they hold a vast majority of market share for a certain medium and have no major competition.
Can we stop with the Nazi stuff. I don't know if they stopped teaching history, but there is nothing happening in the US that is within an order of magnitude of the evil the Nazi's perpetrated. Being anti-vax is not comparable to genocide.
The Nazis in 1933 hadn't done anything within an order of magnitude of the evil they would perpetrate in 1943. They nevertheless were still Nazis, and everyone who did not actively condemn them then was in part responsible for what they did later.
Many evil people weren't Nazis; some Nazis weren't necessarily evil. Evil is not part of the definition of Nazism. Promoting authoritarianism, exclusionary nationalism, institutional racism, autarky, anti-liberalism and anti-socialism are the hallmarks of Nazism. Anyone who holds the beliefs of the Nazis is a Nazi, regardless of what level of success they have to date achieved in carrying out their aims.
> The Nazis in 1933 hadn't done anything within an order of magnitude of the evil they would perpetrate in 1943. They nevertheless were still Nazis, and everyone who did not actively condemn them then was in part responsible for what they did later.
Only because what they did in 1943 surpassed anything imaginable. In 1933 the Nazi party immediately banned all political parties, arrested thousands of political opponents, started forcing sterilization of anyone with hereditary illnesses, and forced abortions of anyone with hereditary illness. Evil is absolutely an identifying part of Nazis. The idea that Nazis are just anti-liberals is exactly why we cannot go around calling everyone we don't like Nazis. The Nazis were not some niche alt-right organization.
If you genuinely think there are Nazis controlling youtube or the government, and all you're doing is complaining about it on hackernews, you're just as complicit as you're claiming those people were.
You seem to misunderstand. You are not immune to being a Nazi because you are not evil, being a Nazi makes people evil. Further, we do not call people Nazis because we dislike them, we dislike them because they are Nazis. Most non-Nazis, when accused of being a Nazi, point out how their views differ from the Nazis. The people who argue they can't possibly be Nazis because Nazis are bad, and they are not, typically are.
We read the history, a lot of it rhymes. Conservatives failed, and exchanged their values for a populist outsider to maintain power (see Franz von Papen). The outsider demeans immigrants and 'sexual deviants'. The outsider champions nationalism. He pardons the people who broke the law to support him. Condemns violence against the party while ignoring the more common violence coming out of the those aligned with the party. Encourages the language of enemies when discussing political opponents and protestors.
Nazi has a lot more connotations than genocide. I'm not sure it is worth nitpicking over. Even if you tone it down to Fascist or Authoritarian there will be push back.
You'd have to ban them from society outright without somehow devolving into an authoritarian hellhole in the process (impossible). Trump still primarily posts on a platform specifically created to be a right wing extremist echo chamber.
I didn't say violence. Whatever you read into that comment is a projection. I'm not even sure violence is effective, but something more muscular than op-eds is called for. For example, labor organizing and various forms of self-defense organizations, of which there are many kinds, not only militias. For example, anti-ICE organizing which protects vulnerable people from the gestapo.
It's their private property, they can ban or promote any ideas that they want to. You're free to not use their property if you disagree with that.
If 'silencing people' doesn't work- so online platforms aren't allowed to remove anything? Is there any limit to this philosophy? So you think platforms can't remove:
Holocaust denial?
Clothed underage content? Reddit banned r/jailbait, but you think that's impermissible? How about clothed pictures of toddlers but presented in a sexual context? It would be 'silencing' if a platform wanted to remove that from their private property?
Bomb or weapons-making tutorials?
Dangerous fads that idiotic kids pass around on TikTok, like the blackout game? You're saying it's not permissible for a platform to remove dangerous instructionals specifically targeted at children?
How about spam? Commercial advertising is legally speech in the US. Platforms can't remove the gigantic quantities of spam they suffer from every day?
Where's the limiting principle here? Why don't we just allow companies to set their own rules on their own private property, wouldn't that be a lot simpler?
I used to believe this. But I feel more and more we need to promote a culture of free speech that goes beyond the literal first amendment. We have to tolerate weird and dangerous ideas.
Refuting does not work... You can throw scientific study upon study, doctor upon doctor, ... negatives run deeper then positives.
In the open, it becomes normalized, it draws in more people. Do you rather have some crazies in the corner, or 50% of a population that believes something false, as it became normalized.
The only people benefiting from those dark concepts are those with financial gains. They make money from it, and push the negatives to sell their products and cures. Those that fight against it, do not gain from it and it cost them time/money. That is why is a losing battle.
How about "If the content isn't illegal, then the government shouldn't pressure private companies to censor/filter/ban ideas/speech"?
And yes, this should apply to everything from criticizing vaccines, denying election results, being woke, being not woke, or making fun of the President on a talk show.
Not saying every platform needs to become like 4chan, but if one wants to be, the feds shouldn't interfere.
> It's their private property, they can ban or promote any ideas that they want to. You're free to not use their property if you disagree with that.
1) They are public corporations and are legal creation of the state and benefit from certain protections of the state. They also have privileged access to some public infrastructures that other private companies do not have.
2) By acting on the behest of the government they were agent of the government for free speech and censorship purposes
3) Being monopolies in their respective markets, this means they must respect certain obligations the same way public utilities have.
> From President Biden on down, administration officials “created a political atmosphere that sought to influence the actions of platforms based on their concerns regarding misinformation,” Alphabet said, claiming it “has consistently fought against those efforts on First Amendment grounds.”
This actually surprised me because I thought (and maybe still think) that it was Google employees that led the charge on this one.
It was. At the time, they felt like they were doing the right thing -- the heroic thing, even -- in keeping dangerous disinformation away from the public view. They weren't shy about their position that censorship in that case was good and necessary. Not the ones who said it on TV, and not the ones who said it to me across the dinner table.
For Google now to pretend Biden twisted their arm is pretty rich. They'd better have a verifiable paper trail to prove that, if they expect anyone with a memory five years long to believe it.
If you read those documents, you will see that the administration was telling them that those accounts were in violation of Twitter TOS. They simply said "hey, this user is violating your TOS, what are you gonna do about it?", and Twitter simply applied their rules.
The implication of saying they were "pressed" by the Biden admin is that Google was unwilling. I don't buy that. They were complicit and are now throwing the Biden admin under the bus because it is politically convenient. Just like how the Twitter files showed that Twitter was complicit in it.
> wasn't Google/Youtube banning so much as government ordering private companies to do so
No, it was not. It’s particularly silly to suggest this when we have live example of such orders right now.
The companies were nudged. (And they were wrong to respond to public pressure.) The President, after all, has a “bully pulpit.” But there were no orders, no credibly threats and plenty of companies didn’t deplatform these folks.
> literally had access to JIRA at Twitter so they could file tickets against accounts
I’m not disputing that they coördinated. I’m challenging that they were coerced.
We wouldn’t describe Fox News altering a script on account of a friendly call from Miller and friends the “government ordering private companies” around. (Or, say, Florida opening their criminal justice records to ICE the federal government ordering states around.) Twitter’s leadership and the Biden administration saw eye to eye. This is a story about a media monoculture and private censorship, not government censorship.
It was not. No threats were made, and Twitter didn’t blindly follow the FBI’s guidance.
The simple truth is the leftist elements that wanted to control the debate were there in the White House and in Twitter’s San Francisco offices. Nobody had to be coerced, they were coördinating.
Everybody here is strangers online, so I think grains of salt are reasonable all around. That said, I'm not sure that people-who-were-banned deserve above average scrutiny. Anecdotally, a lot of the RubyGems maintainers were banned a week ago. It seems really unfair to distrust people _just_ because a person-in-control banned them.
Reddit (both admins and many subreddit moderators) are extremely trigger happy with bans. Plenty of reasonable people get banned by capricious Reddit mods.
A lot of channels had to avoid even saying the word Covid. I only saw it return recently to use at the end of last year. There were a variety of channels banned that shouldn't have been such as some talking about Long Covid.
> content that falsely alleges that approved vaccines are dangerous and cause chronic health effects
The J & J vaccine was approved at the time, but was later banned for causing chronic health effects.
> claims that vaccines do not reduce transmission or contraction of disease
Isn't that true of the covid vaccines? Originally, the proponents claimed that getting the vaccine would stop you from getting covid entirely, but later on, they changed the goal posts to "it will reduce your symptoms of covid".
> The J & J vaccine was approved at the time, but was later banned for causing chronic health effects.
That's not what happened. Authorities received rare reports of a clotting disorder and paused it for 11 days to investigate. That pause was lifted but the panic caused a crash in demand and J&J withdrew it from the market. Source: https://arstechnica.com/health/2023/06/j-fda-revokes-authori...
It seems like you are implying that the pause was lifted because they found nothing. That's not quite right. J&J vaccine killed 9 people, and the FDA issued restrictions on who could get it, limitations on who should get it, and warnings about the side effects.
This highlights what’s so difficult with science communication.
Right here on what should be a technical minded forum, people don’t understand what science is or how it works. Or what risk is. And they don’t even challenge their own beliefs or are curious about how things actually work.
If the “smart” people can’t or won’t continuously incorporate new information, what are our chances?
> Originally, the proponents claimed that getting the vaccine would stop you from getting covid entirely
Some people don’t understand how vaccines work, so may have claimed that, but efficacy rates were very clearly communicated. Anyone who listened in high school biology should know that’s not how they work.
That policy catches and bans any scientists studying the negative health effects of vaccines who later turns out to be right.
1) YouTube doesn't know what is true. They will be relying on the sort of people they would ban to work out when the consensus is wrong. If I watched a YouTube video through of someone spreading "vaccine misinformation" there is a pretty good chance that the speakers have relevant PhDs or are from the medical profession - there is no way the YouTube censors are more qualified than that, and the odds are they're just be random unqualified employees already working in the euphemistically named "Trust & Safety" team.
2) All vaccines are dangerous and can cause chronic health effects. That statement isn't controversial, the controversy is entirely over the magnitude. Every time I get a vaccine the standard advice is "you should probably hang around here for 5 minutes, these things are known to be dangerous in rare cases". I think in most countries you're more likely to get polio from a polio vaccine than in the wild. On the one hand, that is a success of the polio vaccine. On the other hand, the vaccine is clearly dangerous and liable to cause chronic health problems.
> This would include content that falsely says that approved vaccines cause ... cancer ...
Cancer is such a catch all that we can pretty much guarantee there will be some evidence that vaccines cause cancer. Everything causes cancer. Drinking tea is known to cause cancer.
3) All policies have costs and benefits. People have to be able to discuss the overall cost-benefit of a policy in YouTube videos even if they get one of the costs or benefits completely wrong.
> Cancer is such a catch all that we can pretty much guarantee there will be some evidence that vaccines cause cancer. Everything causes cancer. Drinking tea is known to cause cancer.
I'm reminded of the Prop 65 signs everywhere in California warning "this might cause cancer"
> This seems like good banning to me. Anti-vaxxer propaganda isn't forbidden thoughts. It's bad science and lies and killing people.
Any subject important enough in any public forum is potentially going to have wrong opinions that are going to cause harm. While some people could be wrong, and could cause harm, the state itself being wrong is far more dangerous, especially with no dissident voices there to correct its course.
Edit: I see you're getting downvoted for simply stating your honest opinion. But as a matter of principle I'm going to upvote you.
Pfizer hid a lot of the damage done as did the others. A lot of people can die by the time books come out. [1] That's one of the many reasons I held off and glad I did.
Shouting "fire" in a crowded theater being illegal was used to make it illegal to oppose the draft (Schenck v. United States). So actually, since opposing the draft is legal, shouting "fire" in a crowded theater is legal too.
You would be charged with inciting a riot, reckless homicide, etc regardless of the actual words you shouted to cause the deaths, but I see your point.
"Shouting 'fire' in a crowded theater" being used as an excuse for censorship is the surest way to know you are talking to someone who hasn't even started doing the reading. Even worse, they often (over the past very few years) self-identify as socialists or anti-war, and the decision was in order to prosecute anti-war socialists for passing out pamphlets.
If somebody says it, they not only don't care about free speech, they don't even care about having a good faith conversation about free speech. They've probably been told this before, and didn't bother to look it up, just repeated it again. Wasting good people's time.
According to Google's censorship algorithm, Michael Osterholm's podcast (famous epidemiologist and, at the time, a member of President Biden's own gold-star covid-19 advisory panel).
Most ironic thing I've ever seen. I still recall it perfectly, though it's been four years. Never, ever trust censorship algorithms or the people who control them: they are just dumb parrots that suppress all discussion of an unwanted topic, without thought or reason.
My wake up moment was when they not only took down a Covid debate with a very well qualified virologist, but also removed references to it in the Google search index, not just for the YouTube link.
It seems to me that a lot of people are missing the forest for the trees on misinformation and censorship. IMO, a single YouTube channel promoting misinformation, about Covid or anything else, is not a huge problem, even if it has millions of followers.
The problem is that the recommendation algorithms push their viewers into these echo chambers that are divorced from reality where all they see are these videos promoting misinformation. Google's approach to combating that problem was to remove the channels, but the right solution was, and still is today, to fix the algorithms to prevent people from falling into echo chambers.
Why. Why is Google obligated to publish your content? Should Time Magazine also give you a column because they give others space in their pages? Should Harvard Press be required to publish and distribute your book because they do so for others.
These companies owe you nothing that's not in a contract or a requirement of law. That you think they owe you hosting, distribution, and effort on their algorithm, is a sign of how far off course this entire discourse has moved.
Yeah, there are two main things here that are being conflated.
First, there's YouTube's decision of whether or not to allow potentially dangerous misinformation to remain on their site, and whether the government can or did require them to remove it.
Second, though, there's YouTube's much stronger editorial power: whether or not to recommend, advertise, or otherwise help people discover that content. Here I think YouTube most fairly deserves criticism or accolades, and it's also where YouTube pretends that the algorithm is magic and neutral and they cannot be blamed for actively pushing videos full of dangerous medical lies.
The problem is that misinformation has now become information, and vice versa, so who was anyone to decide what was misinformation back then, or now, or ever.
I like the term disinformation better, since it can expand to the unfortunately more relevant dissenting information.
I've argued this before, but the algorithms are not the core problem here.
For whatever reason I guess I'm in that very rare group that genuinely watches everything from far-right racists, to communists, to mainstream media content, to science educational content, to conspiracy content, etc.
My YT feed is all over the place. The algorithms will serve you a very wide range of content if you want that, the issue is that most people don't. They want to hear what they already think.
So while I 100% support changing algorithms to encourage more diversity of views, also I think as a society we need to question why people don't want to listen to more perspectives naturally? Personally I get so bored here people basically echo what I think. I want to listen to people who say stuff I don't expect or haven't thought about before. But I'm in a very significant minority.
I might agree that the algos making recommendations on the sidebar might not matter much, but the algos that control which videos show up when you search for videos on Google, and also in YouTube search absolutely do matter.
The problem with this is that a lot of people have already fallen into these misinformation echo chambers. No longer recommending them may prevent more from becoming unmoored from reality, but it does nothing for those currently caught up in it. Only removing the channel helps with that.
I don't think those people caught up in it are suddenly like "oop that YouTuber is banned, I guess I don't believe that anymore". They'll seek it out elsewhere.
The algorithm doesn't push anyone. It just gives you what it thinks you want. If Google decided what was true and then used the algorithm to remove what isn't true, that would be pushing things. Google isn't and shouldn't be the ministry of truth.
Exactly, they shouldn't be the ministry of truth. They should present balanced viewpoints on both sides of controversial subjects. But that's not what they're doing right now. If you watch too many videos on one side of a subject it will just show you more and more videos reinforcing that view point because you're likely to watch them!
"what it thinks you want" is doing a lot of work here. why would it "think" that you want to be pushed into an echo chamber divorced from reality instead of something else? why would it give you exactly what you "want" instead of something aligned with some other value?
The problem with any system like this is that due to scale it will be automated which means a large swath of people will be caught up in it doing nothing wrong.
This is why perma bans are bad. Id rather a strike system before a temp ban to give some breathing room for people to navigate the inevitable incorrect automation. Even then if the copyright issue is anything to go by this is going to hurt more than help.
There isn't really a good solution here. A precedent for banning speech isn't a good one, but COVID was a real problem and misinformation did hurt people.
The issue is that there is no mechanism for punishing people who spread dangerous misinformation. It's strange that it doesn't exist though, because you're allowed to sue for libel and slander. We know that it's harmful, because people will believe lies about a person, damaging their reputation. It's not clear why it can't be generalized to things that we have a high confidence of truth in and where lying is actively harmful.
No speech was banned. Google didn't prevent anyone from speaking. They simply withheld their distribution. No one can seem to get this right. Private corporations owe you almost nothing and certainly not free distribution.
It's interesting you say that, because the government is saying Tylenol causes autism in infants when the mother takes it. The original report even says more verification is required and it's results are inconclusive.
I wouldn't be surprised if some lawsuit is incoming from the company that manufactures it.
We have mechanisms for combatting the government through lawsuits. If the government came out lies that actively harm people, I hope lawsuits come through or you know... people organize and vote for people who represent their interests.
Virtually all of the supposed misinformation turned out not to be that at all. Period, the end. All the 'experts' were wrong, all those that we banned off platforms (the actual experts) were right
i'd like to think that if I were a YTer that got banned for saying something that I believed in that I would at least have the dignity not to take my value back to the group that squelched me.
All those words, and no mention of Section 230, which is what this is really all about. Google can see which way the wind is blowing and they know POTUS will -- for better or worse -- happily sign any anti-"Big Tech censorship" bill that gets to his desk. They hope to preempt this.
I could go on. Feel free if you want to see more. :)
(Was it misinformation when Fauci said you shouldn't rush a vaccine or all hell breaks loose years later? Or when he intimated that masks wouldn't work for covid?)
If you (or anyone) run across a flagged comment that isn't tediously repeating ideological battle tropes, pushing discussion flameward, or otherwise breaking the site guidelines, you're welcome to bring it to our attention. So far, the flagged comments I've seen in this thread seem correctly flagged. But we don't see everything.
On this site, we're trying for discussion in which people don't just bash each other with pre-existing talking points (and unprocessed rage). Such comments quickly flood the thread on a divisive topic like this one, so flagging them is essential to having HN operate as intended. To the extent possible at least.
(oh and btw, though it ought to go without saying, this has to do with the type of comment, not the view it's expressing. People should be able to make their substantive points thoughtfully, without getting flagged.)
Yeah but in practice this isn't actually the case, people flag all the time for people just having a dissenting opinion, fitting none of the categories you mentioned
They are mega-corporations. They always do what ever the hell they want, certainly absent your input. Did you really believe they don't do what ever they want, because that's pretty damned naive.
Promoting medical misinformation or even health misinformation should be critically judged. Alternative health companies are rubbing their hands together.
The next Drain-o chug challenge "accident" is inevitable, at this rate.
Nice edit where you cut out all the “progressive” stuff to make the same lie that the shooter is right wing because you want to do anything you can to avoid admitting that extreme left even exists lets alone is capable of assassination.
Yes, those are Kimmel words, and when he said them, everyone already knew the facts about his gay lifestyle and trans boyfriend and that he murdered Kirk out of hate. It’s nice to see that you picked up his torch regardless of how little rational sense it makes.
>Somebody using violence for political means is exactly aligned with Charlie Kirk's spoken words.
Cite it. In whole context, but video is preferable. The guy had a decade of being in front of the camera, post any the hateful video, it should be easy! Show us the hate, if not the justification for his own murder of course.
Steelman argument is it's better to know what liars, bigots, and other naughty people are up to than push them entirely underground. And someday future moderators may think you're naughty/lying/a quack/etc.
IMO we should not let private platforms become near monopolies, and certainly not without regulation, since they become a defacto public square. But if we're going to let them eat the world, then hopefully they'll at least use good judgment and measures like de-ranking or even banning folks who encourage others to do harm. Making bans temporary is a safety valve in case of bad moderation.
That steelman is still a pretty bad argument, though. I don't see why giving liars, bigots and other naughty people a megaphone is required in order to know what they're saying.
I suppose the argument there is that it's not necessarily a megaphone for the fella with 24 followers. The concern comes from when someone amasses a following through "acceptable" means and then pivots. Not sure how to balance that.
What is Youtube a 'near monopoly' in? Online video.....? Do you have any idea how much video there is online that's not on Youtube? They don't meet the legal definition of a monopoly
Earlier in 2025, the video game Fortnite announced[1] that they were giving cheaters with lifetime bans a "second chance" and let them return to the game. Lo and behold, cheating in the game spiked up this year and has returned as a huge ongoing problem. Turns out, the vast majority of the bans were probably correct, and when you let people back into something who were banned for doing X, they're going to immediately start doing X again once they're back in.
Admittedly, Google was very heavy handed with Covid censorship. Sure, there was a lot of genuine misinformation that maybe deserved it, but they also tended to catch a lot of actual qualified scientists engaging in scientific debate (say, arguing in favor of masks and the transmission through air theory in the early days) or even some discussion that wasn't opposing the official stances.
Somewhat related, it's pretty insane how even to this day YouTubers have to avoid referring to by name a global multi-year situation that everyone who existed at the time went through. It's due to advertisers rather than government pressure, but still, insane.
Your point reminded me that around the time when the pandemic first started, I saw a YouTube video on physics titled something like "Corona and Arc Discharge" and it had the contextual note that is sometimes added to videos. I think the official name YouTube gives it is: "topical context in information panel". I thought it was a funny case where the automated system thought this physics video had something to do with COVID.
Yeah at the time I get the impression they were banning dissent, not just egregious or dangerous content (whatever that even means). I though most places came to their senses a long time ago and walked back that heavy handedness, I'm surprised this just happened.
Merriam Webster defines con man as "a person who tricks other people in order to get their money : con artist"
Even if people were straight up wrong about their COVID-19 theories, I don't think many of the banned people were trying to get viewers to send them money.
We both know that ads and sponsorships are a significant way influencers monetize their viewers.
All they have to do is lie to attract eyeballs and they make money. E-begging isn't necessary, the platforms allow you to extract value from viewers at an incredible scale.
Any service which allows user generated content and allows arbitrary IP addresses to create infinite accounts is guaranteed to be overrun with CSAM. It's practically a law of physics.
If you actually cared about CSAM you would want those posting it to self incriminate and then face consequences in real life at the hands of actual authorities. Websites banning such posters only serves to alert them that they need to improve their tactics and give them the opportunity to hide. Removing only the offending content and alerting authorities is the appropriate thing a website like Youtube should be doing.
Even if one does argue that CSAM should result in hardware and IP bans, there's no reason that can't be a sole exception to a wider prohibition on such bans.
> If you actually cared about CSAM you would want those posting it to self incriminate and then face consequences in real life
We don’t have the resources for this, even when the FBI isn’t being purged and sent to Home Depots. Unrestricting IPs means a boom for CSAM production and distribution.
[1] https://www.bbc.com/news/technology-52388586
Did they? I remember them revising their guidance, which seems like something one would expect during an emerging crisis, but I don't remember them directly contradicting themselves.
misinformation is a real and worsening problem, but censorship makes conspiracies flourish, and establishes platforms as arbiters of truth. that "truth" will shift with the political tides.
IMO we need to teach kids how to identify misinformation in school. maybe by creating fake articles, mixing them with real articles and having students track down sources and identify flaws. critical thinking lessons.
Placing absolute trust in these organizations and restricting freedom of speech based on that is a very bootlicking, anti-freedom stance
Silver lining in this is the conversation continued and will continue. I can see governments needing to try to get accurate and helpful information out in crisis - and needing to pressure or ask more of private companies to do that. But also like that we can reflect back and go - maybe that didn’t work like what we wanted or maybe it was heavy-handed.
In many governments, the government can do no wrong. There are no checks and balances.
The question is - should we still trust YouTube/Google? Is YouTube really some kind of champion of free speech? No. Is our current White House administration a champion of free speech? Hardly.
But hopefully we will still have a system that can have room for critique in the years to come.
Have we all lost the ability to reason? Seriously, this isn't hard. No one owes you distribution unless you have a contract saying otherwise.
The common carrier law says you have to for for some things, so it makes sense to institute such a law for some parts of social media as they are fundamental enough. It is insane that we give that much censorship power to private corporations.
Even today, I was listening to NPR talk about the potential TikTok deal and the commenter was wringing their hands about having a "rich guy" like Larry Ellison control the content.
I don't know exactly what the right answer is. But given their reach -- and the fact that a lot of these companies are near monopolies -- I think we should at least do more than just shrug and say, "they can do what they want."
That "and/or" is doing a lot of work here. There's a huge difference between government censorship and forcing private companies to host content they don't want to host on servers they own.
Then again, Alphabet is now claiming they did want to host it and mean old Biden pressured them into pulling it so if we buy that, maybe it doesn't matter.
> What if they started banning tylenol-autism sceptical accounts?
What if it's pro-cannibalism or pedophilia content? Everyone has a line, we're all just arguing about where exactly we think that line should be.
If I choose to put a Kamala sign in my yard and not a Trump sign, that’s an expression of free speech.
If the marketing company I own decides to not work for causes I don’t personally support, that’s free speech.
If the video hosting platform I’m CEO of doesn’t host unfounded anti-vax content because I think it’s a bad business move, is that not also free speech?
I simply don't believe people who say they want to support a culture of free speech on a media or social media site. They haven't really thought about what that means.
If those two private companies would host all legal content, this could be a thriving market.
Somehow big tech and payment processors get to censor most software.
The legal process already did all the hard work of reaching consensus/compromise on where that line is, so just use that. At least with the legal system, there's some degree of visibility and influence possible by everyone. It's not some ethics department silently banning users they don't agree with.
This throws out spam and fraud filters, both of which are content-based moderation.
Nobody moderates anything isn’t unfortunately a functional option. Particularly if the company has to sell ads.
As a book publisher, should I be required to publish your furry smut short stories? Of course not. Is that infringing on your freedom of speech? Of course not.
What you are arguing for is a dissolution of HN and sites like it.
The reason we ban government censorship is so that a private actor can always create their own conspiracy theory + snuff film site if they want, and other platforms are not obligated to carry content they find objectionable. Get really into Rumble or Truth Social or X if you would like a very different perspective from Youtube's.
People should be free to say what they want online. But going down "YouTube conspiracy theory" rabbit holes is a real thing, and YouTube doesn't need to make that any easier, or recommend extreme (or demonstrably false) content because it leads to more "engagement".
The first amendment was written in the 1700s...
It massively amplified the nuts. It brought it to the mainstream.
I'm a bit amazed seeing people still justifying it after all we've learned.
COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.
And to state my position like the root guy, I'm a progressive, pro-vaccine, medical science believer. I listen to my doctor and am skeptical if not dismissive of the YouTube "wellness" grifters selling scam supplements. I believe in science and research. I thought the worm pill people were sad if not pathetic. Anyone who gets triggered by someone wearing a mask needs to reassess their entire life.
But lockdowns went on way too long. Limits on behaviour went on way too long. Vaccine compliance measures were destructive the moment we knew it had a negligible effect on spread. When platforms of "good intentions" people started silencing the imbeciles, it handed them a megaphone and made the problem much worse.
And now we're living in the consequences. Where we have a worm-addled halfwit directed medicine for his child-rapist pal.
>COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.
In theory, I agree, kind of.
But also - we were 10+ months into COVID raging in the US before Biden’s administration, the administration that enacted the policies the article is about, came to be. Vaccine production and approval were well under way, brought to fruition in part due to the first Trump administration. The “nuts” had long been mainstream and amplified before this “silencing” began. Misinformation was rampant and people were spreading it at a quick speed. Most people I know who ultimately refused the vaccines made up their minds before Biden took office.
My biggest problem with YouTube isn't what it does/doesn't allow on its platform, it's that it will happily feed you a psychotic worldview if it keeps you on the site. I've had several family members go full conspiracy nut-job after engaging heavily with YouTube content.
I don't know what the answer is. I think many people would rightly argue that removing misinformation from the recommendation engine is synonymous with banning it. FWIW I'd be happy if recommendation engines generally were banned for being a societal cancer, but I'm probably in the minority here.
It actually does work. You need to remove ways for misinformation to spread, and suppressing a couple of big agents works very well.
- https://www.nature.com/articles/s41586-024-07524-8 - https://www.tandfonline.com/doi/full/10.1080/1369118X.2021.1... - https://dl.acm.org/doi/abs/10.1145/3479525 - https://arxiv.org/pdf/2212.11864
Obviously, the best solution would be prevention, by having good education systems and arming common people with the weapons to assess and criticize information, but we are kinda weak on that front.
The US military also promoted anti-vax propaganda in the Philippines [0].
A lot of the comments here raise good points about silencing well meaning people expressing their opinion.
But information warfare is a fundamental part of modern warfare. And it's effective.
An American company or individual committing fraud can be dealt with in the court system. But we don't yet have a good remedy for how to deal with a military power flooding social media with information intentionally designed to mislead and cause harm for people who take it seriously.
So
> I think we have to realize silencing people doesn't work
it seems to have been reasonably effective at combating disinformation networks
> It just causes the ideas to metastasize
I don't think this is generally true. If you look at old disinformation campaigns like the idea that the US faked the moon landings, it's mostly confined to a small group of people who are prone to conspiracy thinking. The idea of a disinformation campaign is you make it appear like a crazy idea has broad support. Making it appear that way requires fake accounts or at least boosters who are in on the scheme. Taking that away means the ideas compete on their own merit, and the ideas are typically real stinkers.
[0] https://www.btimesonline.com/articles/167919/20240727/u-s-ad...
In general, you can't argue or 'fact' people out of beliefs they were not argued into. The best you can do is give them a safe place to land when disconfirmation begins. Don't be too judgy, no one is immune to propaganda.
I think if public health bodies just laid out the data they had honestly (good and bad) and said that they think most people should probably take it, but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.
Add in a healthy dose of subconsciously racist beliefs about how advanced Western society is (plus ideas of how this means they must be smart too) and how catching diseases preventable by vaccines is only a brown people thing.
Basically, it's easy to be anti-vax when the disease isn't in your face and you have an out-group to blame even if it does end up in your face (a common excuse by anti-vaxxers I see when measles is in the news is that the immigrants are bringing it in and should be blamed instead of anti-vaxxers)
Anthony R. Mawson, et al., “Pilot Comparative Study on the Health of Vaccinated and Unvaccinated 6 to 12-year-old U.S. Children,” Journal of Translational Science 3, no. 3 (2017): 1-12, doi: 10.15761/JTS.1000186
Anthony R. Mawson et al., “Preterm Birth, Vaccination and Neurodevelopmental Disorders: A Cross-Sectional Study of 6- to 12-Year-Old Vaccinated and Unvaccinated Children,” Journal of Translational Science 3, no. 3 (2017): 1-8, doi:10.15761/JTS.1000187.
Brian Hooker and Neil Z. Miller, “Analysis of Health Outcomes in Vaccinated and Unvaccinated Children: Developmental Delays, Asthma, Ear Infections and Gastrointestinal Disorders,” SAGE Open Medicine 8, (2020): 2050312120925344, doi:10.1177/2050312120925344.
Brian Hooker and Neil Z. Miller, “Health Effects in Vaccinated versus Unvaccinated Children,” Journal of Translational Science 7, (2021): 1-11, doi:10.15761/JTS.1000459.
James Lyons-Weiler and Paul Thomas, “Relative Incidence of Office Visits and Cumulative Rates of Billed Diagnoses along the Axis of Vaccination,” International Journal of Environmental Research and Public Health 17, no. 22 (2020): 8674, doi:10.3390/ijerph17228674.
James Lyons-Weiler, "Revisiting Excess Diagnoses of Illnesses and Conditions in Children Whose Parents Provided Informed Permission to Vaccinate Them" September 2022 International Journal of Vaccine Theory Practice and Research 2(2):603-618 DOI:10.56098/ijvtpr.v2i2.59
NVKP, “Diseases and Vaccines: NVKP Survey Results,” Nederlandse Vereniging Kritisch Prikken, 2006, accessed July 1, 2022.
Joy Garner, “Statistical Evaluation of Health Outcomes in the Unvaccinated: Full Report,” The Control Group: Pilot Survey of Unvaccinated Americans, November 19, 2020.
Joy Garner, “Health versus Disorder, Disease, and Death: Unvaccinated Persons Are Incommensurably Healthier than Vaccinated,” International Journal of Vaccine Theory, Practice and Research 2, no. 2, (2022): 670-686, doi: 10.56098/ijvtpr.v2i2.40.
Rachel Enriquez et al., “The Relationship Between Vaccine Refusal and Self-Report of Atopic Disease in Children,” The Journal of Allergy and Clinical Immunology 115, no. 4 (2005): 737-744, doi:10.1016/j.jaci.2004.12.1128.
Hooker & Miller 2020/2021 – analysis of “control group” data also from self-selected surveys; same methodological problems.
Lyons-Weiler & Thomas 2020, 2022 – data from a single pediatric practice run by one of the authors; serious selection bias.
Joy Garner / NVKP surveys – activist-run online surveys with no verification.
Enriquez et al. 2005 – a small cross-sectional study about allergy self-reports, not about overall neurodevelopment.
Large, well-controlled population studies (Denmark, Finland, the U.S. Vaccine Safety Datalink, etc.) comparing vaccinated vs. unvaccinated children show no increase in autism, neurodevelopmental disorders, or overall morbidity attributable to recommended vaccines.
"We fully realize that a survey like this, even on purely scientific grounds, is flawed on all counts. The sample of children studied is far too small and unrepresentative, we didn't use control groups, and so on."
Turns out the NVKP roughly translates to "Dutch Organization for those critical towards vaccines."
I understand being skeptical about vaccines, but the skepticism needs to go both ways
Even if I granted every single paper's premise here. I'd still much rather have a living child with a slightly higher chance of allergies or asthma or <insert survivable condition here> than a dead child. How quickly we forget how bad things once were. Do you dispute that vaccines also accounted for 40% of the decline in infant mortality over the last 50 years? And before that, TB, Flu, and Smallpox killed uncountably many people. Vaccines are a public good and one of the best things we've ever created as a species.
Do you also have theories about autism you'd like to share with the class?
So the important question is: Are you now going to say “well, I guess i got some bad data and i have to go back and review my beliefs” or dig in?
Other treatments aren’t applied preventatively to the entire population which is why the risk presumably is lower.
To the contrary, high quality studies consistently show that vaccines are not linked to developmental disability or worse health outcomes.
Retracted: https://retractionwatch.com/2017/05/08/retracted-vaccine-aut...
If you edit down your list to journal articles that you know you be valid and unretracted, I will reconsider looking through it. However, journal access in general is too expensive for me to bother reading retracted articles.
What does the West have to do with it? Non-westerners are even more into folk medicine and witch doctors.
Meanwhile the 'educated' Westerner, to whom Polio is a third-world disease, will convince themselves that the doctor is lying for some reason, will choose to take the 75% chance of an asymptomatic infection because they don't truly appreciate how bad it can otherwise be, will use their access to a vast collection of humanity's information to cherry pick data that supports their position (most likely while also claiming to seek debate despite not intending to seriously consider opposing evidence), and if their gamble fails, will probably just blame immigrants, government or 'big pharma' for doing it.
Honest question: is this true? What’s the data around this? If it is true, why are there so many people from SEA in American universities? Wouldn’t they stay in their home country or another in the area?
I’m truly trying to learn here and square this statement with what I’ve come to understand so far.
We have the same thing going on with racism in the West where people are convinced racism is a much bigger problem than it actually is.
And whether it's anti-vax or racist beliefs, when you start attacking people for holding these views you always end up inadvertently encouraging people to start asking why that is and they end up down rabbit holes.
No one believes peas cause cancer for example, but I guarantee one of best ways to make people start to believing peas cause cancer is for the media to start talking about how some people believe that peas do cause cancer, then for sites like YouTube and Facebook to starting ban people who talk about it. Because if they allow people to talk about UFOs and flat Earth conspiracies why are they banning people for suggesting that peas cause cancer? Is there some kind of conspiracy going on funded by big agriculture? You can see how this type of thinking happens.
It also isn't convincing to be claiming that racism isn't as big in the West given all the discourse around H1Bs, Indians (the Trump base has been pretty open on this one, with comments on JD Vance's wife, the flood of anti-Indian racism on social media, and recently the joy taken in attempting to interfere with Indians forced to fly back to the US in a hurry due to the lack of clarity on the H1B thing), how ICE is identifying illegals, a senator openly questioning the citizenship of a brown mayoral candidate and so on.
I agree that denying something is the easiest way to convince people of the opposite, but it's also understandable when social media companies decide to censor advice from well known individuals that people should do potentially harmful things like consume horse dewormer to deal with Covid. Basically, it's complicated, though I would prefer to lean towards not censoring such opinions.
It's often a lot better to just let kooks speak freely.
There is nobody more confident in themselves than the middle-class.
The only reason you believe that is because all information to the contrary was systematically censored and removed from the media you consume. The actual data doesn't support that, there are even cases where it increased mortality, like https://pmc.ncbi.nlm.nih.gov/articles/PMC11278956/ and increased the chance of future covid infections, like https://pubmed.ncbi.nlm.nih.gov/39803093/ .
People have become more anti-Vax because the Covid vaccines were at best ineffective and as you said anything contra-narrative is buried or ignored.
If you push a shitty product and force people to take it to keep their jobs it’s going to turn them into skeptics of all vaccines, even the very effective ones.
More harm than good was done there. The government should have approved them for voluntary use so the fallout would not have been so bad.
The first paper seems to claim a very standard cohort study is subject to "immortal time bias", an effect whereby measuring outcomes can seem to change them.
The typical example of sampling time bias is that slow-growing cancers are more survivable than fast-growing ones, but also more likely to be measured by a screening, giving a correlation between screening and survivablility. So you get a time effect where more fast-acting cancers do not end up in the measurement, biasing the data. But in measurements such that one outcome or the other does not bias the odds of that outcome being sampled, there can be no measurement time effect, which is why it's a pretty uncommon thing to correct for. The authors do not explain why measurement time effects would have anything to do with detecting or not detecting death rates in the abstract, or anywhere else in the paper, and why such an unconventional adjustment is necessary, because they are quacks seeking a preferred outcome. Nor do they explain why measurement methods immune to any possible such effect consistently yield the result that vaccines work.
I did not read the second paper.
Nah, the same grifters who stand to make a political profit of turning everything into a wedge issue would have still hammered right into it. They've completely taken over public discourse on a wide range of subjects, that go well beyond COVID vaccines.
As long as you can make a dollar by telling people that their (and your) ignorance is worth just as much - or more - than someone else's knowledge, you'll find no shortage of listeners for your sermon. And that popularity will build its own social proof. (Millions of fools can't all be wrong, after all.)
There's always going to be people for all kinds of reasons pushing out bad ideas. That's part of the trade-off of living in a free society where there is no universal "right" opinion the public must hold.
> They've completely taken over public discourse on a wide range of subjects
Most people are not anti-vax. If "they've" "taken over public discourse" in other subjects to the point you are now holding a minority opinion you should consider whether "they" are right or wrong and why so many people believe what they do.
If can't understand their position and disagree you should reach out to people in a non-confrontational way, understand their position, then explain why you disagree (if you still do at that point). If we all do a better job at this we'll converge towards truth. If you think talking and debate isn't the solution to disagreements I'd argue you don't really believe in our democratic system (which isn't a judgement).
"A total of 913 participants were included in the final analysis. The adjusted ORs for COVID-19 infection among vaccinated individuals compared to unvaccinated individuals were 1.85 (95% CI: 1.33-2.57, p < 0.001). The odds of contracting COVID-19 increased with the number of vaccine doses: one to two doses (OR: 1.63, 95% CI: 1.08-2.46, p = 0.020), three to four doses (OR: 2.04, 95% CI: 1.35-3.08, p = 0.001), and five to seven doses (OR: 2.21, 95% CI: 1.07-4.56, p = 0.033)." - ["Behavioral and Health Outcomes of mRNA COVID-19 Vaccination: A Case-Control Study in Japanese Small and Medium-Sized Enterprises" (2024)](https://www.cureus.com/articles/313843-behavioral-and-health...)
"the bivalent-vaccinated group had a slightly but statistically significantly higher infection rate than the unvaccinated group in the statewide category and the age ≥50 years category" - ["COVID-19 Infection Rates in Vaccinated and Unvaccinated Inmates: A Retrospective Cohort Study" (2023)](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10482361/)
"The risk of COVID-19 also varied by the number of COVID-19 vaccine doses previously received. The higher the number of vaccines previously received, the higher the risk of contracting COVID-19" - ["Effectiveness of the Coronavirus Disease 2019 (COVID-19) Bivalent Vaccine" (2022)](https://www.medrxiv.org/content/10.1101/2022.12.17.22283625v...)
"Confirmed infection rates increased according to time elapsed since the last immunity-conferring event in all cohorts. For unvaccinated previously infected individuals they increased from 10.5 per 100,000 risk-days for those previously infected 4-6 months ago to 30.2 for those previously infected over a year ago. For individuals receiving a single dose following prior infection they increased from 3.7 per 100,000 person days among those vaccinated in the past two months to 11.6 for those vaccinated over 6 months ago. For vaccinated previously uninfected individuals the rate per 100,000 person days increased from 21.1 for persons vaccinated within the first two months to 88.9 for those vaccinated more than 6 months ago." - ["Protection and waning of natural and hybrid COVID-19 immunity" (2021)](https://www.medrxiv.org/content/10.1101/2021.12.04.21267114v...)
Neo-nazis and white nationalists went from their 3-4 replies per thread forums, 4chan posts, and Telegram channels, to now regularly reaching millions of people and getting tens of thousands of likes.
As a Danish person I remember how American media in the 2010s and early 2020s used to shame Denmark for being very right-wing on immigration. The average US immigration politics thread on X is worse than anything I have ever seen in Danish political discussions.
Except many people don't roll their eyes at it, that's exactly the problem. QAnon went from a meme on 4chan to the dominant political movement across the US and Europe. Anti-vax went from fringe to the official policy position of the American government. Every single conspiracy theory that I'm aware of has only become more mainstream, while trust in any "mainstream" source of truth has gone down. All all of this in an environment of aggressive skepticism, arguing, debating and debunking. Sunlight is not disinfecting anything.
We're literally seeing the result of the firehose of misinformation and right-wing speech eating people's brains and you're saying we just have to "let it ride?"
Silencing people alone doesn't work, but limiting the damage misinformation and hate speech can do while pushing back against it does work. We absolutely do need to preserve the right of platforms to choose what speech they spread and what they don't.
Is it? How does that work at scale?
Speech generally hasn’t been restricted broadly. The same concepts and ideas removed from YouTube still were available on many places (including here).
Yet we still have so many people believing falsehoods and outright lies. Even on this very topic of COVID, both sides present their “evidence” and and truly believe they are right, no matter what the other person says.
Trump thought so too.
Now vaccines are getting banned and the GOP is gerrymandering the hell out of the country to ensure the end of the democratic process. Sure, let's do nothing and see where that brings us. Maybe people will magically come to their senses.
The same thing since time eternal will continue to occur: the educated and able will physically move themselves from risk and others will suffer either by their own volition, or by association, or by lot.
Look at twitter before and after Musk, and tell me again that deplatforming doesn't work.
There's a reason you don't fan the flames of disinformation. Groups of people cannot be reasoned with like you can reason with an individual.
[1] https://systemicjustice.org/article/facebook-and-genocide-ho...
We also tried letting the propaganda machine full-blast those lies on the telly for the past 5 years.
For some reason, that didn't work either.
What is going to work? And what is your plan for getting us to that point?
People can post all sorts of crazy stuff, but the algorithms do not need to promote it.
Countries can require Algorithmic Impact Assements and set standards of compliance to recommended guidelines.
Reality is, i have personally seen what this type of uncontrolled anti-vax stuff does. The issue is that its harder to disprove a negative with a positive, then people realize.
The moment you are into the youtube, tiktok or whatever platform algorithm, you are fed a steady diet of this misinformation. When you then try to argue with actual factual studies, you get the typical response from "they already said that those studies are made up"... How do you fight that? Propaganda works by flooding the news and over time, people believe it.
That is the result of uncensored access because most people do not have the time to really look up a scientific study. The amount of negative channels massive out way positive / fact based channels because the later is "boring". Its the same reason why your evening news is 80% deaths, corruptions, thefts, politicians and taxes or other negative world news. Because it has been proven that people take in negative news much more. Clickbait titles that are negative draw in people.
There is a reason why holocaust denial is illegal in countries. Because the longer some people can spew that, the more people actually start to believe it.
Yes, i am going to get roasted for this but people are easily influenced and they are not as smart as they think themselves are. We have platforms that cater to people's short attention span with barely 1~3 min clips. Youtube video's longer then 30min are horrible for the youtubers income as people simply do not have the attention span and resulting lost income.
Why do we have laws like seatbelt, speed limits, and other "control" over people. Because people left to their own devices, can be extreme uncaring about their own family, others, even themselves.
Do i like the idea of censorship for the greater good, no. But when there are so many that spew nonsense just to sell their powders, and their homemade vitamine C solutions (made in China)... telling people information that may hurt or kills themselves, family or others.
Where is the line of that unbridled free speech? Silencing people works as in, your delaying the flow of shit running down a creek, will it stop completely? No, but the delay helps people downstream. Letting it run uninterrupted, hoping that a few people downstream with a mop will do all the work, yea ...
We only need to look at platforms like X, when "censorship" got removed (moderation). Full of free speech, no limits and it turned into a soak pit extreme fast (well, bigger soak pit).
Not sure why i am writing this because this is a heated topic but all i can say is, I have seen the damage that anti-vax did on my family. And even to this day, that damage is still present. How a person who never had a issue with vaccinations, never had a bad reaction beyond the sour arm for a day, turned so skeptical to everything vaccination. All because those anti-vax channels got to her.
The anti-vax movement killed people. There is scientific study upon study how red states in the US ended up with bigger amounts of deaths given the time periodes. And yet, not a single person was ever charged for this, ... all simply accepted this and never looked back. Like it was a natural thing, that people's grandparents, family members died that did not need to die.
The fact that people have given up, and now accept to let those with often financial interests, spew nonsense as much as they like. Well, its "normal".
I weep for the human race because we are not going to make it.
I agree, but how do you combat propaganda from Putin? Do you match him dollar for dollar? I am sure YouTube would like that, but who has deep enough pockets to counter the disinformation campaigns?
Similar issue with Covid... when you are in the middle of a pandemic, and dead bodies are piling up, and hospitals are running out of room, how do you handle misinformation spreading on social media?
What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?
Time delay. No content based restrictions. Just, like, a 2- to 24-hour delay between when a post or comment is submitted and when it becomes visible, with the user free to delete or change (in this case, the timer resets) their content.
I’d also argue for demonetising political content, but idk if that would fly.
Who decides agar falls in this bucket. The government? That seems to go against the idea of restricting speech and ideas.
Congress for the first. Either the FCC or, my preference, private litigants for the second. (Treble damages for stupid suits, though.)
> For all content or just “political”?
The courts can already distinguish political speech from non-political speech. But I don’t trust a regulator to.
I’d borrow from the French. All content within N weeks of an in the jurisdiction. (I was going to also say any content that mentions an elected by name, but then we’ll just get meme names and nobody needs that.)
Bonus: electeds get constituent pressure to consolidate elections.
Alternative: these platforms already track trending topics. So an easy fix is to slow down trending topics. It doesn’t even need to be by that much, what we want is for people to stop and think and have a chance to reflect on what they do, maybe take a step away from their device while at it.
This is the conundrum social media has created. In the past only the press, who were at least semi-responsible, had the ability to spread information on a massive scale. Social media changed that. Now anyone can spread information instantly on a massive scale, and often it is the conspiracy theories and incorrect information that people seek out.
"We were a bit naive: we thought the internet, with the availability of information, would make us all a lot more factual. The fact that people would seek out—kind of a niche of misinformation—we were a bit naive." -- Bill Gates to Oprah, on "AI and the Future of us".
Yes. An unfortunate conclusion I’m approaching (but have not reached, and frankly don’t want to reach) is the First Amendment doesn’t work in a country that’s increasingly illiterate and addicted to ad-powered algorithmic social media.
It's taking a sword to the surgery room where no scalpel has been invented yet.
We need better tools to combat dis/mis-information.
I wish I knew what that tool was.
Maybe 'inoculating information' that's specifically stickier than the dis/mis-info?
But the government should stay out of it.
Many evil people weren't Nazis; some Nazis weren't necessarily evil. Evil is not part of the definition of Nazism. Promoting authoritarianism, exclusionary nationalism, institutional racism, autarky, anti-liberalism and anti-socialism are the hallmarks of Nazism. Anyone who holds the beliefs of the Nazis is a Nazi, regardless of what level of success they have to date achieved in carrying out their aims.
Only because what they did in 1943 surpassed anything imaginable. In 1933 the Nazi party immediately banned all political parties, arrested thousands of political opponents, started forcing sterilization of anyone with hereditary illnesses, and forced abortions of anyone with hereditary illness. Evil is absolutely an identifying part of Nazis. The idea that Nazis are just anti-liberals is exactly why we cannot go around calling everyone we don't like Nazis. The Nazis were not some niche alt-right organization.
If you genuinely think there are Nazis controlling youtube or the government, and all you're doing is complaining about it on hackernews, you're just as complicit as you're claiming those people were.
Nazi has a lot more connotations than genocide. I'm not sure it is worth nitpicking over. Even if you tone it down to Fascist or Authoritarian there will be push back.
If 'silencing people' doesn't work- so online platforms aren't allowed to remove anything? Is there any limit to this philosophy? So you think platforms can't remove:
Holocaust denial? Clothed underage content? Reddit banned r/jailbait, but you think that's impermissible? How about clothed pictures of toddlers but presented in a sexual context? It would be 'silencing' if a platform wanted to remove that from their private property? Bomb or weapons-making tutorials? Dangerous fads that idiotic kids pass around on TikTok, like the blackout game? You're saying it's not permissible for a platform to remove dangerous instructionals specifically targeted at children? How about spam? Commercial advertising is legally speech in the US. Platforms can't remove the gigantic quantities of spam they suffer from every day?
Where's the limiting principle here? Why don't we just allow companies to set their own rules on their own private property, wouldn't that be a lot simpler?
In the open, it becomes normalized, it draws in more people. Do you rather have some crazies in the corner, or 50% of a population that believes something false, as it became normalized.
The only people benefiting from those dark concepts are those with financial gains. They make money from it, and push the negatives to sell their products and cures. Those that fight against it, do not gain from it and it cost them time/money. That is why is a losing battle.
In this case it wasn't a purely private decision.
How about "If the content isn't illegal, then the government shouldn't pressure private companies to censor/filter/ban ideas/speech"?
And yes, this should apply to everything from criticizing vaccines, denying election results, being woke, being not woke, or making fun of the President on a talk show.
Not saying every platform needs to become like 4chan, but if one wants to be, the feds shouldn't interfere.
1) They are public corporations and are legal creation of the state and benefit from certain protections of the state. They also have privileged access to some public infrastructures that other private companies do not have.
2) By acting on the behest of the government they were agent of the government for free speech and censorship purposes
3) Being monopolies in their respective markets, this means they must respect certain obligations the same way public utilities have.
This actually surprised me because I thought (and maybe still think) that it was Google employees that led the charge on this one.
For Google now to pretend Biden twisted their arm is pretty rich. They'd better have a verifiable paper trail to prove that, if they expect anyone with a memory five years long to believe it.
The Twitter files showed direct communications from the administration asking them ban specific users like Alex Berenson, Dr. Martin Kulldorff, and Dr. Andrew Bostom: https://cbsaustin.com/news/nation-world/twitter-files-10th-i...
Meta submitted direct communications from the administration pressuring them to ban people as part of a congressional investigation: https://www.aljazeera.com/news/2024/8/27/did-bidens-white-ho...
It would be more surprising if they left Google alone.
Although if they got banned during the start of covid during the Trump administration then we're talking about 5 years.
No, it was not. It’s particularly silly to suggest this when we have live example of such orders right now.
The companies were nudged. (And they were wrong to respond to public pressure.) The President, after all, has a “bully pulpit.” But there were no orders, no credibly threats and plenty of companies didn’t deplatform these folks.
I’m not disputing that they coördinated. I’m challenging that they were coerced.
We wouldn’t describe Fox News altering a script on account of a friendly call from Miller and friends the “government ordering private companies” around. (Or, say, Florida opening their criminal justice records to ICE the federal government ordering states around.) Twitter’s leadership and the Biden administration saw eye to eye. This is a story about a media monoculture and private censorship, not government censorship.
It was not. No threats were made, and Twitter didn’t blindly follow the FBI’s guidance.
The simple truth is the leftist elements that wanted to control the debate were there in the White House and in Twitter’s San Francisco offices. Nobody had to be coerced, they were coördinating.
https://blog.youtube/news-and-events/managing-harmful-vaccin...
From the two links in the post, Google fleshes it out in great detail, with many examples of forbidden thought.
The J & J vaccine was approved at the time, but was later banned for causing chronic health effects.
> claims that vaccines do not reduce transmission or contraction of disease
Isn't that true of the covid vaccines? Originally, the proponents claimed that getting the vaccine would stop you from getting covid entirely, but later on, they changed the goal posts to "it will reduce your symptoms of covid".
That's not what happened. Authorities received rare reports of a clotting disorder and paused it for 11 days to investigate. That pause was lifted but the panic caused a crash in demand and J&J withdrew it from the market. Source: https://arstechnica.com/health/2023/06/j-fda-revokes-authori...
https://www.fda.gov/media/146304/download
I am not and my source covers this.
Right here on what should be a technical minded forum, people don’t understand what science is or how it works. Or what risk is. And they don’t even challenge their own beliefs or are curious about how things actually work.
If the “smart” people can’t or won’t continuously incorporate new information, what are our chances?
Some people don’t understand how vaccines work, so may have claimed that, but efficacy rates were very clearly communicated. Anyone who listened in high school biology should know that’s not how they work.
1) YouTube doesn't know what is true. They will be relying on the sort of people they would ban to work out when the consensus is wrong. If I watched a YouTube video through of someone spreading "vaccine misinformation" there is a pretty good chance that the speakers have relevant PhDs or are from the medical profession - there is no way the YouTube censors are more qualified than that, and the odds are they're just be random unqualified employees already working in the euphemistically named "Trust & Safety" team.
2) All vaccines are dangerous and can cause chronic health effects. That statement isn't controversial, the controversy is entirely over the magnitude. Every time I get a vaccine the standard advice is "you should probably hang around here for 5 minutes, these things are known to be dangerous in rare cases". I think in most countries you're more likely to get polio from a polio vaccine than in the wild. On the one hand, that is a success of the polio vaccine. On the other hand, the vaccine is clearly dangerous and liable to cause chronic health problems.
> This would include content that falsely says that approved vaccines cause ... cancer ...
Cancer is such a catch all that we can pretty much guarantee there will be some evidence that vaccines cause cancer. Everything causes cancer. Drinking tea is known to cause cancer.
3) All policies have costs and benefits. People have to be able to discuss the overall cost-benefit of a policy in YouTube videos even if they get one of the costs or benefits completely wrong.
I'm reminded of the Prop 65 signs everywhere in California warning "this might cause cancer"
Any subject important enough in any public forum is potentially going to have wrong opinions that are going to cause harm. While some people could be wrong, and could cause harm, the state itself being wrong is far more dangerous, especially with no dissident voices there to correct its course.
Edit: I see you're getting downvoted for simply stating your honest opinion. But as a matter of principle I'm going to upvote you.
[1] - https://www.amazon.com/Pfizer-Papers-Pfizers-Against-Humanit...
Allowing the debate to be shut down is undemocratic and unscientific (science without question is nothing more than religion).
Not allowing people to come to different conclusions from the same data is tyranny.
If somebody says it, they not only don't care about free speech, they don't even care about having a good faith conversation about free speech. They've probably been told this before, and didn't bother to look it up, just repeated it again. Wasting good people's time.
edit: here's a copy of fire in a crowded theater, https://postimg.cc/gallery/q4PJnPh
https://x.com/cidrap/status/1420482621696618496 ("Our Osterholm Update podcast episode (Jul 22) was removed for “medical misinformation.”" (2021))
Most ironic thing I've ever seen. I still recall it perfectly, though it's been four years. Never, ever trust censorship algorithms or the people who control them: they are just dumb parrots that suppress all discussion of an unwanted topic, without thought or reason.
What I'm not comfortable with is preventing a private company from moderating their product.
The problem is that the recommendation algorithms push their viewers into these echo chambers that are divorced from reality where all they see are these videos promoting misinformation. Google's approach to combating that problem was to remove the channels, but the right solution was, and still is today, to fix the algorithms to prevent people from falling into echo chambers.
These companies owe you nothing that's not in a contract or a requirement of law. That you think they owe you hosting, distribution, and effort on their algorithm, is a sign of how far off course this entire discourse has moved.
First, there's YouTube's decision of whether or not to allow potentially dangerous misinformation to remain on their site, and whether the government can or did require them to remove it.
Second, though, there's YouTube's much stronger editorial power: whether or not to recommend, advertise, or otherwise help people discover that content. Here I think YouTube most fairly deserves criticism or accolades, and it's also where YouTube pretends that the algorithm is magic and neutral and they cannot be blamed for actively pushing videos full of dangerous medical lies.
I like the term disinformation better, since it can expand to the unfortunately more relevant dissenting information.
For whatever reason I guess I'm in that very rare group that genuinely watches everything from far-right racists, to communists, to mainstream media content, to science educational content, to conspiracy content, etc.
My YT feed is all over the place. The algorithms will serve you a very wide range of content if you want that, the issue is that most people don't. They want to hear what they already think.
So while I 100% support changing algorithms to encourage more diversity of views, also I think as a society we need to question why people don't want to listen to more perspectives naturally? Personally I get so bored here people basically echo what I think. I want to listen to people who say stuff I don't expect or haven't thought about before. But I'm in a very significant minority.
This is why perma bans are bad. Id rather a strike system before a temp ban to give some breathing room for people to navigate the inevitable incorrect automation. Even then if the copyright issue is anything to go by this is going to hurt more than help.
The issue is that there is no mechanism for punishing people who spread dangerous misinformation. It's strange that it doesn't exist though, because you're allowed to sue for libel and slander. We know that it's harmful, because people will believe lies about a person, damaging their reputation. It's not clear why it can't be generalized to things that we have a high confidence of truth in and where lying is actively harmful.
I wouldn't be surprised if some lawsuit is incoming from the company that manufactures it.
We have mechanisms for combatting the government through lawsuits. If the government came out lies that actively harm people, I hope lawsuits come through or you know... people organize and vote for people who represent their interests.
..but i'm not a yter.
Actual letter: https://judiciary.house.gov/sites/evo-subsites/republicans-j...
Good editorial: https://www.businessinsider.com/google-meta-congress-letter-...
Yes, I know about the Charlie Kirk firings etc.
- https://www.engadget.com/big-tech/youtube-may-reinstate-chan...
- https://arstechnica.com/gadgets/2025/09/youtube-will-restore...
First of all, you can't separate a thing's content from the platform it's hosted on? Really?
Second of all, this is why
I'll just go do this again and if you flag me it's on you, you have no standing to do it (the internet is supposed to be democratic, remember?)
https://rumble.com/v28x6zk-sasha-latypova-msc.-nsa-team-enig...
https://rumble.com/v3zh3fh-staggering-17m-deaths-after-covid...
https://rumble.com/vt62y6-covid-19-a-second-opinion.html
https://rumble.com/v2nxfvq-international-covid-summit-iii-pa...
I could go on. Feel free if you want to see more. :)
(Was it misinformation when Fauci said you shouldn't rush a vaccine or all hell breaks loose years later? Or when he intimated that masks wouldn't work for covid?)
On this site, we're trying for discussion in which people don't just bash each other with pre-existing talking points (and unprocessed rage). Such comments quickly flood the thread on a divisive topic like this one, so flagging them is essential to having HN operate as intended. To the extent possible at least.
(oh and btw, though it ought to go without saying, this has to do with the type of comment, not the view it's expressing. People should be able to make their substantive points thoughtfully, without getting flagged.)
https://news.ycombinator.com/newsguidelines.html
Future tense?
The next Drain-o chug challenge "accident" is inevitable, at this rate.
Yes, those are Kimmel words, and when he said them, everyone already knew the facts about his gay lifestyle and trans boyfriend and that he murdered Kirk out of hate. It’s nice to see that you picked up his torch regardless of how little rational sense it makes.
>Somebody using violence for political means is exactly aligned with Charlie Kirk's spoken words.
Cite it. In whole context, but video is preferable. The guy had a decade of being in front of the camera, post any the hateful video, it should be easy! Show us the hate, if not the justification for his own murder of course.
IMO we should not let private platforms become near monopolies, and certainly not without regulation, since they become a defacto public square. But if we're going to let them eat the world, then hopefully they'll at least use good judgment and measures like de-ranking or even banning folks who encourage others to do harm. Making bans temporary is a safety valve in case of bad moderation.
You can leave it up to companies, but what happens when Trump allies like Elon Musk and Larry Ellison buy up major platforms like Twitter and TikTok?
Do we really trust those guys with that much power?
1: https://www.fortnite.com/news/fortnite-anti-cheat-update-feb...
Somewhat related, it's pretty insane how even to this day YouTubers have to avoid referring to by name a global multi-year situation that everyone who existed at the time went through. It's due to advertisers rather than government pressure, but still, insane.
Even if people were straight up wrong about their COVID-19 theories, I don't think many of the banned people were trying to get viewers to send them money.
They were trying to get viewers to get money. It's an important distinction.
All they have to do is lie to attract eyeballs and they make money. E-begging isn't necessary, the platforms allow you to extract value from viewers at an incredible scale.
I know that some services do this in addition to account ban.
Even if one does argue that CSAM should result in hardware and IP bans, there's no reason that can't be a sole exception to a wider prohibition on such bans.
We don’t have the resources for this, even when the FBI isn’t being purged and sent to Home Depots. Unrestricting IPs means a boom for CSAM production and distribution.