26 comments

  • diego_sandoval 43 minutes ago
    At the time, YouTube said: “Anything that would go against World Health Organization recommendations would be a violation of our policy.” [1] which, in my opinion, is a pretty extreme stance to take, especially considering that the WHO contradicted itself many times during the pandemic.

    [1] https://www.bbc.com/news/technology-52388586

    • danparsonson 15 minutes ago
      > the WHO contradicted itself many times during the pandemic

      Did they? I remember them revising their guidance, which seems like something one would expect during an emerging crisis, but I don't remember them directly contradicting themselves.

    • sterlind 8 minutes ago
      it was an extreme time, but yes, probably the most authoritarian action I've seen social media take.

      misinformation is a real and worsening problem, but censorship makes conspiracies flourish, and establishes platforms as arbiters of truth. that "truth" will shift with the political tides.

      IMO we need to teach kids how to identify misinformation in school. maybe by creating fake articles, mixing them with real articles and having students track down sources and identify flaws. critical thinking lessons.

      • adiabatichottub 1 minute ago
        As I recall from my school days, in Social Studies class, there were a set of Critical Thinking questions at the end of every chapter in the textbook. Never once were we assigned any of those questions.
    • hyperhopper 26 minutes ago
      The united states also said not to buy masks and that they were ineffective during the pandemic.

      Placing absolute trust in these organizations and restricting freedom of speech based on that is a very bootlicking, anti-freedom stance

  • system7rocks 17 minutes ago
    We live in a complicated world, and we do need the freedom to get things right and wrong. Never easy though in times of crisis.

    Silver lining in this is the conversation continued and will continue. I can see governments needing to try to get accurate and helpful information out in crisis - and needing to pressure or ask more of private companies to do that. But also like that we can reflect back and go - maybe that didn’t work like what we wanted or maybe it was heavy-handed.

    In many governments, the government can do no wrong. There are no checks and balances.

    The question is - should we still trust YouTube/Google? Is YouTube really some kind of champion of free speech? No. Is our current White House administration a champion of free speech? Hardly.

    But hopefully we will still have a system that can have room for critique in the years to come.

  • softwaredoug 2 hours ago
    I'm very pro-vaccines, I don't think the 2020 election was stolen. But I think we have to realize silencing people doesn't work. It just causes the ideas to metastasize. A lot of people will say all kinds of craziness, and you just have to let it ride so most of us can roll our eyes at it.
    • asadotzler 11 minutes ago
      My refusing to distribute your work is not "silencing." Silencing would be me preventing you from distributing it.

      Have we all lost the ability to reason? Seriously, this isn't hard. No one owes you distribution unless you have a contract saying otherwise.

      • Jensson 1 minute ago
        > No one owes you distribution unless you have a contract saying otherwise.

        The common carrier law says you have to for for some things, so it makes sense to institute such a law for some parts of social media as they are fundamental enough. It is insane that we give that much censorship power to private corporations.

      • sterlind 6 minutes ago
        I'd certainly consider an ISP refusing to route my packets as silencing. is YouTube so different? legally, sure, but practically?
      • timmg 1 minute ago
        It's interesting how much "they are a private company, they can do what they want" was the talking point around that time. And then Musk bought Twitter and people accuse him of using it to swing the election or whatever.

        Even today, I was listening to NPR talk about the potential TikTok deal and the commenter was wringing their hands about having a "rich guy" like Larry Ellison control the content.

        I don't know exactly what the right answer is. But given their reach -- and the fact that a lot of these companies are near monopolies -- I think we should at least do more than just shrug and say, "they can do what they want."

      • justinhj 3 minutes ago
        So you're saying that YouTube is a publisher and should not have section 230 protections? They can't have it both ways. Sure remove content that violates policies but YouTube has long set itself up as an opinion police force, choosing which ideas can be published and monetized and which cannot.
      • hn_throw_250915 5 minutes ago
        [dead]
    • andy99 1 hour ago
      The more important point (and this is really like a high school civics debate) is that the government and/or a big tech company shouldn't decide what people are "allowed" to say. There's tons of dumb stuff online, the only thing dumber is the state dictating how I'm supposed to think. People seem to forget that sometimes someone they don't agree with is in power. What if they started banning tylenol-autism sceptical accounts?
      • mapontosevenths 1 hour ago
        > the government and/or a big tech company shouldn't decide what people are "allowed" to say.

        That "and/or" is doing a lot of work here. There's a huge difference between government censorship and forcing private companies to host content they don't want to host on servers they own.

        Then again, Alphabet is now claiming they did want to host it and mean old Biden pressured them into pulling it so if we buy that, maybe it doesn't matter.

        > What if they started banning tylenol-autism sceptical accounts?

        What if it's pro-cannibalism or pedophilia content? Everyone has a line, we're all just arguing about where exactly we think that line should be.

        • MostlyStable 1 hour ago
          It can simultaneously be legal/allowable for them to ban speech, and yet also the case that we should criticize them for doing so. The first amendment only restricts the government, but a culture of free speech will also criticize private entities for taking censorious actions. And a culture of free speech is necessary to make sure that the first amendment is not eventually eroded away to nothing.
          • plantwallshoe 24 minutes ago
            Isn’t promoting/removing opinions you care about a form of speech?

            If I choose to put a Kamala sign in my yard and not a Trump sign, that’s an expression of free speech.

            If the marketing company I own decides to not work for causes I don’t personally support, that’s free speech.

            If the video hosting platform I’m CEO of doesn’t host unfounded anti-vax content because I think it’s a bad business move, is that not also free speech?

            • lmz 17 minutes ago
              Agreed. If I have a TV network and think these anti-government hosts on my network are bad for business, that is also freedom of speech.
              • rubyfan 5 minutes ago
                Maybe. If it is independent of government coercion.
          • lkey 27 minutes ago
            Or it might be the case that that 'culture' is eroding the thing it claims to be protecting. https://www.popehat.com/p/how-free-speech-culture-is-killing...
          • SantalBlush 4 minutes ago
            Are you in favor of HN allowing advertisements, shilling, or spam in these threads? Because those things are free speech. Would you like to allow comments about generic ED pills?

            I simply don't believe people who say they want to support a culture of free speech on a media or social media site. They haven't really thought about what that means.

          • asadotzler 13 minutes ago
            Will you criticize my book publishing company for not publishing and distributing your smut short story?
            • user34283 4 minutes ago
              No, but I will criticize Apple and Google for banning smut apps.

              If those two private companies would host all legal content, this could be a thriving market.

              Somehow big tech and payment processors get to censor most software.

        • briHass 29 minutes ago
          The line should be what is illegal, which, at least in the US, is fairly permissive.

          The legal process already did all the hard work of reaching consensus/compromise on where that line is, so just use that. At least with the legal system, there's some degree of visibility and influence possible by everyone. It's not some ethics department silently banning users they don't agree with.

        • mc32 32 minutes ago
          The thing is that people will tell you it wasn’t actually censorship because for them it was only the government being a busy body nosey government telling the tech corps about a select number of people violating their terms (nudge nudge please do something)… so I think the and/or is important.
      • JumpCrisscross 47 minutes ago
        > the government and/or a big tech company shouldn't decide what people are "allowed" to say

        This throws out spam and fraud filters, both of which are content-based moderation.

        Nobody moderates anything isn’t unfortunately a functional option. Particularly if the company has to sell ads.

      • asadotzler 14 minutes ago
        No one in Big Tech decides what you are allowed to say, they can only withhold their distribution of what you say.

        As a book publisher, should I be required to publish your furry smut short stories? Of course not. Is that infringing on your freedom of speech? Of course not.

        • mitthrowaway2 12 minutes ago
          No, they ban your account and exclude you from the market commons if they don't like what you say.
      • mulmen 5 minutes ago
        [delayed]
      • heavyset_go 23 minutes ago
        This is just a reminder that we're both posting on one the most heavily censored, big tech-sponsored spaces on the internet, and arguably, that's what allows for you to have your civics debate in earnest.

        What you are arguing for is a dissolution of HN and sites like it.

      • zetazzed 29 minutes ago
        Does Disney have a positive obligation to show animal cruelty snuff films on Disney Plus? Or are they allowed to control what people say on their network? Does Roblox have to allow XXX games showing non-consensual sex acts on their site, or are they allowed to control what people say on their network? Can WebMD decide not to present articles claiming that homeopathy is the ultimate cure-all? Does X have to share a "trending" topic about the refusal to release the Epstein files?

        The reason we ban government censorship is so that a private actor can always create their own conspiracy theory + snuff film site if they want, and other platforms are not obligated to carry content they find objectionable. Get really into Rumble or Truth Social or X if you would like a very different perspective from Youtube's.

    • sazylusan 18 minutes ago
      Perhaps free speech isn't the problem, but free speech x algorithmic feeds is? As we all know the algorithm favors the dramatic, controversial, etc. That creates an uneven marketplace for free speech where the most subversive and contrarian takes essentially have a megaphone over everyone else.
      • hn_throwaway_99 10 minutes ago
        Glad to see this, was going to make a similar comment.

        People should be free to say what they want online. But going down "YouTube conspiracy theory" rabbit holes is a real thing, and YouTube doesn't need to make that any easier, or recommend extreme (or demonstrably false) content because it leads to more "engagement".

      • sazylusan 14 minutes ago
        Building on that, the crazy person spouting conspiracy theories in the town square, who would have been largely ignored in the past, suddenly becomes the most visible.

        The first amendment was written in the 1700s...

    • Aloha 1 hour ago
      I think it made sense as a tactical choice at the moment, just like censorship during wartime - I dont think it should go on forever, because doing so is incompatible with a free society.
      • llm_nerd 27 minutes ago
        It didn't even make sense at the time. It tainted everything under a cloud that the official, accepted truth needed to suppress alternatives to win the battle of minds. It was disastrous, and it is astonishing seeing people (not you, but in these comments) still trying to paint it as a good choice.

        It massively amplified the nuts. It brought it to the mainstream.

        I'm a bit amazed seeing people still justifying it after all we've learned.

        COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.

        And to state my position like the root guy, I'm a progressive, pro-vaccine, medical science believer. I listen to my doctor and am skeptical if not dismissive of the YouTube "wellness" grifters selling scam supplements. I believe in science and research. I thought the worm pill people were sad if not pathetic. Anyone who gets triggered by someone wearing a mask needs to reassess their entire life.

        But lockdowns went on way too long. Limits on behaviour went on way too long. Vaccine compliance measures were destructive the moment we knew it had a negligible effect on spread. When platforms of "good intentions" people started silencing the imbeciles, it handed them a megaphone and made the problem much worse.

        And now we're living in the consequences. Where we have a worm-addled halfwit directed medicine for his child-rapist pal.

        • LeafItAlone 13 minutes ago
          >It massively amplified the nuts. It brought it to the mainstream.

          >COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.

          In theory, I agree, kind of.

          But also - we were 10+ months into COVID raging in the US before Biden’s administration, the administration that enacted the policies the article is about, came to be. Vaccine production and approval were well under way, brought to fruition in part due to the first Trump administration. The “nuts” had long been mainstream and amplified before this “silencing” began. Misinformation was rampant and people were spreading it at a quick speed. Most people I know who ultimately refused the vaccines made up their minds before Biden took office.

      • ioteg 58 minutes ago
        [dead]
    • yojo 25 minutes ago
      I think there's a difference between silencing people, and having an algorithm that railroads people down a polarization hole.

      My biggest problem with YouTube isn't what it does/doesn't allow on its platform, it's that it will happily feed you a psychotic worldview if it keeps you on the site. I've had several family members go full conspiracy nut-job after engaging heavily with YouTube content.

      I don't know what the answer is. I think many people would rightly argue that removing misinformation from the recommendation engine is synonymous with banning it. FWIW I'd be happy if recommendation engines generally were banned for being a societal cancer, but I'm probably in the minority here.

    • braiamp 18 minutes ago
      > But I think we have to realize silencing people doesn't work

      It actually does work. You need to remove ways for misinformation to spread, and suppressing a couple of big agents works very well.

      - https://www.nature.com/articles/s41586-024-07524-8 - https://www.tandfonline.com/doi/full/10.1080/1369118X.2021.1... - https://dl.acm.org/doi/abs/10.1145/3479525 - https://arxiv.org/pdf/2212.11864

      Obviously, the best solution would be prevention, by having good education systems and arming common people with the weapons to assess and criticize information, but we are kinda weak on that front.

    • ants_everywhere 16 minutes ago
      These policies were put in place because the anti-vax and election skepticism content was being promoted by military intelligence organizations that were trying to undermine democracy and public healthy in the US.

      The US military also promoted anti-vax propaganda in the Philippines [0].

      A lot of the comments here raise good points about silencing well meaning people expressing their opinion.

      But information warfare is a fundamental part of modern warfare. And it's effective.

      An American company or individual committing fraud can be dealt with in the court system. But we don't yet have a good remedy for how to deal with a military power flooding social media with information intentionally designed to mislead and cause harm for people who take it seriously.

      So

      > I think we have to realize silencing people doesn't work

      it seems to have been reasonably effective at combating disinformation networks

      > It just causes the ideas to metastasize

      I don't think this is generally true. If you look at old disinformation campaigns like the idea that the US faked the moon landings, it's mostly confined to a small group of people who are prone to conspiracy thinking. The idea of a disinformation campaign is you make it appear like a crazy idea has broad support. Making it appear that way requires fake accounts or at least boosters who are in on the scheme. Taking that away means the ideas compete on their own merit, and the ideas are typically real stinkers.

      [0] https://www.btimesonline.com/articles/167919/20240727/u-s-ad...

    • lkey 44 minutes ago
      I think you are granting false neutrality to this speech. These misinfo folks are always selling a cure to go with their rejection of medicine. It's a billion dollar industry built off of spreading fear and ignorance, and youtube doesn't have any obligation to host their content. As an example, for 'curing' autism, the new grift is reject Tylenol and buy my folic acid supplement to 'fix' your child. Their stores are already open and ready.
      • lkey 40 minutes ago
        To finish the thought, scientists at the CDC (in the before times) were not making money off of their recommendations, nor were they making youtube videos as a part of their day job. There's a deep asymmetry here that's difficult to balance if you assume the premise that 'youtube must accept every kind of video no matter what, people will sort themselves out'. Reader, they will not.
      • mvdtnz 42 minutes ago
        And silencing these people only lends credence to their "they don't want you to know this" conspiracy theories. Because at that point it's not a theory, it's a proven fact.
        • lkey 32 minutes ago
          These people will claim they were 'silenced' regardless. Even as they appear with their published bestseller about being silenced on every podcast and news broadcast under the sun, they will speak of the 'conspiracy' working against them at every step. The actual facts at hand almost never matter. Even at a press conference where the President is speaking on your behalf they'll speak of the 'groups' that are 'against' them, full of nefarious purpose. There is no magical set of actions that changes the incentive they have to lie, or believe lies. (except regulation of snake oil, which is not going to happen any time soon)
          • mvdtnz 27 minutes ago
            And most people roll their eyes and don't believe it. Which is why it's a good idea not to make it true.
            • lkey 11 minutes ago
              Conspiratorial thinkers are more likely to believe that Osama Bin Laden was already dead and is still alive rather than the official narrative that he was killed on the day reported. https://www.researchgate.net/publication/235449075_Dead_and_...

              In general, you can't argue or 'fact' people out of beliefs they were not argued into. The best you can do is give them a safe place to land when disconfirmation begins. Don't be too judgy, no one is immune to propaganda.

    • kypro 1 hour ago
      I agree. People today are far more anti-vaccine than they were a few years ago which is kinda crazy when you consider we went through a global pandemic where one of the only things that actually worked to stop people dying was the roll out of effective vaccines.

      I think if public health bodies just laid out the data they had honestly (good and bad) and said that they think most people should probably take it, but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.

      • dotnet00 1 hour ago
        I think the anti-vax thing is mostly because the average Western education level is just abysmal.

        Add in a healthy dose of subconsciously racist beliefs about how advanced Western society is (plus ideas of how this means they must be smart too) and how catching diseases preventable by vaccines is only a brown people thing.

        Basically, it's easy to be anti-vax when the disease isn't in your face and you have an out-group to blame even if it does end up in your face (a common excuse by anti-vaxxers I see when measles is in the news is that the immigrants are bringing it in and should be blamed instead of anti-vaxxers)

        • mrcwinn 32 minutes ago
          If that were the case, wouldn’t we see vaccine skepticism in poorly educated, racist non-Western nations?
          • braiamp 16 minutes ago
            You don't see those, because it's on their faces. Or more accurately on our faces. I live in such country, and we kill for having our kids vaccinated. We live these diseases, so we aren't so stupid to fall for misinformation.
        • logicchains 1 hour ago
          The anti-vax thing is because every single comparative study of vaccinated and unvaccinated children found a greater rate of developmental disorders in vaccinated children. They're also the only products for which you're not allowed to sue the manufacturers for liability, and the justification given by the manufacturers for requesting this liability protection was literally that they'd be sued out of business otherwise. If they were as safe as other treatments they wouldn't need a blanket liability immunity.

          Anthony R. Mawson, et al., “Pilot Comparative Study on the Health of Vaccinated and Unvaccinated 6 to 12-year-old U.S. Children,” Journal of Translational Science 3, no. 3 (2017): 1-12, doi: 10.15761/JTS.1000186

          Anthony R. Mawson et al., “Preterm Birth, Vaccination and Neurodevelopmental Disorders: A Cross-Sectional Study of 6- to 12-Year-Old Vaccinated and Unvaccinated Children,” Journal of Translational Science 3, no. 3 (2017): 1-8, doi:10.15761/JTS.1000187.

          Brian Hooker and Neil Z. Miller, “Analysis of Health Outcomes in Vaccinated and Unvaccinated Children: Developmental Delays, Asthma, Ear Infections and Gastrointestinal Disorders,” SAGE Open Medicine 8, (2020): 2050312120925344, doi:10.1177/2050312120925344.

          Brian Hooker and Neil Z. Miller, “Health Effects in Vaccinated versus Unvaccinated Children,” Journal of Translational Science 7, (2021): 1-11, doi:10.15761/JTS.1000459.

          James Lyons-Weiler and Paul Thomas, “Relative Incidence of Office Visits and Cumulative Rates of Billed Diagnoses along the Axis of Vaccination,” International Journal of Environmental Research and Public Health 17, no. 22 (2020): 8674, doi:10.3390/ijerph17228674.

          James Lyons-Weiler, "Revisiting Excess Diagnoses of Illnesses and Conditions in Children Whose Parents Provided Informed Permission to Vaccinate Them" September 2022 International Journal of Vaccine Theory Practice and Research 2(2):603-618 DOI:10.56098/ijvtpr.v2i2.59

          NVKP, “Diseases and Vaccines: NVKP Survey Results,” Nederlandse Vereniging Kritisch Prikken, 2006, accessed July 1, 2022.

          Joy Garner, “Statistical Evaluation of Health Outcomes in the Unvaccinated: Full Report,” The Control Group: Pilot Survey of Unvaccinated Americans, November 19, 2020.

          Joy Garner, “Health versus Disorder, Disease, and Death: Unvaccinated Persons Are Incommensurably Healthier than Vaccinated,” International Journal of Vaccine Theory, Practice and Research 2, no. 2, (2022): 670-686, doi: 10.56098/ijvtpr.v2i2.40.

          Rachel Enriquez et al., “The Relationship Between Vaccine Refusal and Self-Report of Atopic Disease in Children,” The Journal of Allergy and Clinical Immunology 115, no. 4 (2005): 737-744, doi:10.1016/j.jaci.2004.12.1128.

          • jawarner 57 minutes ago
            Mawson et al. 2017 (two papers) – internet survey of homeschoolers recruited from anti-vaccine groups; non-random, self-reported, unverified health outcomes. Retracted by the publisher after criticism.

            Hooker & Miller 2020/2021 – analysis of “control group” data also from self-selected surveys; same methodological problems.

            Lyons-Weiler & Thomas 2020, 2022 – data from a single pediatric practice run by one of the authors; serious selection bias.

            Joy Garner / NVKP surveys – activist-run online surveys with no verification.

            Enriquez et al. 2005 – a small cross-sectional study about allergy self-reports, not about overall neurodevelopment.

            Large, well-controlled population studies (Denmark, Finland, the U.S. Vaccine Safety Datalink, etc.) comparing vaccinated vs. unvaccinated children show no increase in autism, neurodevelopmental disorders, or overall morbidity attributable to recommended vaccines.

          • MSM 56 minutes ago
            I picked one at random (NVKP, "Diseases and Vaccines: NVKP Survey Results") and, while I needed to translate it to read it, it's clear (and loud!) about not actually being a scientific study.

            "We fully realize that a survey like this, even on purely scientific grounds, is flawed on all counts. The sample of children studied is far too small and unrepresentative, we didn't use control groups, and so on."

            Turns out the NVKP roughly translates to "Dutch Organization for those critical towards vaccines."

            I understand being skeptical about vaccines, but the skepticism needs to go both ways

          • lkey 53 minutes ago
            "If they were as safe as other treatments they wouldn't need a blanket liability immunity." Citation very much needed for this inference.

            Even if I granted every single paper's premise here. I'd still much rather have a living child with a slightly higher chance of allergies or asthma or <insert survivable condition here> than a dead child. How quickly we forget how bad things once were. Do you dispute that vaccines also accounted for 40% of the decline in infant mortality over the last 50 years? And before that, TB, Flu, and Smallpox killed uncountably many people. Vaccines are a public good and one of the best things we've ever created as a species.

            Do you also have theories about autism you'd like to share with the class?

            • TimorousBestie 43 minutes ago
              A very good point. These studies should be comparing QALYs (quality-adjusted life years, a measure of disease burden) instead of relative prevalence of a handful of negative outcomes, the latter of which is much more vulnerable to p-hacking.
          • conception 36 minutes ago
            Here’s where the “bad ideas out in the open get corrected” now is tested. There are 4 really good refutations of your evidence. Outside of the unspoken “perhaps vaccines cause some measurable bad outcomes but compare then to measles. And without the herd immunity vaccinations aren’t nearly as useful” argument.

            So the important question is: Are you now going to say “well, I guess i got some bad data and i have to go back and review my beliefs” or dig in?

          • barbazoo 12 minutes ago
            > If they were as safe as other treatments they wouldn't need a blanket liability immunity.

            Other treatments aren’t applied preventatively to the entire population which is why the risk presumably is lower.

          • tnias23 29 minutes ago
            The studies you cite are the typical ones circulated by antivaxers and are not considered credible by the medical community due to severe methodological flaws, undisclosed biases, retractions, etc.

            To the contrary, high quality studies consistently show that vaccines are not linked to developmental disability or worse health outcomes.

          • TimorousBestie 54 minutes ago
            > Anthony R. Mawson, et al., “Pilot Comparative Study on the Health of Vaccinated and Unvaccinated 6 to 12-year-old U.S. Children,” Journal of Translational Science 3, no. 3 (2017): 1-12, doi: 10.15761/JTS.1000186

            Retracted: https://retractionwatch.com/2017/05/08/retracted-vaccine-aut...

            If you edit down your list to journal articles that you know you be valid and unretracted, I will reconsider looking through it. However, journal access in general is too expensive for me to bother reading retracted articles.

        • xdennis 1 hour ago
          > I think the anti-vax thing is mostly because the average Western education level is just abysmal.

          What does the West have to do with it? Non-westerners are even more into folk medicine and witch doctors.

          • dotnet00 59 minutes ago
            They're into folk medicine, but their anti-vax issues generally come from people who don't have any means of knowing better (i.e. never been to school, dropped out at a very early grade, isolated, not even literate). Typically just education and having a doctor or a local elder respectfully explain to them that the Polio shot will help prevent their child from being paralyzed for life is enough to convince them.

            Meanwhile the 'educated' Westerner, to whom Polio is a third-world disease, will convince themselves that the doctor is lying for some reason, will choose to take the 75% chance of an asymptomatic infection because they don't truly appreciate how bad it can otherwise be, will use their access to a vast collection of humanity's information to cherry pick data that supports their position (most likely while also claiming to seek debate despite not intending to seriously consider opposing evidence), and if their gamble fails, will probably just blame immigrants, government or 'big pharma' for doing it.

          • andrewmcwatters 1 hour ago
            And yet, SEA and others are still better educated than us.
            • LeafItAlone 40 minutes ago
              >SEA and others are still better educated than us.

              Honest question: is this true? What’s the data around this? If it is true, why are there so many people from SEA in American universities? Wouldn’t they stay in their home country or another in the area?

              I’m truly trying to learn here and square this statement with what I’ve come to understand so far.

        • kypro 1 hour ago
          Anti-vax has never really been a thing though. I don't know what the data is these days, but it used to be like 1% of the population who were anti-vax.

          We have the same thing going on with racism in the West where people are convinced racism is a much bigger problem than it actually is.

          And whether it's anti-vax or racist beliefs, when you start attacking people for holding these views you always end up inadvertently encouraging people to start asking why that is and they end up down rabbit holes.

          No one believes peas cause cancer for example, but I guarantee one of best ways to make people start to believing peas cause cancer is for the media to start talking about how some people believe that peas do cause cancer, then for sites like YouTube and Facebook to starting ban people who talk about it. Because if they allow people to talk about UFOs and flat Earth conspiracies why are they banning people for suggesting that peas cause cancer? Is there some kind of conspiracy going on funded by big agriculture? You can see how this type of thinking happens.

          • dotnet00 39 minutes ago
            Anti-vax was enough of an issue that vaccine mandates were necessary for Covid.

            It also isn't convincing to be claiming that racism isn't as big in the West given all the discourse around H1Bs, Indians (the Trump base has been pretty open on this one, with comments on JD Vance's wife, the flood of anti-Indian racism on social media, and recently the joy taken in attempting to interfere with Indians forced to fly back to the US in a hurry due to the lack of clarity on the H1B thing), how ICE is identifying illegals, a senator openly questioning the citizenship of a brown mayoral candidate and so on.

            I agree that denying something is the easiest way to convince people of the opposite, but it's also understandable when social media companies decide to censor advice from well known individuals that people should do potentially harmful things like consume horse dewormer to deal with Covid. Basically, it's complicated, though I would prefer to lean towards not censoring such opinions.

      • trollbridge 1 hour ago
        And the attempts at censorship have played a part in people drifting towards being more vaccine-hesitant or anti-vaccine.

        It's often a lot better to just let kooks speak freely.

        • vFunct 54 minutes ago
          It's less about censorship and more about more people becoming middle-class and therefore thinking they're smarter than researchers.

          There is nobody more confident in themselves than the middle-class.

          • khazhoux 43 minutes ago
            That’s a very confident statement presented without a hint of evidence.
      • logicchains 1 hour ago
        >where one of the only things that actually worked to stop people dying was the roll out of effective vaccines.

        The only reason you believe that is because all information to the contrary was systematically censored and removed from the media you consume. The actual data doesn't support that, there are even cases where it increased mortality, like https://pmc.ncbi.nlm.nih.gov/articles/PMC11278956/ and increased the chance of future covid infections, like https://pubmed.ncbi.nlm.nih.gov/39803093/ .

        • rpiguy 1 hour ago
          I appreciate you.

          People have become more anti-Vax because the Covid vaccines were at best ineffective and as you said anything contra-narrative is buried or ignored.

          If you push a shitty product and force people to take it to keep their jobs it’s going to turn them into skeptics of all vaccines, even the very effective ones.

          More harm than good was done there. The government should have approved them for voluntary use so the fallout would not have been so bad.

        • cynicalkane 21 minutes ago
          This is typical of Covid conspiracy theorists, or conspiracy theorists of any sort: one or two papers on one side prove something, but an overwhelming mountain of evidence on the other side does not prove something. The theorist makes no explanation as to how a planetful of scientists missed the obvious truth that some random dudes found; they just assert that it happened, or make some hand-waving explanation about how an inexplicable planet-wide force of censors is silencing the few unremarkable randos who somehow have the truth.

          The first paper seems to claim a very standard cohort study is subject to "immortal time bias", an effect whereby measuring outcomes can seem to change them.

          The typical example of sampling time bias is that slow-growing cancers are more survivable than fast-growing ones, but also more likely to be measured by a screening, giving a correlation between screening and survivablility. So you get a time effect where more fast-acting cancers do not end up in the measurement, biasing the data. But in measurements such that one outcome or the other does not bias the odds of that outcome being sampled, there can be no measurement time effect, which is why it's a pretty uncommon thing to correct for. The authors do not explain why measurement time effects would have anything to do with detecting or not detecting death rates in the abstract, or anywhere else in the paper, and why such an unconventional adjustment is necessary, because they are quacks seeking a preferred outcome. Nor do they explain why measurement methods immune to any possible such effect consistently yield the result that vaccines work.

          I did not read the second paper.

      • vkou 1 hour ago
        > but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.

        Nah, the same grifters who stand to make a political profit of turning everything into a wedge issue would have still hammered right into it. They've completely taken over public discourse on a wide range of subjects, that go well beyond COVID vaccines.

        As long as you can make a dollar by telling people that their (and your) ignorance is worth just as much - or more - than someone else's knowledge, you'll find no shortage of listeners for your sermon. And that popularity will build its own social proof. (Millions of fools can't all be wrong, after all.)

        • kypro 1 hour ago
          I agree. Again the vast majority would have gotten the vaccine.

          There's always going to be people for all kinds of reasons pushing out bad ideas. That's part of the trade-off of living in a free society where there is no universal "right" opinion the public must hold.

          > They've completely taken over public discourse on a wide range of subjects

          Most people are not anti-vax. If "they've" "taken over public discourse" in other subjects to the point you are now holding a minority opinion you should consider whether "they" are right or wrong and why so many people believe what they do.

          If can't understand their position and disagree you should reach out to people in a non-confrontational way, understand their position, then explain why you disagree (if you still do at that point). If we all do a better job at this we'll converge towards truth. If you think talking and debate isn't the solution to disagreements I'd argue you don't really believe in our democratic system (which isn't a judgement).

      • stefantalpalaru 1 hour ago
        > one of the only things that actually worked to stop people dying was the roll out of effective vaccines

        "A total of 913 participants were included in the final analysis. The adjusted ORs for COVID-19 infection among vaccinated individuals compared to unvaccinated individuals were 1.85 (95% CI: 1.33-2.57, p < 0.001). The odds of contracting COVID-19 increased with the number of vaccine doses: one to two doses (OR: 1.63, 95% CI: 1.08-2.46, p = 0.020), three to four doses (OR: 2.04, 95% CI: 1.35-3.08, p = 0.001), and five to seven doses (OR: 2.21, 95% CI: 1.07-4.56, p = 0.033)." - ["Behavioral and Health Outcomes of mRNA COVID-19 Vaccination: A Case-Control Study in Japanese Small and Medium-Sized Enterprises" (2024)](https://www.cureus.com/articles/313843-behavioral-and-health...)

        "the bivalent-vaccinated group had a slightly but statistically significantly higher infection rate than the unvaccinated group in the statewide category and the age ≥50 years category" - ["COVID-19 Infection Rates in Vaccinated and Unvaccinated Inmates: A Retrospective Cohort Study" (2023)](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10482361/)

        "The risk of COVID-19 also varied by the number of COVID-19 vaccine doses previously received. The higher the number of vaccines previously received, the higher the risk of contracting COVID-19" - ["Effectiveness of the Coronavirus Disease 2019 (COVID-19) Bivalent Vaccine" (2022)](https://www.medrxiv.org/content/10.1101/2022.12.17.22283625v...)

        "Confirmed infection rates increased according to time elapsed since the last immunity-conferring event in all cohorts. For unvaccinated previously infected individuals they increased from 10.5 per 100,000 risk-days for those previously infected 4-6 months ago to 30.2 for those previously infected over a year ago. For individuals receiving a single dose following prior infection they increased from 3.7 per 100,000 person days among those vaccinated in the past two months to 11.6 for those vaccinated over 6 months ago. For vaccinated previously uninfected individuals the rate per 100,000 person days increased from 21.1 for persons vaccinated within the first two months to 88.9 for those vaccinated more than 6 months ago." - ["Protection and waning of natural and hybrid COVID-19 immunity" (2021)](https://www.medrxiv.org/content/10.1101/2021.12.04.21267114v...)

    • dawnerd 20 minutes ago
      It also turns into a talking point for them. A lot of these weird conspiracies would have naturally died out if some people didn’t try to shut them down so much.
    • deegles 44 minutes ago
      no, letting misinformation persist is counterproductive because of the illusory truth effect. the more people hear it, the more they think (consciously or not) "there must be something to this if it keeps popping up"
      • NullCascade 29 minutes ago
        Elon Musk's takeover of X is already a good example of what happens with unlimited free speech and unlimited reach.

        Neo-nazis and white nationalists went from their 3-4 replies per thread forums, 4chan posts, and Telegram channels, to now regularly reaching millions of people and getting tens of thousands of likes.

        As a Danish person I remember how American media in the 2010s and early 2020s used to shame Denmark for being very right-wing on immigration. The average US immigration politics thread on X is worse than anything I have ever seen in Danish political discussions.

    • krapp 4 minutes ago
      >A lot of people will say all kinds of craziness, and you just have to let it ride so most of us can roll our eyes at it.

      Except many people don't roll their eyes at it, that's exactly the problem. QAnon went from a meme on 4chan to the dominant political movement across the US and Europe. Anti-vax went from fringe to the official policy position of the American government. Every single conspiracy theory that I'm aware of has only become more mainstream, while trust in any "mainstream" source of truth has gone down. All all of this in an environment of aggressive skepticism, arguing, debating and debunking. Sunlight is not disinfecting anything.

      We're literally seeing the result of the firehose of misinformation and right-wing speech eating people's brains and you're saying we just have to "let it ride?"

      Silencing people alone doesn't work, but limiting the damage misinformation and hate speech can do while pushing back against it does work. We absolutely do need to preserve the right of platforms to choose what speech they spread and what they don't.

    • tonfreed 1 hour ago
      The best disinfectant is sunlight. I'm similarly appalled by some of the behaviour after a certain political activist was murdered, but I don't want them to get banned or deplatformed. I'm hoping what we're seeing here is a restoration of the ability to disagree with each other
      • LeafItAlone 46 minutes ago
        >The best disinfectant is sunlight.

        Is it? How does that work at scale?

        Speech generally hasn’t been restricted broadly. The same concepts and ideas removed from YouTube still were available on many places (including here).

        Yet we still have so many people believing falsehoods and outright lies. Even on this very topic of COVID, both sides present their “evidence” and and truly believe they are right, no matter what the other person says.

        • TeeMassive 39 minutes ago
          What's your alternative? The opposite is state dictated censorship and secrecy and those have turned very wrong every single time.
          • LeafItAlone 32 minutes ago
            I honestly don’t know. My libertarian foundation want me to believe that any and all ideas should be able to be spread. But with the technological and societal changes in the past 10-15 years, we’ve seen how much of a danger this can be too. A lie or mistrust can be spread faster than ever to a wider audience than previously ever possible. I don’t have solution, but what we have not is clearly not working.
      • slater- 1 hour ago
        >> The best disinfectant is sunlight.

        Trump thought so too.

      • thrance 1 hour ago
        How's that working out? The worst ideas of the 20th century are resurfacing in plain sunlight because the dem's couldn't pluck their heads out of the sand and actually fight them.

        Now vaccines are getting banned and the GOP is gerrymandering the hell out of the country to ensure the end of the democratic process. Sure, let's do nothing and see where that brings us. Maybe people will magically come to their senses.

        • andrewmcwatters 1 hour ago
          Well, people literally died. So, I think we all know how it played out.

          The same thing since time eternal will continue to occur: the educated and able will physically move themselves from risk and others will suffer either by their own volition, or by association, or by lot.

    • thrance 1 hour ago
      Sure, let the right-wing propaganda machine churn lies and misinformation full-blast, maybe people will magically come to their senses and realize that, no, vaccines and paracetamol don't cause autism, or that the 2020 election wasn't stolen.

      Look at twitter before and after Musk, and tell me again that deplatforming doesn't work.

    • heavyset_go 1 hour ago
      When the pogroms[1] start, it will be a luxury to let it ride out so you can roll your eyes at it.

      There's a reason you don't fan the flames of disinformation. Groups of people cannot be reasoned with like you can reason with an individual.

      [1] https://systemicjustice.org/article/facebook-and-genocide-ho...

    • vkou 1 hour ago
      > But I think we have to realize silencing people doesn't work.

      We also tried letting the propaganda machine full-blast those lies on the telly for the past 5 years.

      For some reason, that didn't work either.

      What is going to work? And what is your plan for getting us to that point?

      • _spduchamp 17 minutes ago
        Algorithmic Accountability.

        People can post all sorts of crazy stuff, but the algorithms do not need to promote it.

        Countries can require Algorithmic Impact Assements and set standards of compliance to recommended guidelines.

    • benjiro 41 minutes ago
      Funny thing, several person that counter responded and disagreed got grayed out (aka negative downvoted ... as in censored).

      Reality is, i have personally seen what this type of uncontrolled anti-vax stuff does. The issue is that its harder to disprove a negative with a positive, then people realize.

      The moment you are into the youtube, tiktok or whatever platform algorithm, you are fed a steady diet of this misinformation. When you then try to argue with actual factual studies, you get the typical response from "they already said that those studies are made up"... How do you fight that? Propaganda works by flooding the news and over time, people believe it.

      That is the result of uncensored access because most people do not have the time to really look up a scientific study. The amount of negative channels massive out way positive / fact based channels because the later is "boring". Its the same reason why your evening news is 80% deaths, corruptions, thefts, politicians and taxes or other negative world news. Because it has been proven that people take in negative news much more. Clickbait titles that are negative draw in people.

      There is a reason why holocaust denial is illegal in countries. Because the longer some people can spew that, the more people actually start to believe it.

      Yes, i am going to get roasted for this but people are easily influenced and they are not as smart as they think themselves are. We have platforms that cater to people's short attention span with barely 1~3 min clips. Youtube video's longer then 30min are horrible for the youtubers income as people simply do not have the attention span and resulting lost income.

      Why do we have laws like seatbelt, speed limits, and other "control" over people. Because people left to their own devices, can be extreme uncaring about their own family, others, even themselves.

      Do i like the idea of censorship for the greater good, no. But when there are so many that spew nonsense just to sell their powders, and their homemade vitamine C solutions (made in China)... telling people information that may hurt or kills themselves, family or others.

      Where is the line of that unbridled free speech? Silencing people works as in, your delaying the flow of shit running down a creek, will it stop completely? No, but the delay helps people downstream. Letting it run uninterrupted, hoping that a few people downstream with a mop will do all the work, yea ...

      We only need to look at platforms like X, when "censorship" got removed (moderation). Full of free speech, no limits and it turned into a soak pit extreme fast (well, bigger soak pit).

      Not sure why i am writing this because this is a heated topic but all i can say is, I have seen the damage that anti-vax did on my family. And even to this day, that damage is still present. How a person who never had a issue with vaccinations, never had a bad reaction beyond the sour arm for a day, turned so skeptical to everything vaccination. All because those anti-vax channels got to her.

      The anti-vax movement killed people. There is scientific study upon study how red states in the US ended up with bigger amounts of deaths given the time periodes. And yet, not a single person was ever charged for this, ... all simply accepted this and never looked back. Like it was a natural thing, that people's grandparents, family members died that did not need to die.

      The fact that people have given up, and now accept to let those with often financial interests, spew nonsense as much as they like. Well, its "normal".

      I weep for the human race because we are not going to make it.

    • breadwinner 1 hour ago
      > silencing people doesn't work

      I agree, but how do you combat propaganda from Putin? Do you match him dollar for dollar? I am sure YouTube would like that, but who has deep enough pockets to counter the disinformation campaigns?

      Similar issue with Covid... when you are in the middle of a pandemic, and dead bodies are piling up, and hospitals are running out of room, how do you handle misinformation spreading on social media?

      • JumpCrisscross 1 hour ago
        Slow down our algorithmic hell hole. Particularly around elections.
        • LeafItAlone 50 minutes ago
          >Slow down our algorithmic hell hole.

          What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?

          • JumpCrisscross 48 minutes ago
            > What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?

            Time delay. No content based restrictions. Just, like, a 2- to 24-hour delay between when a post or comment is submitted and when it becomes visible, with the user free to delete or change (in this case, the timer resets) their content.

            I’d also argue for demonetising political content, but idk if that would fly.

            • LeafItAlone 36 minutes ago
              Ok, but how does that get implemented? Not technically, but who makes it happen and enforces the rules? For all content or just “political”? Who decides what’s “political”? Information about the disease behind a worldwide pandemic isn’t inherently “political”, but somehow it became so.

              Who decides agar falls in this bucket. The government? That seems to go against the idea of restricting speech and ideas.

              • JumpCrisscross 34 minutes ago
                > who makes it happen and enforces the rules?

                Congress for the first. Either the FCC or, my preference, private litigants for the second. (Treble damages for stupid suits, though.)

                > For all content or just “political”?

                The courts can already distinguish political speech from non-political speech. But I don’t trust a regulator to.

                I’d borrow from the French. All content within N weeks of an in the jurisdiction. (I was going to also say any content that mentions an elected by name, but then we’ll just get meme names and nobody needs that.)

                Bonus: electeds get constituent pressure to consolidate elections.

                Alternative: these platforms already track trending topics. So an easy fix is to slow down trending topics. It doesn’t even need to be by that much, what we want is for people to stop and think and have a chance to reflect on what they do, maybe take a step away from their device while at it.

        • breadwinner 55 minutes ago
          If the government asks private companies to do that, then that's a violation of 1st amendment, isn't it?

          This is the conundrum social media has created. In the past only the press, who were at least semi-responsible, had the ability to spread information on a massive scale. Social media changed that. Now anyone can spread information instantly on a massive scale, and often it is the conspiracy theories and incorrect information that people seek out.

          "We were a bit naive: we thought the internet, with the availability of information, would make us all a lot more factual. The fact that people would seek out—kind of a niche of misinformation—we were a bit naive." -- Bill Gates to Oprah, on "AI and the Future of us".

          • JumpCrisscross 53 minutes ago
            > If the government asks private companies to do that, then that's a violation of 1st amendment, isn't it?

            Yes. An unfortunate conclusion I’m approaching (but have not reached, and frankly don’t want to reach) is the First Amendment doesn’t work in a country that’s increasingly illiterate and addicted to ad-powered algorithmic social media.

      • altruios 42 minutes ago
        Censorship is a tool to combat misinformation.

        It's taking a sword to the surgery room where no scalpel has been invented yet.

        We need better tools to combat dis/mis-information.

        I wish I knew what that tool was.

        Maybe 'inoculating information' that's specifically stickier than the dis/mis-info?

      • TeeMassive 47 minutes ago
        Have you heard about Tik Tok? And you think governments' intelligence agencies are not inserting their agents in key positions at bit tech companies?
    • felixgallo 1 hour ago
      [flagged]
      • putzdown 1 hour ago
        No. This perspective is wrong in both directions: (1) it is bad medicine and, (2) the medicine doesn't treat the disease. If we could successfully ban bad ideas (assuming that "we" could agree on what they are) then perhaps we should. If the damage incurred by the banning of ideas were sufficiently small, perhaps we should. But both of these are false. Banning does not work. And it brings harm. Note that the keepers of "correct speech" doing the banning today (eg in Biden's day) can quickly become the ones being banned another day (eg Trump's). It's true that drowning the truth through volume is a severe problem, especially in a populace that doesn't care to seek out truth, to find needles in haystacks. But again, banning doesn't resolve this problem. The real solution is develop a populace that cares about, seeks out, and with some skill identifies the truth. That may not be an achievable solution, and in the best case it's not going to happen quickly. But it is the only solution. All of the supply-based solutions (controlling speech itself, rather than training good listeners) run afoul of this same problem, that you cannot really limit the supply, and to the extent you can, so can your opponents.
        • paulryanrogers 1 hour ago
          What do you think about measures that stop short of banning? Like down ranking, demonetizing, or even hell 'banning' that just isolates cohorts that consistently violate rules?
          • rahidz 1 hour ago
            Not OP, but my opinion is that if a platform wants to do so, then I have zero issues with that, unless they hold a vast majority of market share for a certain medium and have no major competition.

            But the government should stay out of it.

      • unclad5968 1 hour ago
        Can we stop with the Nazi stuff. I don't know if they stopped teaching history, but there is nothing happening in the US that is within an order of magnitude of the evil the Nazi's perpetrated. Being anti-vax is not comparable to genocide.
        • jjk166 1 hour ago
          The Nazis in 1933 hadn't done anything within an order of magnitude of the evil they would perpetrate in 1943. They nevertheless were still Nazis, and everyone who did not actively condemn them then was in part responsible for what they did later.

          Many evil people weren't Nazis; some Nazis weren't necessarily evil. Evil is not part of the definition of Nazism. Promoting authoritarianism, exclusionary nationalism, institutional racism, autarky, anti-liberalism and anti-socialism are the hallmarks of Nazism. Anyone who holds the beliefs of the Nazis is a Nazi, regardless of what level of success they have to date achieved in carrying out their aims.

          • unclad5968 47 minutes ago
            > The Nazis in 1933 hadn't done anything within an order of magnitude of the evil they would perpetrate in 1943. They nevertheless were still Nazis, and everyone who did not actively condemn them then was in part responsible for what they did later.

            Only because what they did in 1943 surpassed anything imaginable. In 1933 the Nazi party immediately banned all political parties, arrested thousands of political opponents, started forcing sterilization of anyone with hereditary illnesses, and forced abortions of anyone with hereditary illness. Evil is absolutely an identifying part of Nazis. The idea that Nazis are just anti-liberals is exactly why we cannot go around calling everyone we don't like Nazis. The Nazis were not some niche alt-right organization.

            If you genuinely think there are Nazis controlling youtube or the government, and all you're doing is complaining about it on hackernews, you're just as complicit as you're claiming those people were.

            • jjk166 25 minutes ago
              You seem to misunderstand. You are not immune to being a Nazi because you are not evil, being a Nazi makes people evil. Further, we do not call people Nazis because we dislike them, we dislike them because they are Nazis. Most non-Nazis, when accused of being a Nazi, point out how their views differ from the Nazis. The people who argue they can't possibly be Nazis because Nazis are bad, and they are not, typically are.
        • epakai 21 minutes ago
          We read the history, a lot of it rhymes. Conservatives failed, and exchanged their values for a populist outsider to maintain power (see Franz von Papen). The outsider demeans immigrants and 'sexual deviants'. The outsider champions nationalism. He pardons the people who broke the law to support him. Condemns violence against the party while ignoring the more common violence coming out of the those aligned with the party. Encourages the language of enemies when discussing political opponents and protestors.

          Nazi has a lot more connotations than genocide. I'm not sure it is worth nitpicking over. Even if you tone it down to Fascist or Authoritarian there will be push back.

        • tehjoker 1 hour ago
          [flagged]
      • dotnet00 1 hour ago
        How can you say that banning Nazis has worked well considering everything so far this year?
        • felixgallo 46 minutes ago
          Europe is sliding, but has done ok so far. Crossing fingers.
        • miltonlost 1 hour ago
          Well it would if we would actually ban Nazis instead of platform them. They haven't been banned. That's the problem.
          • dotnet00 1 hour ago
            You'd have to ban them from society outright without somehow devolving into an authoritarian hellhole in the process (impossible). Trump still primarily posts on a platform specifically created to be a right wing extremist echo chamber.
          • cpursley 1 hour ago
            What is a Nazi?
            • indy 1 hour ago
              For a lot of people it's "anyone who I disagree with".
          • knifemaster 1 hour ago
            [flagged]
      • tehjoker 1 hour ago
        [flagged]
        • indy 1 hour ago
          Perhaps not the wisest comment to make in light of recent events
          • tehjoker 26 minutes ago
            I didn't say violence. Whatever you read into that comment is a projection. I'm not even sure violence is effective, but something more muscular than op-eds is called for. For example, labor organizing and various forms of self-defense organizations, of which there are many kinds, not only militias. For example, anti-ICE organizing which protects vulnerable people from the gestapo.
    • hash872 1 hour ago
      It's their private property, they can ban or promote any ideas that they want to. You're free to not use their property if you disagree with that.

      If 'silencing people' doesn't work- so online platforms aren't allowed to remove anything? Is there any limit to this philosophy? So you think platforms can't remove:

      Holocaust denial? Clothed underage content? Reddit banned r/jailbait, but you think that's impermissible? How about clothed pictures of toddlers but presented in a sexual context? It would be 'silencing' if a platform wanted to remove that from their private property? Bomb or weapons-making tutorials? Dangerous fads that idiotic kids pass around on TikTok, like the blackout game? You're saying it's not permissible for a platform to remove dangerous instructionals specifically targeted at children? How about spam? Commercial advertising is legally speech in the US. Platforms can't remove the gigantic quantities of spam they suffer from every day?

      Where's the limiting principle here? Why don't we just allow companies to set their own rules on their own private property, wouldn't that be a lot simpler?

      • softwaredoug 1 hour ago
        I used to believe this. But I feel more and more we need to promote a culture of free speech that goes beyond the literal first amendment. We have to tolerate weird and dangerous ideas.
        • andrewmcwatters 1 hour ago
          Better out in the open with refutations or warnings than in the dark where concepts become physical dangers.
          • benjiro 35 minutes ago
            Refuting does not work... You can throw scientific study upon study, doctor upon doctor, ... negatives run deeper then positives.

            In the open, it becomes normalized, it draws in more people. Do you rather have some crazies in the corner, or 50% of a population that believes something false, as it became normalized.

            The only people benefiting from those dark concepts are those with financial gains. They make money from it, and push the negatives to sell their products and cures. Those that fight against it, do not gain from it and it cost them time/money. That is why is a losing battle.

      • drak0n1c 1 hour ago
        Read the article, along with this one https://reclaimthenet.org/google-admits-biden-white-house-pr...

        In this case it wasn't a purely private decision.

      • rahidz 1 hour ago
        "Where's the limiting principle here?"

        How about "If the content isn't illegal, then the government shouldn't pressure private companies to censor/filter/ban ideas/speech"?

        And yes, this should apply to everything from criticizing vaccines, denying election results, being woke, being not woke, or making fun of the President on a talk show.

        Not saying every platform needs to become like 4chan, but if one wants to be, the feds shouldn't interfere.

      • TeeMassive 40 minutes ago
        > It's their private property, they can ban or promote any ideas that they want to. You're free to not use their property if you disagree with that.

        1) They are public corporations and are legal creation of the state and benefit from certain protections of the state. They also have privileged access to some public infrastructures that other private companies do not have.

        2) By acting on the behest of the government they were agent of the government for free speech and censorship purposes

        3) Being monopolies in their respective markets, this means they must respect certain obligations the same way public utilities have.

  • cactusplant7374 2 hours ago
    > From President Biden on down, administration officials “created a political atmosphere that sought to influence the actions of platforms based on their concerns regarding misinformation,” Alphabet said, claiming it “has consistently fought against those efforts on First Amendment grounds.”

    This actually surprised me because I thought (and maybe still think) that it was Google employees that led the charge on this one.

    • softwaredoug 1 hour ago
      It's in their interests now to throw Biden under the bus. There may be truth to this, but I'm sure its exaggerated for effect.
    • HankStallone 1 hour ago
      It was. At the time, they felt like they were doing the right thing -- the heroic thing, even -- in keeping dangerous disinformation away from the public view. They weren't shy about their position that censorship in that case was good and necessary. Not the ones who said it on TV, and not the ones who said it to me across the dinner table.

      For Google now to pretend Biden twisted their arm is pretty rich. They'd better have a verifiable paper trail to prove that, if they expect anyone with a memory five years long to believe it.

      • dotnet00 1 hour ago
        To be fair, even if they were being honest about Biden twisting their arm (I don't buy it), the timing makes it impossible to believe their claim.
        • CSMastermind 22 minutes ago
          Why wouldn't you buy it?

          The Twitter files showed direct communications from the administration asking them ban specific users like Alex Berenson, Dr. Martin Kulldorff, and Dr. Andrew Bostom: https://cbsaustin.com/news/nation-world/twitter-files-10th-i...

          Meta submitted direct communications from the administration pressuring them to ban people as part of a congressional investigation: https://www.aljazeera.com/news/2024/8/27/did-bidens-white-ho...

          It would be more surprising if they left Google alone.

          • braiamp 13 minutes ago
            If you read those documents, you will see that the administration was telling them that those accounts were in violation of Twitter TOS. They simply said "hey, this user is violating your TOS, what are you gonna do about it?", and Twitter simply applied their rules.
          • dotnet00 9 minutes ago
            The implication of saying they were "pressed" by the Biden admin is that Google was unwilling. I don't buy that. They were complicit and are now throwing the Biden admin under the bus because it is politically convenient. Just like how the Twitter files showed that Twitter was complicit in it.
  • lesuorac 2 hours ago
    2 years is a pretty long ban for a not even illegal conduct.

    Although if they got banned during the start of covid during the Trump administration then we're talking about 5 years.

    • asadotzler 10 minutes ago
      No one owes them any distribution at all.
      • zug_zug 0 minutes ago
        Absolutely. Especially when those election deniers become insurrectionists.
    • Simulacra 1 hour ago
      They went against a government narrative. This wasn't Google/Youtube banning so much as government ordering private companies to do so.
      • LeafItAlone 1 hour ago
        And do you think the impetuous behind this action happening now is any different? In both cases YouTube is just doing what the government wants.
      • JumpCrisscross 1 hour ago
        > wasn't Google/Youtube banning so much as government ordering private companies to do so

        No, it was not. It’s particularly silly to suggest this when we have live example of such orders right now.

        The companies were nudged. (And they were wrong to respond to public pressure.) The President, after all, has a “bully pulpit.” But there were no orders, no credibly threats and plenty of companies didn’t deplatform these folks.

        • spullara 55 minutes ago
          They literally had access to JIRA at Twitter so they could file tickets against accounts.
          • JumpCrisscross 51 minutes ago
            > literally had access to JIRA at Twitter so they could file tickets against accounts

            I’m not disputing that they coördinated. I’m challenging that they were coerced.

            We wouldn’t describe Fox News altering a script on account of a friendly call from Miller and friends the “government ordering private companies” around. (Or, say, Florida opening their criminal justice records to ICE the federal government ordering states around.) Twitter’s leadership and the Biden administration saw eye to eye. This is a story about a media monoculture and private censorship, not government censorship.

          • unethical_ban 23 minutes ago
            Do you think no nefarious nation state actors are on social media spinning disinformation?
        • starik36 56 minutes ago
          That was certainly the case with Twitter. It came out during the congressional hearings. FBI had a direct line to the decision makers.
          • JumpCrisscross 54 minutes ago
            > was certainly the case with Twitter

            It was not. No threats were made, and Twitter didn’t blindly follow the FBI’s guidance.

            The simple truth is the leftist elements that wanted to control the debate were there in the White House and in Twitter’s San Francisco offices. Nobody had to be coerced, they were coördinating.

          • brokencode 54 minutes ago
            A direct line to threaten decision makers? Or to point out possible misinformation spreaders?
      • stronglikedan 1 hour ago
        [flagged]
        • 3cKU 1 hour ago
          [flagged]
    • jackmottatx 1 hour ago
      [dead]
  • reop2whiskey 2 minutes ago
    is there any political censorship scheme at this large of scale in modern us history?
  • bluedino 51 minutes ago
    I'm banned from posting in a couple subreddits for not aligning with the COVID views of the moderators. Lame.
    • c-hendricks 47 minutes ago
      Whenever someone says "i was banned from ..." take what they say with a huge grain of salt.
      • pinkmuffinere 19 minutes ago
        Everybody here is strangers online, so I think grains of salt are reasonable all around. That said, I'm not sure that people-who-were-banned deserve above average scrutiny. Anecdotally, a lot of the RubyGems maintainers were banned a week ago. It seems really unfair to distrust people _just_ because a person-in-control banned them.
      • mvdtnz 31 minutes ago
        Reddit (both admins and many subreddit moderators) are extremely trigger happy with bans. Plenty of reasonable people get banned by capricious Reddit mods.
      • alex1138 19 minutes ago
        Stop excusing it. It's a very real, very serious problem with Reddit. They're very much abusive on this and many other topics
  • whycome 2 hours ago
    What exactly constituted a violation of a COVID policy?
    • PaulKeeble 2 hours ago
      A lot of channels had to avoid even saying the word Covid. I only saw it return recently to use at the end of last year. There were a variety of channels banned that shouldn't have been such as some talking about Long Covid.
    • carlosjobim 2 hours ago
      Every opinion different from the opinion of "authorities". They documented it here:

      https://blog.youtube/news-and-events/managing-harmful-vaccin...

      From the two links in the post, Google fleshes it out in great detail, with many examples of forbidden thought.

      • miltonlost 2 hours ago
        [flagged]
        • someuser2345 1 hour ago
          > content that falsely alleges that approved vaccines are dangerous and cause chronic health effects

          The J & J vaccine was approved at the time, but was later banned for causing chronic health effects.

          > claims that vaccines do not reduce transmission or contraction of disease

          Isn't that true of the covid vaccines? Originally, the proponents claimed that getting the vaccine would stop you from getting covid entirely, but later on, they changed the goal posts to "it will reduce your symptoms of covid".

          • joecool1029 1 hour ago
            > The J & J vaccine was approved at the time, but was later banned for causing chronic health effects.

            That's not what happened. Authorities received rare reports of a clotting disorder and paused it for 11 days to investigate. That pause was lifted but the panic caused a crash in demand and J&J withdrew it from the market. Source: https://arstechnica.com/health/2023/06/j-fda-revokes-authori...

            • WillPostForFood 57 minutes ago
              It seems like you are implying that the pause was lifted because they found nothing. That's not quite right. J&J vaccine killed 9 people, and the FDA issued restrictions on who could get it, limitations on who should get it, and warnings about the side effects.

              https://www.fda.gov/media/146304/download

              • joecool1029 19 minutes ago
                > It seems like you are inferring that the pause was lifted because they found nothing. That's not quite right.

                I am not and my source covers this.

          • 2muchcoffeeman 1 hour ago
            This highlights what’s so difficult with science communication.

            Right here on what should be a technical minded forum, people don’t understand what science is or how it works. Or what risk is. And they don’t even challenge their own beliefs or are curious about how things actually work.

            If the “smart” people can’t or won’t continuously incorporate new information, what are our chances?

          • teamonkey 1 hour ago
            > Originally, the proponents claimed that getting the vaccine would stop you from getting covid entirely

            Some people don’t understand how vaccines work, so may have claimed that, but efficacy rates were very clearly communicated. Anyone who listened in high school biology should know that’s not how they work.

        • roenxi 1 hour ago
          That policy catches and bans any scientists studying the negative health effects of vaccines who later turns out to be right.

          1) YouTube doesn't know what is true. They will be relying on the sort of people they would ban to work out when the consensus is wrong. If I watched a YouTube video through of someone spreading "vaccine misinformation" there is a pretty good chance that the speakers have relevant PhDs or are from the medical profession - there is no way the YouTube censors are more qualified than that, and the odds are they're just be random unqualified employees already working in the euphemistically named "Trust & Safety" team.

          2) All vaccines are dangerous and can cause chronic health effects. That statement isn't controversial, the controversy is entirely over the magnitude. Every time I get a vaccine the standard advice is "you should probably hang around here for 5 minutes, these things are known to be dangerous in rare cases". I think in most countries you're more likely to get polio from a polio vaccine than in the wild. On the one hand, that is a success of the polio vaccine. On the other hand, the vaccine is clearly dangerous and liable to cause chronic health problems.

          > This would include content that falsely says that approved vaccines cause ... cancer ...

          Cancer is such a catch all that we can pretty much guarantee there will be some evidence that vaccines cause cancer. Everything causes cancer. Drinking tea is known to cause cancer.

          3) All policies have costs and benefits. People have to be able to discuss the overall cost-benefit of a policy in YouTube videos even if they get one of the costs or benefits completely wrong.

          • handoflixue 1 hour ago
            > Cancer is such a catch all that we can pretty much guarantee there will be some evidence that vaccines cause cancer. Everything causes cancer. Drinking tea is known to cause cancer.

            I'm reminded of the Prop 65 signs everywhere in California warning "this might cause cancer"

        • TeeMassive 52 minutes ago
          > This seems like good banning to me. Anti-vaxxer propaganda isn't forbidden thoughts. It's bad science and lies and killing people.

          Any subject important enough in any public forum is potentially going to have wrong opinions that are going to cause harm. While some people could be wrong, and could cause harm, the state itself being wrong is far more dangerous, especially with no dissident voices there to correct its course.

          Edit: I see you're getting downvoted for simply stating your honest opinion. But as a matter of principle I'm going to upvote you.

        • mapontosevenths 2 hours ago
          [flagged]
          • Bender 48 minutes ago
            Pfizer hid a lot of the damage done as did the others. A lot of people can die by the time books come out. [1] That's one of the many reasons I held off and glad I did.

            [1] - https://www.amazon.com/Pfizer-Papers-Pfizers-Against-Humanit...

          • rpiguy 1 hour ago
            People have the right to believe things that could get them killed and the right to share their beliefs with others.

            Allowing the debate to be shut down is undemocratic and unscientific (science without question is nothing more than religion).

            Not allowing people to come to different conclusions from the same data is tyranny.

          • immibis 1 hour ago
            Shouting "fire" in a crowded theater being illegal was used to make it illegal to oppose the draft (Schenck v. United States). So actually, since opposing the draft is legal, shouting "fire" in a crowded theater is legal too.
            • mapontosevenths 1 hour ago
              You would be charged with inciting a riot, reckless homicide, etc regardless of the actual words you shouted to cause the deaths, but I see your point.
            • trollbridge 1 hour ago
              Yep, and that's what Brandenburg v. Ohio enshrined.
            • jjk166 1 hour ago
              That's quite the legal theory.
            • pessimizer 1 hour ago
              "Shouting 'fire' in a crowded theater" being used as an excuse for censorship is the surest way to know you are talking to someone who hasn't even started doing the reading. Even worse, they often (over the past very few years) self-identify as socialists or anti-war, and the decision was in order to prosecute anti-war socialists for passing out pamphlets.

              If somebody says it, they not only don't care about free speech, they don't even care about having a good faith conversation about free speech. They've probably been told this before, and didn't bother to look it up, just repeated it again. Wasting good people's time.

              edit: here's a copy of fire in a crowded theater, https://postimg.cc/gallery/q4PJnPh

    • perihelions 2 hours ago
      According to Google's censorship algorithm, Michael Osterholm's podcast (famous epidemiologist and, at the time, a member of President Biden's own gold-star covid-19 advisory panel).

      https://x.com/cidrap/status/1420482621696618496 ("Our Osterholm Update podcast episode (Jul 22) was removed for “medical misinformation.”" (2021))

      Most ironic thing I've ever seen. I still recall it perfectly, though it's been four years. Never, ever trust censorship algorithms or the people who control them: they are just dumb parrots that suppress all discussion of an unwanted topic, without thought or reason.

      • delichon 2 hours ago
        My wake up moment was when they not only took down a Covid debate with a very well qualified virologist, but also removed references to it in the Google search index, not just for the YouTube link.
        • miltonlost 2 hours ago
          [flagged]
          • delichon 1 hour ago
            I am not comfortable letting Google make that decision for me. You are?
            • theossuary 1 hour ago
              We lost that choice when google became a monopoly.

              What I'm not comfortable with is preventing a private company from moderating their product.

              • janalsncm 20 minutes ago
                Your line of reasoning is mixing “is” and “aught”. The whole thread is about what aught to be. I doubt most people want Google to be a monopoly.
        • barbacoa 1 hour ago
          Google went so far as to scan people's private google drives for copies of the documentary 'plandemic' and delete them.
    • jimt1234 2 hours ago
      [flagged]
    • zobzu 2 hours ago
      [flagged]
    • potsandpans 1 hour ago
      Saying lab leak was true
  • woeirua 1 hour ago
    It seems to me that a lot of people are missing the forest for the trees on misinformation and censorship. IMO, a single YouTube channel promoting misinformation, about Covid or anything else, is not a huge problem, even if it has millions of followers.

    The problem is that the recommendation algorithms push their viewers into these echo chambers that are divorced from reality where all they see are these videos promoting misinformation. Google's approach to combating that problem was to remove the channels, but the right solution was, and still is today, to fix the algorithms to prevent people from falling into echo chambers.

    • asadotzler 6 minutes ago
      Why. Why is Google obligated to publish your content? Should Time Magazine also give you a column because they give others space in their pages? Should Harvard Press be required to publish and distribute your book because they do so for others.

      These companies owe you nothing that's not in a contract or a requirement of law. That you think they owe you hosting, distribution, and effort on their algorithm, is a sign of how far off course this entire discourse has moved.

    • CobrastanJorji 28 minutes ago
      Yeah, there are two main things here that are being conflated.

      First, there's YouTube's decision of whether or not to allow potentially dangerous misinformation to remain on their site, and whether the government can or did require them to remove it.

      Second, though, there's YouTube's much stronger editorial power: whether or not to recommend, advertise, or otherwise help people discover that content. Here I think YouTube most fairly deserves criticism or accolades, and it's also where YouTube pretends that the algorithm is magic and neutral and they cannot be blamed for actively pushing videos full of dangerous medical lies.

    • stronglikedan 1 hour ago
      The problem is that misinformation has now become information, and vice versa, so who was anyone to decide what was misinformation back then, or now, or ever.

      I like the term disinformation better, since it can expand to the unfortunately more relevant dissenting information.

      • 3cKU 35 minutes ago
        [dead]
    • kypro 1 hour ago
      I've argued this before, but the algorithms are not the core problem here.

      For whatever reason I guess I'm in that very rare group that genuinely watches everything from far-right racists, to communists, to mainstream media content, to science educational content, to conspiracy content, etc.

      My YT feed is all over the place. The algorithms will serve you a very wide range of content if you want that, the issue is that most people don't. They want to hear what they already think.

      So while I 100% support changing algorithms to encourage more diversity of views, also I think as a society we need to question why people don't want to listen to more perspectives naturally? Personally I get so bored here people basically echo what I think. I want to listen to people who say stuff I don't expect or haven't thought about before. But I'm in a very significant minority.

      • woeirua 28 minutes ago
        I might agree that the algos making recommendations on the sidebar might not matter much, but the algos that control which videos show up when you search for videos on Google, and also in YouTube search absolutely do matter.
    • theossuary 1 hour ago
      The problem with this is that a lot of people have already fallen into these misinformation echo chambers. No longer recommending them may prevent more from becoming unmoored from reality, but it does nothing for those currently caught up in it. Only removing the channel helps with that.
      • squigz 16 minutes ago
        I don't think those people caught up in it are suddenly like "oop that YouTuber is banned, I guess I don't believe that anymore". They'll seek it out elsewhere.
      • hsbauauvhabzb 47 minutes ago
        Algorithms that reverse the damage by providing opposing opinions could be implemented.
    • terminalshort 1 hour ago
      The algorithm doesn't push anyone. It just gives you what it thinks you want. If Google decided what was true and then used the algorithm to remove what isn't true, that would be pushing things. Google isn't and shouldn't be the ministry of truth.
      • woeirua 27 minutes ago
        Exactly, they shouldn't be the ministry of truth. They should present balanced viewpoints on both sides of controversial subjects. But that's not what they're doing right now. If you watch too many videos on one side of a subject it will just show you more and more videos reinforcing that view point because you're likely to watch them!
      • TremendousJudge 1 hour ago
        "what it thinks you want" is doing a lot of work here. why would it "think" that you want to be pushed into an echo chamber divorced from reality instead of something else? why would it give you exactly what you "want" instead of something aligned with some other value?
  • rustystump 1 hour ago
    The problem with any system like this is that due to scale it will be automated which means a large swath of people will be caught up in it doing nothing wrong.

    This is why perma bans are bad. Id rather a strike system before a temp ban to give some breathing room for people to navigate the inevitable incorrect automation. Even then if the copyright issue is anything to go by this is going to hurt more than help.

  • pcdoodle 22 minutes ago
    So great to see the censorship apparatus in full swing on HN. Lots of great comments into the dust bin.
  • ironman1478 23 minutes ago
    There isn't really a good solution here. A precedent for banning speech isn't a good one, but COVID was a real problem and misinformation did hurt people.

    The issue is that there is no mechanism for punishing people who spread dangerous misinformation. It's strange that it doesn't exist though, because you're allowed to sue for libel and slander. We know that it's harmful, because people will believe lies about a person, damaging their reputation. It's not clear why it can't be generalized to things that we have a high confidence of truth in and where lying is actively harmful.

    • asadotzler 9 minutes ago
      No speech was banned. Google didn't prevent anyone from speaking. They simply withheld their distribution. No one can seem to get this right. Private corporations owe you almost nothing and certainly not free distribution.
    • reop2whiskey 17 minutes ago
      What if the government is the source of misinformation?
      • ironman1478 13 minutes ago
        It's interesting you say that, because the government is saying Tylenol causes autism in infants when the mother takes it. The original report even says more verification is required and it's results are inconclusive.

        I wouldn't be surprised if some lawsuit is incoming from the company that manufactures it.

        We have mechanisms for combatting the government through lawsuits. If the government came out lies that actively harm people, I hope lawsuits come through or you know... people organize and vote for people who represent their interests.

    • alex1138 22 minutes ago
      Virtually all of the supposed misinformation turned out not to be that at all. Period, the end. All the 'experts' were wrong, all those that we banned off platforms (the actual experts) were right
  • serf 36 minutes ago
    i'd like to think that if I were a YTer that got banned for saying something that I believed in that I would at least have the dignity not to take my value back to the group that squelched me.

    ..but i'm not a yter.

    • TeMPOraL 30 minutes ago
      It's showbiz. For those making actual money there, sacrificing dignity is the price of entry.
  • pessimizer 2 hours ago
  • alex1138 24 minutes ago
    So the other day, I linked to something on Rumble right here on Hacker News and was told to find a better source

    First of all, you can't separate a thing's content from the platform it's hosted on? Really?

    Second of all, this is why

    I'll just go do this again and if you flag me it's on you, you have no standing to do it (the internet is supposed to be democratic, remember?)

    https://rumble.com/v28x6zk-sasha-latypova-msc.-nsa-team-enig...

    https://rumble.com/v3zh3fh-staggering-17m-deaths-after-covid...

    https://rumble.com/vt62y6-covid-19-a-second-opinion.html

    https://rumble.com/v2nxfvq-international-covid-summit-iii-pa...

    I could go on. Feel free if you want to see more. :)

    (Was it misinformation when Fauci said you shouldn't rush a vaccine or all hell breaks loose years later? Or when he intimated that masks wouldn't work for covid?)

  • guelo 51 minutes ago
    The amount of flagged hidden comments here by the supposed anti censorship side is almost funny.
    • dang 41 minutes ago
      If you (or anyone) run across a flagged comment that isn't tediously repeating ideological battle tropes, pushing discussion flameward, or otherwise breaking the site guidelines, you're welcome to bring it to our attention. So far, the flagged comments I've seen in this thread seem correctly flagged. But we don't see everything.

      On this site, we're trying for discussion in which people don't just bash each other with pre-existing talking points (and unprocessed rage). Such comments quickly flood the thread on a divisive topic like this one, so flagging them is essential to having HN operate as intended. To the extent possible at least.

      (oh and btw, though it ought to go without saying, this has to do with the type of comment, not the view it's expressing. People should be able to make their substantive points thoughtfully, without getting flagged.)

      https://news.ycombinator.com/newsguidelines.html

      • alex1138 16 minutes ago
        Yeah but in practice this isn't actually the case, people flag all the time for people just having a dissenting opinion, fitting none of the categories you mentioned
        • dang 13 minutes ago
          As mentioned, I haven't seen cases of that in the current thread. If there are any, I'd appreciate links. We don't see everything.
  • valentinammm 34 minutes ago
    [dead]
  • cindyllm 6 minutes ago
    [dead]
  • cbradford 1 hour ago
    So absolutely no one involved will have any repercussions. So they will all do it over again at the next opportunity
    • asadotzler 4 minutes ago
      They are mega-corporations. They always do what ever the hell they want, certainly absent your input. Did you really believe they don't do what ever they want, because that's pretty damned naive.
    • JumpCrisscross 53 minutes ago
      > they will all do it over again at the next opportunity

      Future tense?

    • lazyeye 53 minutes ago
      What should the punishment be for having opinions the govt disagrees with?
      • th0ma5 52 minutes ago
        Notoriety
        • lazyeye 49 minutes ago
          Yep..and fame, admiration, contempt, loathing, indifference etc
      • Supermancho 48 minutes ago
        Promoting medical misinformation or even health misinformation should be critically judged. Alternative health companies are rubbing their hands together.

        The next Drain-o chug challenge "accident" is inevitable, at this rate.

    • johnnyanmac 53 minutes ago
      yeah, 2025 in a nutshell. The year of letting all the grifts thrive.
  • oldpersonintx2 2 hours ago
    [dead]
  • jimt1234 2 hours ago
    [flagged]
  • najarvg 2 hours ago
    [flagged]
    • ch4s3 2 hours ago
      Far too many people are free speech hypocrites.
    • eschulz 2 hours ago
      who doesn't get free speech?
      • apercu 2 hours ago
        [flagged]
      • jimt1234 2 hours ago
        [flagged]
        • cptnapalm 2 hours ago
          [flagged]
        • SV_BubbleTime 2 hours ago
          [flagged]
          • lesuorac 2 hours ago
            [flagged]
            • SV_BubbleTime 1 hour ago
              Nice edit where you cut out all the “progressive” stuff to make the same lie that the shooter is right wing because you want to do anything you can to avoid admitting that extreme left even exists lets alone is capable of assassination.

              Yes, those are Kimmel words, and when he said them, everyone already knew the facts about his gay lifestyle and trans boyfriend and that he murdered Kirk out of hate. It’s nice to see that you picked up his torch regardless of how little rational sense it makes.

              >Somebody using violence for political means is exactly aligned with Charlie Kirk's spoken words.

              Cite it. In whole context, but video is preferable. The guy had a decade of being in front of the camera, post any the hateful video, it should be easy! Show us the hate, if not the justification for his own murder of course.

  • guelo 2 hours ago
    [flagged]
    • paulryanrogers 2 hours ago
      Steelman argument is it's better to know what liars, bigots, and other naughty people are up to than push them entirely underground. And someday future moderators may think you're naughty/lying/a quack/etc.

      IMO we should not let private platforms become near monopolies, and certainly not without regulation, since they become a defacto public square. But if we're going to let them eat the world, then hopefully they'll at least use good judgment and measures like de-ranking or even banning folks who encourage others to do harm. Making bans temporary is a safety valve in case of bad moderation.

      • immibis 1 hour ago
        That steelman is still a pretty bad argument, though. I don't see why giving liars, bigots and other naughty people a megaphone is required in order to know what they're saying.
        • brokencode 49 minutes ago
          Who gets to decide who’s naughty? One day it’s the Biden admin, and the next it’s the Trump admin. That’s the tough part about censorship.

          You can leave it up to companies, but what happens when Trump allies like Elon Musk and Larry Ellison buy up major platforms like Twitter and TikTok?

          Do we really trust those guys with that much power?

        • paulryanrogers 1 hour ago
          Yeah, I personally still see a place for permanent bans. But I can see the other side.
        • jimmygrapes 1 hour ago
          I suppose the argument there is that it's not necessarily a megaphone for the fella with 24 followers. The concern comes from when someone amasses a following through "acceptable" means and then pivots. Not sure how to balance that.
          • nickthegreek 1 hour ago
            new algos will gladly give people with 24 followers millions of views if the content pushes the right metrics
      • hash872 2 hours ago
        What is Youtube a 'near monopoly' in? Online video.....? Do you have any idea how much video there is online that's not on Youtube? They don't meet the legal definition of a monopoly
    • bawolff 2 hours ago
      People change/make mistakes. Permanent bans are rarely a good idea.
      • ryandrake 56 minutes ago
        Earlier in 2025, the video game Fortnite announced[1] that they were giving cheaters with lifetime bans a "second chance" and let them return to the game. Lo and behold, cheating in the game spiked up this year and has returned as a huge ongoing problem. Turns out, the vast majority of the bans were probably correct, and when you let people back into something who were banned for doing X, they're going to immediately start doing X again once they're back in.

        1: https://www.fortnite.com/news/fortnite-anti-cheat-update-feb...

      • stefantalpalaru 1 hour ago
        [dead]
    • dotnet00 2 hours ago
      Admittedly, Google was very heavy handed with Covid censorship. Sure, there was a lot of genuine misinformation that maybe deserved it, but they also tended to catch a lot of actual qualified scientists engaging in scientific debate (say, arguing in favor of masks and the transmission through air theory in the early days) or even some discussion that wasn't opposing the official stances.

      Somewhat related, it's pretty insane how even to this day YouTubers have to avoid referring to by name a global multi-year situation that everyone who existed at the time went through. It's due to advertisers rather than government pressure, but still, insane.

      • layman51 1 hour ago
        Your point reminded me that around the time when the pandemic first started, I saw a YouTube video on physics titled something like "Corona and Arc Discharge" and it had the contextual note that is sometimes added to videos. I think the official name YouTube gives it is: "topical context in information panel". I thought it was a funny case where the automated system thought this physics video had something to do with COVID.
      • andy99 1 hour ago
        Yeah at the time I get the impression they were banning dissent, not just egregious or dangerous content (whatever that even means). I though most places came to their senses a long time ago and walked back that heavy handedness, I'm surprised this just happened.
    • IncreasePosts 2 hours ago
      Merriam Webster defines con man as "a person who tricks other people in order to get their money : con artist"

      Even if people were straight up wrong about their COVID-19 theories, I don't think many of the banned people were trying to get viewers to send them money.

      • mapontosevenths 1 hour ago
        > trying to get viewers to send them money.

        They were trying to get viewers to get money. It's an important distinction.

      • heavyset_go 1 hour ago
        We both know that ads and sponsorships are a significant way influencers monetize their viewers.

        All they have to do is lie to attract eyeballs and they make money. E-begging isn't necessary, the platforms allow you to extract value from viewers at an incredible scale.

  • apercu 2 hours ago
    [flagged]
  • moomoo11 2 hours ago
    I think hardware and ip level bans.. should be banned.

    I know that some services do this in addition to account ban.

    • ocdtrekkie 1 hour ago
      Any service which allows user generated content and allows arbitrary IP addresses to create infinite accounts is guaranteed to be overrun with CSAM. It's practically a law of physics.
      • jjk166 1 hour ago
        If you actually cared about CSAM you would want those posting it to self incriminate and then face consequences in real life at the hands of actual authorities. Websites banning such posters only serves to alert them that they need to improve their tactics and give them the opportunity to hide. Removing only the offending content and alerting authorities is the appropriate thing a website like Youtube should be doing.

        Even if one does argue that CSAM should result in hardware and IP bans, there's no reason that can't be a sole exception to a wider prohibition on such bans.

        • JumpCrisscross 43 minutes ago
          > If you actually cared about CSAM you would want those posting it to self incriminate and then face consequences in real life

          We don’t have the resources for this, even when the FBI isn’t being purged and sent to Home Depots. Unrestricting IPs means a boom for CSAM production and distribution.

        • ocdtrekkie 58 minutes ago
          Yes, we should let people "self-incriminate" with Tor and disposable email services...