886 comments

  • softwaredoug a day ago ago

    I'm very pro-vaccines, I don't think the 2020 election was stolen. But I think we have to realize silencing people doesn't work. It just causes the ideas to metastasize. A lot of people will say all kinds of craziness, and you just have to let it ride so most of us can roll our eyes at it.

    • andy99 a day ago ago

      The more important point (and this is really like a high school civics debate) is that the government and/or a big tech company shouldn't decide what people are "allowed" to say. There's tons of dumb stuff online, the only thing dumber is the state dictating how I'm supposed to think. People seem to forget that sometimes someone they don't agree with is in power. What if they started banning tylenol-autism sceptical accounts?

      • mapontosevenths a day ago ago

        > the government and/or a big tech company shouldn't decide what people are "allowed" to say.

        That "and/or" is doing a lot of work here. There's a huge difference between government censorship and forcing private companies to host content they don't want to host on servers they own.

        Then again, Alphabet is now claiming they did want to host it and mean old Biden pressured them into pulling it so if we buy that, maybe it doesn't matter.

        > What if they started banning tylenol-autism sceptical accounts?

        What if it's pro-cannibalism or pedophilia content? Everyone has a line, we're all just arguing about where exactly we think that line should be.

        • int_19h a day ago ago

          > There's a huge difference between government censorship and forcing private companies to host content they don't want to host on servers they own.

          It really depends. I remember after the Christchurch mosque shootings, there was a scramble to block the distribution of the shooter's manifesto. In some countries, the government could declare the content illegal directly, but in others, such as Australia, they didn't have pre-existing laws sufficiently wide to cover that, and so what happened in practice is that ISPs "proactively" formed a voluntary censorship cartel, acting in lockstep to block access to all copies of the manifesto, while the government was working on the new laws. If the practical end result is the same - a complete country block on some content - does it really matter whether it's dressed up as public or private censorship?

          And with large tech companies like Alphabet and Meta, it is a particularly pointed question given how much the market is monopolized.

          • onecommentman a day ago ago

            I wonder, in the case of mass violence events that were used as advertisement for the (assumed) murderer’s POV, whether there should be an equivalent of a House of Lords for the exceptional situation of censoring what in any other context would be breaking news. You don’t want or need (or be able) to censor a manifesto for all time, but you would want to prevent the (assumed) murderers from gaining any momentum from their heinous acts. So a ninety day (but only 90 day) embargo on public speech from bad actors, with the teeth of governmental enforcement, sounds pretty reasonable to me. Even cleverer to salt the ether with “leaks” that would actively suppress any political momentum for the (presumed) murderers during the embargo period, but with the true light of day shining after three months.

            • int_19h 21 hours ago ago

              It doesn't sound reasonable to me tbh. If anything, reading those manifestos is a good way to learn just how nutty those people are in the first place. At the same time, having it accessible prevents speculation about motives, which can lead to false justification for politically oppressive measures.

              OTOH if the goal is to prevent copycats then I don't see the point of a 90-day embargo. People who are likely to take that kind of content seriously enough to emulate are still going to do so. Tarrant, for example, specifically referenced Anders Breivik.

        • MostlyStable a day ago ago

          It can simultaneously be legal/allowable for them to ban speech, and yet also the case that we should criticize them for doing so. The first amendment only restricts the government, but a culture of free speech will also criticize private entities for taking censorious actions. And a culture of free speech is necessary to make sure that the first amendment is not eventually eroded away to nothing.

          • plantwallshoe a day ago ago

            Isn’t promoting/removing opinions you care about a form of speech?

            If I choose to put a Kamala sign in my yard and not a Trump sign, that’s an expression of free speech.

            If the marketing company I own decides to not work for causes I don’t personally support, that’s free speech.

            If the video hosting platform I’m CEO of doesn’t host unfounded anti-vax content because I think it’s a bad business move, is that not also free speech?

            • AfterHIA a day ago ago

              The crux of this is a shift in context (φρόνησις) where-in entities like marketing companies or video hosting platforms are treated like moral agents which act in the same manner as individuals. We can overcome this dilemma by clarifying that generally, "individuals with the power to direct or control the speech of others run the risk of gross oppression by being more liberal with a right to control or stifle rather than erring on the side of propagating a culture of free expression whether this power is derived from legitimate political ascension or the concentration of capital."

              In short-- no. Your right is to positively assert, "Trump sign" not, "excludes all other signs as a comparative right" even though this is a practical consequence of supporting one candidate and not others. "Owning a marketing company" means that you most hold to industrial and businesss ethics in order to do business in a common economic space. Being the CEO of any company that serves the democratic public means that one's ethical obligations must reflect the democratic sentiment of the public. It used to be that, "capitalism" or, "economic liberalism" meant that the dollars and eyeballs would go elsewhere as a basic bottom line for the realization of the ethical sentiment of the nation-state. This becomes less likely under conditions of monopoly and autocracy. The truth is that Section 230 created a nightmare. If internet platforms are now ubiquitous and well-developed aren't the protections realized under S230 now obsolete?

              It would be neat if somebody did, "you can put any sign in my yard to promote any political cause unless it is specifically X/Trump/whatever." That would constitute a unique form of exclusionary free speech.

              • plantwallshoe a day ago ago

                > Being the CEO of any company that serves the democratic public means that one's ethical obligations must reflect the democratic sentiment of the public.

                How does one determine the democratic sentiment of the public, especially a public that is pretty evenly ideologically split? Seems fraught with personal interpretation (which is arguably another form of free speech.)

                • AfterHIA a day ago ago

                  Let's think pragmatically and think of, "democracy" as a way of living which seeks to maximize human felicity and minimize human cruelty. In a fair society there would be/is a consensus that at a basic level our social contract is legitimized by these commitments to that. The issue stems from splitting hairs about what human felicity constitutes. This can be resolved as recognizing that some dignified splitting of these hairs is a necessary component of that felicity. This presents in our society as the public discourse and the contingent but distinct values of communities in their efforts to realize themselves.

                  I'm reminded of that old line by Tolstoy-- something like, "happy families are all happy for precisely the same reasons; every unhappy family is unhappy in its own way." The point from an Adam Smith perspective is that healthy societies might all end up tending toward the same end by widely different means: Chinese communists might achieve superior cooperation and the realization of their values as, "the good life" by means dissimilar to the Quaker or the African tribesperson. The trick is seeing that the plurality of living forms and their competing values is not a hinderance to cooperation and mutual well-being but an opportunity for extended and renewed discourses about, "what we would like to be as creatures."

                  Worth mentioning:

                  https://sites.pitt.edu/~rbrandom/Courses/Antirepresentationa...

            • lmz a day ago ago

              Agreed. If I have a TV network and think these anti-government hosts on my network are bad for business, that is also freedom of speech.

              • rubyfan a day ago ago

                Maybe. If it is independent of government coercion.

                • Jensson a day ago ago

                  But Youtube did this after government coercion, so what is the difference?

                  • rubyfan 15 hours ago ago

                    Maybe it’s ok if it was an independent business decision but I’m not saying Youtube’s was or wasn’t.

                    It’s a problem especially if there is a direct or implied threat to use the powers of the government to impact a business if the government is acting counter to the first amendment. This is essentially the government causing the outcome, not a business using its free speech after an independent business decision.

                    One could argue a business might come to a decision to pull content the government doesn’t like independently without coercion if they had an antitrust case pending with the DOJ. There’s probably a line here where the government needs to act in a specific way to threaten to make it coercion. Maybe the line was crossed in YT’s case?

                    On all of these cases I come to the conclusion there needs to be separation of powers on some of these executive branch actions. I’m not sure how to do it something is needed to protect individual rights from executive overreach (regardless of which party is in power).

                  • alphabettsy a day ago ago

                    I think you should look up the definition of coercion.

                    • Jensson 19 hours ago ago

                      Have you seen the emails the Biden Administration sent to Youtube? Here is a quote verbatim that they sent to Youtube:

                      > we want to be sure that you have a handle on vaccine hesitancy generally and are working toward making the problem better. This is a concern that is shared at the highest (and I mean highest) levels of the White House

                      Saying you want to make sure they will censor these videos is a threat, and then they said that Biden was behind this to add legitimacy to the threat.

                      If it was just a friendly greeting why would they threaten youtube with Bidens name? If youtube did this willingly there would be no need to write such a threatening message saying they want to make sure Youtube censors these.

                      You can read the whole report here if you wanna see more: https://judiciary.house.gov/sites/evo-subsites/republicans-j...

                      And if you don't see that as a threat, imagine someone in the trump administration sent that, do you still think its not a threat? Of course its a threat, it makes no sense to write that way otherwise, you would just say you wanted to hear how it goes not say you wanna make sure they do this specific thing and threaten them with the presidents powers.

                      • mapontosevenths 14 hours ago ago

                        >And if you don't see that as a threat, imagine someone in the trump administration sent that, do you still think its not a threat?

                        We don't need to imagine anything. The chair of the FCC publicly threatened ABC over Kimmel. This morning Trump posted a direct threat of government reprisals if they didn't fire a comedian over a joke he doesnt like.

                        Nothing vague or implied about it. Just the government of the United States directly threatening free speech

                        I wont link to truth social. You can Google it.

                      • alphabettsy 13 hours ago ago

                        Thank you for providing this report that had a conclusion before the investigation even started.

                        Fortunately, the Trump administration has given us an example of what a threat and coercion actually looks like. They declared exactly the action they would take if they did not get their preferred outcome and it’s clearly politically motivated.

                        That’s quite a bit different than we’re concerned about this misinformation and would like you to do something about it.

                        I think a reasonable and nuanced debate can be had on whether or not that was appropriate, but there is a difference.

                        • mapontosevenths 10 hours ago ago

                          > I think a reasonable and nuanced debate can be had on whether or not that was appropriate, but there is a difference.

                          I very much agree. It's reasonable to ask if the Biden admin overstepped their boundaries by politely asking if Youtube would help them stop people from murdering each other with disinformation and trying to overthrow the government.

                          I think the current situation is much less debatable. The government is now issuing ultimatums and very publicly threatening corporations to stifle free speech.

              • crtasm a day ago ago

                I hope to see the anti-government hosts before they're let go. The channels I've tried so far only seem to have boring old anti-corruption, anti-abuse of power and anti-treating groups of people as less than human hosts.

              • AfterHIA a day ago ago

                You use terms (other as well) like, "own, is the CEO of, and the owner of" and this speaks to the ironically illiberal shift we've seen in contemporary politics. Historically one needed to justify, "why" some person is put into a position of authority or power-- now as a result of the Randroid Neoliberal Assault™ it's taken for granted that if, "John Galt assumed a position of power that he has a right to exercise his personal will even at the behest of who he serves or at the behest of ethics" as an extension of, "the rights of the individual."

                I want to recapitulate this sentiment as often and as widely as possible-- Rand and her cronies know as much about virtue, freedom, and Aristotle as they do about fornicating; not much.

                • mapontosevenths 10 hours ago ago

                  > Rand and her cronies know as much about virtue, freedom, and Aristotle as they do about fornicating; not much.

                  Even if I disagreed with you I would upvote for this gem. I'll be chuckling at this one randomly for weeks.

                  • HankStallone 10 hours ago ago

                    It'd be a good zinger, except isn't it commonly known that Rand had an affair with her lead follower, and basically announced to her husband and her lover's wife that they were in open marriages from then on? It seems like fornicating was one thing she did know about.

                    • mapontosevenths 7 hours ago ago

                      I have no idea why, but that somehow makes it even funnier. :)

          • AfterHIA a day ago ago

            Bingo. This is Adam Smith's whole point in the second half of, "Wealth Of Nations" that nobody bothers to read in lieu of the sentiments of the Cato Institute and the various Adam Smith societies. Nations produce, "kinds of people" that based on their experience of a common liberty and prosperity will err against tyranny. Economics and autocracy in our country is destroying our culture of, "talk and liberality." Discourse has become, "let's take turns attacking each other and each other's positions."

            The American civilization has deep flaws but has historically worked toward, "doing what was right."

            https://www.adamsmithworks.org/documents/book-v-of-the-reven...

          • lkey a day ago ago

            Or it might be the case that that 'culture' is eroding the thing it claims to be protecting. https://www.popehat.com/p/how-free-speech-culture-is-killing...

            • AfterHIA a day ago ago

              This. Even if we have concrete protections in our society it takes a society of people committed to a common democratic cause and common functional prosperity that prevents there from being abuses of the right to speak and so on (..) This isn't complicated and this wasn't always controversial.

              I've already described above that even in this thread there's a sentiment which is that, "as long as somebody has gained coercive power legitimately then it is within their right to coerce." I see terms thrown around like, "if somebody owns" or, "if somebody is the CEO of..." which speaks to the growing air of illiberality an liberal autocranarianism which is a direct result of the neoliberal assault founding and funding thousands of Cato Institutes, Adam Smith Societies, and Heritage Foundations since the neoliberal turn in the late 1960's. We've legitimized domination ethics as an extension of the hungry rights of pseudotyrants and the expense of people in general.

              I wonder what people in general might one day do about this? I wonder if there's a historical precedent for what happens when people face oppression and the degradation of common cultural projects?

              https://en.wikipedia.org/wiki/Russian_Revolution#October_Rev...

              https://en.wikipedia.org/wiki/Reign_of_Terror

          • SantalBlush a day ago ago

            Are you in favor of HN allowing advertisements, shilling, or spam in these threads? Because those things are free speech. Would you like to allow comments about generic ED pills?

            I simply don't believe people who say they want to support a culture of free speech on a media or social media site. They haven't really thought about what that means.

            • AfterHIA a day ago ago

              Without being crude I think they stopped, "thinking about that means" in any positive sense a long time ago. Cultures of discourse and criticism are never good for the powerful. The goal is to create a culture when anyone can say anything but with no meaningful social consequences negative or positive. I can call Trump a pedophile all day on my computer interface and maybe somebody else will see it but the Google and Meta machine just treat it as another engagement dollar. These dollars are now literally flowing to the White House in the form of investment commitments by acting Tech Czar Zuckerberg.

              While I'm with my dudes in computer space-- it all starts with the passing of the Mansfield Amendment. You want to know why tech sucks and we haven't made any foundational breakthroughs for decades? The privatization of technology innovation.

              https://en.wikipedia.org/wiki/Pirates_of_Silicon_Valley

              https://www.nsf.gov/about/history/narrative#chapter-iv-tumul...

          • asadotzler a day ago ago

            Will you criticize my book publishing company for not publishing and distributing your smut short story?

            • AfterHIA a day ago ago

              Perhaps and if you have some kind of monopoly than definitely. Things beings, "yours" isn't some fundamental part of the human condition. CEOs serve their employees and shareholders and the ethics of the business space they operate in. Owners are ethically obligated to engage in fair business practices. I'm sick up to my neck of this sentiment that if John Galt is holding a gun he necessarily has the right to shoot it at somebody.

              Modern democracies aren't founded on realist ethics or absolute commitments to economic liberalism as totalizing-- they're founded on a ethical balance between the real needs of people, the real potential for capital expansion, and superior sentiments about the possibilities of the human condition. As a kid that supported Ron Paul's bid for the Republican nomination as a 16-year-old I can't help but feel that libertarian politics has ruined generations of people by getting them to accept autocracy as, "one ethical outcome to a free society." It isn't.

              The irony in me posting this will be lost on most: https://www.uschamber.com/

            • user34283 a day ago ago

              No, but I will criticize Apple and Google for banning smut apps.

              If those two private companies would host all legal content, this could be a thriving market.

              Somehow big tech and payment processors get to censor most software.

        • briHass a day ago ago

          The line should be what is illegal, which, at least in the US, is fairly permissive.

          The legal process already did all the hard work of reaching consensus/compromise on where that line is, so just use that. At least with the legal system, there's some degree of visibility and influence possible by everyone. It's not some ethics department silently banning users they don't agree with.

          • a day ago ago
            [deleted]
        • mitthrowaway2 a day ago ago

          The middle ground is when a company becomes a utility. The power company can't simply disconnect your electricity because they don't feel like offering it to you, even though they own the power lines. The phone company can't disconnect your call because they disagree with what you're saying, even though they own the transmission equipment.

        • AfterHIA a day ago ago

          There's a literal world of literature both contemporary and classical which points to the idea that concentrations of power in politics and concentrations of wealth and power in industry aren't dissimilar. I think there are limits to this as recent commentaries by guys like Zizek seem to suggest that the, "strong Nation-State" is a positive legacy of the European enlightenment. I think this is true, "when it is."

          Power is power. Wealth is power. Political power is power. The powerful should not control the lives or destinies of the less powerful. This is the most basic description of contemporary democracy but becomes controversial when the Randroids and Commies alike start to split hairs about how the Lenins and John Galts of the world have a right to use power to further their respective political objectives.

          https://www.gutenberg.org/files/3207/3207-h/3207-h.htm (Leviathan by Hobbes)

          https://www.gutenberg.org/ebooks/50922 (Perpetual Peace by Kant)

          https://www.heritage-history.com/site/hclass/secret_societie...

        • mc32 a day ago ago

          The thing is that people will tell you it wasn’t actually censorship because for them it was only the government being a busy body nosey government telling the tech corps about a select number of people violating their terms (nudge nudge please do something)… so I think the and/or is important.

          • AfterHIA a day ago ago

            Great post mc32 (I hope you're a Wayne Kramer fan!)

            This private-public tyranny that's going on right now. The FCC can't directly tell Kimmel, "you can't say that" they can say, "you may have violated this or this technical rule which..." This is how Project 2025 will play out in terms of people's real experience. You occupy all posts with ideologically sympathetic players and the liberality people are used to becomes ruinous as, "the watchers" are now, "watching for you." The irony is that most conservatives believe this is just, "what the left was doing in the 2010's in reverse" and I don't have a counterargument for this other than, "it doesn't matter; it's always bad and unethical." Real differences between Colbert and Tate taken for granted.

            • mc32 14 hours ago ago

              All sides and i mean all sides with one tiny sliver of an exception will be hypocrites about freedom of speech. I’m not an absolutist as I think there are things we know produce harm in people, specially susceptible young populations, but definitely strictly political speech should be protected and allowed. How can we not have debates about efficacy of medical products or about trustworthiness of the data?

              • mapontosevenths 8 hours ago ago

                > How can we not have debates about efficacy of medical products or about trustworthiness of the data?

                We must be able to have those debates, but we must also guard against grifters selling 5g-proof underwear to the functionally illiterate or serving up vaccine skepticism based more on their desire to score political points than science.

                The devil will always be in the details, and it's a tough balancing act. I personally suspect that the way we do this is by checking credentials.

                Doctors and scientists should be allowed to offer alternative theories. Meth-addicts posting from labs hidden in the Ozarks, software developers, and unqualified politicians, maybe not.

      • JumpCrisscross a day ago ago

        > the government and/or a big tech company shouldn't decide what people are "allowed" to say

        This throws out spam and fraud filters, both of which are content-based moderation.

        Nobody moderates anything isn’t unfortunately a functional option. Particularly if the company has to sell ads.

      • ncallaway a day ago ago

        As with others, I think your "and/or" between government and "big tech" is problematic.

        I think government censorship should be strictly prohibited. I think "company" censorship is just the application of the first amendment.

        Where I think the problem lies with things like YouTube is the fact that we have _monopolies_, so there is no "free market" of platforms.

        I think we should be addressing "big tech" censorship not by requiring tech companies to behave like a government, but rather by preventing any companies from having so much individual power that we _need_ them to behave like a government.

        We should have aggressive anti-trust laws, and interoperability requirements for large platforms, such that it doesn't matter if YouTube decides to be censorious, because there are 15 other platforms that people can viably use instead.

      • singleshot_ 2 hours ago ago

        > a big tech company shouldn't decide what people are "allowed" to say

        On their platform, that’s exactly what they are entitled to do. When you type into the box in the Facebook app, that’s your speech. But unless the platform wants to add your contribution to their coherent speech product, they have every right to reject it.

        Otherwise, the government is deciding what people can say, and you’d be against that, right?

        Further, if I wanted to start a social media platform called thinkingtylenolcausesautismisstupid.com, wouldn’t restricting my right to craft my product defeat the whole point of my business?

        Giving platforms the ability to moderate their output to craft a coherent speech product is the only reason we have multiple social networks with different rules, instead of one first-mover social network with no rules where everyone is locked in by network effects.

      • AfterHIA a day ago ago

        Another way of articulating this: "concentrations of power and wealth should not determine the speech or political sentiments of the many."

        My fear is that this is incredibly uncontroversial this is until it's not-- when pushes becomes shoves we start having debates about what are, "legitimate" concentrations of power (wealth) and how that legitimacy in itself lets us, "tolerate what we would generally condemn as intolerable." I feel we need to take a queue from the Chomsky's of the world and decree:

        "all unjustified concentrations of power and wealth are necessarily interested in control and as such we should aggressively and purposefully refuse to tolerate them at all as a basic condition of democratic living..."

        This used to be, "social democracy" where these days the Democratic Party in the United States' motto is more, "let us make deals with the devil because reasons and things." People have the power. We are the people. Hare fucking Krsna.

      • heavyset_go a day ago ago

        This is just a reminder that we're both posting on one the most heavily censored, big tech-sponsored spaces on the internet, and arguably, that's what allows for you to have your civics debate in earnest.

        What you are arguing for is a dissolution of HN and sites like it.

      • asadotzler a day ago ago

        No one in Big Tech decides what you are allowed to say, they can only withhold their distribution of what you say.

        As a book publisher, should I be required to publish your furry smut short stories? Of course not. Is that infringing on your freedom of speech? Of course not.

        • mitthrowaway2 a day ago ago

          No, they ban your account and exclude you from the market commons if they don't like what you say.

          • mulmen a day ago ago

            Yes that’s how free markets work. Your idea has to be free to die in obscurity.

            Compelled speech is not free speech. You have no right to an audience. The existence of a wide distribution platform does not grant you a right to it.

            These arguments fall completely flat because it’s always about the right to distribute misinformation. It’s never about posting porn or war crimes or spam. That kind of curation isn’t contentious.

            Google didn’t suddenly see the light and become free speech absolutists. They caved to political pressure and are selectively allowing the preferred misinformation of the current administration.

            • int_19h a day ago ago

              A market that has companies with the size - or rather, the market dominance - of the likes of Google is not meaningfully a free market. The fundamental problem isn't whether Google censors or not, nor what it censors, but the very fact that its decision on this matter is so impactful.

              • mulmen a day ago ago

                If you want to debate anti trust and regulation then let’s do it. Google’s dominance is bad for our society, culture, and our economy but it’s not a reason to erode our fundamental rights. Compelling free speech will do nothing to erode Google’s market share or encourage competition. In fact it will further entrench Google’s dominance.

                • int_19h 21 hours ago ago

                  You're right, but freedom of speech is also a valid angle from which to debate antitrust and regulation. Indeed, I don't want Google to be compelled to platform others - I want platforms that large to not exist in the first place. But pointing out that censorship by big tech megacorps has very real and very negative effects that can be comparable to outright government censorship in some cases is a part of that fight.

                  • mulmen 19 hours ago ago

                    > You're right, but freedom of speech is also a valid angle from which to debate antitrust and regulation.

                    The effect of YouTube’s content moderation size on speech is a symptom of weak antitrust policy, not of free expression. So sure, mention the effect on speech if you want but don’t ignore the solution.

                • Dylan16807 a day ago ago

                  How is compelling google to censor less going to entrench their dominance? If it's purely by making them suck less, I'm okay with that risk.

                  And I don't think it erodes any fundamental rights to put restrictions on huge monopolies.

                  • mulmen a day ago ago

                    > How is compelling google to censor less going to entrench their dominance?

                    If you force Google alone to amplify certain speech then what competitive advantage does a less censorious service provide?

                    > If it's purely by making them suck less, I'm okay with that risk.

                    Define “suck less”. Now ask yourself if you are comfortable with someone you completely disagree with defining what sucks less.

                    > And I don't think it erodes any fundamental rights to put restrictions on huge monopolies.

                    You’re talking about antitrust, not free expression.

                    Compelled speech is an erosion of the first amendment. You may think that erosion is acceptable but you can’t deny it exists.

                    • Dylan16807 a day ago ago

                      > If you force Google alone to amplify certain speech then what competitive advantage does a less censorious service provide?

                      If that's the only "advantage" another service has, I don't care if it has no competitive advantage. If it offers anything else then that's the advantage.

                      Seriously this idea is super weird to me. There are plenty of reasons to avoid too much regulation. But "don't force company X to make their users happier because happy users won't leave" is a terrible reason.

                      >Define “suck less”. Now ask yourself if you are comfortable with someone you completely disagree with defining what sucks less.

                      A big part of the "if" is that people are making their own evaluations.

                      > You’re talking about antitrust

                      I am not talking about antitrust. I'm saying that the bigger and more powerful a corporation gets the further it is from a human and human rights.

                      > Compelled speech is an erosion of the first amendment. You may think that erosion is acceptable but you can’t deny it exists.

                      In this case, barely at all, and it's the same one we already have for common carriers.

                      • mulmen 21 hours ago ago

                        > If that's the only "advantage" another service has, I don't care if it has no competitive advantage. If it offers anything else then that's the advantage.

                        The value proposition of a less censorious YouTube alternative is exactly that it is less censorious. You’re seemingly arguing against free markets.

                        > Seriously this idea is super weird to me. There are plenty of reasons to avoid too much regulation. But "don't force company X to make their users happier because happy users won't leave" is a terrible reason.

                        The problem with compelled speech is that the government should not be in the business of deciding what kind of speech makes people happy.

                        > A big part of the "if" is that people are making their own evaluations.

                        People should have the freedom to choose the media they consume. Compelled speech takes that choice away from them by putting the government in the position of making that decision for the people. This distorts the marketplace of ideas.

                        I don’t have time to read every comment or email or watch every video. Private content moderation is a value add and a form of expression. We need competition in that space, not government restriction.

                        > I am not talking about antitrust. I'm saying that the bigger and more powerful a corporation gets the further it is from a human and human rights.

                        If your problem with Google is how much influence they have then yes, you are talking about antitrust. That’s the regulatory mechanism by which excessive corporate influence can be restricted.

                        > In this case, barely at all, and it's the same one we already have for common carriers.

                        “A little” is still more than nothing which was your previous assertion. You may be comfortable with the rising temperature of our shared pot of water but I say it is a cause for concern.

                        • Dylan16807 16 hours ago ago

                          > You’re seemingly arguing against free markets.

                          You're only talking about the people that like a feature. Why do you need a free market for that if every company can do it?

                          Not everything has to be a free market. There are reasons to use competition but not this reason.

                          > the government should not be in the business of deciding what kind of speech makes people happy

                          I did not say or intentionally imply they should.

                          > People should have the freedom to choose the media they consume. Compelled speech takes that choice away

                          Not if the compelling is just that they can't ban content. That only adds choice.

                          > If your problem with Google is how much influence they have then yes, you are talking about antitrust. That’s the regulatory mechanism by which excessive corporate influence can be restricted.

                          There can be other mechanisms, and more importantly my argument there isn't about mechanisms. They are barely barely humanlike, so human rights are barely barely relevant.

                          > “A little” is still more than nothing which was your previous assertion. You may be comfortable with the rising temperature of our shared pot of water but I say it is a cause for concern.

                          It's barely any increase because we already have common carrier rules.

                          And I stand by the statement that it doesn't erode fundamental rights. The right of giant corporations to have free speech is at the edge, not fundamental. And a rule like that increases the free speech of so many actual humans.

                      • 21 hours ago ago
                        [deleted]
            • themaninthedark 21 hours ago ago

              Just to split hairs here, as I do not think that a company should be forced to host content.

              Hosting content is not giving someone an audience.

              If I take my stool into the main square and stand on it, giving a speech about the evils of canned spinach. People pass by but no-one stops and listens(or not for long), I did not have an audience.

              If I record the same thing and put it up on Youtube and the same reaction happens. I only get 5~10 views, Youtube is not giving me an audience. They are hosting the video, just like they do for many other videos that are uploaded everyday.

              If Youtube suddenly starts pushing my video onto everyone's "Home", "Recommended " or whatever; then that would be them giving me an audience.

              If the Big Spinach Canners find my video and ask Youtube to take it down, that is censorship.

              • mulmen 20 hours ago ago

                > Hosting content is not giving someone an audience.

                Yes, it is.

                > If I take my stool into the main square and stand on it, giving a speech about the evils of canned spinach. People pass by but no-one stops and listens(or not for long), I did not have an audience.

                Well, yes, you did. They are free to cheer, boo, or leave. YouTube is more like an open mic night. I reject the idea that it is a public space like a main square.

                > If I record the same thing and put it up on Youtube and the same reaction happens. I only get 5~10 views, Youtube is not giving me an audience. They are hosting the video, just like they do for many other videos that are uploaded everyday.

                I am lucky to have never worked in content moderation but I’m certain YouTube refuses or removes submissions every day. So while your spinach speech may survive there are many other videos that don’t.

                > If Youtube suddenly starts pushing my video onto everyone's "Home", "Recommended " or whatever; then that would be them giving me an audience.

                Being on YouTube at all is YouTube giving you an audience. Their recommendation algorithm is the value proposition of their product to consumers whose attention is the product sold to advertisers.

                > If the Big Spinach Canners find my video and ask Youtube to take it down, that is censorship.

                Perhaps in the strictest dictionary sense it is censorship but it is not censorship in a first amendment sense. This is a private business decision. You’re free to submit your video as an ad and pay Google directly for eyeballs. And they can still say no.

                The only problem here is the size of YouTube relative to competitors. The fix there is antitrust, not erosion of civil liberties.

                Consider the landscape that evolves in a post-YouTube environment with an eroded first amendment and without section 230 protections. Those protections are critical for innovation and free expression.

        • AfterHIA a day ago ago

          If the furry smut people became the dominant force in literature and your company was driven out of business fairly for not producing enough furry smut would that too constitute censorship?

          I want to see how steep this hill you're willing to die on is. What's that old saying-- that thing about the shoe being on the other foot?

      • zetazzed a day ago ago

        Does Disney have a positive obligation to show animal cruelty snuff films on Disney Plus? Or are they allowed to control what people say on their network? Does Roblox have to allow XXX games showing non-consensual sex acts on their site, or are they allowed to control what people say on their network? Can WebMD decide not to present articles claiming that homeopathy is the ultimate cure-all? Does X have to share a "trending" topic about the refusal to release the Epstein files?

        The reason we ban government censorship is so that a private actor can always create their own conspiracy theory + snuff film site if they want, and other platforms are not obligated to carry content they find objectionable. Get really into Rumble or Truth Social or X if you would like a very different perspective from Youtube's.

        • AfterHIA a day ago ago

          Let's say that in the future that the dominant form of entertainment is X-rated animal snuff films for whatever reason. Would a lack of alternative content constitute an attack on your right to choose freely or speak? Given your ethical framework I'd have to say, "no" but even as your discursive opponent I would have to admit that if you as a person are adverse to, "X-rated furry smut" that I would sympathize with you as the oppressed if it meant your ability to live and communicate has been stifled or called into question. Oppression has many forms and many names. The Johnny Conservatarians want to reserve certain categories of cruelty as, "necessary" or, "permissable" by creating frameworks like, "everything is permitted just as long as some social condition is met..."

          At the crux of things the libertarians and the non-psychos are just having a debate on when it's fair game to be unethical or cruel to others in the name of extending human freedom and human dignity. We've fallen so far from the tree.

      • ben_w 17 hours ago ago

        > There's tons of dumb stuff online, the only thing dumber is the state dictating how I'm supposed to think

          I've seen stupidity on the internet you wouldn't believe.
          Time Cube rants — four simultaneous days in one rotation — burning across static-filled CRTs.
          Ponzi pyramids stretching into forever, needing ten billion souls to stand beneath one.
          And a man proclaiming he brought peace in wars that never were, while swearing the President was born on foreign soil.
          All those moments… lost… in a rain of tweets.
        
        But even that dumb stuff aside: there's two ways for a government to silence the truth: censorship, and propaganda.

        We've got LLMs now, letting interested parties (government or not) overwhelm everyone with an endless barrage of the worst, cheapest, lowest quality AI slop, the kind that makes even AI proponents like me go "ah, I see what you mean about it being autocomplete", because even the worst of that by quality is still able to bury any bad news story just as effectively as any censorship. Too much noise and not enough signal, is already why I'm consuming far less YouTube these days, why I gave up on Twitter when it was still called that, etc.

        And we have AI that's a lot better at holding a conversation than just the worst, cheapest, lowest quality AI slop. We've already seen LLMs are able to induce psychosis in some people just by talking to them, and that was, so far as we can tell, accidental. How long will it be before a developer chooses to do this on purpose, and towards a goal of their choice? Even if it's just those who are susceptible, there's a lot of people.

        What's important is the freedom to share truth, no matter how uncomfortable, and especially when it's uncomfortable for those with power. Unfortunately, what we humans actually share the most is gossip, which is already a poor proxy for truth and is basically how all the witch hunts, genocides, and other moral-panic-induced horrors of history happened.

        It is all a mess; it is all hard; don't mistake the proxy (free speech in general) for the territory (speak truth to power, I think?); censorship is simultaneously bad and the only word I know for any act which may block propaganda which is also bad.

      • mulmen a day ago ago

        I have some ideas I want to post on your personal webpage but you have not given me access. Why are you censoring me?

        • matthewrobertso an hour ago ago

          The government told me to.

        • AfterHIA a day ago ago

          I have a consortium of other website owners who refuse to crosslink your materials unless you put our banner on your site. Is this oppression? Oppression goes both ways, has many names, and takes many forms. Its most insidious form being the Oxford Comma.

        • mitthrowaway2 a day ago ago

          Is andy99's personal webpage a de-facto commons where the public congregates to share and exchange ideas?

          • AfterHIA a day ago ago

            I know that your post is rhetorical but I'll extend your thinking into real life-- has andy99 personal webpage been created because you're an elected official representing others? Would this still give andy99 the right to distribute hate speech on his personal webpage? I think we can harmonize around, "unfortunately so" and that's why I think the way forward is concentrating on the, "unfortunately" and not the, "so."

            We have the right to do a potentially limitless amount of unbecoming, cruel, and oppressive things to our fellow man. We also have the potential for forming and proliferating societies. We invented religion and agriculture out of dirt and need. Let us choose Nazarenes, Jeffersons, and Socrates' over Neros, Alexanders, and Napoleons. This didn't use to be politically controversial!

          • mulmen a day ago ago

            It would be if they’d stop censoring me!

    • asadotzler a day ago ago

      My refusing to distribute your work is not "silencing." Silencing would be me preventing you from distributing it.

      Have we all lost the ability to reason? Seriously, this isn't hard. No one owes you distribution unless you have a contract saying otherwise.

      • jhbadger a day ago ago

        It's not that simple. For example, when libraries remove books for political reasons they often claim it isn't "censorship" because you could buy the book at a bookstore if you wanted. But if it really would have no effect on availability they wouldn't bother to remove the book, would they?

        • amanaplanacanal a day ago ago

          Libraries are typically run by the government. Governments aren't supposed to censor speech. Private platforms are a different matter by law.

          • 4 hours ago ago
            [deleted]
        • a day ago ago
          [deleted]
      • sterlind a day ago ago

        I'd certainly consider an ISP refusing to route my packets as silencing. is YouTube so different? legally, sure, but practically?

        • michaelt a day ago ago

          If we were still in the age of personal blogs and phpbb forums, where there were thousands of different venues - the fact the chess forum would ban you for discussing checkers was no problem at all.

          But these days, when you can count the forums on one hand even if you're missing a few fingers, and they all have extremely similar (American-style) censorship policies? To me it's less clear than it once was.

        • jabwd a day ago ago

          yes... coz youtube is not your ISP. A literal massive difference. RE: net neutrality.

        • scarface_74 a day ago ago

          No because you are perfectly technically capable of setting your own servers in a colo and distributing your video.

      • pfannkuchen a day ago ago

        I think the feeling of silencing comes from it being a blacklist and not a whitelist.

        If you take proposals from whoever and then only approve ones you specifically like, for whatever reason, then I don’t think anyone would feel silenced by that.

        If you take anything from anyone, and a huge volume of it, on any topic and you don’t care what, except for a few politically controversial areas, that feels more like silencing. Especially when there is no alternative service available due to network effects and subsidies from arguably monopolistic practices.

        • mock-possum a day ago ago

          Also allowing it to be posted initially for a period of time before being taken down feels worse than simply preventing it from ever being published on your platform to begin with.

          Of course they would never check things before allowing them to be posted because there isn’t any profit in that.

      • ultrarunner a day ago ago

        At some level these platforms are the public square and facilitate public discussion. In fact, Google has explicitly deprioritized public forum sites (e.g. PHPbb) in preference to forums like YouTube. Surely there is a difference between declining to host and distribute adult material and enforcing a preferred viewpoint on a current topic.

        Sure, Google doesn't need to host anything they don't want to; make it all Nazi apologia if they thing it serves their shareholders. But doing so and silencing all other viewpoints in that particular medium is surely not a net benefit for society, independent of how it affects Google.

        • Scoundreller a day ago ago

          “Covid” related search results were definitely hard-coded or given a hand-tuned boost. Wikipedia was landing on the 2nd or 3rd page which never happens for a general search term on Google.

          I’d even search for “coronavirus” and primarily get “official” sites about Covid-19 even tho that’s just one of many coronaviruses. At least Wikipedia makes the front page again, with the Covid-19 page outranking the coronavirus page…

        • Scoundreller a day ago ago

          “Covid” related search results were definitely hard-coded. Wikipedia was landing on the 2nd or 3rd page which never happens.

          I’d even search for “coronavirus” and primarily get “official” sites about Covid-19 even tho that’s just one of many coronaviruses. At least Wikipedia makes the front page again, with the Covid-19 page outranking the coronavirus page…

      • unyttigfjelltol a day ago ago

        > My refusing to distribute your work is not "silencing."

        That distinction is a relic of a world of truly public spaces used for communication— a literal town square. Then it became the malls and shopping centers, then the Internet— which runs on private pipes— and now it’s technological walled gardens. Being excluded from a walled garden now is effectively being “silenced” the same way being excluded from the town square was when whatever case law you’re thinking was decided.

      • Jensson a day ago ago

        > No one owes you distribution unless you have a contract saying otherwise.

        The common carrier law says you have to for for some things, so it makes sense to institute such a law for some parts of social media as they are fundamental enough. It is insane that we give that much censorship power to private corporations. They shouldn't have the power to decide elections on a whim etc.

        • AfterHIA a day ago ago

          I 100% agree with your sentiment here Jensson but in Googling, "common carrier law" what I get are the sets of laws governing transportation services liability:

          https://en.wikipedia.org/wiki/Common_carrier

          Is there perhaps another name for what you're describing? It piques my interest.

          • Jensson a day ago ago

            Common carrier also applies to phones and electricity and so on, it is what prevents your phone service provider from deciding who you can call or what you can say. Imagine a world where your phone service provider could beep out all your swear words, or if they prevented you from calling certain people, that is what common carrier prevents.

            So the equivalent of Google banning anyone talking about Covid is the same as a phone service provider ending service for anyone mentioning covid on their phones. Nobody but the most extreme authoritarians thinks phone providers should be allowed to do that, so why not apply this to Google as well?

            • amanaplanacanal a day ago ago

              This is essentially the free speech maximalist position: allow any legal content.

              If they did that, people would leave the service in droves for a competitor with reasonable moderation. Nobody wants to use a site that is overrun with spam and porn.

              • throwmeaway222 3 hours ago ago

                So what's the solution, just continue to build two Americas that will eventually go to war?

              • nradov a day ago ago

                Perhaps. But another approach would be to give users better filtering features so that they wouldn't see content they consider objectionable, even if it's not censored and still readily available to other users.

              • Jensson a day ago ago

                > If they did that, people would leave the service in droves for a competitor with reasonable moderation.

                Did people leave Google in droves in favor of a competitor that censors out all porn from search results? No, people had no issue that you can find porn on Google, they still used it. Youtube providing porn to those who want it does not cause problems for anyone, just like it doesn't for Google search, and Google even run both so they can easily apply this same feature on Youtube.

                > Nobody wants to use a site that is overrun with spam and porn.

                The internet is overrun by spam and porn yet people still use it, so you are clearly wrong. Google already manages as search engine over the internet that is capable of not showing you porn when you don't search for it, but you can find it if you do, so Google has already solved that problem and could just do the same in Youtube.

                • amanaplanacanal a day ago ago

                  Note that we are having this conversation on a site with heavy moderation. I doubt removing this moderation would in any way make the site better.

                  You might ask yourself why you are here, instead of another website with less or no moderation.

                  • Jensson 19 hours ago ago

                    The only reason we need moderation is that we have discussions, youtube videos doesn't have that feature, you can't attach a video to another persons video, but you can attach a comment here to another persons comment. I am all for moderating youtube comments for that very reason, but not youtube videos.

                    I would prefer if discord / reddit and similar became common carriers of forums, not messages. So discord and reddit can't control what a subreddit does and what its moderators do, but the moderators can control what the people posting there can do.

                    By having a common carrier forum provider anyone could easily make their own forum with their own rules and compete on an open market without needing any technical skills, and without the forum provider being able to veto everything they say and do on that forum. That is where we want to be, in such an environment HN wouldn't need to depend on ycombinator, you could have many independently moderated forums and you pick the best one.

                    Discord and reddit today aren't that, both ban things they don't like, it would be much better if we removed that power from them. Both reddit and discord admins allows porn and spam, their censorship adds zero value to the platform, the only thing it does is kick some political factions out of the platform which doesn't add any value to it, as I wouldn't visit those discords / subreddits anyway so they don't hurt me.

                    So it isn't hard to imagine how to draft such laws where all our favorite usecases are still allowed while also adding much more freedom for users and making life easier for these content platforms since they are no longer targeted by takedown request spam, it is a win win for everyone except those who want to censor.

                    • amanaplanacanal 18 hours ago ago

                      You are welcome to set up such a forum provider today. You probably won't be able to get sponsors for it though. Reddit used to be much more lightly moderated, but they wanted to be able to run ads/make money. 4chan is much more lightly moderated than the big platforms.

                      Unless you make a law preventing all moderation, the users and advertisers are going to migrate to the moderated forums.

                      • Jensson 18 hours ago ago

                        > You are welcome to set up such a forum provider today. You probably won't be able to get sponsors for it though. Reddit used to be much more lightly moderated, but they wanted to be able to run ads/make money.

                        Thanks for answering why the law is needed, as you explained a private solution cannot solve this. Advertisers wouldn't be able to push reddit to ban things if reddit weren't allowed to ban them, so you would still be able to run ads with such a law, it just reduces the power those ad companies has over you.

                        And no, the ad companies doesn't really care if you show porn or show terrorist propaganda on your site, you can both watch porn and read terrorist propaganda on Google without leaving the site yet every advertiser I know is happily spending a massive amount of money on Google ads. If they actually cared they would leave Google, instead they just care about bullying those who can comply, if they know the target wont budge due to a law then they would just continue to advertise like they do with Google.

                        These kind of regulations are needed when the free market results in oppressive results, there are many such cases where regulations do a good job and I don't see why these internet companies should be an exception.

      • a day ago ago
        [deleted]
      • typeofhuman a day ago ago

        Not OP, but we did learn the US federal government was instructing social media sites like Twitter to remove content it found displeasing. This is known as jawboning and is against the law.

        SCOTUS. Bantam Books, Inc. v. Sullivan, holds that governments cannot coerce private entities into censoring speech they disfavor, even if they do not issue direct legal orders.

        This was a publicly announced motivation for Elon Musk buying Twitter. Because of which we know the extent of this illegal behavior.

        Mark Zuckerberg has also publicly stated Meta was asked to remove content by the US government.

        • brookst a day ago ago

          Crazy how fast we got from “please remove health misinformation during a pandemic” (bad) to “FCC chair says government will revoke broadcast licenses for showing comedians mocking the president” (arguably considerably worse).

          • typeofhuman a day ago ago

            If you're referring to Jimmy Kimmel. You should probably consider that while the FCC member made that comment, Sinclair (the largest ABC affiliate group) and others had been demanding ABC cancel his show for its horrible ratings, and awful rhetoric which inhibited them from selling advertising. His show was bad for business. It's worth suspecting ABC let no good opportunity go to waste: save Kimmel's reputation and scapegoat the termination as political.

            More here: https://sbgi.net/sinclair-says-kimmel-suspension-is-not-enou...

            • alphabettsy a day ago ago
            • brookst a day ago ago

              I can’t figure out what you’re trying to say. It’s no big deal that the head of the FCC says they’ll pull licenses for media outlets that mock the president, because one media outlet says that would be the right commercial decision anyway?

              That can’t be your point, but I also can’t think of a more charitable interpretation.

              • typeofhuman 15 hours ago ago

                It wasn't for mocking the President. It was knowingly lying about at a catastrophic event which is a violation of FCC rules. IIUC it was because Kimmel said in more words that the coward that murdered Charlie Kirk was MAGA. Which is false and Kimmel knew it.

          • themaninthedark a day ago ago

            >On July 20, White House Communications Director Kate Bedingfield appeared on MSNBC. Host Mika Brzezinski asked Bedingfield about Biden's efforts to counter vaccine misinformation; apparently dissatisfied with Bedingfield's response that Biden would continue to "call it out," Brzezinski raised the specter of amending Section 230—the federal statute that shields tech platforms from liability—in order to punish social media companies explicitly.

            >In April 2021, White House advisers met with Twitter content moderators. The moderators believed the meeting had gone well, but noted in a private Slack discussion that they had fielded "one really tough question about why Alex Berenson hasn't been kicked off from the platform."

            Is there a difference between the White House stating they are looking at Section 230 and asking why this one guy has not been banned?

            • slater a day ago ago

              from your paste, it looks like Mika B. brought up the section 230 thing?

              Also, spreading disinformation about covid has real-world implications.

              Orange man getting his feelings hurt because comedian said something isn't even in the same ballpark

              • themaninthedark a day ago ago

                Sorry, I only grabbed part of the quote. Here is it paraphrased as the names are not that familiar to me.

                "Shouldn't they(Facebook and Twitter) be liable for publishing that information and then open to lawsuits?" - MSNBC "Certainly, they should be held accountable, You've heard the president speak very aggressively about this. He understands this is an important piece of the ecosystem." - White House Communications Director Kate Bedingfield

                Source: https://reason.com/2023/01/19/how-the-cdc-became-the-speech-...

                So yes, MSNBC brought up Section 230 and the White House Communications Director says "Yes, we are looking to hold social media accountable."

                >Also from the same source: The Twitter moderators believed the meeting had gone well, but noted in a private Slack discussion that they had fielded "one really tough question about why Alex Berenson hasn't been kicked off from the platform."

                >Throughout 2020 and 2021, Berenson had remained in contact with Twitter executives and received assurances from them that the platform respected public debate. These conversations gave Berenson no reason to think his account was at risk. But four hours after Biden accused social media companies of killing people, Twitter suspended Berenson's account.

                I don't care about Trump's feelings but if we want to be able to speak truth to power, we have to be willing to let people talk shit as well. Yes, COVID has real world implications. Almost everything does.

                People on the left say "Think about the children and implications with regard to this." People on the right say "Think about the children and implications with regard to that."

                Notice how none of them seem to be saying "Let's lay out the facts and let you think about it."

              • tbrownaw a day ago ago

                Preventing people from disputing claims of fact makes it harder to find out if those claims are actually solid. Same for arguments. https://www.goodreads.com/quotes/66643-he-who-knows-only-his...

                Preventing people from having a platform for content-free asshattery doesn't have that problem.

                (A fun implication of this line is reasoning, is that the claim that Kimmel's comments were "lies" makes the jawboning against him more morally bad rather than less bad.)

              • typeofhuman a day ago ago

                > Also, spreading disinformation about covid has real-world implications.

                Your logic can be used to censor anything that goes against the narratives of the arbiters of disinformation.

                > Orange man getting his feelings hurt because comedian said something isn't even in the same ballpark

                Pejorative. Lack of evidence. Ignoring contradictory evidence. Sounds like you are locked in.

      • timmg a day ago ago

        It's interesting how much "they are a private company, they can do what they want" was the talking point around that time. And then Musk bought Twitter and people accuse him of using it to swing the election or whatever.

        Even today, I was listening to NPR talk about the potential TikTok deal and the commenter was wringing their hands about having a "rich guy" like Larry Ellison control the content.

        I don't know exactly what the right answer is. But given their reach -- and the fact that a lot of these companies are near monopolies -- I think we should at least do more than just shrug and say, "they can do what they want."

        • a day ago ago
          [deleted]
      • Ekaros 21 hours ago ago

        If you refuse to distribute some information you are making editorial decision. Clearly you are reviewing all of the content. So you should be fully liable for all content that remains. Including things like libel or copyright violation.

        To me that sounds only fair trade. You editorialize content. You are liable for all content. In every possible way.

      • joannanewsom a day ago ago

        Jimmy Kimmel wasn't being silenced. He doesn't have a right to a late night talk show. Disney is free to end that agreement within the bounds of their contract. Being fired for social media posts isn't being silenced. Employment is for the most part at will. Getting deported for protesting the Gaza war isn't being silenced. Visas come with limitations, and the US government has the authority to revoke your visa if you break those rules. /s

        You seem to think there's a bright line of "silenced" vs "not silenced". In reality there's many ways of limiting and restricting people's expressions. Some are generally considered acceptable and some are not. When huge swaths of communication are controlled by a handful of companies, their decisions have a huge impact on what speech gets suppressed. We should interrogate whether that serves the public interest.

        • amanaplanacanal a day ago ago

          The US has pretty much given up on antitrust enforcement. That's the big problem.

        • scarface_74 a day ago ago

          The federal government was literally pressuring ABC to take Kimmel off the air. Even Ted Cruz and other prominent republicans said that was a bridge too far.

          • joannanewsom 18 hours ago ago

            The federal government was literally pressuring YouTube to remove certain COVID content that did not violate its policies. It's said explicitly in the story.

            What I'm trying to get at is it's possible to stifle people's freedom of expression without literally blocking them from every platform. Threatening their livelihood. Threatening their home. Kicking them off these core social media networks. All of these things are "silencing". And we should be wary of doing that for things we simply disagree about.

            • rendall 16 hours ago ago

              This is such an important idea. I'm afraid that most people do not think beyond "bad opinions should be legally suppressed" and unfortunately that includes many of the purported guardians of our social morals.

            • scarface_74 12 hours ago ago

              Did they say they were going to put YouTube out of business? The FCC threatened to take away ABCs broadcast license.

              • joannanewsom 9 hours ago ago

                Is that a meaningful distinction? If they had offered instead to show strong favoritism to Disney for suspending Jimmy Kimmel would that have made it okay? I think it's wrong for the federal government to pressure Disney in this way regardless of the means.

                • raw_anon_1111 9 hours ago ago

                  There is a huge difference between “soft power” and using the bully pulpit and bringing the full legal force of the government and taking away a broadcast license. This was a bridge too far for conservative politicians.

                  The other distinction you seem to be ignoring is that the Biden administration was doing it because of public health concerns. Trump and the FCC was doing it because a comedian said meab things aboit him and a devout racist.

                  What favoritism did the Biden administration show? They still went after Google for being a monopoly.

                  Unlike Trump who only had his administration approve the Paramount deal after accepting a $15 million bribe in public

      • hn_throw_250915 a day ago ago

        [dead]

      • justinhj a day ago ago

        So you're saying that YouTube is a publisher and should not have section 230 protections? They can't have it both ways. Sure remove content that violates policies but YouTube has long set itself up as an opinion police force, choosing which ideas can be published and monetized and which cannot.

        • tzs a day ago ago

          Section 230 does not work like you think it does. In fact it is almost opposite of what you probably think it does. The whole point was to allow them to have it both ways.

          It makes sites not count as the publisher or speaker of third party content posted to their site, even if they remove or moderate that third party content.

        • bee_rider a day ago ago

          YouTube’s business model probably wouldn’t work if they were made to be responsible for all the content they broadcasted. It would be really interesting to see a world where social media companies were treated as publishers.

          Might be a boon for federated services—smaller servers, finer-grained units of responsibility…

        • krapp a day ago ago
          • justinhj a day ago ago

            Thank you. I was completely wrong about section 230.

    • sazylusan a day ago ago

      Perhaps free speech isn't the problem, but free speech x algorithmic feeds is? As we all know the algorithm favors the dramatic, controversial, etc. That creates an uneven marketplace for free speech where the most subversive and contrarian takes essentially have a megaphone over everyone else.

      • cptnapalm a day ago ago

        As I understand it, Twitter has something called Community Notes. So people can write things, but it can potentially have an attached refutation.

        • prisenco a day ago ago

          Community notes is better than nothing, but they only relate to a single tweet. So if one tweet with misinformation gets 100k likes, then a community note might show up correcting it.

          But if 100 tweets each get 1000 likes, they're never singularly important enough to community note.

          • cptnapalm a day ago ago

            Fair enough on that. The problem I've seen (and don't have a good idea for how to fix) is on Reddit where the most terminally online are the worst offenders and they simply drown out everything else until non-crazy people just leave. It doesn't help that the subreddit mods are disproportionately also the terminally online.

      • hn_throwaway_99 a day ago ago

        Glad to see this, was going to make a similar comment.

        People should be free to say what they want online. But going down "YouTube conspiracy theory" rabbit holes is a real thing, and YouTube doesn't need to make that any easier, or recommend extreme (or demonstrably false) content because it leads to more "engagement".

        • squigz a day ago ago

          Online, sure. But online doesn't mean YouTube or Facebook.

      • AfterHIA a day ago ago

        I feel that this is the right approach-- the liability and toxicity of the platforms isn't due to them being communication platforms it's because in most practical or technical ways they are not: they are deliberate behavior modification schemes where-in companies are willfully inflaming their customer's political and social sentiments for profit in exchange for access to the addictive platform. It's like free digital weed but the catch is that it makes you angry and politically divisive.

        In this sense platforms like X need to be regulated more like gambling. In some ways X is a big roulette wheel that's being spun which will help stochastically determine where the next major school shooting will take place.

        • prisenco a day ago ago

          Right, engagement algorithms are like giving bad takes a rocket ship.

          The words of world renown epidemiologists who were, to be frank, boring and unentertaining could never possibly compete with crunchymom44628 yelling about how Chinese food causes covid.

          Bad takes have the advantage of the engagement of both the people who vehemently agree and the people who vehemently disagree. Everyone is incentivized to be a shock jock. And the shock jocks are then molded by the algorithm to be ever more shock jockish.

          Especially at a time when we were all thrown out of the streets and into our homes and online.

          And here I'll end this by suggesting everyone watch Eddington.

          • AfterHIA 12 hours ago ago

            Just wiki'd Eddington and I'm adding it to my watch list. Thanks for the recommend prisenco.

            One of the sentiments I've been flirted with in posts below/above is the idea that while bad takes and their amplification are indeed a kind of societal evil-- in a society which was more effectively mediated bad takes might serve a vital purpose in the discourse. Societies committed to their own felicity might treat disagreements as an opportunity to extend the public discourse. This seems to be the crux of the thing-- we can take all day about checks and balances but unless a society is truly at some level committed to its own preservation and expansion those checks and balances will end up becoming tools for domination and exploitation as we see in the United States.

            It don't care how well you can bake you can't make apple pie with rotten apples. No amount of sugar will correct the rot. Th trick is growing healthy apples.

      • sazylusan a day ago ago

        Building on that, the crazy person spouting conspiracy theories in the town square, who would have been largely ignored in the past, suddenly becomes the most visible.

        The first amendment was written in the 1700s...

    • yongjik a day ago ago

      I feel like we're living in different worlds, because from what I've seen, giving people platforms clearly doesn't work either. It just lets the most stupid and incendiary ideas to spread unchecked.

      If you allow crazy people to "let it ride" then they don't stop until... until... hell we're still in the middle of it and I don't even know when or if they will stop.

      • atmavatar a day ago ago

        I wonder how much of that is giving a platform to conspiracy theorists and how much of it is the social media algorithms' manipulation making the conspiracy theories significantly more visible and persuasive.

        • prawn a day ago ago

          Is there any consideration of this with regard to Section 230? e.g., you're a passive conduit if you allow something to go online, but you're an active publisher if you actively employ any form of algorithm to publish and promote?

        • thrance 15 hours ago ago

          Look at 4chan and it's derivatives: minimal algorithms and they're the shitholes of ideas on the internet.

      • mac-attack a day ago ago

        It's poorly thought out logic. Everyone sees how messy and how mistakes can be made when attempting to get to a truth backed by data + science, so they somehow they conclude that allowing misinformation to flourish will solve the problem instead of leading to a slow decline of morality/civilization.

        Very analogous to people who don't like how inefficient governments function and somehow conclude that the solution is to put people in power with zero experience managing government.

        • mitthrowaway2 a day ago ago

          There's a journey that every hypothesis makes on the route to becoming "information", and that journey doesn't start at top-down official recognition. Ideas have to circulate, get evaluated and rejected and accepted by different groups, and eventually grasp their way towards consensus.

          I don't believe Trump's or Kennedy's ideas about COVID and medicine are the ones that deserve to win out, but I do think that top-down suppression of ideas can be very harmful to truth seeking and was harmful during the pandemic. In North America I believe this led to a delayed (and ultimately minimal) social adoption of masks, a late acceptance of the aerosol-spread vector, an over-emphasis on hand washing, and a far-too-late restriction on international travel and mass public events, well past the point when it could have contributed to containing the disease (vs Taiwan's much more effective management, for example).

          Of course there's no guarantee that those ideas would have been accepted in time to matter had there been a freer market for views, and of course it would have opened the door to more incorrect ideas as well, but I'm of the view that it would have helped.

          More importantly I think those heavy restrictions on pre-consensus ideas (as many of them would later become consensus) helped lead to a broader undermining of trust in institutions, the fallout of which we are observing today.

          • thrance 15 hours ago ago

            That journey of "fringe hypothesis" to "actual fact" doesn't start out on right-wing Facebook groups, nor has it ever. There are more efficient channels for this. Make a paper, submit it for review, tell the press maybe. But social media can't play a part in establishing the truths we hold for granted, lest we want to be ruled by absolute buffoons that would make vaccines and paracetamol illegal on pseudo-scientific grounds.

          • mac-attack a day ago ago

            The issues you are bringing up don't highlight that they stuck with the wrong decision, but rather that they didn't pivot to the right decision as fast as you'd like... yet your solution is bottom-up decision-making that will undoubtedly take much much longer to reach a consensus? How do you square that circle?

            Experts can study and learn from their prior mistakes. Continually doing bottom-up when we have experts is inefficient and short-sighted, no? Surely you would streamline part of the process and end up in the pre-Trump framework yet again?

            Also, I'm curious why you have such a rosy picture of the bottom-up alternatives? Are you forgetting about the ivermectin overdoses? 17,000 deaths related to hydroxychloroquine? The US president suggesting people drinking bleach? It is easy to cherry pick the mistakes that science makes while overlooking the noise and misinformation that worms its way into less-informed/less-educated thinkers when non-experts are given the reins

            • mitthrowaway2 a day ago ago

              No, I'm not criticizing the officials for failing to reach the correct decision or adopt the correct viewpoints faster than they did. Institutions are large and risk-averse, data was incomplete, and people make mistakes.

              I'm criticizing them for suppressing the dissemination of ideas that did later turn out to be correct. I hope the distinction is clear.

              If you're going to impose a ban on the dissemination of ideas, you'd better be ten thousand percent sure that nothing covered by that ban later turns out to be the truth. Not a single one, not even if every other idea that got banned was correctly identified as a falsehood. Otherwise, the whole apparatus falls apart and institutions lose trust.

              I'm not forgetting ivermectin overdoses. I don't believe my picture is rosy. I'm aware of all the garbage ideas out there, which is why the measles is back and all the other madness. But I'm firmly of the opinion that trying to suppress these bad ideas has only redoubled their strength in the backlash, and caused a rejection of expert knowledge altogether.

    • Aloha a day ago ago

      I think it made sense as a tactical choice at the moment, just like censorship during wartime - I dont think it should go on forever, because doing so is incompatible with a free society.

      • llm_nerd a day ago ago

        It didn't even make sense at the time. It tainted everything under a cloud that the official, accepted truth needed to suppress alternatives to win the battle of minds. It was disastrous, and it is astonishing seeing people (not you, but in these comments) still trying to paint it as a good choice.

        It massively amplified the nuts. It brought it to the mainstream.

        I'm a bit amazed seeing people still justifying it after all we've learned.

        COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.

        And to state my position like the root guy, I'm a progressive, pro-vaccine, medical science believer. I listen to my doctor and am skeptical if not dismissive of the YouTube "wellness" grifters selling scam supplements. I believe in science and research. I thought the worm pill people were sad if not pathetic. Anyone who gets triggered by someone wearing a mask needs to reassess their entire life.

        But lockdowns went on way too long. Limits on behaviour went on way too long. Vaccine compliance measures were destructive the moment we knew it had a negligible effect on spread. When platforms of "good intentions" people started silencing the imbeciles, it handed them a megaphone and made the problem much worse.

        And now we're living in the consequences. Where we have a worm-addled halfwit directed medicine for his child-rapist pal.

        • LeafItAlone a day ago ago

          >It massively amplified the nuts. It brought it to the mainstream.

          >COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.

          In theory, I agree, kind of.

          But also - we were 10+ months into COVID raging in the US before Biden’s administration, the administration that enacted the policies the article is about, came to be. Vaccine production and approval were well under way, brought to fruition in part due to the first Trump administration. The “nuts” had long been mainstream and amplified before this “silencing” began. Misinformation was rampant and people were spreading it at a quick speed. Most people I know who ultimately refused the vaccines made up their minds before Biden took office.

          • jasonlotito a day ago ago

            > But also - we were 10+ months into COVID raging in the US before Biden’s administration, the administration that enacted the policies the article is about, came to be.

            Google makes it very clear that these were choices they made, and were independent of whatever the government was asking. Suggesting these policies are anything other than Google's is lying.

          • a day ago ago
            [deleted]
          • llm_nerd a day ago ago

            Sure, but I'm not remotely blaming Biden[1]. A lot of tech companies took this on themselves, seeing themselves as arbiters of speech for a better world. Some admin (Trump admin) people might have given them suggestions, but they didn't have to do the strong-arm stuff, and the results weren't remotely helpful.

            We already had a pretty strong undercurrent of contrarianism regarding public health already -- it's absolutely endemic on here, for instance, and was long before COVID -- but it mainstreamed it. Before COVID I had a neighbour that would always tell me hushed tones that he knows what's really going on because he's been learning about it on YouTube, etc. It was sad, but he was incredibly rare. Now that's like every other dude.

            And over 80% of the US public got the vaccine! If we were to do COVID again, I doubt you'd hit even 40% in the US now. The problem is dramatically worse.

            [1] That infamous Zuck interview with Rogan, where Zuck licked Trump's anus to ingratiate himself with the new admin, was amazing in that he kept blaming Biden for things Meta did long before Biden's admin took office or even took shape. Things he did at the urging of the Trump admin pt 1. I still marvel that he could be so astonishingly deceptive and people don't spit in his lying face for it.

        • a day ago ago
          [deleted]
      • ioteg a day ago ago

        [dead]

    • Zanfa 21 hours ago ago

      IMO free speech requires moderation, but the "how" is an unsolved problem. In a completely unmoderated environment, free speech will be drowned out by propaganda from your adversaries. The decades of experience and the industrial scale that Russian (or similar) troll factories can manufacture grassroots content or fund influencers is not something that can be combated at an individual level.

      It would be a mistake to think such operations care too much about specific talking points, the goal is to drown out moderate discussion to replace it with flamewars. It's a numbers game, so they'll push in hundreds of different directions until they find something that sticks and also both sides of the same conflict.

    • theshrike79 17 hours ago ago

      The problem is the algorithm.

      Content that makes people angry (extreme views) brings views.

      Algorithims optimise for views -> people get recommended extreme views.

      You can test this with a fresh account, it doesn't take many swipes on Youtube Shorts to get some pretty heinous shit if you pretend to be a young male to the algorithm.

    • kypro a day ago ago

      I agree. People today are far more anti-vaccine than they were a few years ago which is kinda crazy when you consider we went through a global pandemic where one of the only things that actually worked to stop people dying was the roll out of effective vaccines.

      I think if public health bodies just laid out the data they had honestly (good and bad) and said that they think most people should probably take it, but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.

      • gm678 a day ago ago

        That didn't happen in a vacuum; there was also a _lot_ of money going into pushing anti vaccine propaganda, both for mundane scam reasons and for political reasons: https://x.com/robert_zubrin/status/1863572439084699918?lang=...

      • trollbridge a day ago ago

        And the attempts at censorship have played a part in people drifting towards being more vaccine-hesitant or anti-vaccine.

        It's often a lot better to just let kooks speak freely.

        • hypeatei 15 hours ago ago

          > It's often a lot better to just let kooks speak freely.

          They have always been able to speak freely. I still see vaccine conspiracies on HN to this day. It was rampant during COVID as well.

        • vFunct a day ago ago

          It's less about censorship and more about more people becoming middle-class and therefore thinking they're smarter than researchers.

          There is nobody more confident in themselves than the middle-class.

          • khazhoux a day ago ago

            That’s a very confident statement presented without a hint of evidence.

            • vFunct a day ago ago

              You know that there are studies addressing this, right? I didn't just make it up.

              Here's an overview study that reviewed other studies: https://jphe.amegroups.org/article/view/9493/html

              "Pre-COVID-19 interviews with a high-income vaccine hesitant sample in Perth, Australia found that vaccine hesitancy was based on an inflated sense of agency in making medical decisions without doctors or public health officials, and a preference for “natural” methods of healthcare (30)."

              "A similar study in the United States reported on interviews from 25 White mothers in a wealthy community who refused vaccination for their children (31). These participants reported high levels of perceived personal efficacy in making health decisions for their children and higher confidence in preventing illness through individual “natural” measures such as eating organic food and exercising. Additionally, these participants report lower perceived risk of infection or disease, which is contrasted with their high perceived risk of vaccination."

              "Vaccine hesitancy among those with privilege may be more than just a product of resource access. There is evidence that individuals with high socioeconomic status perceive themselves to be more capable, hardworking, important, and deserving of resources and privileges than others (32,33)"

              • khazhoux a day ago ago

                You said the middle class is the most unreasonably confident group of people. I don't see anything to that effect in what you posted. Yes, I think it's just your made-up dismissive generalization.

                • vFunct 11 hours ago ago

                  I don't see anything in your response that refutes that. Sounds like you have no argument.

      • someNameIG a day ago ago

        It's more that people in general* connect to personal stories far more than impersonal factual data. It's easy to connect to seeing people say they had adverse reactions to a vaccine than statistical data showing it's safer to get vaccinated than not. It's also easier to believe conspiracies, its easier to think bad things happen due to the intent of bad people, than the world being a complex hard to understand place with no intent behind things happening.

        These are just things that some of the population will be more attracted to, I don't think it has anything to do with censorship, lockdowns, or mandates. At most the blame can be at institutions for lacking in their ability to do effective scientific communication.

        *And this skews more to less educated and intelligent.

      • nxm a day ago ago

        Issue is when we weren't/aren't even allowed to question the efficacy or long-term side effects of any vaccine.

      • logicchains a day ago ago

        >where one of the only things that actually worked to stop people dying was the roll out of effective vaccines.

        The only reason you believe that is because all information to the contrary was systematically censored and removed from the media you consume. The actual data doesn't support that, there are even cases where it increased mortality, like https://pmc.ncbi.nlm.nih.gov/articles/PMC11278956/ and increased the chance of future covid infections, like https://pubmed.ncbi.nlm.nih.gov/39803093/ .

        • wvenable a day ago ago

          It isn't hard to find that randomized controlled trials and large meta-analyses show that COVID vaccines are highly effective. No need to rely on media. You can point to one or two observational re-analyses that show otherwise but overall they are not particularly convincing given the large body of easily accessible other evidence.

          • lisbbb a day ago ago

            I don't think a meta analysis is worth anything at all, to be totally honest with you. I also don't think those gene therapy shots were at all effective, given how many people contracted covid after receiving the full course of shots. I think basic herd immunity ended covid and the hysteria lasted far beyond the timeframe in which there was truly a problem. Furthermore, I think those shots are the cause of many cancers, including my wife's. The mechanism? The shots "programmed" the immune system to produce antibodies against covid to the detriment of all other functions, including producing the killer T-Cells that destroy cells in the early stages of becoming cancerous. That's why so many different cancers are happening, as well as other weird issues like the nasty and deadly clotting people had. I have no idea about mycarditis, but that's fine because it is a well documented side effect that has injured a lot of people. So cancer and pulmonary issues are the result of those poorly tested drugs that were given out to millions of people without informed consent and with no basic ethical controls on the whole massive experiment. And before you gaslight me, please understand that my wife, age 49 was diagnosed with a very unusual cancer for someone of her sex and age and it's been a terrible fight since June of 2024 to try and save her life, which has nearly been lost 3x already! Of course I have no proof that the Pfizer shots caused any of this, but damn, it sure could have been that. Also, her cousin, age 41, was diagnosed with breast cancer that same year. So tell me, how incredibly low probability is it that two people in the same family got cancer in the same year? It's got to be 1 in 10 million or something like that. Just don't gaslight me--we can agree to disagree. I'm living the worst case scenario post covid and I only hope my daughter, who also got the damn shots never comes down with cancer.

            • wvenable 21 hours ago ago

              I am sorry to hear what you and your wife are going through. Nothing I say here is meant to dismiss your experience.

              That said, I think it's important to separate personal experiences from what the larger body of evidence shows. Many vaccinated people still got COVID, especially once Omicron came along. The vaccines were never perfect at preventing infection. But the strongest data we have from randomized trials and real-world results show that vaccinated people were far less likely to end up in the ICU or die from COVID. That's what the vaccines were designed to do and that's where they consistently worked.

              As for cancer, I understand why you'd connect your wife's diagnosis to the vaccine -- it's natural to search for causes -- our brains are wired to look for patterns especially when big events happen close together. But cancer registries and monitoring systems around the world haven't found an increase in cancer rates linked to COVID vaccines. The vaccines give a short-lived immune stimulus; they don't reprogram the immune system or permanently shut down T-cells. My family has a long history of cancer going back generations. Literally every other member of my family has had cancer long before COVID. The idea that there is a low probability of two people in the same family getting cancer in the same year is unfortunately not as unlikely as you want to believe. That is perhaps a cold comfort but doctors and scientists aren't seeing the pattern you're worried about.

              That isn't to say there aren't side effects to the vaccine. Myocarditis and clotting problems are well documented but rare side-effects. In fact, someone I know about indirectly had a heart attack immediately after the COVID vaccine -- his family is genetically predisposed to this kind of heart attack but it was directly triggered by the shot (he survived). It's good to acknowledge those risks. But when you look at the big picture, health agencies estimate that the vaccines prevented millions of deaths. I sadly know of a few people who died from COVID prior to vaccine availability and have family members with permanent lung issues. They're currently struggling to get another COVID shot because they don't think they can survive getting it unprotected again.

        • rpiguy a day ago ago

          I appreciate you.

          People have become more anti-Vax because the Covid vaccines were at best ineffective and as you said anything contra-narrative is buried or ignored.

          If you push a shitty product and force people to take it to keep their jobs it’s going to turn them into skeptics of all vaccines, even the very effective ones.

          More harm than good was done there. The government should have approved them for voluntary use so the fallout would not have been so bad.

        • cynicalkane a day ago ago

          This is typical of Covid conspiracy theorists, or conspiracy theorists of any sort: one or two papers on one side prove something, but an overwhelming mountain of evidence on the other side does not prove something. The theorist makes no explanation as to how a planetful of scientists missed the obvious truth that some random dudes found; they just assert that it happened, or make some hand-waving explanation about how an inexplicable planet-wide force of censors is silencing the few unremarkable randos who somehow have the truth.

          The first paper seems to claim a very standard cohort study is subject to "immortal time bias", an effect whereby measuring outcomes can seem to change them. The typical example of sampling time bias is that slow-growing cancers are more survivable than fast-growing ones, but also more likely to be measured by a screening, giving a correlation between screening and survivablility. So you get a time effect where more fast-acting cancers do not end up in the measurement, biasing the data.

          But in measurements such that one outcome or the other does not bias the odds of that outcome being sampled, there can be no measurement time effect, which is why it's not corrected for in studies like this. The authors do not explain why measurement time effects would have anything to do with detecting or not detecting death rates in the abstract, or anywhere else in the paper, because they are quacks, who apply arbitrary math to get the outcome they want.

          As another commenter pointed out, randomized controlled trials -- which cannot possibly have this made-up time effect -- often clearly show a strongly positive effect for vaccination.

          I did not read the second paper.

          • lisbbb a day ago ago

            There is no conspiracy, the studies were all crap! They raced through them and failed at basic double blind experiments as well as giving control groups live shots afterwards, thus eliminating any retrospective studies. There was never any positive effect. It didn't exist. It's disgusting what happened and how so many professionals that we rely on to stand up and tell the truth knuckled under to the pressure of the moment and lied or turned their backs.

      • vkou a day ago ago

        > but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.

        Nah, the same grifters who stand to make a political profit of turning everything into a wedge issue would have still hammered right into it. They've completely taken over public discourse on a wide range of subjects, that go well beyond COVID vaccines.

        As long as you can make a dollar by telling people that their (and your) ignorance is worth just as much - or more - than someone else's knowledge, you'll find no shortage of listeners for your sermon. And that popularity will build its own social proof. (Millions of fools can't all be wrong, after all.)

        • kypro a day ago ago

          I agree. Again the vast majority would have gotten the vaccine.

          There's always going to be people for all kinds of reasons pushing out bad ideas. That's part of the trade-off of living in a free society where there is no universal "right" opinion the public must hold.

          > They've completely taken over public discourse on a wide range of subjects

          Most people are not anti-vax. If "they've" "taken over public discourse" in other subjects to the point you are now holding a minority opinion you should consider whether "they" are right or wrong and why so many people believe what they do.

          If can't understand their position and disagree you should reach out to people in a non-confrontational way, understand their position, then explain why you disagree (if you still do at that point). If we all do a better job at this we'll converge towards truth. If you think talking and debate isn't the solution to disagreements I'd argue you don't really believe in our democratic system (which isn't a judgement).

          • vel0city a day ago ago

            While I do agree "most people are not anti-vax", the rates of opting out of vaccines or doing delayed schedules or being very selective have gone way up.

            Some of these public school districts in Texas have >10% of students objecting to vaccines. My kids are effectively surrounded by unvaccinated kids whenever they go out in public. There's a 1 in 10 chance that kid on the playground has never had a vaccine, and that rate is increasing.

            A lot of the families I know actively having kids are pretty crunchy and are at least vaccine hesitant if not outright anti-vax.

            https://www.dshs.texas.gov/sites/default/files/LIDS-Immuniza...

      • stefantalpalaru a day ago ago

        > one of the only things that actually worked to stop people dying was the roll out of effective vaccines

        "A total of 913 participants were included in the final analysis. The adjusted ORs for COVID-19 infection among vaccinated individuals compared to unvaccinated individuals were 1.85 (95% CI: 1.33-2.57, p < 0.001). The odds of contracting COVID-19 increased with the number of vaccine doses: one to two doses (OR: 1.63, 95% CI: 1.08-2.46, p = 0.020), three to four doses (OR: 2.04, 95% CI: 1.35-3.08, p = 0.001), and five to seven doses (OR: 2.21, 95% CI: 1.07-4.56, p = 0.033)." - ["Behavioral and Health Outcomes of mRNA COVID-19 Vaccination: A Case-Control Study in Japanese Small and Medium-Sized Enterprises" (2024)](https://www.cureus.com/articles/313843-behavioral-and-health...)

        "the bivalent-vaccinated group had a slightly but statistically significantly higher infection rate than the unvaccinated group in the statewide category and the age ≥50 years category" - ["COVID-19 Infection Rates in Vaccinated and Unvaccinated Inmates: A Retrospective Cohort Study" (2023)](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10482361/)

        "The risk of COVID-19 also varied by the number of COVID-19 vaccine doses previously received. The higher the number of vaccines previously received, the higher the risk of contracting COVID-19" - ["Effectiveness of the Coronavirus Disease 2019 (COVID-19) Bivalent Vaccine" (2022)](https://www.medrxiv.org/content/10.1101/2022.12.17.22283625v...)

        "Confirmed infection rates increased according to time elapsed since the last immunity-conferring event in all cohorts. For unvaccinated previously infected individuals they increased from 10.5 per 100,000 risk-days for those previously infected 4-6 months ago to 30.2 for those previously infected over a year ago. For individuals receiving a single dose following prior infection they increased from 3.7 per 100,000 person days among those vaccinated in the past two months to 11.6 for those vaccinated over 6 months ago. For vaccinated previously uninfected individuals the rate per 100,000 person days increased from 21.1 for persons vaccinated within the first two months to 88.9 for those vaccinated more than 6 months ago." - ["Protection and waning of natural and hybrid COVID-19 immunity" (2021)](https://www.medrxiv.org/content/10.1101/2021.12.04.21267114v...)

      • dotnet00 a day ago ago

        [flagged]

        • mrcwinn a day ago ago

          If that were the case, wouldn’t we see vaccine skepticism in poorly educated, racist non-Western nations?

          • dotnet00 a day ago ago

            As the other reply mentions, that's where the "in your face" part comes in. Many of the diseases that can be prevented by vaccines are in living memory for those countries.

            On top of that, 'poorly educated' in those countries often means never having been to a proper school, never having finished basic schooling, being illiterate, or lacking access to information (be it the internet or social programs). That kind of skepticism is easier to help, because it stems from a place of actual ignorance, rather than believing oneself to be smarter than everyone else.

          • Jensson a day ago ago

            You do see a lot of vaccine skepticism in such countries, this study found about half of Africans view vaccines negatively.

            https://pmc.ncbi.nlm.nih.gov/articles/PMC9903367/

          • braiamp a day ago ago

            You don't see those, because it's on their faces. Or more accurately on our faces. I live in such country, and we kill for having our kids vaccinated. We live these diseases, so we aren't so stupid to fall for misinformation.

        • a day ago ago
          [deleted]
        • xdennis a day ago ago

          > I think the anti-vax thing is mostly because the average Western education level is just abysmal.

          What does the West have to do with it? Non-westerners are even more into folk medicine and witch doctors.

          • dotnet00 a day ago ago

            They're into folk medicine, but their anti-vax issues generally come from people who don't have any means of knowing better (i.e. never been to school, dropped out at a very early grade, isolated, not even literate). Typically just education and having a doctor or a local elder respectfully explain to them that the Polio shot will help prevent their child from being paralyzed for life is enough to convince them.

            Meanwhile the 'educated' Westerner, to whom Polio is a third-world disease, will convince themselves that the doctor is lying for some reason, will choose to take the 75% chance of an asymptomatic infection because they don't truly appreciate how bad it can otherwise be, will use their access to a vast collection of humanity's information to cherry pick data that supports their position (most likely while also claiming to seek debate despite not intending to seriously consider opposing evidence), and if their gamble fails, will probably just blame immigrants, government or 'big pharma' for doing it.

          • andrewmcwatters a day ago ago

            And yet, SEA and others are still better educated than us.

            • LeafItAlone a day ago ago

              >SEA and others are still better educated than us.

              Honest question: is this true? What’s the data around this? If it is true, why are there so many people from SEA in American universities? Wouldn’t they stay in their home country or another in the area?

              I’m truly trying to learn here and square this statement with what I’ve come to understand so far.

        • kypro a day ago ago

          Anti-vax has never really been a thing though. I don't know what the data is these days, but it used to be like 1% of the population who were anti-vax.

          We have the same thing going on with racism in the West where people are convinced racism is a much bigger problem than it actually is.

          And whether it's anti-vax or racist beliefs, when you start attacking people for holding these views you always end up inadvertently encouraging people to start asking why that is and they end up down rabbit holes.

          No one believes peas cause cancer for example, but I guarantee one of best ways to make people start to believing peas cause cancer is for the media to start talking about how some people believe that peas do cause cancer, then for sites like YouTube and Facebook to starting ban people who talk about it. Because if they allow people to talk about UFOs and flat Earth conspiracies why are they banning people for suggesting that peas cause cancer? Is there some kind of conspiracy going on funded by big agriculture? You can see how this type of thinking happens.

          • dotnet00 a day ago ago

            Anti-vax was enough of an issue that vaccine mandates were necessary for Covid.

            It also isn't convincing to be claiming that racism isn't as big in the West given all the discourse around H1Bs, Indians (the Trump base has been pretty open on this one, with comments on JD Vance's wife, the flood of anti-Indian racism on social media, and recently the joy taken in attempting to interfere with Indians forced to fly back to the US in a hurry due to the lack of clarity on the H1B thing), how ICE is identifying illegals, a senator openly questioning the citizenship of a brown mayoral candidate and so on.

            I agree that denying something is the easiest way to convince people of the opposite, but it's also understandable when social media companies decide to censor advice from well known individuals that people should do potentially harmful things like consume horse dewormer to deal with Covid. Basically, it's complicated, though I would prefer to lean towards not censoring such opinions.

        • logicchains a day ago ago

          The anti-vax thing is because every single comparative study of vaccinated and unvaccinated children found a greater rate of developmental disorders in vaccinated children. They're also the only products for which you're not allowed to sue the manufacturers for liability, and the justification given by the manufacturers for requesting this liability protection was literally that they'd be sued out of business otherwise. If they were as safe as other treatments they wouldn't need a blanket liability immunity.

          Anthony R. Mawson, et al., “Pilot Comparative Study on the Health of Vaccinated and Unvaccinated 6 to 12-year-old U.S. Children,” Journal of Translational Science 3, no. 3 (2017): 1-12, doi: 10.15761/JTS.1000186

          Anthony R. Mawson et al., “Preterm Birth, Vaccination and Neurodevelopmental Disorders: A Cross-Sectional Study of 6- to 12-Year-Old Vaccinated and Unvaccinated Children,” Journal of Translational Science 3, no. 3 (2017): 1-8, doi:10.15761/JTS.1000187.

          Brian Hooker and Neil Z. Miller, “Analysis of Health Outcomes in Vaccinated and Unvaccinated Children: Developmental Delays, Asthma, Ear Infections and Gastrointestinal Disorders,” SAGE Open Medicine 8, (2020): 2050312120925344, doi:10.1177/2050312120925344.

          Brian Hooker and Neil Z. Miller, “Health Effects in Vaccinated versus Unvaccinated Children,” Journal of Translational Science 7, (2021): 1-11, doi:10.15761/JTS.1000459.

          James Lyons-Weiler and Paul Thomas, “Relative Incidence of Office Visits and Cumulative Rates of Billed Diagnoses along the Axis of Vaccination,” International Journal of Environmental Research and Public Health 17, no. 22 (2020): 8674, doi:10.3390/ijerph17228674.

          James Lyons-Weiler, "Revisiting Excess Diagnoses of Illnesses and Conditions in Children Whose Parents Provided Informed Permission to Vaccinate Them" September 2022 International Journal of Vaccine Theory Practice and Research 2(2):603-618 DOI:10.56098/ijvtpr.v2i2.59

          NVKP, “Diseases and Vaccines: NVKP Survey Results,” Nederlandse Vereniging Kritisch Prikken, 2006, accessed July 1, 2022.

          Joy Garner, “Statistical Evaluation of Health Outcomes in the Unvaccinated: Full Report,” The Control Group: Pilot Survey of Unvaccinated Americans, November 19, 2020.

          Joy Garner, “Health versus Disorder, Disease, and Death: Unvaccinated Persons Are Incommensurably Healthier than Vaccinated,” International Journal of Vaccine Theory, Practice and Research 2, no. 2, (2022): 670-686, doi: 10.56098/ijvtpr.v2i2.40.

          Rachel Enriquez et al., “The Relationship Between Vaccine Refusal and Self-Report of Atopic Disease in Children,” The Journal of Allergy and Clinical Immunology 115, no. 4 (2005): 737-744, doi:10.1016/j.jaci.2004.12.1128.

          • jawarner a day ago ago

            Mawson et al. 2017 (two papers) – internet survey of homeschoolers recruited from anti-vaccine groups; non-random, self-reported, unverified health outcomes. Retracted by the publisher after criticism.

            Hooker & Miller 2020/2021 – analysis of “control group” data also from self-selected surveys; same methodological problems.

            Lyons-Weiler & Thomas 2020, 2022 – data from a single pediatric practice run by one of the authors; serious selection bias.

            Joy Garner / NVKP surveys – activist-run online surveys with no verification.

            Enriquez et al. 2005 – a small cross-sectional study about allergy self-reports, not about overall neurodevelopment.

            Large, well-controlled population studies (Denmark, Finland, the U.S. Vaccine Safety Datalink, etc.) comparing vaccinated vs. unvaccinated children show no increase in autism, neurodevelopmental disorders, or overall morbidity attributable to recommended vaccines.

          • MSM a day ago ago

            I picked one at random (NVKP, "Diseases and Vaccines: NVKP Survey Results") and, while I needed to translate it to read it, it's clear (and loud!) about not actually being a scientific study.

            "We fully realize that a survey like this, even on purely scientific grounds, is flawed on all counts. The sample of children studied is far too small and unrepresentative, we didn't use control groups, and so on."

            Turns out the NVKP roughly translates to "Dutch Organization for those critical towards vaccines."

            I understand being skeptical about vaccines, but the skepticism needs to go both ways

          • lkey a day ago ago

            "If they were as safe as other treatments they wouldn't need a blanket liability immunity." Citation very much needed for this inference.

            Even if I granted every single paper's premise here. I'd still much rather have a living child with a slightly higher chance of allergies or asthma or <insert survivable condition here> than a dead child. How quickly we forget how bad things once were. Do you dispute that vaccines also accounted for 40% of the decline in infant mortality over the last 50 years? And before that, TB, Flu, and Smallpox killed uncountably many people. Vaccines are a public good and one of the best things we've ever created as a species.

            Do you also have theories about autism you'd like to share with the class?

            • TimorousBestie a day ago ago

              A very good point. These studies should be comparing QALYs (quality-adjusted life years, a measure of disease burden) instead of relative prevalence of a handful of negative outcomes, the latter of which is much more vulnerable to p-hacking.

          • conception a day ago ago

            Here’s where the “bad ideas out in the open get corrected” now is tested. There are 4 really good refutations of your evidence. Outside of the unspoken “perhaps vaccines cause some measurable bad outcomes but compare then to measles. And without the herd immunity vaccinations aren’t nearly as useful” argument.

            So the important question is: Are you now going to say “well, I guess i got some bad data and i have to go back and review my beliefs” or dig in?

          • barbazoo a day ago ago

            > If they were as safe as other treatments they wouldn't need a blanket liability immunity.

            Other treatments aren’t applied preventatively to the entire population which is why the risk presumably is lower.

          • tnias23 a day ago ago

            The studies you cite are the typical ones circulated by antivaxers and are not considered credible by the medical community due to severe methodological flaws, undisclosed biases, retractions, etc.

            To the contrary, high quality studies consistently show that vaccines are not linked to developmental disability or worse health outcomes.

          • TimorousBestie a day ago ago

            > Anthony R. Mawson, et al., “Pilot Comparative Study on the Health of Vaccinated and Unvaccinated 6 to 12-year-old U.S. Children,” Journal of Translational Science 3, no. 3 (2017): 1-12, doi: 10.15761/JTS.1000186

            Retracted: https://retractionwatch.com/2017/05/08/retracted-vaccine-aut...

            If you edit down your list to journal articles that you know you be valid and unretracted, I will reconsider looking through it. However, journal access in general is too expensive for me to bother reading retracted articles.

      • boxerab a day ago ago

        Yes! This MUST be why the VAERS adverse event tracker went through the roof right after the rollout began, and why excess death remains sky high in many countries to this day - because a product that didn't stop you from catching or spreading the virus was one of the only things preventing deaths. Couldn't have been our, you know, immune system or anything like that, or that the average age at death was 80 along with several co-morbidities.

    • dakial1 13 hours ago ago

      I used to think like you, believing that, on average, society would expurge the craziness, but the last decade and the effect of social media and the echo chambers in groups made me see that I was completely wrong.

    • yojo a day ago ago

      I think there's a difference between silencing people, and having an algorithm that railroads people down a polarization hole.

      My biggest problem with YouTube isn't what it does/doesn't allow on its platform, it's that it will happily feed you a psychotic worldview if it keeps you on the site. I've had several family members go full conspiracy nut-job after engaging heavily with YouTube content.

      I don't know what the answer is. I think many people would rightly argue that removing misinformation from the recommendation engine is synonymous with banning it. FWIW I'd be happy if recommendation engines generally were banned for being a societal cancer, but I'm probably in the minority here.

    • electriclove a day ago ago

      I agree and I’m pro vaccines but want the choice on if/when to vaccinate my kids. I believe there were election discrepancies but not sure if it was stolen. I felt the ZeroHedge article about lab leak was a reasonable possibility. All these things were shutdown by the powers that be (and this was not Trump’s fault). The people shutting down discourse are the problem.

      • amanaplanacanal a day ago ago

        You pretty much have the choice about vaccinating your kids. You might not be able to send them to public school without vaccinations though, depending on your local laws.

        • electriclove 21 hours ago ago

          In California, it is required for public schools and many private schools also require it, so effectively it isn't much of a choice.

    • trinsic2 a day ago ago

      They don't silence people to stop narratives. People are silenced to cause divisions and to exert control over the population. When people stop using tech they don't control and supporting people or systems that do not have their best interests a heart, only then will we see reach change.

      • brookst a day ago ago

        There is no conspiracy. It’s all emergent behavior by large groups of uncoordinated dunces who can’t keep even the most basic of secrets.

        • trinsic2 10 hours ago ago

          Its known strategy that happens all the time by corrupt individuals in governments around the world. Its so pervasive, im not going to even to bother posting links.

    • hash872 a day ago ago

      It's their private property, they can ban or promote any ideas that they want to. You're free to not use their property if you disagree with that.

      If 'silencing people' doesn't work- so online platforms aren't allowed to remove anything? Is there any limit to this philosophy? So you think platforms can't remove:

      Holocaust denial? Clothed underage content? Reddit banned r/jailbait, but you think that's impermissible? How about clothed pictures of toddlers but presented in a sexual context? It would be 'silencing' if a platform wanted to remove that from their private property? Bomb or weapons-making tutorials? Dangerous fads that idiotic kids pass around on TikTok, like the blackout game? You're saying it's not permissible for a platform to remove dangerous instructionals specifically targeted at children? How about spam? Commercial advertising is legally speech in the US. Platforms can't remove the gigantic quantities of spam they suffer from every day?

      Where's the limiting principle here? Why don't we just allow companies to set their own rules on their own private property, wouldn't that be a lot simpler?

      • softwaredoug a day ago ago

        I used to believe this. But I feel more and more we need to promote a culture of free speech that goes beyond the literal first amendment. We have to tolerate weird and dangerous ideas.

        • andrewmcwatters a day ago ago

          Better out in the open with refutations or warnings than in the dark where concepts become physical dangers.

          • benjiro a day ago ago

            Refuting does not work... You can throw scientific study upon study, doctor upon doctor, ... negatives run deeper then positives.

            In the open, it becomes normalized, it draws in more people. Do you rather have some crazies in the corner, or 50% of a population that believes something false, as it became normalized.

            The only people benefiting from those dark concepts are those with financial gains. They make money from it, and push the negatives to sell their products and cures. Those that fight against it, do not gain from it and it cost them time/money. That is why is a losing battle.

            • int_19h a day ago ago

              Whether something is normalized or not is mostly down to public opinion, not to censorship (whether by the government or by private parties).

              Few countries have more restrictions on Nazi speech than Germany. And yet not only AfD is a thing, but it keeps growing.

      • drak0n1c a day ago ago

        Read the article, along with this one https://reclaimthenet.org/google-admits-biden-white-house-pr...

        In this case it wasn't a purely private decision.

      • rahidz a day ago ago

        "Where's the limiting principle here?"

        How about "If the content isn't illegal, then the government shouldn't pressure private companies to censor/filter/ban ideas/speech"?

        And yes, this should apply to everything from criticizing vaccines, denying election results, being woke, being not woke, or making fun of the President on a talk show.

        Not saying every platform needs to become like 4chan, but if one wants to be, the feds shouldn't interfere.

      • TeeMassive a day ago ago

        > It's their private property, they can ban or promote any ideas that they want to. You're free to not use their property if you disagree with that.

        1) They are public corporations and are legal creation of the state and benefit from certain protections of the state. They also have privileged access to some public infrastructures that other private companies do not have.

        2) By acting on the behest of the government they were agent of the government for free speech and censorship purposes

        3) Being monopolies in their respective markets, this means they must respect certain obligations the same way public utilities have.

        • hash872 a day ago ago

          Re: 1- one certain protection of the state that they benefit from is the US Constitution, which as interpreted so far forbids the government to impair their free speech rights. Making a private actor host content they personally disagree with violates their right of free speech! That's what the 1st Amendment is all about

          2. This has already been adjudicated and this argument lost https://en.wikipedia.org/wiki/Murthy_v._Missouri

          3. What market is Youtube a monopoly in?

          • themaninthedark 20 hours ago ago

            2. This has already been adjudicated and this argument lost https://en.wikipedia.org/wiki/Murthy_v._Missouri

            The 6–3 majority determined that neither the states nor other respondents had standing under Article III, reversing the Fifth Circuit decision.

            In law, standing or locus standi is a condition that a party seeking a legal remedy must show they have, by demonstrating to the court, sufficient connection to and harm from the law or action challenged to support that party's participation in the case.

            Justice Amy Coney Barrett wrote the opinion, stating: "To establish standing, the plaintiffs must demonstrate a substantial risk that, in the near future, they will suffer an injury that is traceable to a government defendant and redressable by the injunction they seek. Because no plaintiff has carried that burden, none has standing to seek a preliminary injunction."

            The Supreme Court did not say that what was done was legal, they only said that the people who were asking for the injunction and bringing the lawsuit could not show how they were being or going to be hurt.

          • TeeMassive 9 hours ago ago

            1. By accepting unique protections and benefice of and from the state, they are no longer entirely private. 2. See comments below. It doesn't say what you think it says. 3. Google has a quasi monopoly (it doesn't require having full control) and abused it with YouTube with its search result. While it's true that it's not YT entirely by itself.

    • braiamp a day ago ago

      > But I think we have to realize silencing people doesn't work

      It actually does work. You need to remove ways for misinformation to spread, and suppressing a couple of big agents works very well.

      - https://www.nature.com/articles/s41586-024-07524-8 - https://www.tandfonline.com/doi/full/10.1080/1369118X.2021.1... - https://dl.acm.org/doi/abs/10.1145/3479525 - https://arxiv.org/pdf/2212.11864

      Obviously, the best solution would be prevention, by having good education systems and arming common people with the weapons to assess and criticize information, but we are kinda weak on that front.

    • wvenable a day ago ago

      > But I think we have to realize silencing people doesn't work.

      Doesn't it though? I've seen this repeated like it's fact but I don't think that's true. If you disallowed all of some random chosen conspiracy off of YouTube and other mainstream platforms I think it would stop being part of the larger public consciousness pretty quickly.

      Many of these things arrived out of nothing and can disappear just as easily.

      It's basic human nature that simply hearing things repeated over and over embeds it into your consciousness. If you're not careful and aware of what you're consuming then that becomes a part of your world view. The most effective way to bring people back from conspiratorial thinking (like QAnon) is to unplug them from that source of information.

    • lkey a day ago ago

      I think you are granting false neutrality to this speech. These misinfo folks are always selling a cure to go with their rejection of medicine. It's a billion dollar industry built off of spreading fear and ignorance, and youtube doesn't have any obligation to host their content. As an example, for 'curing' autism, the new grift is reject Tylenol and buy my folic acid supplement to 'fix' your child. Their stores are already open and ready.

      • lkey a day ago ago

        To finish the thought, scientists at the CDC (in the before times) were not making money off of their recommendations, nor were they making youtube videos as a part of their day job. There's a deep asymmetry here that's difficult to balance if you assume the premise that 'youtube must accept every kind of video no matter what, people will sort themselves out'. Reader, they will not.

      • mvdtnz a day ago ago

        And silencing these people only lends credence to their "they don't want you to know this" conspiracy theories. Because at that point it's not a theory, it's a proven fact.

        • lkey a day ago ago

          These people will claim they were 'silenced' regardless. Even as they appear with their published bestseller about being silenced on every podcast and news broadcast under the sun, they will speak of the 'conspiracy' working against them at every step. The actual facts at hand almost never matter. Even at a press conference where the President is speaking on your behalf they'll speak of the 'groups' that are 'against' them, full of nefarious purpose. There is no magical set of actions that changes the incentive they have to lie, or believe lies. (except regulation of snake oil, which is not going to happen any time soon)

          • mvdtnz a day ago ago

            And most people roll their eyes and don't believe it. Which is why it's a good idea not to make it true.

            • lkey a day ago ago

              Conspiratorial thinkers are more likely to believe that Osama Bin Laden was already dead and is still alive rather than the official narrative that he was killed on the day reported. https://www.researchgate.net/publication/235449075_Dead_and_...

              In general, you can't argue or 'fact' people out of beliefs they were not argued into. The best you can do is give them a safe place to land when disconfirmation begins. Don't be too judgy, no one is immune to propaganda.

    • ants_everywhere a day ago ago

      These policies were put in place because the anti-vax and election skepticism content was being promoted by military intelligence organizations that were trying to undermine democracy and public healthy in the US.

      The US military also promoted anti-vax propaganda in the Philippines [0].

      A lot of the comments here raise good points about silencing well meaning people expressing their opinion.

      But information warfare is a fundamental part of modern warfare. And it's effective.

      An American company or individual committing fraud can be dealt with in the court system. But we don't yet have a good remedy for how to deal with a military power flooding social media with information intentionally designed to mislead and cause harm for people who take it seriously.

      So

      > I think we have to realize silencing people doesn't work

      it seems to have been reasonably effective at combating disinformation networks

      > It just causes the ideas to metastasize

      I don't think this is generally true. If you look at old disinformation campaigns like the idea that the US faked the moon landings, it's mostly confined to a small group of people who are prone to conspiracy thinking. The idea of a disinformation campaign is you make it appear like a crazy idea has broad support. Making it appear that way requires fake accounts or at least boosters who are in on the scheme. Taking that away means the ideas compete on their own merit, and the ideas are typically real stinkers.

      [0] https://www.btimesonline.com/articles/167919/20240727/u-s-ad...

    • fullshark a day ago ago

      It works 99% of the time and you are overindexing on the 1% of the time it doesn’t to draw your conclusion.

    • tonfreed a day ago ago

      The best disinfectant is sunlight. I'm similarly appalled by some of the behaviour after a certain political activist was murdered, but I don't want them to get banned or deplatformed. I'm hoping what we're seeing here is a restoration of the ability to disagree with each other

      • LeafItAlone a day ago ago

        >The best disinfectant is sunlight.

        Is it? How does that work at scale?

        Speech generally hasn’t been restricted broadly. The same concepts and ideas removed from YouTube still were available on many places (including here).

        Yet we still have so many people believing falsehoods and outright lies. Even on this very topic of COVID, both sides present their “evidence” and and truly believe they are right, no matter what the other person says.

        • TeeMassive a day ago ago

          What's your alternative? The opposite is state dictated censorship and secrecy and those have turned very wrong every single time.

          • LeafItAlone a day ago ago

            I honestly don’t know. My libertarian foundation want me to believe that any and all ideas should be able to be spread. But with the technological and societal changes in the past 10-15 years, we’ve seen how much of a danger this can be too. A lie or mistrust can be spread faster than ever to a wider audience than previously ever possible. I don’t have solution, but what we have not is clearly not working.

            • api a day ago ago

              The root problem is that people don’t trust authorities. Why? Because they burned that trust.

              People don’t believe the scientific consensus on vaccines because there were no WMDs in Iraq, to give one of many huge examples.

              “But those were different experts!”

              No they weren’t. Not to the average person. They were “the authorities,” and “the authorities” lied us into a trillion dollar war. Why should anyone trust “the authorities” now?

              Tangentially… as bad as I think Trump is, he’s still not as bad as George W Bush in terms of lasting damage done. Bush II was easily the worst president of the last 100 years, or maybe longer. He is why we have a president Trump.

      • tzs a day ago ago

        > The best disinfectant is sunlight

        Have you actually tried to shine sunlight on online misinformation? If you do you will quickly find it doesn't really work.

        The problem is simple. It is slower to produce factually correct content. A lot slower. And when you do produce something the people producing the misinformation can quickly change their arguments.

        Also, by the time you get your argument out many of the people who saw the piece you are refuting and believed it won't even see your argument. They've moved on to other topics and aren't going to revisit that old one unless it is a topic they are particularly interested in. A large number will have noted the original misinformation, such as some totally unsafe quack cure for some illness that they don't currently have, accepted it as true, and then if they ever find themselves with that illness apply the quack cure without any further thought.

        The debunkers used to have a chance. The scammers and bullshitters always had the speed advantage when it came to producing content but widespread distribution used to be slow and expensive. If say a quack medical cure was spreading the mainstream press could ask the CDC or FDA about it, talk to researchers, and talk to doctors dealing with people showing up in emergency rooms from trying the quack cure, and they had the distribution networks to spread this information out much faster than the scammers and bullshitters.

        Now everyone has fast and cheap distribution through social media, and a large number of people only get their information from social media and so the bullshitters and scammers now have all the advantages.

      • DangitBobby a day ago ago

        And not letting the disease spread to begin with is better than any disinfectant.

      • slater- a day ago ago

        >> The best disinfectant is sunlight.

        Trump thought so too.

      • thrance a day ago ago

        How's that working out? The worst ideas of the 20th century are resurfacing in plain sunlight because the dem's couldn't pluck their heads out of the sand and actually fight them.

        Now vaccines are getting banned and the GOP is gerrymandering the hell out of the country to ensure the end of the democratic process. Sure, let's do nothing and see where that brings us. Maybe people will magically come to their senses.

        • andrewmcwatters a day ago ago

          Well, people literally died. So, I think we all know how it played out.

          The same thing since time eternal will continue to occur: the educated and able will physically move themselves from risk and others will suffer either by their own volition, or by association, or by lot.

    • deegles a day ago ago

      no, letting misinformation persist is counterproductive because of the illusory truth effect. the more people hear it, the more they think (consciously or not) "there must be something to this if it keeps popping up"

      • NullCascade a day ago ago

        Elon Musk's takeover of X is already a good example of what happens with unlimited free speech and unlimited reach.

        Neo-nazis and white nationalists went from their 3-4 replies per thread forums, 4chan posts, and Telegram channels, to now regularly reaching millions of people and getting tens of thousands of likes.

        As a Danish person I remember how American media in the 2010s and early 2020s used to shame Denmark for being very right-wing on immigration. The average US immigration politics thread on X is worse than anything I have ever seen in Danish political discussions.

    • dyauspitr 19 hours ago ago

      Silencing people is the only thing that works is what I’ve learned on the internet.

    • vkou a day ago ago

      > But I think we have to realize silencing people doesn't work.

      We also tried letting the propaganda machine full-blast those lies on the telly for the past 5 years.

      For some reason, that didn't work either.

      What is going to work? And what is your plan for getting us to that point?

      • _spduchamp a day ago ago

        Algorithmic Accountability.

        People can post all sorts of crazy stuff, but the algorithms do not need to promote it.

        Countries can require Algorithmic Impact Assements and set standards of compliance to recommended guidelines.

        • amanaplanacanal a day ago ago

          This seems unlikely to be constitutional in the US.

    • aesthethiccs a day ago ago

      Yes we should be allowed to bully idiots into the ground.

    • krapp a day ago ago

      >A lot of people will say all kinds of craziness, and you just have to let it ride so most of us can roll our eyes at it.

      Except many people don't roll their eyes at it, that's exactly the problem. QAnon went from a meme on 4chan to the dominant political movement across the US and Europe. Anti-vax went from fringe to the official policy position of the American government. Every single conspiracy theory that I'm aware of has only become more mainstream, while trust in any "mainstream" source of truth has gone down. All all of this in an environment of aggressive skepticism, arguing, debating and debunking. All of the sunlight is not disinfecting anything.

      We're literally seeing the result of the firehose of misinformation and right-wing speech eating people's brains and you're saying we just have to "let it ride?"

      Silencing people alone doesn't work, but limiting the damage misinformation and hate speech can do while pushing back against it does work. We absolutely do need to preserve the right of platforms to choose what speech they spread and what they don't.

    • bencorman a day ago ago

      I wish someone could have seen the eye roll I just performed reading this comment.

      Silencing absolutely works! How do you think disinformation metastasized!?

    • benjiro a day ago ago

      Funny thing, several person that counter responded and disagreed got grayed out (aka negative downvoted ... as in censored).

      Reality is, i have personally seen what this type of uncontrolled anti-vax stuff does. The issue is that its harder to disprove a negative with a positive, then people realize.

      The moment you are into the youtube, tiktok or whatever platform algorithm, you are fed a steady diet of this misinformation. When you then try to argue with actual factual studies, you get the typical response from "they already said that those studies are made up"... How do you fight that? Propaganda works by flooding the news and over time, people believe it.

      That is the result of uncensored access because most people do not have the time to really look up a scientific study. The amount of negative channels massive out way positive / fact based channels because the later is "boring". Its the same reason why your evening news is 80% deaths, corruptions, thefts, politicians and taxes or other negative world news. Because it has been proven that people take in negative news much more. Clickbait titles that are negative draw in people.

      There is a reason why holocaust denial is illegal in countries. Because the longer some people can spew that, the more people actually start to believe it.

      Yes, i am going to get roasted for this but people are easily influenced and they are not as smart as they think themselves are. We have platforms that cater to people's short attention span with barely 1~3 min clips. Youtube video's longer then 30min are horrible for the youtubers income as people simply do not have the attention span and resulting lost income.

      Why do we have laws like seatbelt, speed limits, and other "control" over people. Because people left to their own devices, can be extreme uncaring about their own family, others, even themselves.

      Do i like the idea of censorship for the greater good, no. But when there are so many that spew nonsense just to sell their powders, and their homemade vitamine C solutions (made in China)... telling people information that may hurt or kills themselves, family or others.

      Where is the line of that unbridled free speech? Silencing people works as in, your delaying the flow of shit running down a creek, will it stop completely? No, but the delay helps people downstream. Letting it run uninterrupted, hoping that a few people downstream with a mop will do all the work, yea ...

      We only need to look at platforms like X, when "censorship" got removed (moderation). Full of free speech, no limits and it turned into a soak pit extreme fast (well, bigger soak pit).

      Not sure why i am writing this because this is a heated topic but all i can say is, I have seen the damage that anti-vax did on my family. And even to this day, that damage is still present. How a person who never had a issue with vaccinations, never had a bad reaction beyond the sour arm for a day, turned so skeptical to everything vaccination. All because those anti-vax channels got to her.

      The anti-vax movement killed people. There is scientific study upon study how red states in the US ended up with bigger amounts of deaths given the time periodes. And yet, not a single person was ever charged for this, ... all simply accepted this and never looked back. Like it was a natural thing, that people's grandparents, family members died that did not need to die.

      The fact that people have given up, and now accept to let those with often financial interests, spew nonsense as much as they like. Well, its "normal".

      I weep for the human race because we are not going to make it.

    • breadwinner a day ago ago

      > silencing people doesn't work

      I agree, but how do you combat propaganda from Putin? Do you match him dollar for dollar? I am sure YouTube would like that, but who has deep enough pockets to counter the disinformation campaigns?

      Similar issue with Covid... when you are in the middle of a pandemic, and dead bodies are piling up, and hospitals are running out of room, how do you handle misinformation spreading on social media?

      • JumpCrisscross a day ago ago

        Slow down our algorithmic hell hole. Particularly around elections.

        • LeafItAlone a day ago ago

          >Slow down our algorithmic hell hole.

          What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?

          • JumpCrisscross a day ago ago

            > What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?

            Time delay. No content based restrictions. Just, like, a 2- to 24-hour delay between when a post or comment is submitted and when it becomes visible, with the user free to delete or change (in this case, the timer resets) their content.

            I’d also argue for demonetising political content, but idk if that would fly.

            • LeafItAlone a day ago ago

              Ok, but how does that get implemented? Not technically, but who makes it happen and enforces the rules? For all content or just “political”? Who decides what’s “political”? Information about the disease behind a worldwide pandemic isn’t inherently “political”, but somehow it became so.

              Who decides agar falls in this bucket. The government? That seems to go against the idea of restricting speech and ideas.

              • JumpCrisscross a day ago ago

                > who makes it happen and enforces the rules?

                Congress for the first. Either the FCC or, my preference, private litigants for the second. (Treble damages for stupid suits, though.)

                > For all content or just “political”?

                The courts can already distinguish political speech from non-political speech. But I don’t trust a regulator to.

                I’d borrow from the French. All content within N weeks of an in the jurisdiction. (I was going to also say any content that mentions an elected by name, but then we’ll just get meme names and nobody needs that.)

                Bonus: electeds get constituent pressure to consolidate elections.

                Alternative: these platforms already track trending topics. So an easy fix is to slow down trending topics. It doesn’t even need to be by that much, what we want is for people to stop and think and have a chance to reflect on what they do, maybe take a step away from their device while at it.

          • breadwinner a day ago ago

            Easy solution: Repeal Section 230.

            Allow citizens to sue social media companies for the harm caused to them by misinformation and disinformation. The government can stay out of this.

            • JumpCrisscross a day ago ago

              > Easy solution: Repeal Section 230

              May I suggest only repealing it for companies that generate more than a certain amount of revenue from advertising, or who have more than N users and have algorithmic content elevation?

              • breadwinner a day ago ago

                That seems like a reasonable middle ground.

        • breadwinner a day ago ago

          If the government asks private companies to do that, then that's a violation of 1st amendment, isn't it?

          This is the conundrum social media has created. In the past only the press, who were at least semi-responsible, had the ability to spread information on a massive scale. Social media changed that. Now anyone can spread information instantly on a massive scale, and often it is the conspiracy theories and incorrect information that people seek out.

          "We were a bit naive: we thought the internet, with the availability of information, would make us all a lot more factual. The fact that people would seek out—kind of a niche of misinformation—we were a bit naive." -- Bill Gates to Oprah, on "AI and the Future of us".

          • JumpCrisscross a day ago ago

            > If the government asks private companies to do that, then that's a violation of 1st amendment, isn't it?

            Yes. An unfortunate conclusion I’m approaching (but have not reached, and frankly don’t want to reach) is the First Amendment doesn’t work in a country that’s increasingly illiterate and addicted to ad-powered algorithmic social media.

            • breadwinner a day ago ago

              It is social media that is the root problem.

              On the internet everything can appear equally legitimate. Breitbart looks as legit as the BBC. Sacha Baron Cohen https://www.youtube.com/watch?v=ymaWq5yZIYM

              Excerpts:

              Voltaire was right when he said "Those who can make you believe absurdities can make you commit atrocities." And social media lets authoritarians push absurdities to millions of people.

              Freedom of speech is not freedom of reach. Sadly There will always be racists, misogynists, anti-Semites, and child abusers. We should not be giving bigots and pedophiles a free platform to amplify their views and target their victims.

              Zuckerberg says people should decide what's credible, not tech companies. When 2/3rds of millennials have not heard of Auschwitz how are they supposed to know what's true? There is such a thing as objective truth. Facts do exist.

      • altruios a day ago ago

        Censorship is a tool to combat misinformation.

        It's taking a sword to the surgery room where no scalpel has been invented yet.

        We need better tools to combat dis/mis-information.

        I wish I knew what that tool was.

        Maybe 'inoculating information' that's specifically stickier than the dis/mis-info?

        • breadwinner a day ago ago

          Easy solution: Repeal Section 230.

          Social media platforms in the United States rely heavily on Section 230 of the Communications Decency Act, which provides them immunity from liability for most user-generated content.

          • DangitBobby a day ago ago

            This would cause widespread censorship of anything remotely controversial, including the truth. We'd be in a "censor first, ask questions later" society. Somehow that doesn't seem healthy either.

            • breadwinner a day ago ago

              Have you visited nytimes.com in recent months? Just this morning the top headline was about the lies Trump told at the UN. That's pretty controversial - the newspaper of record calling the sitting president a liar. That's not allowed in many or most countries, but it is allowed in the US. And Trump is suing New York Times for $15 billion, for defamation. That didn't intimidate NYT. They are willing to stand behind the articles they publish. If you can't stand behind what you publish, don't publish them.

              • DangitBobby a day ago ago

                Publishing your own story is not the same thing.

      • a day ago ago
        [deleted]
      • TeeMassive a day ago ago

        Have you heard about Tik Tok? And you think governments' intelligence agencies are not inserting their agents in key positions at bit tech companies?

    • homeonthemtn a day ago ago

      You are on a platform that polices speech. It is evidence that policing speech helps establish civility and culture. There's nothing wrong with policing speech, but it can certainly be abused.

      If you were on the early Internet, you were self policing with the help of admins all the time. The difference was you had niche populations that had a stake in keeping the peace and culture of a given board

      We broke those boundaries down though and now pit strangers versus strangers for clicks and views, resulting in daily stochastic terrorism.

      Police the damn speech.

      • softwaredoug a day ago ago

        For inciting violence. Sure. Free speech isn’t absolute.

        But along with fringe Covid ideas, we limited actual speech on legitimate areas of public discourse around Covid. Like school reopening or questioning masks and social distancing.

        We needed those debates. Because the unchecked “trust the experts” makes the experts dumber. The experts need to respond to challenges.

        (And I believe those experts actually did about as best they could given the circumstances)

        • scuff3d 19 hours ago ago

          Try to post a meme here, see how long it stays up.

          More seriously, it's just not this simple man. I know people really want it to be, but it's not.

          I watched my dad get sucked down a rabbit hole of qanon, Alex Jones, anti-vax nonsense and God knows what other conspiracy theories. I showed him point blank evidence that qanon was bullshit, and he just flat out refuses to believe it. He's representative of a not insignificant part of the population. And you can say it doesn't do any damage, but those people vote, and I think we can see clearly it's done serious damage.

          When bonkers ass fringe nonsense with no basis in reality gets platformed, and people end up in that echo chamber, it does significant damage to the public discourse. And a lot of it is geared specifically to funnel people in.

          In more mainstream media climate change is a perfect example. The overwhelming majority in the scientific community has known for a long time it's an issue. There were disagreement over cause or severity, but not that it was a problem. The media elevated dissenting opinions and gave the impression that it was somehow an even split. That the people who disagree with climate change were as numerous and as well informed, which they most certainly weren't, not by a long shot. And that's done irreparable damage to society.

          Obviously these are very fine lines to be walked, but even throughout US history, a country where free speech is probably more valued than anywhere else on the planet, we have accepted certain limitations for the public good.

        • homeonthemtn a day ago ago

          If I were trying to govern during a generational, world stopping epoch event, I would also not waste time picking through the trash to hear opinions.

          I would put my trust in the people I knew were trained for this and adjust from there.

          I suspect many of these opinions are born from hindsight.

          • xboxnolifes a day ago ago

            Letting fringe theories exist on YouTube does not stop you from accessing the WHO or CDC website.

            • jdiff 7 hours ago ago

              Fringe theories existing and being allowed to persist has led to wide takedowns of information from government sites including the CDC.

            • fzeroracer 18 hours ago ago

              Those fringe theories have now embedded themselves into the government itself and directly have contributed to the rot of our public health institutions. So in many ways yes, they do.

              • xboxnolifes 5 hours ago ago

                That happened while Youtube was not allowing it as well.

          • themaninthedark a day ago ago

            Luckily, it is possible for you to just listen to those you trust. No need for you go pick through other people's opinions.

            I don't see how that turns into you needing to mandate what I read and who's opinions I hear.

            • scuff3d 19 hours ago ago

              There has been a massive uptick in anti-vax rhetoric over the last decade. As a result some Americans have decided to not vaccinate, and we are seeing a resurgence in diseases that should be eradicated.

              I have a three month old son. At the time he was being born, in my city, there was an outbreak of one of those diseases that killed more then one kid. Don't tell me this stuff doesn't have a direct impact on people.

          • zmgsabst a day ago ago

            Really?

            Experts have a worse track record than open debate and the COVID censorship was directed at even experts who didn’t adhere to political choices — so to my eyes, you’re saying that you’d give in to authoritarian impulses and do worse.

            • judahmeek a day ago ago

              The problem with debate is that it hinders organized action.

              At some point in any emergency, organized action has to be prioritized over debate.

              Maybe that is still authoritarian, but they do say to have moderation in all things!

              • Gud 21 hours ago ago

                No it doesn't. It allows for correct action to be taken.

              • zmgsabst a day ago ago

                That’s not at all how you’re taught to handle emergencies.

                From health emergencies to shootings to computer system crashes to pandemics — doing things without a reason to believe they’ll improve the situation is dangerous. You can and many have made things worse. And ignoring experts shouting “wait, no!” is a recipe for disaster.

                When we were responding to COVID, we had plenty of time to have that debate in a candid way. We just went down an authoritarian path instead.

              • SV_BubbleTime a day ago ago

                > The problem with debate is that it hinders organized action.

                Ah… so… ”we must do something! Even if it’s the wrong thing”

                Hot take.

              • aianus a day ago ago

                God forbid someone hinder some retarded organized action before enough peoples’ lives are ruined that our majestic rulers notice and gracefully decide to stop.

        • epistasis a day ago ago

          Really, discussion was limited? Or blatant lies were rightly excluded from discourse?

          There's a big difference, and in any healthy public discourse there are severe reputations penalties for lies.

          If school reopening couldn't be discussed, could you point to that?

          It's very odd how as time goes on my recollection differs so much from others, and I'm not sure if it's because of actual different experiences or because of the fog of memory.

          • mixmastamyk a day ago ago

            Blatant truths were excluded as well, and that's the main problem. See replies to: https://news.ycombinator.com/item?id=45353884

            • epistasis a day ago ago

              That's a really long thread and I'm not sure where blatant truths were excluded.

              • mixmastamyk 11 hours ago ago

                Each top level reply responds with an example of a truth suppressed.

                • epistasis 10 hours ago ago

                  The top comment is emphatically not that:

                  >As super low hanging fruit: > June 8, 2020: WHO: Data suggests it's "very rare" for coronavirus to spread through asymptomatics [0] > June 9, 2020: WHO expert backtracks after saying asymptomatic transmission 'very rare' [1] > 0: https://www.axios.com/2020/06/08/who-coronavirus-asymptomati... 1: https://www.theguardian.com/world/2020/jun/09/who-expert-bac... > Of course, if we just take the most recent thing they said as "revised guidance", I guess it's impossible for them to contradict themselves. Just rapidly re-re-re-revised guidance.

                  My hypotheses for our discrepant viewpoints were 1) my aging memory, or 2) different experiences, but it's actually 3) not using words to have the same meaning!

                  Citing this as "blatant truth suppression" weakens my view of any other evidence or argument you put forward, because I no longer trust that we can use words in ways that are compatible with each other.

        • McGlockenshire a day ago ago

          The "debate" ended up doing nothing but spreading misinformation.

          Society as a whole has a responsibility to not do that kind of shit. We shouldn't be encouraging the spread of lies.

        • fzeroracer 18 hours ago ago

          > We needed those debates. Because the unchecked “trust the experts” makes the experts dumber. The experts need to respond to challenges.

          We've had these debates for decades. The end result is stuff like Florida removing all vaccine mandates. You can't debate a conspiracy or illogical thinking into to going away, you can only debate it into validity.

      • jader201 a day ago ago

        > Police the damn speech.

        What happens when the “police” disagrees with and silences what you believe is true? Or when they allow the propagation of what you believe to be lies?

        Who gets to decide what’s the truth vs. lies? The “police”?

        • palmfacehn a day ago ago

          >Who gets to decide what’s the truth vs. lies? The “police”?

          This keeps coming up on this site. It seems like a basic premise for a nuanced and compassionate worldview. Humility is required. Even if we assume the best intentions, the fallible nature of man places limits on what we can do.

          Yet we keep seeing posters appealing to Scientism and "objective truth". I'm not sure it is possible to have a reasonable discussion where basic premises diverge. It is clear how these themes have been used in history to support some of the worst atrocities.

      • frollogaston a day ago ago

        Depends who is doing the policing. In this case, White House was telling Google who to ban.

        • aeternum 21 hours ago ago

          I think it was even slightly worse. The White House was effectively delegating the decision of who to ban/police to the NIH/NIAID, an organization that was funding novel coronavirus research in Wuhan.

          It's easy to see how at minimum there could be a conflict of interest.

          • frollogaston 6 hours ago ago

            Did I miss somewhere in the article or Google's statement that the NIH was involved?

            • aeternum 5 hours ago ago

              Both administrations (Trump 2016 + Biden) adopted the guidance of Fauci and others at the NIH/NIAID more or less directly. So the guidance came through the administrations but originated with the NIH/NIAID.

              You had direct statements like this from scientific experts, those experts turned out to be the middleman group that was funding Wuhan via NIH grants.

              Peter Daszak, a zoologist and president of the EcoHealth Alliance, who has been among the most vocal critics of the idea of a lab leak, wrote, “I just wanted to say a personal thank you on behalf of our staff and collaborators, for publicly standing up and stating that the scientific evidence supports a natural origin for COVID-19 from a bat-to-human spillover, not a lab release from the Wuhan Institute of Virology.”

              Given an expert statement like that, youtube can and did take down lableak videos because they were misinformation (contrary to the information provided by the experts).

      • StanislavPetrov a day ago ago

        Policing speech for civility or spam is very different than policing speech for content that you disagree with. I was on the early internet, and on the vast majority of forums policing someone's speech for content rather than vulgarity or spam was almost universally opposed and frowned upon.

      • nostromo a day ago ago

        You've missed the point entirely.

        It’s not if Google can decide what content they want on YouTube.

        The issue here is that the Biden Whitehouse was pressuring private companies to remove speech that they otherwise would host.

        That's a clear violation of the first amendment. And we now know that the previous Whitehouse got people banned from all the major platforms: Twitter, YouTube, Facebook, etc.

        • dotnet00 a day ago ago

          They claim that the Biden admin pressured them to do it, except that they had been voluntarily doing it even during Trump's initial presidency.

          The current administration has been openly threatening companies over anything and everything they don't like, it isn't surprising all of the tech companies are claiming they actually support the first amendment and were forced by one of the current administration's favorite scapegoats to censor things.

          • 20 hours ago ago
            [deleted]
        • a day ago ago
          [deleted]
        • homeonthemtn a day ago ago

          [flagged]

          • nostromo a day ago ago

            Thankfully the constitution explicitly forbids that in the US.

            • homeonthemtn a day ago ago

              [flagged]

              • themaninthedark a day ago ago

                Huh, last I heard was that Jimmy Kimmel is back on air.

                If the Trump administration had decided to follow through with their threats, ABC could have sued and won.

                Lastly, Jimmy Kimmel could have(and still possibly might be able to) sue for tortious interference.

                • abracadaniel a day ago ago

                  Nexstar and Sinclair are still blocking their stations from airing him which accounts for a quarter of the US.

                  • frollogaston a day ago ago

                    They're private companies. If the reason they're doing this is govt pressure (FCC licenses?), that's not ok though.

              • mensetmanusman a day ago ago

                That was abc, and they just put him back.

              • SV_BubbleTime a day ago ago

                [flagged]

                • 18 hours ago ago
                  [deleted]
      • zmgsabst a day ago ago

        [flagged]

        • z0r a day ago ago

          There is no mass Marxist movement in the USA. There is a left wing crippled by worse than useless identity politics.

        • homeonthemtn a day ago ago

          [flagged]

    • TacticalCoder a day ago ago

      [dead]

    • thrance a day ago ago

      [flagged]

    • heavyset_go a day ago ago

      [flagged]

    • felixgallo a day ago ago

      [flagged]

      • putzdown a day ago ago

        No. This perspective is wrong in both directions: (1) it is bad medicine and, (2) the medicine doesn't treat the disease. If we could successfully ban bad ideas (assuming that "we" could agree on what they are) then perhaps we should. If the damage incurred by the banning of ideas were sufficiently small, perhaps we should. But both of these are false. Banning does not work. And it brings harm. Note that the keepers of "correct speech" doing the banning today (eg in Biden's day) can quickly become the ones being banned another day (eg Trump's). It's true that drowning the truth through volume is a severe problem, especially in a populace that doesn't care to seek out truth, to find needles in haystacks. But again, banning doesn't resolve this problem. The real solution is develop a populace that cares about, seeks out, and with some skill identifies the truth. That may not be an achievable solution, and in the best case it's not going to happen quickly. But it is the only solution. All of the supply-based solutions (controlling speech itself, rather than training good listeners) run afoul of this same problem, that you cannot really limit the supply, and to the extent you can, so can your opponents.

        • paulryanrogers a day ago ago

          What do you think about measures that stop short of banning? Like down ranking, demonetizing, or even hell 'banning' that just isolates cohorts that consistently violate rules?

          • rahidz a day ago ago

            Not OP, but my opinion is that if a platform wants to do so, then I have zero issues with that, unless they hold a vast majority of market share for a certain medium and have no major competition.

            But the government should stay out of it.

        • felixgallo a day ago ago

          No. You are objectively wrong. It's great medicine that works -- for example, in Germany, and in the US, and elsewhere, it has stemmed the flow of violent extremism historically to stop the KKK and the Nazis. You can't even become a citizen if you have been a Nazi. Even on the small scale, like reddit, banning /r/fatpeoplehate was originally subject to much handwringing and weeping by the so-called free speech absolutists, but guess what -- it all went away, and the edgelords and bullies went back to 4chan to sulk, resulting in the bullshit not being normalized and made part of polite society.

          If you want to live in a society where absolutely anything goes at all times, then could I recommend Somalia?

      • unclad5968 a day ago ago

        Can we stop with the Nazi stuff. I don't know if they stopped teaching history, but there is nothing happening in the US that is within an order of magnitude of the evil the Nazi's perpetrated. Being anti-vax is not comparable to genocide.

        • jjk166 a day ago ago

          The Nazis in 1933 hadn't done anything within an order of magnitude of the evil they would perpetrate in 1943. They nevertheless were still Nazis, and everyone who did not actively condemn them then was in part responsible for what they did later.

          Many evil people weren't Nazis; some Nazis weren't necessarily evil. Evil is not part of the definition of Nazism. Promoting authoritarianism, exclusionary nationalism, institutional racism, autarky, anti-liberalism and anti-socialism are the hallmarks of Nazism. Anyone who holds the beliefs of the Nazis is a Nazi, regardless of what level of success they have to date achieved in carrying out their aims.

          • unclad5968 a day ago ago

            > The Nazis in 1933 hadn't done anything within an order of magnitude of the evil they would perpetrate in 1943. They nevertheless were still Nazis, and everyone who did not actively condemn them then was in part responsible for what they did later.

            Only because what they did in 1943 surpassed anything imaginable. In 1933 the Nazi party immediately banned all political parties, arrested thousands of political opponents, started forcing sterilization of anyone with hereditary illnesses, and forced abortions of anyone with hereditary illness. Evil is absolutely an identifying part of Nazis. The idea that Nazis are just anti-liberals is exactly why we cannot go around calling everyone we don't like Nazis. The Nazis were not some niche alt-right organization.

            If you genuinely think there are Nazis controlling youtube or the government, and all you're doing is complaining about it on hackernews, you're just as complicit as you're claiming those people were.

            • jjk166 a day ago ago

              One is not immune to being a Nazi because they are not evil, being a Nazi makes people evil. Much of the horror of the Nazis was that seemingly normal, reasonable people committed those atrocities; many without even considering that what they were doing was wrong until after the fact.

              We do not call people Nazis because we dislike them, we dislike them because they are Nazis. Most non-Nazis, when accused of being a Nazi, point out how their views differ from the Nazis. I won't claim it's always the case, but the people who argue they can't possibly be Nazis because Nazis are bad, and they are not, typically are.

              The Nazis very much were an alt-right, anti-liberal group. They were more than that; I gave a whole list of core tenets to their beliefs. Overlapping some tiny amount doesn't make someone a Nazi. Hitler being a vegetarian is not an indictment of vegetarians. But if a person were to go through the list of those 6 things the Nazis championed and find themselves championing 4 or 5 of them, it should be cause for alarm.

              • unclad5968 a day ago ago

                Listing "anti-liberalism" as one of the worst characteristics of a group that committed genocide, eugenics, enslaved minority groups, and attempted racial extermination is the issue. The idea that being anti-liberal is what makes you a Nazi and not the other stuff is ignorant at best, which is my original point about education.

                • Dylan16807 a day ago ago

                  That wasn't a list of the worst characteristics. It was a list of useful identifying characteristics. And Nazis were Nazis before they did any genocide.

        • epakai a day ago ago

          We read the history, a lot of it rhymes. Conservatives failed, and exchanged their values for a populist outsider to maintain power (see Franz von Papen). The outsider demeans immigrants and 'sexual deviants'. The outsider champions nationalism. He pardons the people who broke the law to support him. Condemns violence against the party while ignoring the more common violence coming out of the those aligned with the party. Encourages the language of enemies when discussing political opponents and protestors.

          Nazi has a lot more connotations than genocide. I'm not sure it is worth nitpicking over. Even if you tone it down to Fascist or Authoritarian there will be push back.

        • tehjoker a day ago ago

          [flagged]

      • dotnet00 a day ago ago

        How can you say that banning Nazis has worked well considering everything so far this year?

        • felixgallo a day ago ago

          Europe is sliding, but has done ok so far. Crossing fingers.

        • miltonlost a day ago ago

          Well it would if we would actually ban Nazis instead of platform them. They haven't been banned. That's the problem.

          • dotnet00 a day ago ago

            You'd have to ban them from society outright without somehow devolving into an authoritarian hellhole in the process (impossible). Trump still primarily posts on a platform specifically created to be a right wing extremist echo chamber.

            • felixgallo a day ago ago

              the choice is not 'devolving into an authoritarian hellhole' or 'give the nazis the ability to do whatever they want'. There is a middle ground that we have lived in for many decades effectively, until recently.

              • dotnet00 21 hours ago ago

                What changed recently? Until the latest admins, most platforms didn't change their stance on what is and is not allowed all too much. I'm not trying to imply that dichotomy, just saying that simply banning nazis is not effective, because they just retreat into their echo chambers and fester until they can trick enough people into giving them power. I don't know what the ideal solution is, but simply banning them doesn't seem to be it, and neither do the two extremes mentioned seem reasonable.

          • cpursley a day ago ago

            What is a Nazi?

            • indy a day ago ago

              For a lot of people it's "anyone who I disagree with".

          • knifemaster a day ago ago

            [flagged]

      • tehjoker a day ago ago

        [flagged]

        • indy a day ago ago

          Perhaps not the wisest comment to make in light of recent events

          • tehjoker a day ago ago

            I didn't say violence. Whatever you read into that comment is a projection. I'm not even sure violence is effective, but something more muscular than op-eds is called for. For example, labor organizing and various forms of self-defense organizations, of which there are many kinds, not only militias. For example, anti-ICE organizing which protects vulnerable people from the gestapo.

    • breadwinner a day ago ago

      The government created this problem when they enacted Section 230. This is at the root of the misinformation and disinformation... social media companies are not responsible for the harm.

      The simple solution is repeal Section 230. When information can be transmitted instantly on a massive scale, somebody need to responsible for the information. The government should not police information but citizens should be allowed to sue social media companies for the harm caused to them.

      • int_19h a day ago ago

        The practical end result of repealing Section 230 is that companies will crack down on any even remotely controversial speech because that's the only way to avoid lawsuits.

        • breadwinner a day ago ago

          The New York Times has published plenty of stories you could call controversial. Just this morning the top headline was that Trump lied at the UN. Trump has sued the Times for defamation, yet the paper stands by its reporting. That’s how publishing works: if you can’t defend what you publish, don’t publish it. The Section 230 debate is about whether large online platforms such as Facebook should bear similar accountability for the content they distribute. I think they should. That's the only way we can control misinformation and disinformation.

    • dawnerd a day ago ago

      It also turns into a talking point for them. A lot of these weird conspiracies would have naturally died out if some people didn’t try to shut them down so much.

    • stinkbeetle a day ago ago

      For that matter why is it even such a crazy wild idea for anybody to dare to question medicines and motives from pharmaceutical companies? Or question elections?

      Both have always been massively shady. I'm old enough to remember the big stink around the Al Gore election loss, or the robust questioning of the 2016 election for that matter. So ridiculous for self-proclaimed defenders of democracy to want to ban the discussion and disagreement about the facts around elections. Democratic processes and institutions should be open to doubt, questioning, and discussion.

      The response to covid vaccines was actually extremely rational. They were highly taken up by the elderly who were shown to have the greatest risk, despite that demographic skewing more conservative (and arguably could be most at risk of "misinformation" from social media). And they were not able to stop transmission or provide much benefit to children and younger people, so they didn't get taken up so much among those groups. So there was really no need for this massive sustained psychological campaign of fearmongering, divisiveness, censorship, and mandates. They could have just presented the data and the facts as they came to hand, and be done with it.

      • dotnet00 a day ago ago

        With medicine there's pushback because the vast majority of the time, someone's scamming you and you likely don't actually know what you're talking about, we had a ton of this during covid, radioactive jewelery that was supposed to protect you, cow piss (I personally know people who tried this...), 5G towers (actual damage done to all sorts of towers), Ivermectin, Hydrochloroquine and more. People who are sick or have a sick loved one are especially vulnerable to these sorts of things (there's an example of such a victim in the comments), and often end up making things worse by waiting too long or causing further damage.

        With questioning elections, I think Jan 6 would be a pretty good indication of why it wasn't appropriate? This wasn't how questioning the results of elections goes in democracies. Instead, even after courts had investigated, the outgoing president refused to accept the result without any substantiated evidence.

        • stinkbeetle 21 hours ago ago

          > With medicine there's pushback because the vast majority of the time, someone's scamming you and you likely don't actually know what you're talking about, we had a ton of this during covid, radioactive jewelery that was supposed to protect you, cow piss (I personally know people who tried this...), 5G towers (actual damage done to all sorts of towers), Ivermectin, Hydrochloroquine and more. People who are sick or have a sick loved one are especially vulnerable to these sorts of things (there's an example of such a victim in the comments), and often end up making things worse by waiting too long or causing further damage.

          Pushback on what? There's always been new age hippy garbage, Chinese medicine, curing cancer with berries, and that kind of thing around. I don't see that causing much damage and certainly not enough to warrant censorship. People can easily see through it and in the end they believe what they want to believe.

          Far far more dangerous and the cause of real damage that I have seen come from the pharmaceutical industry and their captured regulators. Bribing medical professionals, unconscionable public advertising practices, conspiring to push opioids on the population, lying about the cost to produce medications, and on and on. There's like, a massive list of the disasters these greedy corporations and their spineless co-conspirators in government regulators have caused.

          Good thing we can question them, their motives, their products.

          > With questioning elections, I think Jan 6 would be a pretty good indication of why it wasn't appropriate?

          I don't understand your question. Can you explain why you think Jan 6 would be a pretty good indication that discussion and disagreement about elections should be censored?

          > This wasn't how questioning the results of elections goes in democracies. Instead, even after courts had investigated, the outgoing president refused to accept the result without any substantiated evidence.

          I never quite followed exactly were the legal issues around that election. Trump was alleged to have tried to illegally influence some election process and/or obstructed legal transfer of power. Additionally there was a riot of people who thought Trump won and some broke into congress and tried to intimidate law makers.

          I mean taking the worst possible scenario, Trump knew he lost and was scheming a plan to seize power and was secretly transmitting instructions to this mob to enter the building and take lawmakers hostage or something like that. Or any other scenario you like, let your imagination go wild.

          I still fail to see how that could possibly justify censorship of the people and prohibiting them from questioning the government or its democratic processes. In fact the opposite, a government official went rogue and committed a bunch of crimes so therefore... the people should not be permitted to question or discuss the government and its actions?

          There are presumably laws against those actions of rioting, insurrection, etc. Why, if the guilty could be prosecuted with those crimes, should the innocent pay with the destruction of their human rights, in a way that wouldn't even solve the problem and could easily enable worse atrocities be committed by the government in future?

          Should people who question the 2024 election be censored? Should people who have concerns with the messages from the government's foremost immigration and deportation "experts" be prohibited from discussing their views or protesting the government's actions?

          • dotnet00 21 hours ago ago

            Robbery is a crime, so why should people take any measures to protect their things from being stolen? Murder is a crime, so why care about death threats?

            New age medicine has been around forever, yes. But the effects are only known to be negligible outside of pandemics. We know from history that people did many irrational things during past pandemics due to fear and social contagion.

            It's a tough problem, everyone believes themselves an expert on everything, plus trolls and disinformation campaigns. There's also a significant information asymmetry.

            It's funny you mention opioids as I just recently came across a tweet claiming that Indians were responsible for getting Americans addicted to them via prescription. In one of the buried reply chains, the poster admits they have no evidence and are just repeating a claim someone made to them sometime. But how many people will read that initial post and reinforce their racist beliefs vs see that the claim was unsubstantiated? And when that leads to drastic action by a madman, who's going to be the target of the blame? The responsibility is too diffused to target any specific person, the government obviously won't, madmen don't act in a vacuum and so the blame falls on the platform.

            Yes, no one should have the power to determine what ideas are and are not allowed to propagate, but on the other hand, you could still go to other platforms and are not entitled to the reach of the major platforms, but then again, these platforms are extremely influential. At the same time there's also the problem that people in part view the platforms as responsible when they spread bad ideas, the platform operators also feel some level of social responsibility, while the platform owners don't want legal responsibility.

            • stinkbeetle 15 hours ago ago

              > Robbery is a crime, so why should people take any measures to protect their things from being stolen? Murder is a crime, so why care about death threats?

              I don't understand how your question relates to the discussion. Perhaps you could answer my earlier questions first and it might clear things up for me.

              Being censored by the robber from discussing the burglary is not a measure to protect your belongings any more than giving the government the power to prevent freely speaking about their carrying out the democratic processes is a measure to protect democracy from abuse by government officials.

              Should Trump have had the power to censor news and discussion of the 2016 election when there were a lot of election deniers and conspiracy theorists concerned about the legitimacy of the election and conspiracies with Russia? Absolutely not.

              > New age medicine has been around forever, yes. But the effects are only known to be negligible outside of pandemics. We know from history that people did many irrational things during past pandemics due to fear and social contagion.

              There are unfounded claims about how much damage was caused by people exercising their right to speak about covid, and they all come from authoritarians who sound like they have ravenous thirst to gain the power to silence their critics and the population at large. So I consider them totally unreliable handwaving at best, and more likely fraudulent fabrications. I actually don't think there's anything wrong with letting them use social media platforms like anybody else. It's fine if those companies decided to decide whose message to amplify or create their own terms of use, but having governments pressure corporations to carry out this censorship is a crazy overreach and violation of human rights by the government.

              The response, policies and messaging and communication by governments and bureaucrats has caused far more damage to society, to public health, to trust in institutions and trust in vaccines and medical science than common people talking about it.

  • cactusplant7374 a day ago ago

    > From President Biden on down, administration officials “created a political atmosphere that sought to influence the actions of platforms based on their concerns regarding misinformation,” Alphabet said, claiming it “has consistently fought against those efforts on First Amendment grounds.”

    This actually surprised me because I thought (and maybe still think) that it was Google employees that led the charge on this one.

    • softwaredoug a day ago ago

      It's in their interests now to throw Biden under the bus. There may be truth to this, but I'm sure its exaggerated for effect.

    • HankStallone a day ago ago

      It was. At the time, they felt like they were doing the right thing -- the heroic thing, even -- in keeping dangerous disinformation away from the public view. They weren't shy about their position that censorship in that case was good and necessary. Not the ones who said it on TV, and not the ones who said it to me across the dinner table.

      For Google now to pretend Biden twisted their arm is pretty rich. They'd better have a verifiable paper trail to prove that, if they expect anyone with a memory five years long to believe it.

      • dotnet00 a day ago ago

        To be fair, even if they were being honest about Biden twisting their arm (I don't buy it), the timing makes it impossible to believe their claim.

        • CSMastermind a day ago ago

          Why wouldn't you buy it?

          The Twitter files showed direct communications from the administration asking them ban specific users like Alex Berenson, Dr. Martin Kulldorff, and Dr. Andrew Bostom: https://cbsaustin.com/news/nation-world/twitter-files-10th-i...

          Meta submitted direct communications from the administration pressuring them to ban people as part of a congressional investigation: https://www.aljazeera.com/news/2024/8/27/did-bidens-white-ho...

          It would be more surprising if they left Google alone.

          • dotnet00 a day ago ago

            The implication of saying they were "pressed" by the Biden admin (as they claim in the letter) is that Google was unwilling. I don't buy that. They were complicit and are now throwing the Biden admin under the bus because it is politically convenient. Just like how the Twitter files showed that Twitter was complicit in it.

            • db48x a day ago ago

              Well of course they’re going to say that they resisted doing the bad thing, even though they still did the bad thing. All it took to get them to do the bad thing was for someone to ask them to do it, but they really resisted as hard as they could, honest.

              • braiamp a day ago ago

                Note, that in their letter they carefully avoided mention what happened during Trump 1.0 administration. Their policies started before Biden was president, so this is 100% throwing Biden admin under the bus.

                • db48x a day ago ago

                  Do you think that they should omit those facts? That they should fail to mention that the Biden administration used them to censor Americans?

                  • dotnet00 a day ago ago

                    The language they're using implies the Biden admin pressured them to censor (which, as pointed out, doesn't make sense because they were doing it before Biden too), rather than just admitting that they were complicit with the Biden admin to do it.

                    • db48x a day ago ago

                      Yea, but we can see through their self-serving language. The fact is they decided on a policy of banning “misinformation” that the Biden administration turned into a censorship machine. One is misguided, the other is a crime.

                      • dotnet00 a day ago ago

                        The 1st amendment doesn't prevent the government from making suggestions to private companies. They aren't allowed to coerce them into censoring things. So it still isn't a crime.

                        What the Biden admin did was not acceptable, and even at the time I got plenty of heat from HN for thinking that it was a sketchy loophole for the government to use, that it was against the spirit of the law.

                        I'm trying to emphasize the distinction because the companys' self-serving language is going to be abused to claim that the current admin - that has just threatened to sue a TV channel for bringing back a show they tried to threaten the channel into getting rid of - is actually a defender of free speech.

          • braiamp a day ago ago

            If you read those documents, you will see that the administration was telling them that those accounts were in violation of Twitter TOS. They simply said "hey, this user is violating your TOS, what are you gonna do about it?", and Twitter simply applied their rules.

            • db48x a day ago ago

              That was after they had changed the TOS to make it against the rules to talk about certain topics, such as gain of function research at Chinese labs that was funded by researchers that were themselves funded by the US government.

              • braiamp a day ago ago

                Which is still a debunked theory. Nobody created SARS-CoV-2 for nefarious purposes. The best theory we have is that there was a failure in the contention. But people pushing for that theory wanted to have a conspiration instead, when plain human failures explain everything.

                • db48x a day ago ago

                  I never said that it was created for nefarious purposes. That was you projecting or creating a straw man to attack.

                  • braiamp a day ago ago

                    What part of "people pushing for that theory wanted to have a conspiration instead" was missed? I don't care what you think it happened, I just don't want to hear more conspiration. I'm tired of that. We are humans, therefore, we are stupidly imperfect creatures. There isn't anything to learn about the event, other than humans gonna human.

                    • db48x a day ago ago

                      Except that a lot of what was banned were _not_ conspiracy theories. The truth is that the NIH _did_ fund gain of function research and that research _was_ conducted at the Wuhan Institute of Virology. Those are the facts that the government worked so hard to suppress our knowledge of. And they were able to use Google’s policies of suppressing “misinformation” to do it for several years.

                    • tbrownaw a day ago ago

                      > There isn't anything to learn about the event, other than humans gonna human.

                      US money wasn't supposed to be used to fund that kind of research. So people violated policy and evaded detection until the leak happened. How? Who? Would different audit controls have helped?

                      The was a cover-up after the fact. Again, how did it work and who was involved? What could have made it less effective?

                      The lab accident itself is the least interesting part, it's all the bureaucratic stuff that really matters. For boring generic bureaucratic-effectiveness reasons, not any "someone tried to do a bioweapon" silliness.

      • tstrimple 9 hours ago ago

        They don't need a paper trail. Conservatives will believe anything damning they see about liberals. Just vague accusations or outright lies work plenty well to keep conservatives foaming at the mouth over imagined issues.

    • frollogaston a day ago ago

      It's been known for years that the White House was pressuring Google on this. One court ordered them to cease temporarily. I wanted to link the article, but it's hard to find because of the breaking news.

  • diego_sandoval a day ago ago

    At the time, YouTube said: “Anything that would go against World Health Organization recommendations would be a violation of our policy.” [1] which, in my opinion, is a pretty extreme stance to take, especially considering that the WHO contradicted itself many times during the pandemic.

    [1] https://www.bbc.com/news/technology-52388586

    • danparsonson a day ago ago

      > the WHO contradicted itself many times during the pandemic

      Did they? I remember them revising their guidance, which seems like something one would expect during an emerging crisis, but I don't remember them directly contradicting themselves.

      • rogerrogerr a day ago ago

        As super low hanging fruit:

        June 8, 2020: WHO: Data suggests it's "very rare" for coronavirus to spread through asymptomatics [0]

        June 9, 2020: WHO expert backtracks after saying asymptomatic transmission 'very rare' [1]

        0: https://www.axios.com/2020/06/08/who-coronavirus-asymptomati... 1: https://www.theguardian.com/world/2020/jun/09/who-expert-bac...

        Of course, if we just take the most recent thing they said as "revised guidance", I guess it's impossible for them to contradict themselves. Just rapidly re-re-re-revised guidance.

        • margalabargala a day ago ago

          The difference between a contradiction and a revision is the difference between parallel and serial.

          I'm not aware that the WHO ever claimed simultaneously contradictory things.

          Obviously, rapid revisions during a period of emerging data makes YouTube's policy hard to enforce fairly. Do you remove things that were in line with the WHO when they were published? When they were made? Etc

          • brailsafe a day ago ago

            > The difference between a contradiction and a revision is the difference between parallel and serial.

            Eh, ya kind of, but it seems more like the distinction between parallel and concurrent in this case. She doesn't appear to be wrong in that instance while at the same time the models might have indicated otherwise, being an apparent contradiction and apparently both true within the real scope of what could be said about it at that time.

          • zmgsabst a day ago ago

            You’re removing people who were correct before the WHO revised their position.

            • margalabargala a day ago ago

              That is the problem I discuss in my third paragraph, yes.

            • noncoml 15 hours ago ago

              Were they correct because they studied the data or because it just happened that their propaganda for once aligned with the truth?

              A broken clock says the right time twice a day, that doesn’t mean much.

              ps: I don’t support the bans but you argument seems flawed to me.

          • naasking a day ago ago

            A censorship policy that changes daily is a shitty policy. If people on June 8th criticized that official position before they reversed the next day, do you think it was right or a good idea for them to be censored?

            • xracy a day ago ago

              That's a nice hypothetical. Do you have any examples of people getting censored for WHO changing their stance?

              Like, we're getting pretty nuanced here pretty fast, it would be nice to discuss this against an actual example of how this was enforced rather than being upset about a hypothetical situation where we have no idea how it was enforced.

            • margalabargala a day ago ago

              > A censorship policy that changes daily is a shitty policy.

              Yes.

              > If people on June 8th criticized that official position before they reversed the next day, do you think it was right or a good idea for them to be censored?

              Obviously not. Like I pointed out to the other commenter, if you were to read the comment of mine you replied to, I have a whole paragraph discussing that. Not sure why you're asking again.

              • gjsman-1000 a day ago ago

                Screw that; and HN needs a place to frame the most incredible takes so we never forget.

                • margalabargala a day ago ago

                  The person I replied to edited their comment after I replied making it look like I was saying the opposite of what I was. Is that what you were referring to?

            • IIAOPSW 20 hours ago ago

              There's only two ways one could have been contradicting information from the WHO which was later revised prior to them revising it. Either:

              1. They really did have some insight or insider knowledge which the WHO missed and they spoke out in contradiction of officialdom in a nuanced and coherent way that we can all judge for ourselves.

              2. They in fact had no idea what they were talking about at the time, still don't, and lucked into some of it being correct later on.

              I refer to Harry Frankfurt's famous essay "On Bullshit". His thesis is that bullshit is neither a lie nor the truth but something different. Its an indifference to the factuality of ones statements altogether. A bullshit statement is one that is designed to "sound right" for the context it is used, but is actually just "the right thing to say" to convince people and/or win something irrespective of if it is true or false.

              A bullshit statement is more dangerous than a lie, because the truth coming to light doesn't always expose a bullshitter the way it always exposes a lie. A lie is always false in some way, but bullshit is uncorrelated with truth and can often turn out right. Indeed a bullshitter can get a lucky streak and persist a very long time before anyone notices they are just acting confident about things they don't actually know.

              So in response.

              It is still a good idea to censor the people in category two. Even if the hypothetical person in your example turned out to get something right that the WHO initially got wrong, they were still spreading false information in the sense that they didn't actually know the WHO was wrong at the time when they said it. They were bullshitting. Having a bunch of people spreading a message of "the opposite of what public health officials tell you" is still dangerous and bad, even if sometimes in retrospect that advice turns out good.

              People in category one were few and far between and rarely if ever censored.

              • naasking 14 hours ago ago

                > It is still a good idea to censor the people in category two.

                I disagree on numerous levels with this position, not just on ethical grounds, but also on empirical grounds. People are simply not as gullible as you think they are, but I don't have time to delve into this, so I'll just leave it at that.

                > People in category one were few and far between and rarely if ever censored.

                According to whom? The stated policy makes no such distinction, it says anyone who contradicts WHO positions ought to be censored. There is no nuance, and how exactly is YouTube going to judge who belongs in each category? If they could reliably judge who was bullshitting, they wouldn't need the WHO policy to begin with. The policy is a "cover my ass" blanket so they don't have to deal with the nuance.

                • mrbombastic 6 hours ago ago

                  "People are simply not as gullible as you think they are, but I don't have time to delve into this, so I'll just leave it at that." well i for one don't believe you :).

              • Ray20 8 hours ago ago

                > It is still a good idea to censor the people in category two.

                I mean how can you censor the WHO?

                > WHO initially got wrong

                But they don't got something wrong, they, as you put it, were "bullshitting", and it was obvious to any person with a three-digit IQ

          • natch a day ago ago

            They would not utter the word Taiwan. That’s an huge red flag that they are captured and corrupt. Are you claiming this has changed?

            • margalabargala a day ago ago

              Did you reply to the wrong comment? We're discussing whether the WHO put out simultaneously contradictory information. Whether the WHO's politics matches your preferred politics for southeast Asia doesn't seem topical?

          • dazilcher 20 hours ago ago

            > I'm not aware that the WHO ever claimed simultaneously contradictory things.

            Whether they did or not is almost irrelevant: information doesn't reach humans instantaneously, it takes time to propagate through channels with varying latency, it gets amplified/muted depending on media bias, people generally have things going on in life other than staying glued to new sources, etc.

            If you take a cross sample you're guaranteed to observe contradictory "parallel" information even if the source is serially consistent.

        • danparsonson a day ago ago

          OK and if you said something that you later realised to be wrong, would you be contradicting yourself by correcting it? What should they have done in this situation? People do make mistakes, speak out of turn, say the wrong thing sometimes; I don't think we should criticise someone in that position who subsequently fixes their error. And within a couple of days in this case! That's a good thing. They screwed up and then fixed it. What am I missing here?

          • stinkbeetle a day ago ago

            When you're a global organization who is pushing for the censorship of any dissent or questioning of your proclamations, it's really on you not to say one thing one day then the opposite the next day, isn't it? They could have taken some care to make sure their data and analysis was sound before making these kinds of statements.

            If you posted to YouTube that it is very rare for asymptomatics to spread the disease, would you be banned? What if you posted it on the 9th in the hours between checking their latest guidance and their guidance changing? What if you posted it on the 8th but failed to remove it by the 10th?

            What if you disagreed with their guidance they gave on the 8th and posted something explaining your stance? Would you still get banned if your heresy went unnoticed by YouTube's censors until the 10th at which time it now aligns with WHO's new position? Banned not for spreading misinformation, but for daring to question the secular high priests?

            • danparsonson 20 hours ago ago

              Good lord, refer to my original comment. The person I was replying to claimed the WHO contradicted themselves, I asserted that they did not. All the rest of this is your own addition.

              • stinkbeetle 15 hours ago ago

                They did contradict themselves. The assertion about "new data" isn't credible, they fucked up.

            • pests a day ago ago

              Did the WHO push for censorship or was it YouTube/Google/others?

              It was a novel time and things were changing daily. Care needs to be taken yes, but it’s also weighed against clear and open communication. People were very scared. Thinking they would die. I don’t mind having up-to-date information even if it were changing daily.

              • stinkbeetle a day ago ago

                > Did the WHO push for censorship or was it YouTube/Google/others?

                Quite likely the WHO directly or by proxy with members who are also part of bureaucracy and governments in member states.

                There is no question the WHO loves censorship and take an authoritarian approach to their "authority".

                https://healthpolicy-watch.news/the-world-health-organizatio...

                https://www.theguardian.com/world/2020/nov/13/who-drops-cens...

                If corporations start adopting policies that censor anything contradicting WHO, there would be a larger onus on a claim that they were not involved in that censorship action, in my opinion.

                If it wasn't them and it was all Google's idea to censor this without any influence from governments or these organizations, which is quite laughable to think but let's entertain the idea -- the WHO still should not have responded as it did with these knee jerk reactions, and also it should have been up to Google to ensure the did not use as their "source of truth" an organization that behaved in that way.

                > It was a novel time

                It wasn't really that novel since there have been centuries to study pandemics and transmissible diseases of all kinds, and there have even been many others of slightly less scale happen.

                > and things were changing daily.

                Things always change daily. Covid was not particularly "fast moving" at the time. It's not like new data was coming in that suddenly changed things day to day. It just progressed over the course of months and years. It appeared to be wild and fast moving and ever changing mainly because of the headless-chicken response from organizations like this.

                > Care needs to be taken yes, but it’s also weighed against clear and open communication. People were very scared. Thinking they would die.

                People were very scared because of the fear campaign, and the imbecilic and contradictory responses from these organizations.

                Not that it was nothing to be afraid of, but people should have calmly been given data and advice and that's it. Automobiles, heart attacks, and cancer kill lots of people too, and should be taken very seriously and measures taken to reduce risk but even so it would be stupid to start screaming about them and cause panic.

                > I don’t mind having up-to-date information even if it were changing daily.

                It's not having data that is the problem, it is jumping the gun with analysis and findings and recommendations based on that data, then having to retract it immediately and say the opposite.

                • Jensson 19 hours ago ago

                  > If it wasn't them and it was all Google's idea to censor this without any influence from governments or these organizations

                  We actually has the emails the Biden administration sent to Youtube, here is a quote they sent:

                    "we want to be sure that you have a handle on vaccine hesitancy generally and are working toward making the problem better. This is a concern that is shared at the highest (and I mean highest) levels of the White House"
                  
                  That is a very clear threat. "We want to make sure you ...", and then saying this threat is done with the highest authority of the USA, so better get working on what we want.

                  There are hundreds of such emails detailed in this report if you want to read what they sent to the different tech companies to make them so scared that they banned anything related to Covid: https://judiciary.house.gov/media/press-releases/weaponizati...

          • f33d5173 a day ago ago

            Them correcting themselves isn't a bad thing. The point is that it would be absolutely retarded to require that people never disagree with the WHO. Please try and follow the thread of the conversation and not take it down these pointless tangents.

            • danparsonson 20 hours ago ago

              No, the point (and my original reply) is that correcting themselves is not the same as contradicting themselves. I didn't say anything about never disagreeing with them, and it's not a tangent, I'm replying to replies to my original comment.

              • f33d5173 10 hours ago ago

                You're arguing over language, which is arbitrary. We can choose to define the words however we like. Utterly pointless

        • brookst a day ago ago

          Is there a difference between an expert opinion in the midst of a pandemic and an organizational recommendation?

          • rogerrogerr a day ago ago

            Sure seemed like you'd get kicked off YouTube equally fast for questioning either one.

            • thinkingtoilet a day ago ago

              Oh stop it. There was rampant misinformation on youtube all through out the pandemic.

              • rogerrogerr a day ago ago

                Like that the novel coronavirus first seen in Wuhan may have come from the Wuhan Novel Coronavirus Lab?

                Yeah, that was banished to the dark corners of Reddit until Jon Stewart said the obvious, and he was considered too big to censor.

                • Dylan16807 a day ago ago

                  No. People were talking about it all over.

                • thinkingtoilet 15 hours ago ago

                  Put your money where your mouth is. How much do you want to bet I can find 20 videos today talking about the lab theory released in 2020 and 2021? I will bet you literally any amount of money. People like you are so insufferable. You know what you are saying is factually wrong.

              • mensetmanusman a day ago ago

                And unfortunately much of it was spread by official institutions like the WHO.

                • pylotlight a day ago ago

                  and the governments, all of which who were bought and paid for by...... big pharma. This comment was brought to you by Pfizer

        • 1oooqooq a day ago ago

          they also changed the symptoms definitions, so ...

          • danparsonson a day ago ago

            So as researchers learned more about COVID the WHO should've just ignored any new findings and stuck to their initial guidance? This is absurd.

          • a day ago ago
            [deleted]
      • wdr1 a day ago ago

        > Did they?

        They said it was a fact that COVID is NOT airborne. (It is.)

        Not they believed it wasn't airborne.

        Not that data was early but indicated it wasn't airborne.

        That it was fact.

        In fact, they published fact checks on social media asserting that position. Here is one example on the official WHO Facebook page:

        https://www.facebook.com/WHO/posts/3019704278074935/?locale=...

        • danparsonson 20 hours ago ago

          None of that argues that they contradicted themselves. You and several others have just hijacked this thread to pile on the WHO.

          Argue that they were incompetent in their handling of it, sure, whatever. That's not the comment you're replying to.

      • Manuel_D 11 hours ago ago

        Some WHO reports were suggesting that lockdowns do more harm than good as early as late 2020.

    • kevin_thibedeau a day ago ago

      Don't forget that they ban-hammered anyone who advanced the lab leak theory because a global entity was pulling the strings at the WHO. I first heard about Wuhan in January of 2020 from multiple Chinese nationals who were talking about the leak story they were seeing in uncensored Chinese media and adamant that the state media story was BS. As soon as it blew up by March, Western media was manipulated into playing the bigotry angle to suppress any discussion of what may have happened.

      • zeven7 a day ago ago

        I believe having Trump as president exacerbated many, many things during that time, and this is one example. He was quick to start blaming the "Chinese", he tried to turn it into a reason to dislike China and Chinese people, because he doesn't like China, and he's always thinking in terms of who he likes and dislikes. This made it hard to talk about the lab leak hypothesis without sounding like you were following Trump in that. If we had had a more normal president, I don't think this and other issues would have been as polarized, and taking nuanced stances would have been more affordable.

      • 9 hours ago ago
        [deleted]
      • amanaplanacanal a day ago ago

        My memory is that the "lab leak" stuff I saw back then was all conspiracy theories about how it was a Chinese bioweapon.

        Eventually I started seeing some serious discussion about how it might have been accidentally created through gain of function research.

        • api a day ago ago

          I’m undecided on the issue, but… if I were trying to cover up an accidental lab leak I’d spread a story that it was a giant conspiracy to create a bio weapon. For extra eye rolls I’d throw in some classic foil hat tropes like the New World Order or the International Bankers.

          If it was a lab leak, by far the most likely explanation is that someone pricked themselves or caught a whiff of something.

          A friend of mine who lived in China for a while and is familiar with the hustle culture there had his own hypothesis. Some low level techs who were being given these bats and other lab animals to euthanize and incinerate were like “wait… we could get some money for these over at the wet market!”

        • naasking a day ago ago

          > My memory is that the "lab leak" stuff I saw back then was all conspiracy theories about how it was a Chinese bioweapon.

          No, that was just the straw man circulated in your echo chamber to dismiss discussion. To be clear, there were absolutely people who believed that, but the decision to elevate the nonsense over the serious discussion is how partisan echo chambers work.

          • gusgus01 a day ago ago

            That was one of the main arguments by some of my coworkers and friends when COVID came up socially. I specifically remember a coworker at a FAANG saying something along the lines of "It's a bioweapon, so it's basically an act of war".

      • potsandpans a day ago ago

        I called this out in this thread and was immediately downvoted

      • McGlockenshire a day ago ago

        > because a global entity was pulling the strings at the WHO'

        excuse me I'm sorry what?

      • cmilton a day ago ago

        Because that is a bold claim to make. There is no proof of a lab leak and evidence leads to the wet market as the source. There is a debate out there for 100k to prove this. Check it out.

        • pton_xd a day ago ago

          > Because that is a bold claim to make. There is no proof of a lab leak and evidence leads to the wet market as the source.

          A novel coronavirus outbreak happens at the exact location as a lab performing gain of function research on coronaviruses... but yeah, suggesting a lab leak is outlandish, offensive even, and you should be censored for even mentioning that as a possibility. Got it.

          This line of thinking didn't make sense then and still doesn't make sense now.

          • j_w 14 hours ago ago

            But the first cases were all linked to a wet market far enough from the lab that it would be highly improbable for the cases to come from the lab itself.

            • kevin_thibedeau 11 hours ago ago

              Their containment protocols are known to be lax. A staffer could have been a vector for the initial transmission. Remember they tried to pin it on US military personnel early on. The CCP wants the market to be the focus of attention and we'll never get believable evidence that suggests otherwise.

          • aeternum 21 hours ago ago

            Yes, Jon Stewart really nailed this point, it's a great clip and worth the re-watch.

          • cmilton 14 hours ago ago

            Correlation does not equal causation my friend.

            Plenty of people were able to talk about lab leak conspiracies. That is why we are still debating it today.

        • mayama a day ago ago

          > There is no proof of a lab leak and evidence leads to the wet market as the source

          Because WHO worked with CPC to bury evidence and give clean chit to wuhan lab. There was some pressure building up then for international teams to visit wuhan lab and examine data transparently. But, with thorough ban of lab leak theory, WHO visited china and gave clean chit without even visiting wuhan lab or having access to lab records. The only place that could prove this definitively buried all records.

        • themaninthedark a day ago ago

          It is not as cut and dry as you think. Also it is rather hard to get any evidence when you aren't allow to visit the "scene of the crime" so to speak and all data is being withheld.

          https://www.nytimes.com/interactive/2024/06/03/opinion/covid...

          Even Dr Fauci said in 2021 he was "not convinced" the virus originated naturally. That was a shift from a year earlier, when he thought it most likely Covid had spread from animals to humans.

          https://www.deseret.com/coronavirus/2021/5/24/22451233/coron...

          (..February 2023..) The Department of Energy, which oversees a network of 17 U.S. laboratories, concluded with “low confidence” that SARS-CoV-2 most likely arose from a laboratory incident. The Federal Bureau of Investigation said it favored the laboratory theory with “moderate” confidence. Four other agencies, along with a national intelligence panel, still judge that SARS-CoV-2 emerged from natural zoonotic spillover, while two remain undecided.

          https://www.nejm.org/doi/full/10.1056/NEJMp2305081

          WHO says that "While most available and accessible published scientific evidence supports hypothesis #1, zoonotic transmission from animals, possibly from bats or an intermediate host to humans, SAGO is not currently able to conclude exactly when, where and how SARS-CoV-2 first entered the human population."

          However "Without information to fully assess the nature of the work on coronaviruses in Wuhan laboratories, nor information about the conditions under which this work was done, it is not possible for SAGO to assess whether the first human infection(s) may have resulted due to a research related event or breach in laboratory biosafety."

          https://www.who.int/news/item/27-06-2025-who-scientific-advi...

          WHO paraphrased: We have no data at all about the Wuhan Laboratory so we can not make a conclusion on that hypothesis. Since we have data relating to natural transmission from animals we can say that situation was possible.

        • mensetmanusman a day ago ago

          It’s not a bold claim. The Fauci emails showed he and others were discussing this as a reasonable possibility.

        • natch a day ago ago

          But there is no proof of any real wet lab connection and evidence points to the lab as a source.

        • potsandpans a day ago ago

          The topic at hand is not whether it's a bold claim to make. The question is: should organizations that control a large portion of the world's communication channels have the ability to unilaterally define the tone and timber of a dialog surrounding current events?

          To the people zealously downvoting all of these replies: defend yourselves. What about this is not worthy of conversation?

          I'm not saying that I support lab leak. The observation is that anyone that discussed the lab leak hypothesis on social media had content removed and potentially were banned. I am fundamentally against that.

          If the observation more generally is that sentiments should be censored that can risk peoples lives by influencing the decisions they make, then let me ask you this:

          Should Charlie Kirk have been censored? If he were, he wouldn't have been assassinated.

          • blooalien 20 hours ago ago

            > "Should Charlie Kirk have been censored? If he were, he wouldn't have been assassinated."

            On the other hand, if he were, then whoever censored him might have just as easily become the target of some other crazy, because that appears to be the world we live in now. Something's gotta change. This whole "us vs them" situation is just agitating the most extreme folks right over the edge of sanity into "Crazy Town". Wish we could get back to bein' that whole "One Nation Under God" "Great Melting Pot" "United States" they used to blather on about in grade-school back in the day, but that ship appears to have done sailed and then promptly sunk to the bottom... :(

        • naasking a day ago ago

          > Because that is a bold claim to make. There is no proof of a lab leak and evidence leads to the wet market as the source.

          It was not a bold claim at the time. Not only was there no evidence that it was the wet market at the time, the joint probability of a bat coronavirus outbreak where there were few bat caves but where they were doing research on bat coronaviruses is pretty damning. Suppressing discussion of this very reasonable observation was beyond dumb.

          • tbrownaw a day ago ago

            > Suppressing discussion of this very reasonable observation was beyond dumb.

            I thought it wasn't so much an error as a conflict of interest.

    • hyperhopper a day ago ago

      The united states also said not to buy masks and that they were ineffective during the pandemic.

      Placing absolute trust in these organizations and restricting freedom of speech based on that is a very bootlicking, anti-freedom stance

      • anonymousiam a day ago ago

        Fauci was trying to prevent a run on masks, which he believed were needed by the health care workers. So he probably justified his lie to the US to himself because it was for the "greater good" (The ends justify the means is not my view BTW).

        It turns out that masks ARE largely ineffective at preventing CoViD infection. It's amazing how many studies have come up with vastly different results.

        https://egc.yale.edu/research/largest-study-masks-and-covid-...

        (Before you tell me that the story I cited above says the opposite, look at the effectiveness percentages they claim for each case.)

        There's also this: https://x.com/RandPaul/status/1970565993169588579

        • iyn 19 hours ago ago

          Actual (N95/FFP2/FFP3) masks DO work, your comment is misleading. The study you've linked says:

          > Colored masks of various construction were handed out free of charge, accompanied by a range of mask-wearing promotional activities inspired by marketing research

          "of various construction" is... not very specific.

          If you just try to cover your face with a piece of cloth it won't work well. But if you'll use a good mask (N95/FFP2/FFP3), with proper fit [0] then you can decrease the chance of being infected (see e.g. [1])

          [0] https://www.mpg.de/17916867/coronavirus-masks-risk-protectio...

          [1] https://www.cam.ac.uk/research/news/upgrading-ppe-for-staff-...

        • dotnet00 a day ago ago

          They claim a 5% reduction in spread with cloth masks and a 12% reduction with surgical masks. I think 1 less case out of every 10 or 20 is pretty acceptable?

          Especially at the time when many countries were having their healthcare systems overloaded by cases.

        • pixxel 20 hours ago ago

          [dead]

        • lisbbb a day ago ago

          I didn't want to be the one to have to say it, but neither masks nor social distancing had any scientific backing at all. It was all made up, completely made up. The saddest thing I see all the time is the poor souls STILL wearing masks in 2025 for no reason. I don't care how immunocompromised they are, the mask isn't doing anything to prevent viral infection at all. They might help against pollen. I also can't believe how many doctors and nurses at my wife's cancer clinic wear masks all the damn time even though they are not in a surgical enviornment. It's all been foisted upon them by the management of those clinics and the management is completely insane and nobody speaks up about it because it's their job if they do, so the isanity just keeps rolling on and on and it is utterly dehumanizing and demoralizing. If a cancer patient wants to wear a mask because it affords them some tiny comfort, then fine, but that is purely psychological. I've seen it over and over and over because I've been at numerous hospitals this past year trying to help my wife survive a cancer that I think Pfizer may be to blame for.

          • jbm a day ago ago

            I'm sorry about your wife.

            There was scientific basis for N95 masks and similar masks. If you are talking about cloth and paper masks, I mostly agree. Even then there were tests done with using even those surgical masks with 3d printed frames. I remember this as one example of people following this line of thinking.

            https://www.concordia.ca/news/stories/2021/07/26/surgical-ma...

            As for dehumanization, I used to live in Tokyo and spending years riding the train. I think blaming masks for dehumanization when we have entire systems ragebaiting us on a daily basis is like blaming the LED light for your electric bill.

            Social Distancing having "no scientific backing" is very difficult to respond to. Do you mean in terms of long term reduction of spread, or as a temporary measure to prevent overwhelming the hospitals (which is what the concern was at the time)?

            I do agree that it was fundamentally dishonest to block people from going to church and then telling other people it was OK to protest (because somehow these protests were "socially distanced" and outdoors). They could have applied the same logic to Church groups and helped them find places to congregate, but it was clearly a case of having sympathy for the in-group vs the out-group.

          • D-Machine a day ago ago

            Basically, yes. However, if we make a distinction between respirators (e.g. N95 mask) and masks (including "surgical" masks, which don't really have a meaningfully better FFE than cloth masks), then at least respirators offer some protection to the wearer, provided they also still minimize contact. But, in keeping with this distinction, yes, masks were never seriously scientifically supported. It is incredibly disheartening to see mask mandates still in cancer wards, despite these being mandates for (objectively useless) cloth/surgical masks.

          • iyn 18 hours ago ago

            > I didn't want to be the one to have to say it, but neither masks nor social distancing had any scientific backing at all.

            This is false. Even quick search shows multiple papers from pre-covid times that show masks being effective [0][1]. There are many more studies post-covid that show that N95/FFP2/FFP3 masks actually work if you wear them correctly (most people don't know how to do this). Educate yourself before sharing lies.

            [0] https://pubmed.ncbi.nlm.nih.gov/21477136/

            [1] https://pubmed.ncbi.nlm.nih.gov/19652172/

      • amanaplanacanal a day ago ago

        Yeah they burned a lot of trust with that, for sure.

        • lisbbb a day ago ago

          They burned it beyond down to the ground and below. And many of you on here willfully continue to trust them and argue vehemently against people who try to tell you the actual truth of the matter. RFK Jr. is a flawed human being, but he's doing some good work in unwinding some of the web of lies we live under right now.

          • aeternum 21 hours ago ago

            It's good RFK is more willing to question things but he seems just as guilty when it comes to spinning webs of lies.

            If we think tylenol might cause autism why doesn't he run/fund a nice clean and large randomized controlled trial? Instead he spreads conjecture based on papers with extremely weak evidence.

          • alphabettsy a day ago ago

            He’s just bringing different lies with new sponsors.

      • dakial1 13 hours ago ago

        I think the problem is that apparently some people discovered there is a profitable business model in spreading misinformation, so a trustful (even if not always right), non malicious, reference of information might be needed.

        But who watches the watchmen?

      • a day ago ago
        [deleted]
    • sterlind a day ago ago

      it was an extreme time, but yes, probably the most authoritarian action I've seen social media take.

      misinformation is a real and worsening problem, but censorship makes conspiracies flourish, and establishes platforms as arbiters of truth. that "truth" will shift with the political tides.

      IMO we need to teach kids how to identify misinformation in school. maybe by creating fake articles, mixing them with real articles and having students track down sources and identify flaws. critical thinking lessons.

      • YZF a day ago ago

        This just seems incredibly difficult. Even between people who are highly intelligent, educated, and consider themselves to be critical thinkers there can be a huge divergence of what "truth" is on many topics. Most people have no tools to evaluate various claims and it's not something you can just "teach kids". Not saying education can't move the needle but the forces we're fighting need a lot more than that.

        I think some accountability for platforms is an important part of this. Platforms right now have the wrong incentives, we need to fix this. It's not just about "truth" but it's also about stealing our attention and time. It's a drug and we should regulate it like the drug it is.

      • adiabatichottub a day ago ago

        As I recall from my school days, in Social Studies class there were a set of "Critical Thinking" questions at the end of every chapter in the textbook. Never once were we assigned any of those questions.

        • tbrownaw a day ago ago

          I'd expect questions with that label to have the sort of answers that are a pain to grade.

        • cultofmetatron a day ago ago

          [flagged]

          • Jensson a day ago ago

            You do realize Arabs also massacred a lot of Jews at the same time? Both sides were absolutely abhorrent at the time, it was war between quickly assembled militias and civilians fighting for survival, that is never going to end well.

            Example of Arabs lynching Jews, they started killing each other before the partition happened, so everyone knew it would be all out war after the British left:

            > Arab workers stormed the refinery armed with tools and metal rods, beating 39[d] Jewish workers to death and wounding 49.

            https://en.wikipedia.org/wiki/Haifa_Oil_Refinery_massacre

            • margalabargala a day ago ago

              If you're going to take any conflict in the Middle East up to and including present day, and go back and forth in time to see which group "started it", you'll run out of written record first. There's always an earlier counterexample atrocity that the other side did.

            • SamBam a day ago ago

              > Both sides were absolutely abhorrent

              ...and I think OP's point was that they were only showing one side.

              • a day ago ago
                [deleted]
              • cultofmetatron a day ago ago

                > I think OP's point was that they were only showing one side.

                exactly my point

                • Jensson a day ago ago

                  Did they really bring up how the Arab massacred the Israelis during the partitioning when they talked about the 6 day war? If that was what you meant you should have said that, but you didn't so I don't believe it, to me it looks like you just wanted them to talk about Israeli atrocities there and not what the Arabs did to them before.

                  If you meant they should have brought up the preceding conflict when talking about the 6 day war you wouldn't just have mentioned what Israel did there.

                  So the only way I can read your post is that you wanted the coverage to be one sided. But maybe you were just unclear and you meant "I wanted to hear about how the conflicts escalated and arabs massacred jews which lead to jews massacring arabs and then repeat in an ever increasing spiral of violence", if so can you please clarify that is what you meant here and say you should have been clearer in your original comment?

                  • cultofmetatron a day ago ago

                    My point is that the coverage I received in highschool civics WAS one sided in which jews were shown as heroes fighting to create a country against all odds and all the countries around it wanted to destroy them because they were jews.

                    The very institutions that we created to educate them against propaganda have themselves been used to instill propaganda. In this case, the very real islamaphobia and dehumanization of arabs that runs at the core of american culture.

                    • Jensson 19 hours ago ago

                      Ok, you could have made that more clear, I didn't even consider the possibility that you didn't know the Jews did bad things when you heard that, so it sounded to me like you just wanted them to say bad things about Jews in this lecture rather than you feeling deceived.

                      I grew up in Sweden in the 90s and the debate was raging already then here due to significant Muslim immigration, so there was always people who brought up all these bad things the Jews did there, so I couldn't imagine an adult who cared about the topic not knowing.

                      • cultofmetatron 9 hours ago ago

                        >I didn't even consider the possibility that you didn't know the Jews did bad things when you heard that

                        so to give you context, (and I went to a good school)

                        In my American education, we spent 2 years covering in great detail teh horrors of the holocaust along with meeting survivors.

                        We spent exactly 0 days studying or learning about the kumir rouge, the vietnam war, the korean war, the congolese genocide, the Rwandan genocide or the armenian genocide.

                        I literally didn't learn about any of these till I went to university and was told about this stuff by peers who were more educated on these things. these topics were just not taught. An the idea that jews might the aggressors and not innocent victims? I'm pretty sure you would have gotten politically targeted for simply suggesting the idea.

                        so to drive the point home, most americans really have no idea.

      • blooalien 20 hours ago ago

        > "IMO we need to teach kids how to identify misinformation in school. maybe by creating fake articles, mixing them with real articles and having students track down sources and identify flaws. critical thinking lessons."

        You just described a perfectly normal "Civics & Current Events" class in early grade-school back when / where I grew up. We were also taught how to "follow the facts back to the actual sources" and other such proper research skills. This was way back when you had to go to an actual library and look up archived newspapers on microfiche, and encyclopedias were large collections of paper books. Y'know... When dinosaurs still roamed the streets... ;)

      • Aurornis a day ago ago

        > IMO we need to teach kids how to identify misinformation in school.

        This is extremely difficult. Many of the people who thrive on disinformation are drawn to it because they are contrarian. They distrust anything from the establishment and automatically trust anything that appears anti-establishment. If you tell them not to trust certain sources that’s actually a cue to them to explore those sources more and assume they’re holding some valuable information that “they” don’t want you to know.

        The dynamics of this are very strange. A cluster of younger guys I know can list a dozen different times medical guidance was wrong in history from memory (Thalidomide, etc), but when you fact check Joe Rogan they laugh at you because he’s a comedian so you can’t expect him to be right about everything. “Do your own research” is the key phrase, which is a dog whistle to mean find some info to discount the professionals but then take sources like Joe Rogan and his guests at face value because they’re not the establishment.

      • tjpnz a day ago ago

        Some of the worst examples of viral misinformation I've encountered were image posts on social media. They'll often include a graph, a bit of text and links to dense articles from medical journals. Most people will give up at that point and assume that it's legit because the citations point to BMJ et el. You actually need to type those URLs into a browser by hand, and assuming they go anywhere leverage knowledge taught while studying university level stats.

        I spent several hours on one of these only to discover the author of the post had found a subtle way to misrepresent the findings and had done things to the graph to skew it further. You cannot expect a kid (let alone most adults) to come to the same conclusion through lessons on critical thinking.

  • lesuorac a day ago ago

    2 years is a pretty long ban for a not even illegal conduct.

    Although if they got banned during the start of covid during the Trump administration then we're talking about 5 years.

    • asadotzler a day ago ago

      No one owes them any distribution at all.

      • zug_zug a day ago ago

        Absolutely. Especially when those election deniers become insurrectionists.

      • beeflet a day ago ago

        that is a two-way street

    • jackmottatx a day ago ago

      [dead]

    • Simulacra a day ago ago

      They went against a government narrative. This wasn't Google/Youtube banning so much as government ordering private companies to do so.

      • JumpCrisscross a day ago ago

        > wasn't Google/Youtube banning so much as government ordering private companies to do so

        No, it was not. It’s particularly silly to suggest this when we have live example of such orders right now.

        The companies were nudged. (And they were wrong to respond to public pressure.) The President, after all, has a “bully pulpit.” But there were no orders, no credibly threats and plenty of companies didn’t deplatform these folks.

        • EasyMark a day ago ago

          That's what I told my MAGA friends. Biden recommended stuff, Trump threatens stuff. So far only one of them has followed through with action. Trump has threatened business deals and prosecution, and is currently sending government after his opponents with the DoJ. Yet those same people are as quiet as mice now on "government bullying"

        • spullara a day ago ago

          They literally had access to JIRA at Twitter so they could file tickets against accounts.

          • JumpCrisscross a day ago ago

            > literally had access to JIRA at Twitter so they could file tickets against accounts

            I’m not disputing that they coördinated. I’m challenging that they were coerced.

            We wouldn’t describe Fox News altering a script on account of a friendly call from Miller and friends the “government ordering private companies” around. (Or, say, Florida opening their criminal justice records to ICE the federal government ordering states around.) Twitter’s leadership and the Biden administration saw eye to eye. This is a story about a media monoculture and private censorship, not government censorship.

          • unethical_ban a day ago ago

            Do you think no nefarious nation state actors are on social media spinning disinformation?

            • EasyMark a day ago ago

              It's extremely obvious on twitter. blue check accounts that post every few minutes 24/7 with profiles that say stuff like "true believer, wife, lover, seeker-of-truth. Don't DM me, I don't answer" . They are on there in the hundreds of thousands.

        • nailer 6 hours ago ago

          Zuckerberg mentioned meta were getting government employees that were calling Facebook absolutely furious, and when they didn’t take down legal speech administration did not approve of, there was an immediate investigation launched into Meta that he considers retaliatory.

          https://apnews.com/article/meta-platforms-mark-zuckerberg-bi...

          https://open.spotify.com/episode/3kDr0LcmqOHOz3mBHMdDuV?si=j...

        • starik36 a day ago ago

          That was certainly the case with Twitter. It came out during the congressional hearings. FBI had a direct line to the decision makers.

          • JumpCrisscross a day ago ago

            > was certainly the case with Twitter

            It was not. No threats were made, and Twitter didn’t blindly follow the FBI’s guidance.

            The simple truth is the leftist elements that wanted to control the debate were there in the White House and in Twitter’s San Francisco offices. Nobody had to be coerced, they were coördinating.

          • brokencode a day ago ago

            A direct line to threaten decision makers? Or to point out possible misinformation spreaders?

            • starik36 a day ago ago

              Threaten. Because of the implication.

              • brokencode 21 hours ago ago

                Do you think Elon would ever shut up about it if he was getting threatened by Biden and the FBI?

                • starik36 9 hours ago ago

                  It was before Elon.

              • JumpCrisscross a day ago ago

                > Because of the implication

                What was the implication? Twitter had no business in front of the federal government. They were wilfully complying.

                That doesn't make it okay. But it's a total retconning of actual history to suggest this was government censorship in any form.

              • dotnet00 a day ago ago

                Musk owned Twitter for years of the Biden admin, at least one year of that was him openly simping for Trump.

                So... what sort of threat was this, that suddenly disappeared when Musk bought it? How credible was the threat if Musk was able to release the Twitter Files without repercussions from the Biden admin?

                • starik36 8 hours ago ago

                  The Twitter files happened before Elon. He stopped it once he got in.

                  And the implication of repercussions were for employees that were in charge of removing content. Not for the head honchos.

                  • dotnet00 7 hours ago ago

                    What were those repercussions? Besides just being fired for not doing what their employer wanted (indicating complicity from the company rather than pressure from the government)?

      • LeafItAlone a day ago ago

        And do you think the impetuous behind this action happening now is any different? In both cases YouTube is just doing what the government wants.

      • stronglikedan a day ago ago

        [flagged]

        • 3cKU a day ago ago

          [flagged]

  • system7rocks a day ago ago

    We live in a complicated world, and we do need the freedom to get things right and wrong. Never easy though in times of crisis.

    Silver lining in this is the conversation continued and will continue. I can see governments needing to try to get accurate and helpful information out in crisis - and needing to pressure or ask more of private companies to do that. But also like that we can reflect back and go - maybe that didn’t work like what we wanted or maybe it was heavy-handed.

    In many governments, the government can do no wrong. There are no checks and balances.

    The question is - should we still trust YouTube/Google? Is YouTube really some kind of champion of free speech? No. Is our current White House administration a champion of free speech? Hardly.

    But hopefully we will still have a system that can have room for critique in the years to come.

    • electriclove a day ago ago

      It is scary how close we were to not being able to continue the conversation.

      • doom2 an hour ago ago

        If anything, I think we're even closer. It feels like the current administration is stifling speech more than ever. It's open season on people who don't proudly wave the flag or correctly mourn Charlie Kirk. People who dare speak against Israel are being doxxed and in some cases hounded out of their jobs. Books are being taken off library shelves on the whim of a very few community members with objections. And all of it is getting a giant stamp of approval from the White House.

    • type0 a day ago ago

      > Is our current White House administration a champion of free speech? Hardly.

      So after January 22 2026, US leaves WHO and youtube users will be able to contradict WHO recommendations

  • whinvik 21 hours ago ago

    Its odd. People on HN routinely complain how Stripe or PayPal or some other entity banned them unfairly and the overwhelming sentiment is that it was indeed unfair.

    But when it comes to this thread, the sentiment mostly is banning is good and we should trust Google made the right choice.

    • 21 hours ago ago
      [deleted]
    • squigz 17 hours ago ago

      Like the other commenter says, HN isn't a hive mind and doesn't always agree on things.

      More than that... different situations usually require different conclusions.

    • seivan 19 hours ago ago

      [dead]

  • breadwinner a day ago ago

    I think it would be wise to listen to Nobel Prize-winning journalist Maria Ressa of The Philippines, regarding unchecked social media.

    "You and I, if we say a lie we are held responsible for it, so people can trust us. Well, Facebook made a system where the lies repeated so often that people can't tell."

    "Both United Nations and Meta came to the same conclusion, which is that this platform Facebook actually enabled genocide that happened in Myanmar. Think about it as, when you say it a million times... it is not just the lie but also it is laced with fear, anger and hate. This is what was prioritized in the design and the distribution on Facebook. It keeps us scrolling, but in countries like Myanmar, in countries like Philippines, in countries where institutions are weak, you saw that online violence became real world violence."

    "Fear, anger, hate, lies, salaciousness, this is the worst of human nature... and I think that's what Big Tech has been able to do through social media... the incentive structure is for the worst of who we are because you keep scrolling, and the longer you keep scrolling the more money the platform makes."

    "Without a shared reality, without facts, how can you have a democracy that works?"

    https://www.cnn.com/2025/01/12/us/video/gps0112-meta-scraps-...

    • themaninthedark a day ago ago

      "Beware of he who would deny you access to information for in his heart he dreams himself your master." - Commissioner Pravin Lal, U.N. Declaration of Rights

      • ethbr1 a day ago ago

        Full quote: "As the Americans learned so painfully in Earth's final century, free flow of information is the only safeguard against tyranny. The once-chained people whose leaders at last lose their grip on information flow will soon burst with freedom and vitality, but the free nation gradually constricting its grip on public discourse has begun its rapid slide into despotism. Beware of he who would deny you access to information, for in his heart he deems himself your master."

        (Alpha Centauri, 1999, https://civilization.fandom.com/wiki/The_Planetary_Datalinks... )

        • HocusLocus a day ago ago

          I sit here in my cubicle, here on the motherworld. When I die, they will put my body in a box and dispose of it in the cold ground. And in the million ages to come, I will never breathe, or laugh, or twitch again. So won't you run and play with me here among the teeming mass of humanity? The universe has spared us this moment."

          ~Anonymous, Datalinks.

        • 01HNNWZ0MV43FF 21 hours ago ago

          Anyway this video about Biden drinking the blood of Christian children is brought to you by Alpha Testerone 2 Supplements, now FDA-approved

          • Yeul 18 hours ago ago

            Haha this is why I stopped using YT and Twitter.

            I'm just not interested in the batshit insane ramblings of Americans. The US can spiral downwards into its own Christ fascist dictatorship but there's no reason for me to join them.

            • kelvinjps 14 hours ago ago

              You can watch YouTube without watching any channels from an american person what do you mean

              • fakedang 5 hours ago ago

                Weird American ads from crazy American Christians convinced about the rapture.

          • dzhiurgis 17 hours ago ago

            That's why you buy $20,000 GPU for local inference for your AI-ad-blocker, geez.

            Orrrrr you pay $20 per month to either left or right wing one on the cloud.

      • tensor a day ago ago

        There is a difference between free flow of information and propaganda. Much like how monopolies can destroy free markets, unchecked propaganda can bury information by swamping it with a data monoculture.

        I think you could make a reasonable argument that the algorithms that distort social media feeds actually impede the free flow of information.

        • AnthonyMouse a day ago ago

          > Much like how monopolies can destroy free markets, unchecked propaganda can bury information by swamping it with a data monoculture.

          The fundamental problem here is exactly that.

          We could have social media that no central entity controls, i.e. it works like the web and RSS instead of like Facebook. There are a billion feeds, every single account is a feed, but you subscribe to thousands of them at most. And then, most importantly, those feeds you subscribe to get sorted on the client.

          Which means there are no ads, because nobody really wants ads, and so their user agent doesn't show them any. And that's the source of the existing incentive for the monopolist in control of the feed to fill it with rage bait, which means that goes away.

          The cost is that you either need a P2P system that actually works or people who want to post a normal amount of stuff to social media need to pay $5 for hosting (compare this to what people currently pay for phone service). But maybe that's worth it.

          • nobody9999 10 hours ago ago

            >We could have social media that no central entity controls, i.e. it works like the web and RSS instead of like Facebook. There are a billion feeds, every single account is a feed, but you subscribe to thousands of them at most. And then, most importantly, those feeds you subscribe to get sorted on the client.

            The Fediverse[1] with ActivityPub[0]?

            [0] https://activitypub.rocks/

            [1] https://fediverse.party/

        • nradov a day ago ago

          There is no generally accepted definition of propaganda. One person's propaganda is another person's accurate information. I don't trust politicians or social media employees to make that distinction.

          • kelvinjps 13 hours ago ago

            There is definitely videos that are propaganda.

            Like those low quality AI video about Trump or Biden, saying things that didn't happened. Anyone with critical thinking knows that those are either propaganda or engagement farming

            • lupusreal 13 hours ago ago

              Or they're just humorous videos meant to entertain and not be taken seriously. Or they are meant to poke fun of the politician, e.g. clearly politically motivated speech, literally propaganda, but aren't meant to be taken as authentic recordings and deception isn't the intent.

              Sometimes it's clearly one and not the other, but it isn't always clear.

              • _DeadFred_ 8 hours ago ago

                'I'm just a comedian guys' interviewing presidential candidates, spouting how we shouldn't be in Ukraine, then the second they get any pushback 'I'm just a comedian'. It's total bullshit. They are trying to influence, not get a laugh.

                • _DeadFred_ 5 hours ago ago

                  Downvoted...yet here is the Vice President saying the FCC Commissioner saying 'we can do this the hard way or the easy way' regarding censoring Jimmy Kimmel was 'just telling a joke':

                  https://bsky.app/profile/atrupar.com/post/3lzm3z3byos2d

                  You 'it's just comedy' guys are so full of it. The FCC Head attacking free media in the United States isn't 'just telling jokes'.

          • tensor a day ago ago

            What you think is propaganda is irrelevant. When you let people unnaturally amplify information by paying to have it forced into someone’s feed that is distorting the free flow of information.

            Employees choose what you see every day you use most social media.

            • msandford a day ago ago

              Congrats! You are 99% of the way to understanding it. Now you just have to realize that "whoever is in charge" might or might not have your best interests at heart, government or private.

              Anyone who has the power to deny you information absolutely has more power than those who can swamp out good information with bad. It's a subtle difference yes, but it's real.

              • tensor a day ago ago

                Banning algorithms and paid amplification is not denying you information. You can still decide for yourself who to follow, or actively look for information, actively listen to people. The difference is that it becomes your choice.

                • vintermann a day ago ago

                  Well, this is about bringing back creators banned for (in YouTube's eyes) unwarranted beliefs stemming from distrust of political or medical authorities, and promoting such distrust. They weren't banned because of paid amplification.

                  I don't quite understand how the Ressa quote in the beginning of this thread justifies banning dissent for being too extreme. The algorithms are surely on YouTube and Facebook (and Ressa's!) side here, I'm sure they tried to downrank distrust-promoting content as much as they dared and had capabilities to, limited by e.g. local language capabilities and their users' active attempts to avoid automatic suppression - something everyone does these days.

                • mensetmanusman 8 hours ago ago

                  Just regulate the algorithm market. Let people see, decide, share, compare

                  • nradov 5 hours ago ago

                    What is the "algorithm market"? Where can I buy one algorithm?

            • vintermann 17 hours ago ago

              OK, but that's an argument against advertising, and maybe against dishonest manipulation of ranking systems.

              It's not an argument for banning doctors from YouTube for having the wrong opinions on public health policy.

            • dzhiurgis 17 hours ago ago

              > distorting the free flow of information

              There is no free flow of information. Never was. YouTube and FB and Google saying "oh it's the algorithm" is complete BS. It always manipulated, boosting whoever they feel fit.

          • refurb a day ago ago

            And propaganda by definition isn’t false information. Propaganda can be factual as well.

          • fellowniusmonk a day ago ago

            So many people have just given up on the very idea of coherent reality? Of correspondence? Of grounding?

            Why? No one actually lives like that when you watch their behavior in the real world.

            It's not even post modernism, it's straight up nihilism masquerading as whatever is trendy to say online.

            These people accuse every one of bias while ignoring that there position comes from a place of such extreme biased it irrationally, presuppositionaly rejects the possibility of true facts in their chosen, arbitrary cut outs. It's special pleading as a lifestyle.

            It's very easy to observe, model, simulate, any node based computer networks that allow for coherent and well formed data with high correspondence, and very easy to see networks destroyed by noise and data drift.

            We have this empirically observed in real networks, it's pragmatic and why the internet and other complex systems run. People rely on real network systems and the observed facts of how they succeed or fail then try to undercut those hard won truths from a place of utter ignorance. While relying on them! It's absurd ideological parasitism, they deny the value of the things the demonstrably value just by posting! Just the silliest form of performative contradiction.

            I don't get it. Fact are facts. A thing can be objectively true in what for us is a linear global frame. The log is the log.

            Wikipedia and federated text content should never be banned, logs and timelines, data etc... but memes and other primarily emotive media is case by case, I don't see their value. I don't see the value in allowing people to present unprovable or demonstrably false data using a dogmatically, confidentally true narrative.

            I mean present whatever you want but mark it as interpretation or low confidence interval vs multiple verified sources with a paper trail.

            Data quality, grounding and correspondence can be measured. It takes time though for validation to occur, it's far easier to ignore those traits and just generate infinite untruth and ungrounded data.

            Why do people prop up infinite noise generation as if it was a virtue? As if noise and signal epistemically can't be distinguished ever? I always see these arguments online by people who don't live that way at all in any pragmatic sense. Whether it's flat earthers or any other group who rejects the possibility of grounded facts.

            Interpretation is different, but so is the intentional destruction of a shared meaning space by turning every little word into a shibboleth.

            People are intentionally destroying the ability to even negotiate connections to establish communication channels.

            Infinite noise leads to runaway network failure and in human systems the inevitably of violence. I for one don't like to see people die because the system has destroyed message passing via attentional ddos.

            • nradov a day ago ago

              Fortunately your biased opinion about what information has value is utterly worthless and will have zero impact on public policy. Idealized mathematical models of computer networks have no relevance to politics or freedom of expression in the real world.

          • ruszki 20 hours ago ago

            There isn’t. Yet, everybody knows what I mean under “propaganda against immigration” (just somebody would discredit it, somebody would fight for it), and nobody claims that the Hungarian government’s “information campaign” about migrants is not fascist propaganda (except the government, obviously, but not even their followers deny it). So, yes, the edges are blurred, yet we can clearly identify some propaganda.

            Also accurate information (like here is 10 videos about black killing whites) with distorted statistics (there is twice as much white on black murder) is still propaganda. But these are difficult to identify, since they clearly affect almost the whole population. Not many people even tried to fight against it. Especially because the propaganda’s message is created by you. // The example is fiction - but the direction exists, just look on Kirk’s twitter for example -, I have no idea about the exact numbers off the top of my head

        • ASalazarMX 7 hours ago ago

          Propaganda wouldn't be such a problem if content wasn't dictated by a handful of corporations, and us people weren't so unbelievably gullible.

        • boltzmann-brain a day ago ago

          indeed, didn't YT ban a bunch of RT employees for undisclosed ties? I bet those will be coming back.

        • vintermann 18 hours ago ago

          Oh, but can you make an argument that the government, pressuring megacorporations with information monopolies to ban things they deem misinformation, is a good thing and makes things better?

          Because that's the argument you need to be making here.

          • potato3732842 16 hours ago ago

            You don't even need to make the argument. Go copy paste some top HN comments on this issue from around the time the actions we're discussing youtube reversing happened.

            • vintermann 13 hours ago ago

              I think those arguments sound especially bad today, actually. They got the suppression they wanted, but it did not give the outcome they wanted.

          • estearum 14 hours ago ago

            Not really. You can argue that the government should have the right to request content moderation from private platforms and that private platforms should have the right to decline those requests. There are countless good reasons for both sides of that.

            In fact, this is the reality we have always had, even under Biden. This stuff went to court. They found no evidence of threats against the platforms, the platforms didn't claim they were threatened, and no platform said anything other than they maintained independent discretion for their decisions. Even Twitter's lawyers testified under oath that the government never coerced action from them.

            Even in the actual letter from YouTube, they affirm again that they made their decisions independently: "While the Company continued to develop and enforce its policies independently, Biden Administration officials continued to press the company to remove non-violative user-generated content."

            So where does "to press" land on the spectrum between requesting action and coercion? Well, one key variable would be the presence of some type of threat. Not a single platform has argued they were threatened either implicitly or explicitly. Courts haven't found evidence of threats. Many requests were declined and none produced any sort of retaliation.

            Here's a threat the government might use to coerce a platform's behavior: a constant stream of subpoenas! Well, wouldn't you know it, that's exactly what produced the memo FTA.[1]

            Why hasn't Jim Jordan just released the evidence of Google being coerced into these decisions? He has dozens if not hundreds of hours of filmed testimony from decision-makers at these companies he refuses to release. Presumably because, like in every other case that has actually gone to court, the evidence doesn't exist!

            [1] https://www.politico.com/live-updates/2025/03/06/congress/ji...

            • ethbr1 11 hours ago ago

              The key problem with the government "requesting" a company do something is that the government has nigh infinite unrelated decisions that can be used to apply pressure to that company.

              It's unreasonable to expect some portion of the executive branch to reliably act counter to the President's stated goals, even if they would otherwise have.

              And that opportunity for perversion of good governance (read: making decisions objectively) is exactly why the government shouldn't request companies censor or speak in certain ways, ever.

              If there are extenuating circumstances (e.g. a public health crisis), then there need to be EXTREMELY high firewalls built between the part of the government "requesting" and everyone else (and the President should stay out of it).

              • estearum 9 hours ago ago

                The government has a well-established right to request companies to do things, and there are good reasons to keep it.

                For example, the government has immense resources to detect fraud, CSAM, foreign intelligence attacks, and so on.

                It is good, actually, that the government can notify employers that one of their employees is a suspected foreign asset and request they do not work on sensitive technologies.

                It is good, actually, that the government can notify a social media platform that there are terrorist cells spreading graphic beheading videos and request they get taken down.

                It's also good that in the vast majority cases, the platforms are literally allowed to reply with "go fuck yourself!"

                The high firewall is already present, it's called the First Amendment and the platforms' unquestioned right to say "nope," as they do literally hundreds of times per day.

                • ethbr1 4 hours ago ago

                  How does any of that prevent the government from de facto tying unrelated decisions to compliance by companies? E.g. FCC merger approval?

                  • estearum 17 minutes ago ago

                    None of it de facto prevents anything, but if a corporation feels they're being bullied in this way they can sue.

                    In the Biden admin, multiple lawsuits (interestingly none launched by the allegedly coerced parties) revealed no evidence of such mechanics at play.

                    In the Trump admin, the FCC Commissioner and POTUS have pretty much explicitly tied content moderation decisions to unrelated enforcement decisions.

                    Definitely there's possibility for an admin to land in the middle (actually coercive, but not stupid enough to do it on Truth Social), and in those scenarios we rely on the companies to defend themselves.

                    The idea that government should be categorically disallowed from communicating and expressing preferences is functionally absurd.

      • yongjik a day ago ago

        That sounds great in the context of a game, but in the years since its release, we have also learned that those who style themselves as champions of free speech also dream themselves our master.

        They are usually even more brazen in their ambitions than the censors, but somehow get a free pass because, hey, he's just fighting for the oppressed.

        • ethbr1 11 hours ago ago

          I'd say free speech absolutism (read: early-pandemic Zuckerberg, not thumb-on-the-scales Musk) has always aged better than the alternatives.

          The trick is there's a fine line between honest free speech absolutism and 'pro free speech I believe in and silence about the freedom of that I don't.' Usually when ego and power get involved (see: Trump, Musk).

          To which, props to folks like Ted Cruz on vocally addressing the dissonance of and opposing FCC speech policing.

        • potato3732842 16 hours ago ago

          Anything that people uncritically good attracts the evil and the illegitimate because they cannot build power on their own so they must co-opt things people see as good.

      • soganess a day ago ago

        Not in the original statement, but as it referenced here, the word 'information' is doing absolutely ludicrous amounts of lifting. Hopefully it bent at the knees, because it my book it broke.

        You can't call the phrase "the sky is mint chocolate chip pink with pulsate alien clouds" information.

        • arevno a day ago ago

          While this is true, It's also important to realize that during the great disinformation hysteria, perfectly reasonable statements like "This may have originated from a lab", "These vaccines are non-sterilizing", or "There were some anomalies of Benford's Law in this specific precinct and here's the data" were lumped into the exact same bucket as "The CCP built this virus to kill us all", "The vaccine will give you blood clots and myocarditis", or "The DNC rigged the election".

          The "disinformation" bucket was overly large.

          There was no nuance. No critical analysis of actual statements made. If it smelled even slightly off-script, it was branded and filed.

          • nradov 11 hours ago ago

            The mRNA based COVID-19 vaccines literally did cause myocarditis as a side effect in a small subset of patients. We can argue about the prevalence and severity or risk trade-offs versus possible viral myocarditis but the basic statement about possible myocarditis should have never been lumped into the disinformation bucket.

            https://www.cdc.gov/vaccines/covid-19/clinical-consideration...

            • arevno 11 hours ago ago

              Doesn't detract from my point. "These vaccines are correlated with an N% increased risk of myocarditis" is a different statement from "These vaccines will give you myocarditis".

              BOTH of them were targeted by the misinformation squad, as if equivalent.

          • BrenBarn a day ago ago

            But it is because of the deluge that that happens. We can only process so much information. If the amount of "content" coming through is orders of magnitude larger, it makes sense to just reject everything that looks even slightly like nonsense, because there will still be more than enough left over.

            • themaninthedark 21 hours ago ago

              So does that justify the situation with Jimmy Kimmel? After all there was a deluge of information and a lot of unknowns about the shooter but the word choice he used was very similar to the already debunked theory that it was celebratory gunfire from a supporter.

              Of course not.

              • netsharc 21 hours ago ago

                That sentence from Kimmel was IMO factually incorrect, and he was foolish to make the claim, but how is offensive towards the dead, and why is it worth a suspension?

                But as we know, MAGA are snowflakes and look for anything so they can pull out their Victim Card and yell around...

                • ethbr1 11 hours ago ago

                  MAGA are badasses when they're out of power, yet apparently threatened enough by an escalator stopping so as to call for terrorism charges.

                  The doublethink is real.

            • Slava_Propanei a day ago ago

              [dead]

        • ayntkilove a day ago ago

          You can call it data and have sufficient respect of others that they may process it into information. Too many have too little faith in others. If anything we need to be deluged in data and we will probably work it out ourselves eventually.

          • protocolture a day ago ago

            Facebook does its utmost to subject me to Tartarian, Flat Earth and Creationist content.

            Yes I block it routinely. No the algo doesnt let up.

            I dont need "faith" when I can see that a decent chunk of people disbelieve modern history, and aggressively disbelieve science.

            More data doesnt help.

      • intended 19 hours ago ago

        This is a fear of an earlier time.

        We are not controlling people by reducing information.

        We are controlling people by overwhelming them in it.

        And when we think of a solution, our natural inclination to “do the opposite” smacks straight into our instinct against controlling or reducing access to information.

        The closest I have come to any form of light at the end of the tunnel is Taiwan’s efforts to create digital consultations for policy, and the idea that facts may not compete on short time horizon, but they surely win on longer time horizons.

        • ethbr1 10 hours ago ago

          The problem is that in our collective hurry to build and support social networks, we never stopped to think about what other functions might be needed with them to promote good, factual society.

          People should be able to say whatever the hell they want, wherever the hell they want, whenever the hell they want. (Subject only to the imminent danger test)

          But! We should also be funding robust journalism to exist in parallel with that.

          Can you imagine how different today would look if the US had leveraged a 5% tax on social media platforms above a certain size, with the proceeds used to fund journalism?

          That was a thing we could have done. We didn't. And now we're here.

      • probably_wrong 20 hours ago ago

        Beware of those who quote videogames and yet attribute them to "U.N. Declaration of Rights".

        • Starman_Jones 13 hours ago ago

          They're not wrong; the attribution is part of the quote. In-game, the source of the quote is usually important, and is always read aloud (unlike in Civ).

          • probably_wrong 12 hours ago ago

            I would argue that they are, if not wrong, at least misleading.

            If you've never played Alpha Centauri (like me) you are guaranteed to believe this to be a real quote by a UN diplomat. It also doesn't help that searching for "U.N. Declaration of Rights" takes me (wrongly) to the (real) Universal Declaration of Human Rights. I only noticed after reading ethbr1's comment [1], and I bet I'm not the only one.

            [1] https://news.ycombinator.com/item?id=45355441

            • ethbr1 10 hours ago ago

              Hence my reply.

              Also, you missed a great game.

      • BrenBarn a day ago ago

        The thing is that burying information in a firehose of nonsense is just another way of denying access to it. A great way to hide a sharp needle is to dump a bunch of blunt ones on top of it.

      • Cheer2171 a day ago ago

        Beware he who would tell you that any effort at trying to clean up the post apocalyptic wasteland that is social media is automatically tyranny, for in his heart he is a pedophile murderer fraudster, and you can call him that without proof, and when the moderators say your unfounded claim shouldn't be on the platform you just say CENSORSHIP.

      • a day ago ago
        [deleted]
      • rixed 21 hours ago ago

        Is your point that any message is information?

        Without truth there is no information.

      • jancsika a day ago ago

        That seems to be exactly her point, no?

        Imagine an interface that reveals the engagement mechanism by, say, having an additional iframe. In this iframe an LLM clicks through its own set of recommendations picked to minimize negative emotions at the expense of engagement.

        After a few days you're clearly going to notice the LLM spending less time than you clicking on and consuming content. At the same time, you'll also notice its choices are part of what seems to you a more pleasurable experience than you're having in your own iframe.

        Social media companies deny you the ability to inspect, understand, and remix how their recommendation algos work. They deny you the ability to remix an interface that does what I describe.

        In short, your quote surely applies to social media companies, but I don't know if this is what you originally meant.

      • totetsu a day ago ago

        Raising the noise floor of disinformation to drown out information is a way of denying access to information too..

      • N_Lens a day ago ago

        We must dissent.

      • idiotsecant a day ago ago

        Sure, great. Now suppose that a very effective campaign of social destabilisation propaganda exists that poses an existential risk to your society.

        What do you do?

        It's easy to rely on absolutes and pithy quotes that don't solve any actual problems. What would you, specifically, with all your wisdom do?

        • nradov a day ago ago

          Let's not waste time on idle hypotheticals and fear mongering. No propaganda campaign has ever posed an existential threat to the USA. Let us know when one arrives.

          • CJefferson a day ago ago

            Have you seen the US recently? Just in the last couple of days, the president is standing up and broadcasting clear medical lies about autism, while a large chunk of the media goes along with him.

            • nradov a day ago ago

              I have seen the US recently. I'm not going to attempt to defend the President but regardless of whether he is right or wrong about autism this is hardly an existential threat to the Republic. Presidents have been wrong about many things before and that is not a valid justification for censorship. In a few years we'll have another president and he or she will be wrong about a whole different set of issues.

              • CJefferson a day ago ago

                I hope I’m wrong, but I think America is fundamentally done, because it turns out the whole “checks and balances” system turned out to be trivial to steamroll as president, and future presidents will know that now.

                By done I don’t mean it won’t continue to be the worlds biggest and most important country, but I don’t expect any other country to trust America more than they have to for a 100 years or so.

                • nradov 21 hours ago ago

                  A lot of people thought that America was fundamentally done in 1861, and yet here we are. The recent fracturing of certain established institutional norms is a matter of some concern. But whether other countries trust us or not is of little consequence. US foreign policy has always been volatile, subject to the whims of each new administration. Our traditional allies will continue to cooperate regardless of trust (or lack thereof) because mutual interests are still broadly aligned and they have no credible alternative.

                  • defrost 20 hours ago ago

                    > whether other countries trust us or not is of

                    some consequence. Not all consuming, but significant.

                    > Our traditional allies will continue to cooperate regardless of

                    whether they continue to include the US within that circle to the same degree, or indeed at all.

                    Trump's tariff's have been a boon for China's global trade connections, they continue to buy soybeans, but from new partners whereas before they sourced mainly from the US.

                • ethbr1 10 hours ago ago

                  > turned out to be trivial to steamroll as president, and future presidents will know that now

                  ... when the Presidency, House, and Senate are also controlled by one unified party, and the Supreme Court chooses not to push back aggressively.

                  That rarely happens.

              • estearum 14 hours ago ago

                "You cannot trust basic statements of fact coming from POTUS, HHS, FDA, CDC, DOD" is absolutely an existential risk.

                • nradov 11 hours ago ago

                  I won't attempt to defend the current administration's incompetent and chaotic approach to public health (or communications in general) but it's hardly an existential crisis. The country literally existed for over a century before HHS was even created.

                  • dragonwriter 11 hours ago ago

                    Among other major problems, the logic in your comment implicitly assumes that the worst a badly-run (incompetent, malevolent, or some combination) central authority can be is equal to the effect of no central authority.

                    Another important error is the implicit assumption that public health risks are constant, and do not vary with changing time and conditions, so that the public health risk profile today is essentially the same as in the first century of the US’s existence.

              • _DeadFred_ 19 hours ago ago

                They are spreading this nonsense in part in order to hide from the fact that they refuse to release the Epstein files, something that seems to include a rather lot of high profile/high importance official potentially doing really bad things.

                It's called flooding the zone, and it is a current Republican strategy to misinform, to sow defeatism in their political opposition, default/break all of the existing systems for handling politics, with the final outcome to manipulate the next election. And they publicized this yet people like you claim to think it's non issue.

              • Yeul 18 hours ago ago

                [flagged]

          • rixed 21 hours ago ago

            It doesn't have to be national threat. Social media can be used by small organisations or even sufficiently motivated individuals to easily spread lies and slanders against individuals or group and it's close to impossible to prevent (I've been fighting some trolls threatening a group of friends on Facebook lately, and I can attest how much the algorithm favor hate speach over reason)

            • nradov 20 hours ago ago

              That's a non sequitur. Your personal troubles are irrelevant when it comes to public policy, social media, and the fundamental human right of free expression. While I deplore hate speech, it's existence doesn't justify censorship.

              • rixed 15 hours ago ago

                It is of course subjective. For you hate speech does not justify censorship but for me it does. Probably because we make different risk assessments: you might expect hate speech to have no consequences in general and censorship to lead to authoritarianism, whereas I expect hate speech to have actual consequences on people life that are worse and more likely than authoritarianism. When I think about censorship and authoritarianism, I think about having to hide, but when I think about hate speech I picture war propaganda and genocides.

        • Steltek 14 hours ago ago

          There are twin goals: total freedom of speech and holding society together (limit polarization). I would say you need non-anonymous speech, reputation systems, trace-able moderation (who did you upvote), etc. You can say whatever you want but be ready to stand by it.

          One could say the problem with freedom of speech was that there weren't enough "consequences" for antisocial behavior. The malicious actors stirred the pot with lies, the gullible and angry encouraged the hyperbole, and the whole US became polarized and divided.

          And yes, this system chills speech as one would be reluctant to voice extreme opinions. But you would still have the freedom to say it but the additional controls exert a pull back to the average.

      • 2OEH8eoCRo0 12 hours ago ago

        Facebook speaks through what it chooses to promote or suppress and they are not liable for that speech because of Section 230.

        • Manuel_D 11 hours ago ago

          Not quite: prior to the communications Decency Act of 1996 (which contained section 230), companies were also not liable for the speech of their users, but lost that protection if they engaged in any moderation. The two important cases at hand are Stratton Oakmont, Inc. v. Prodigy Services Co. And Cubby, Inc. v. CompuServe Inc.

          The former moderated content and was thus held liable for posted content. The latter did not moderate content and was determined not to be liable for user generated content they hosted.

          Part of the motivation of section 230 was to encourage sites to engage in more moderation. If section 230 were to be removed, web platforms would probably choose to go the route of not moderating content in order to avoid liability. Removing section 230 is a great move if one wants misinformation and hateful speech to run unchecked.

    • potato3732842 16 hours ago ago

      There's a special irony in this being the top comment on a site where everyone has a rightthink score and people routinely and flagrantly engage in "probably bad faith, but there's plausible deniability so you can't pin it on them" communication to crap on whatever the wrongthink on an issue is.

      As bad as facebook and it's opaque algorithms that favor rage bait are, the kind of stuff you get by keeping score is worse.

    • vachina a day ago ago

      This is why China bans western social media.

      • yupyupyups a day ago ago

        Say what you will about the CCP, it's naive to let a foreign nation have this much impact on your subjects. The amount of poison and political manipulation that are imported from these platform is astronomical.

        • ethbr1 a day ago ago

          Instead of implementing government information control, why not invest those resources in educating and empowering ones citizenry to recognize disinformation?

          • BrenBarn a day ago ago

            To me this is sort of like saying why do we need seat belts when we could just have people go to the gym so they're strong off to push back an oncoming car. Well, you can't get that strong, and also you can't really educate people well enough to reliably deal with the full force of the information firehose. Even people who are good at doing it do so largely by relying on sources they've identified as trustworthy and thus offloading some of the work to those. I don't think there's anyone alive who could actually distinguish fact from fiction if they had to, say, view every Facebook/Twitter/Reddit/everything post separately in isolation (i.e., without relying on pre-screening of some sort).

            And once you know you need pre-screening, the question becomes why not just provide it instead of making people hunt it down?

            • belorn 14 hours ago ago

              With modern safety design and human factors, we do both and more. A car can have an automated breaking system, and a manual break, and an information system that informs the driver of the surroundings in order for better informed driver. We don't remove any of those in the false belief that one of them should be enough.

              Applying that to information and propaganda, users should have some automated defenses (like ad blockers), but also manual control of what should or should not be blocked, and also education and tools to be better informed when taking manual control.

              In neither system should we remove manual control, education or automated help. They all act in union to make people safer.

              • ethbr1 11 hours ago ago

                Perhaps a better analogy from recent HN discussion would be auto-lock-on-drive doors.

                Some people die (often children) by opening doors while a vehicle is moving or before it is safe to do so.

                However, this also impedes the ability of rescuers to extract people from crashed vehicles (especially with fail-dangerous electric car locks).

                Is it safer to protect citizens from themselves or empower them to protect themselves?

                In my perfect US, both would be done:

                "Dealing with disinformation" as a semester-long required high-school level course and federally mandating the types of tools that citizens could use to better (read: requiring all the transparency data X and Meta stopped reporting, from any democracy-critical large platform).

                While also mandating measures to limit disinformation where technically possible and ethically appropriate (read: not making hamfisted government regulations, when mandating data + allowing journalists / citizens to act is a better solution).

          • rixed 21 hours ago ago

            Instead of investing resources in education, why not let people discover by themselves the virtues of education?

            Sarcasm aside, we tend to focus too much on the means and too little on the outcomes.

          • beepboopboop a day ago ago

            That’s hundreds of millions of people in the US, of varying ages and mostly out of school already. Seems like a good thing to try but I’d imagine it doesn’t make a tangible impact for decades.

          • CJefferson a day ago ago

            Because no one person can fight against a trillion dollar industry who has decided misinformation makes the biggest profit.

            How am I supposed to learn what’s going on outside my home town without trusting the media?

          • rgavuliak 17 hours ago ago

            Because it doesn't seem to work?

          • yupyupyups 11 hours ago ago

            I never defended the authoritarianism of the CCP. I only said it makes sense to block foreign platforms, regardless if the state is a tyranny or not. Framing it as if it's some kind of tactic to help keep the populous indoctrinated is a very simplistic take.

            Take Reddit, for example. It's filled with blatant propaganda, from corporations and politicians. It's a disgustingly astroturfed platform ran by people of questionable moral character. What's more, it also has porn. All you need is an account to access 18+ "communities". Not exactly "enlightening material" that frees the mind from tyranny.

          • mns 13 hours ago ago

            Because in that case you wouldn't be able to use disinformation yourself.

          • idiotsecant a day ago ago

            Because you want to use it yourself. You can't vaccinate if you rely on the disease to maintain power. You can't tell people not to be afraid of people different than themselves if your whole party platform is being afraid of people different than yourself.

          • xracy a day ago ago

            'An ounce of prevention is worth a pound of the cure.'

            It's so much easier to stop one source than it is to (checks notes) educate the entire populace?!? Gosh, did you really say that with a straight face? As if education isn't also under attack?

          • Broken_Hippo a day ago ago

            Because it isn't that simple.

            If we could just educate people and make sure they don't fall for scams, we'd do it. Same for disinformation.

            But you just can't give that sort of broad education. If you aren't educated in medicine and can't personally verify qualifications of someone, you are going to be at a disadvantage when you are trying to tell if that health information is sound. And if you are a doctor, it doesn't mean you know about infrastructure or have contacts to know what is actually happening in the next state or country over.

            It's the same with products, actually. I can't tell if an extension cord is up to code. The best that I can realistically do is hope the one I buy isn't a fake and meets all of the necessary safety requirements. A lot of things are like this.

            Education isn't enough. You can't escape misinformation and none of us have the mental energy to always know these things. We really do have to work the other way as well.

            • yupyupyups 10 hours ago ago

              Why is this being downvoted?

          • erxam a day ago ago

            Sorry, 'recognizing disinformation'? You must have meant 'indoctrination'.

            (They don't necessarily exclude each other. You need both positive preemptive and negative repressive actions to keep things working. Liberty is cheap talk when you've got a war on your hands.)

        • scarface_74 a day ago ago

          Well when the local media bends a knee and outright bribes the President (Paramount, Disney, Twitter, Facebook), why should we trust the domestic media?

          • nxm a day ago ago

            Like Biden administration pressured social media to take down information/account that went against their narrative

            • alphabettsy a day ago ago

              Is there a meaningful difference between pressuring and taking or threatening regulatory action? I think so.

            • Eisenstein a day ago ago

              Wait, are you saying that the person you are replying to is a hypocrite, or are you saying that the Biden admin set the standard for responsible government handling of media relations, or are you saying that if one administration does something bad it is ok for any other administration to do something bad, like a tit-for-tat tally system of bad things you get for free after the inauguration?

            • bediger4000 a day ago ago

              Biden admin's bad behavior certainly allows Trump to act the same way.

              If it was bad for Biden admin, it's much worse for Trump admin - he campaigned against it.

            • scarface_74 12 hours ago ago

              You don’t see a difference between that and outright bribery?

            • RickJWagner 14 hours ago ago

              Note that your statement is true and relevant to the conversation, yet downvoted.

              It’s shameful that this happens. Is it bot voting? Partisan cheering over productive conversation? It’s troubling.

      • nradov a day ago ago

        China reflexively bans anything that could potentially challenge Chairman Xi's unchecked authority and control over the information flow.

    • gchamonlive a day ago ago

      That's the evil genius behind the general movement in the world to discredit democratic institutions and deflate the government.

      Who would hold Meta accountable for the lies it helps spread and capitalize upon them if not the government.

      So by crippling democratic institutions and dwarfing the government to the point of virtual non-existence, all in the name of preserving freedom of speech and liberalism -- and in the process subverting both concepts -- elected leaders have managed to neutralize the only check in the way of big corps to ramp up this misinformation machine that the social networks have become.

    • stinkbeetle a day ago ago

      I think it would be even wiser to start by holding to account the politicians, corporations, and government institutions regarding their unchecked lies corruption and fraud.

      But no, yet again the blame is all piled on to the little people. Yes, it's us plebs lying on the internet who are the cause of all these problems and therefore we must be censored. For the greater good.

      I have an alternative idea, let's first imprison or execute (with due process) politicians, CEOs, generals, heads of intelligence and other agencies and regulators, those found to have engaged in corrupt behavior, lied to the public, committed fraud, insider trading, fabricated evidence to support invading other countries, engage in undeclared wars, ordered extrajudicial executions, colluded with foreign governments to hack elections, tax evasion, etc. Then after we try that out for a while and if it has not improved things, then we could try ratcheting up the censorship of plebs. Now one might argue that would be a violation of the rights of those people to take such measures against them, but that is a sacrifice I'm willing to make. Since We Are All In This Together™, they would be willing to make that sacrifice too. And really, if they have nothing to hide then they have nothing to fear.

      When you get people like Zuckerberg lying to congress, it's pretty difficult to swallow the propaganda claiming that it's Joe Smith the unemployed plumber from West Virginia sharing "dangerous memes" with his 12 friends on Facebook that is one of the most pressing concerns.

      • n4r9 14 hours ago ago

        I don't think "breadwinner" is blaming the little people.

        • stinkbeetle 3 hours ago ago

          No, the ruling class is. breadwinner I guess has bought into the propaganda, but hasn't made the connection that it's basically putting all the blame on the little people and proposes to put all the burden of "fixing" things onto them, with measures that will absolutely not actually fix anything except handing more power to the ruling class.

    • _dain_ a day ago ago

      >unchecked social media

      Passive voice. Who exactly is supposed to do the "checking" and why should we trust them?

      • breadwinner a day ago ago

        Citizens. Through lawsuits. Currently we can't because of Section 230.

        • nradov a day ago ago

          Nonsense. If social media users engage in fraud, slander, or libel then you can still hold them accountable through a civil lawsuit. Section 230 doesn't prevent this.

          • breadwinner 10 hours ago ago

            Will/can Facebook tell you the real identity of the user? If no then Facebook has to take responsibility for the fraud/slander/libel. Currently Section 230 means they can't be held responsible.

            • nradov 9 hours ago ago

              Yes. A plaintiff can file a civil lawsuit in a US court against a "John Doe" defendant and ask the court to order Facebook (or any online service) to turn over any data they have on the user's real identity. If the court agrees and issues the order then Facebook will comply: this is quite routine and happens all the time. The plaintiff can then amend the lawsuit to name specific defendants.

              Section 230 is largely irrelevant to this process so I don't know why you'd bring it up. Have you ever even read the Communications Decency Act of 1996?

        • trhway a day ago ago

          The "editorializing" may possibly be applied i think (not a lawyer) when the platform's manipulation of what a user sees is based on content. And the Youtube's banning of specific Covid and election content may be such an "editorializing", and thus Youtube may not have Section 230 protection at least in those cases.

          • nradov a day ago ago

            Have you even read Section 230? Editorializing is irrelevant.

    • StanislavPetrov a day ago ago

      >"You and I, if we say a lie we are held responsible for it, so people can trust us."

      I don't know how it works in The Philippines, but in the USA the suggestion that media outlets are held responsible for the lies that they tell is one of the most absurd statements one could possibly make.

      • lfpeb8b45ez a day ago ago

        How about InfoWars?

        • StanislavPetrov 20 hours ago ago

          I was referring more to established Media that people consider credible like the NBC, CBS, The Guardian, The New York Times, the Wall Street Journal, The Atlantic, etc. The fact that the only person in "media" who has been severely punished for their lies is a roundly despised figure (without any credibility among established media or the ruling class) is not a ringing endorsement for the system. While the lies of Jones no doubt caused untold hardship for the families of the victims, they pale in comparison to the much more consequential lies told by major media outlets with far greater influence.

          When corporate media figures tell lies that are useful to the establishment, they are promoted, not called to account.

          In 2018 Luke Harding at the Guardian lied and published a story that "Manafort held secret talks with Assange in Ecuadorian embassy" (headline later amended with "sources say" after the fake story was debunked) in order to bolster the Russiagate narrative. It was proven without a shadow of a doubt that Manafort never went to the Embassy or had any contact at all with Assange (who was under blanket surveillance), at any time. However, to this day this provably fake story remains on The Guardian website, without any sort of editor's note that is it false or that it was all a pack of lies!(1) No retraction was ever issued. Luke Harding remains an esteemed foreign correspondent for The Guardian.

          In 2002, Jonah Golberg told numerous lies in a completely false article in The New Yorker that sought to establish a connection between the 9/11 attacks and Saddam Hussein called, "The Great Terror".(2) This article was cited repeatedly during the run up to the war as justification for the subsequent invasion and greatly helped contribute to an environment where a majority of Americans thought that Iraq was linked to Bin Laden and the 9/11 attackers. More than a million people were killed, in no small part because of his lies. And Goldberg? He was promoted to editor-in-chief of The Atlantic, perhaps the most prestigious and influential journal in the country. He remains in this position today.

          There are hundreds, if not thousands, of similar examples. The idea suggested in the original OP that corporate/established media is somehow more credible or held to a higher standard than independent media is simply not true. Unfortunately there are a ton of lies, falsehoods and propaganda out there, and it is up to all of us to be necessarily skeptical no matter where we get our information and do our due diligence.

          1. https://www.theguardian.com/us-news/2018/nov/27/manafort-hel...

          2. https://www.newyorker.com/magazine/2002/03/25/the-great-terr...

        • anonymousiam a day ago ago

          A sympathetic jury can be an enemy of justice.

          I'm not an Alex Jones fan, but I don't understand how a conspiracy theory about the mass shooting could be construed as defamation against the parents of the victims. And the $1.3B judgement does seem excessive to me.

          • AlexandrB a day ago ago

            You should read up on some details. The defamation claim is because Alex Jones accused the parents of being actors who are part of staging the false flag. The huge judgement is partly because Alex Jones failed to comply[1][2] with basic court procedure like discovery in a timely way so a default judgement was entered.

            Despite his resources, Alex Jones completely failed to get competent legal representation and screwed himself. He then portrayed himself as the victim of an unjust legal system.

            [1] https://www.npr.org/2021/11/15/1055864452/alex-jones-found-l...

            > Connecticut Superior Court Judge Barbara Bellis cited the defendants' "willful noncompliance" with the discovery process as the reasoning behind the ruling. Bellis noted that defendants failed to turned over financial and analytics data that were requested multiple times by the Sandy Hook family plaintiffs.

            [2] https://lawandcrime.com/high-profile/judge-rips-alex-jones-c...

            > Bellis reportedly said Jones' attorneys "failure to produce critical material information that the plaintiffs needed to prove their claims" was a "callous disregard of their obligation," the Hartford Courant reported.

            • tbrownaw a day ago ago

              > The huge judgement is partly because Alex Jones failed to comply with basic court procedure like discovery in a timely way so a default judgement was entered.

              Yeah. Reufsing to cooperate with the court has to always be at least as bad as losing your case would have been.

          • protocolture a day ago ago

            The specific conspiracy theory implied fraud and cover up on behalf of the parents. Lmao.

      • thrance 15 hours ago ago

        Ever watched Fox News?

    • vintermann a day ago ago

      Exactly what are you trying to say about unbanning YouTubers here?

      • afavour a day ago ago

        That it could be dangerous to readmit people who broadcast disinformation? The connection seemed pretty clear to me.

        • vintermann 21 hours ago ago

          I certainly guessed that was what you wanted to say. Funny how polarization makes everything predictable.

          But what I just realized is that you don't explicitly say it, and certainly make no real argument for it. Ressa laments algorithmic promotion of inflammatory material, but didn't say "keep out anti-government subversives who spread dangerous misinformation" - which is good, because

          1. We can all see how well the deplatforming worked - Trump is president again, and Kennedy is health secretary.

          2. In the eyes of her government, she was very much such a person herself, so it would have been pretty bizarre thing of her to say.

          Ironically, your post is very much an online "go my team!" call, and a good one too (top of the thread!). We all understand what you want and most of us, it seems, agree. But you're not actually arguing for the deplatforming you want, just holding up Ressa as a symbol for it.

          • n4r9 13 hours ago ago

            > We can all see how well the deplatforming worked - Trump is president again

            Not a compelling argument...

            Jan 2021 - Twitter bans Trump (for clear policy violations)

            Apr 2022 - Musk buys Twitter

            Aug 2023 - Twitter reinstates Trump's account

            Nov 2024 - Trump re-elected, gives Musk cabinet position

            • vintermann 13 hours ago ago

              So deplatforming works, unless people become so unhinged at the efforts to shape them that they do crazy stuff like buy major media platforms? Guess what, they do!

              But at least the Covid dissenter deplatforming worked, right? Or was the problem Musk there again?

              One of my mantras is that powerful people believe all the crazy things regular people believe in, they just act differently on them. I think both Musk and Kennedy are great examples that you'd appreciate, as are Xi and Putin with their open mic life extension fantasies.

              It's not long ago that Musk and even Trump himself, was aligned with your competent technocrats wielding the "suppression of irresponsible speech" powers.

              • n4r9 13 hours ago ago

                I'm saying that Trump's re-election is not a compelling counter-example to the general argument for banning disinformation, because he was "re-"platformed for over a year by the time of the election.

                • Dig1t 9 hours ago ago

                  You underestimate how much seeing a sitting president be deplatformed affected the voting public. It wasn’t just Musk, all this talk of “deplatforming” people on the right was an obviously clear erosion of free speech that pushed many moderates like myself rightward.

                  It wasn’t just banning Trump either, tbh one of the biggest ones was the banning of the Babylon Bee for a pretty tame joke. There’s a long list of other right-leaning accounts which were banned during that time as well.

          • afavour 14 hours ago ago

            You realise I didn’t make the original post, right?

            • vintermann 13 hours ago ago

              No, it looked as if you did. Whatever.

    • refurb a day ago ago

      The problem is not the content, the problem is people believing things blindly.

      The idea that we need to protect people from “bad information” is a dark path to go down.

      • BrenBarn a day ago ago

        I don't see it so much as protecting people from bad information as protecting people from bad actors, among whom entities like Facebook are prominent. If people want to disseminate quackery they can do it like in the old days by standing on a street corner and ranting. The point is that the mechanisms of content delivery amplify the bad stuff.

        • refurb 20 hours ago ago

          It’s a terrible idea and creates more problems than it solves.

          You eliminate the good and the bad ideas. You eliminate the good ideas that are simple “bad” because it upsets people with power. You eliminate the good ideas that are “bad” simply because they are deemed too far out the Overton window.

          And worst of all, it requires some benevolent force to make the call between good and bad, which attracts all sorts of psychopaths hungry for power.

          • thrance 15 hours ago ago

            Have you been living under a rock these past few years? The "bad" ideas outnumber the "good" ones ten to one. The current secretary of health lets internet conspiracies dictate its politics, such that vaccines are getting banned, important research is getting defunded, and now they're even going after paracetamol (!!). People will die.

            Cue the quote that says it takes 30 minutes to debunk 30 seconds of lying.

            • refurb 13 hours ago ago

              You chose to ignore all the points I made.

              Why?

    • trhway a day ago ago

      Censorship works both ways. When i tried speaking against violence and genocide perpetrated by Russia in Ukraine i was shut down on Linkedin.

      Even here on HN, i was almost banned when i said about children abduction by Russia https://news.ycombinator.com/item?id=33005062 - the crime that half year later ICC wrote the order against Putin.

      • breadwinner a day ago ago

        You know how this used to work in the old days? Instead of publishing allegations yourself, you would take your story to a newspaper reporter. The reporter will then do the investigations then, if there is solid evidence, the story will be published in the newspaper. At that point the newspaper company is standing behind the story, and citizens know the standing of the newspaper in their community, and how much credence to give to the story, based on that. Social media destroyed this process, now anyone can spread allegations at lightning speed on a massive scale without any evidence to back it up. This has to stop. We should return to the old way, it wasn't perfect, but it worked for 100s of years. Repealing Section 230 will accomplish this.

        • themaninthedark a day ago ago

          I remember a story that was investigated and then published...it was spread far and wide. The current president of the US stole the election and our biggest adversary has videos of him in compromising positions. Then debunked. (Steele dossier) https://www.thenation.com/article/politics/trump-russiagate-...

          I remember a story that was investigated and then published...for some reason it was blocked everywhere and we were not allowed to discuss the story or even link to the news article. It "has the hallmarks of a Russian intelligence operation."(Hunter Biden Laptop) Only to come out that it was true: https://www.msn.com/en-us/news/politics/fbi-spent-a-year-pre...

          I would rather not outsource my thinking or my ability to get information to approved sources. I have had enough experience with gell-mann amnesia to realize they have little to no understanding of the situation as well. I may not be an expert in all domains but while I am still free at least I can do my best to learn.

          • tanjtanjtanj 11 hours ago ago

            > Russiagate

            It was never “debunked”, that is far too strong a word. Is it true? Who knows! Should we operate as if it was true without it being proven? Definitely not.

            > Hunter’s laptop

            In what way was that story buried or hidden? It was a major news story on every news and social network for over half a year. There was only consternation about how the laptop was acquired and who or what helped with that endeavor. The “quieting” of the story is BS and only came about a long time after the fact. Biden’s people sought (unsuccessfully) to have images removed from platforms but there was never an effort to make it seem like the allegations that stemmed from the laptop were misinformation.

            • nradov 10 hours ago ago

              You are spreading misinformation. According to Mark Zuckerberg, Facebook actively buried and hid posts related to the Hunter Biden laptop story in 2020. We can argue about whether Facebook did the right thing based on the information they had at the time but let's be clear about the facts: the CEO literally stated that they did it, so it's not BS.

              https://www.bbc.com/news/world-us-canada-62688532

              Twitter's Vijaya Gadde also admitted that they blocked users from sharing the story. That's not BS either.

              https://www.bbc.com/news/technology-54568785

              • tanjtanjtanj 4 hours ago ago

                I stand by what I said and I think you are interpreting my words uncharitably in order to “win” some argument I’m not a part of.

                I am not going into a semantic argument with you over whether my exact wordings match whatever you think I said.

                I will however say that both theses put forth by the comment I replied to are false. If you read either article you linked they actually support my point, the Hunter Biden news was extremely widely shared on Facebook and only throttled due to suspicions on Facebook’s part that it may have been inorganic. A particular article (but not the news) was blocked on Twitter based on an existing policy, discussion was still allowed and it was definitely widely discussed and shared.

          • ModernMech 12 hours ago ago

            w.r.t. the Steele Dossier, it was always from the beginning purported to be a "raw intelligence product", which is understood by everyone involved in that process to mean it is not 100% true -- the intelligence is weighted at different levels of confidence. Steele has said he believed his sources were credible, but he did not claim the dossier was 100% accurate. He weighed it at 50/50, and expected that investigators would use it as leads to verify information, not as proof in itself.

            And on that point the FBI investigations didn't even start on the basis of the Steele Dossier; they started on the basis of an Australian diplomat, Alexander Downer, who during a meeting with top Trump campaign foreign policy advisor George Papadopoulos, became alarmed when Papadopoulos mentioned that the Russian government had "dirt" on Hillary Clinton and might release it to assist the Trump campaign. Downer alerted the Australian government, who informed the FBI. The Steele Dossier was immaterial the investigation's genesis.

            So any claim that the dossier as a whole has been "debunked" is not remarkable. Of course parts of it have been debunked, because it wasn't even purported to be 100% true by the author himself. It's not surprising things in it were proven false.

            Moreover that also doesn't mean everything in it was not true. The central claim of the dossier -- that Donald Trump and his campaign had extensive ties to Russia, and that Russia sought to influence the 2016 U.S. election in Trump’s favor -- were proven to be true by the Muller Report Vol I and II, and the Senate Select Intel Committee Report on Russian Active Measures Campaigns and Interference in the 2016 Election, Vols I - VI.

            > The current president of the US stole the election

            Not a claim made in the dossier.

            > and our biggest adversary has videos of him in compromising positions.

            This hasn't been debunked. The claim in the dossier was that Russia has videos of Trump with prostitutes peeing on a bed Obama slept in, not peeing on Trump himself. The idea that it was golden showers is a figment of the internet. Whether or not the scenario where people peed on a bed Obama slept in happened as laid out in the dossier is still unverified, but not "debunked".

          • scarface_74 a day ago ago

            [flagged]

            • themaninthedark a day ago ago

              Forest for the trees.

              Don't take my comment as a declaration for Trump and all he stands for.

              My parent had posted "You know how this used to work in the old days? Instead of publishing allegations yourself, you would take your story to a newspaper reporter. The reporter will then do the investigations then, if there is solid evidence, the story will be published in the newspaper. At that point the newspaper company is standing behind the story, and citizens know the standing of the newspaper in their community, and how much credence to give to the story, based on that."

              Rather than call it an argument to authority, which it is very close to, I decided to highlight two cases where this authority that we are supposed to defer to was wrong.

              Perhaps a better and direct argument would be to point out that during the COVID pandemic; Youtube, Facebook and Twitter were all banning and removing posts from people who had heterodox opinions, those leading the charge with cries of "Trust the Science".

              This run contrary of what science and the scientific process is, Carl Segan saying it better than I "One of the great commandments of science is, 'Mistrust arguments from authority.' ... Too many such arguments have proved too painfully wrong. Authorities must prove their contentions like everybody else."

              Now that I have quoted a famous scientist in a post to help prove my point about how arguments from authority are invalid, I shall wait for the collapse of the universe upon itself.

        • nradov a day ago ago

          It never worked. Newspapers in the old days frequently printed lies and fake news. They usually got away with it because no one held them accountable.

          • itbeho a day ago ago

            William Randolph Hearst and the Spanish-American war come to mind.

        • pkphilip 21 hours ago ago

          What happens when the press refuses to publish anything which doesn't align with their financial or political interest?

        • trhway a day ago ago

          >At that point the newspaper company is standing behind the story

          the newspaper company is the bottleneck that the censors can easily tighten like it was say in USSR. Or like even FCC today with the media companies like in the case of Kimmel.

          Social media is our best tool so far against censorship. Even with all the censorship that we do have in social media, the information still finds a way due to the sheer scale of the Internet. That wasn't the case in the old days when for example each typewritter could be identified by unique micro-details of the shape of its characters.

          >Social media destroyed this process, now anyone can spread allegations at lightning speed on a massive scale without any evidence to back it up.

          Why to believe anything not accompanied by evidence? The problem here is with the news consumer. We teach children to not stick fingers into electricity wall socket. If a child would still stick the fingers there, are you going to hold the electric utility company responsible?

          >This has to stop. We should return to the old way, it wasn't perfect, but it worked for 100s of years.

          The same can be said about modern high density of human population, transport connections and infectious decease spreading. What you suggest is to decrease the population and confine the rest preventing any travel like in the "old days" (interesting that it took Black Death some years to spread instead of days it would have taken today, yet it still did spread around all the known world). We've just saw how it works in our times (and even if you say it worked then why aren't we still doing it today?). You can't put genie back into the bottle and stop the progress.

          >Repealing Section 230 will accomplish this.

          Yes, good thing people didn't decided back then to charge the actual printer houses with lies present in the newspapers they printed.

          • nextaccountic 14 hours ago ago

            Social media is also a bottleneck. In places like India, Facebook will comply with censorship or they will get blocked

        • petermcneeley a day ago ago

          > We should return to the old way, it wasn't perfect, but it worked for 100s of years

          At this stage you are clearly just trolling. Are you even aware of the last 100s of years? From Luther to Marx? You are not acting in good faith. I want nothing to do with your ahistorical worldview.

        • mensetmanusman a day ago ago

          There is no way to go back to this. It’s about as feasible as getting rid of vehicles.

          • breadwinner a day ago ago

            I am not saying we should go back to physical newspapers printed on paper. News can be published online... but whoever is publishing it has to stand behind it, and be prepared to face lawsuits from citizens harmed by false stories. This is feasible, and it is the only solution to the current mess.

            • nradov a day ago ago

              It's horrifying that anyone would believe that censorship and control over news would be a solution to anything. The naivety of your comment is in itself an indictment of our collective failure to properly educate the polity in civics.

            • knome a day ago ago

              A determined instigator could easily continue pushing modern yellow journalism with little problem under the system you propose.

              They simply need choose which negative stories they print, which opinions they run. How do you frame misrepresentation vs a differing point of view? How do you call out mere emphasis on which true stories are run. Truths are still truths, right?

              It's not infrequent today to see political opinions washed through language to provide reasonable deniability by those using it.

              Hell, it's not infrequent to see racism, bigotry and hate wrapped up to avoid the key phrases of yesteryear, instead smuggling their foulness through carefully considered phrases, used specifically to shield those repeating them from being called out.

              'No no no. Of course it doesn't mean _that_, you're imagining things and making false accusations.'

      • EB-Barrington a day ago ago

        [dead]

      • King-Aaron a day ago ago

        I can think of another hot-potato country that will get posts nerfed from HN and many others

    • Slava_Propanei a day ago ago

      [dead]

  • whycome a day ago ago

    What exactly constituted a violation of a COVID policy?

    • PaulKeeble a day ago ago

      A lot of channels had to avoid even saying the word Covid. I only saw it return recently to use at the end of last year. There were a variety of channels banned that shouldn't have been such as some talking about Long Covid.

      • doom2 a day ago ago

        Now you see channels avoiding saying "Gaza" or "genocide". I haven't seen any proof platforms are censoring at least some content related to Israel but I wouldn't be surprised.

    • perihelions a day ago ago

      According to Google's censorship algorithm, Michael Osterholm's podcast (famous epidemiologist and, at the time, a member of President Biden's own gold-star covid-19 advisory panel).

      https://x.com/cidrap/status/1420482621696618496 ("Our Osterholm Update podcast episode (Jul 22) was removed for “medical misinformation.”" (2021))

      Most ironic thing I've ever seen. I still recall it perfectly, though it's been four years. Never, ever trust censorship algorithms or the people who control them: they are just dumb parrots that suppress all discussion of an unwanted topic, without thought or reason.

      • delichon a day ago ago

        My wake up moment was when they not only took down a Covid debate with a very well qualified virologist, but also removed references to it in the Google search index, not just for the YouTube link.

        • miltonlost a day ago ago

          [flagged]

          • delichon a day ago ago

            I am not comfortable letting Google make that decision for me. You are?

            • theossuary a day ago ago

              We lost that choice when google became a monopoly.

              What I'm not comfortable with is preventing a private company from moderating their product.

              • janalsncm a day ago ago

                Your line of reasoning is mixing “is” and “aught”. The whole thread is about what aught to be. I doubt most people want Google to be a monopoly.

        • barbacoa a day ago ago

          Google went so far as to scan people's private google drives for copies of the documentary 'plandemic' and delete them.

    • potsandpans a day ago ago

      Saying lab leak was true

    • a day ago ago
      [deleted]
    • carlosjobim a day ago ago

      Every opinion different from the opinion of "authorities". They documented it here:

      https://blog.youtube/news-and-events/managing-harmful-vaccin...

      From the two links in the post, Google fleshes it out in great detail, with many examples of forbidden thought.

      • miltonlost a day ago ago

        [flagged]

        • someuser2345 a day ago ago

          > content that falsely alleges that approved vaccines are dangerous and cause chronic health effects

          The J & J vaccine was approved at the time, but was later banned for causing chronic health effects.

          > claims that vaccines do not reduce transmission or contraction of disease

          Isn't that true of the covid vaccines? Originally, the proponents claimed that getting the vaccine would stop you from getting covid entirely, but later on, they changed the goal posts to "it will reduce your symptoms of covid".

          • joecool1029 a day ago ago

            > The J & J vaccine was approved at the time, but was later banned for causing chronic health effects.

            That's not what happened. Authorities received rare reports of a clotting disorder and paused it for 11 days to investigate. That pause was lifted but the panic caused a crash in demand and J&J withdrew it from the market. Source: https://arstechnica.com/health/2023/06/j-fda-revokes-authori...

            • WillPostForFood a day ago ago

              It seems like you are implying that the pause was lifted because they found nothing. That's not quite right. J&J vaccine killed 9 people, and the FDA issued restrictions on who could get it, limitations on who should get it, and warnings about the side effects.

              https://www.fda.gov/media/146304/download

              • joecool1029 a day ago ago

                > It seems like you are inferring that the pause was lifted because they found nothing. That's not quite right.

                I am not and my source covers this.

                • WillPostForFood a day ago ago

                  Your source doesn't mention the nine deaths or the blood clotting side effect. It doesn't convey that there were legitimate, validated reasons for the pause and downgrade. "Rare reports" and "Nine Deaths" reads differently.

          • 2muchcoffeeman a day ago ago

            This highlights what’s so difficult with science communication.

            Right here on what should be a technical minded forum, people don’t understand what science is or how it works. Or what risk is. And they don’t even challenge their own beliefs or are curious about how things actually work.

            If the “smart” people can’t or won’t continuously incorporate new information, what are our chances?

          • teamonkey a day ago ago

            > Originally, the proponents claimed that getting the vaccine would stop you from getting covid entirely

            Some people don’t understand how vaccines work, so may have claimed that, but efficacy rates were very clearly communicated. Anyone who listened in high school biology should know that’s not how they work.

        • roenxi a day ago ago

          That policy catches and bans any scientists studying the negative health effects of vaccines who later turns out to be right.

          1) YouTube doesn't know what is true. They will be relying on the sort of people they would ban to work out when the consensus is wrong. If I watched a YouTube video through of someone spreading "vaccine misinformation" there is a pretty good chance that the speakers have relevant PhDs or are from the medical profession - there is no way the YouTube censors are more qualified than that, and the odds are they're just be random unqualified employees already working in the euphemistically named "Trust & Safety" team.

          2) All vaccines are dangerous and can cause chronic health effects. That statement isn't controversial, the controversy is entirely over the magnitude. Every time I get a vaccine the standard advice is "you should probably hang around here for 5 minutes, these things are known to be dangerous in rare cases". I think in most countries you're more likely to get polio from a polio vaccine than in the wild. On the one hand, that is a success of the polio vaccine. On the other hand, the vaccine is clearly dangerous and liable to cause chronic health problems.

          > This would include content that falsely says that approved vaccines cause ... cancer ...

          Cancer is such a catch all that we can pretty much guarantee there will be some evidence that vaccines cause cancer. Everything causes cancer. Drinking tea is known to cause cancer.

          3) All policies have costs and benefits. People have to be able to discuss the overall cost-benefit of a policy in YouTube videos even if they get one of the costs or benefits completely wrong.

          • gus_massa a day ago ago

            > I think in most countries you're more likely to get polio from a polio vaccine than in the wild. On the one hand, that is a success of the polio vaccine. On the other hand, the vaccine is clearly dangerous and liable to cause chronic health problems.

            In https://en.wikipedia.org/wiki/Polio_eradication#2025 I count 2 countries with the will type and 17 with the vaccine derived type (and like 160 without polio!)

            There are two vaccines, the oral that has attenuated ("live") virus and the inyectable that has inactivated ("dead") virus.

            * The oral version is not dangerous for the person that recibes it [1], but the virus can pass to other persons and after a few hops mutate to the dangerous version. The advantage is that the immunity is stronger and it also stops transmission.

            * The inject able version is also not dangerous [1], it doesn't stop transmission, but it also can't mutate because the virus is totally dead.

            Most fist word countries, and many other countries with no recent case use only the inyectable version. (Here in Argentina, we switched to only inyectable like 5 years ago :) .)

            Countries with recent case of other problems use a mix, to reduce transmission. (I think the inyectable one is also cheaper and easier to store.) (Also, a few years ago they dropped globally one of the strains from the oral one, because that strains is eradicated. The inyectable one has that strains just in case, but it can't escape.)

            [1] except potencial allergic reactions, that are rare, but I also remember big signs with instructions for the nurse explaining in case of an emergency what to do, what to inject, where to call ... The risk is not 0, but very low. I wonder if the trip to the hospital to get the vaccine is more dangerous.

          • handoflixue a day ago ago

            > Cancer is such a catch all that we can pretty much guarantee there will be some evidence that vaccines cause cancer. Everything causes cancer. Drinking tea is known to cause cancer.

            I'm reminded of the Prop 65 signs everywhere in California warning "this might cause cancer"

        • TeeMassive a day ago ago

          > This seems like good banning to me. Anti-vaxxer propaganda isn't forbidden thoughts. It's bad science and lies and killing people.

          Any subject important enough in any public forum is potentially going to have wrong opinions that are going to cause harm. While some people could be wrong, and could cause harm, the state itself being wrong is far more dangerous, especially with no dissident voices there to correct its course.

          Edit: I see you're getting downvoted for simply stating your honest opinion. But as a matter of principle I'm going to upvote you.

        • mapontosevenths a day ago ago

          [flagged]

          • rpiguy a day ago ago

            People have the right to believe things that could get them killed and the right to share their beliefs with others.

            Allowing the debate to be shut down is undemocratic and unscientific (science without question is nothing more than religion).

            Not allowing people to come to different conclusions from the same data is tyranny.

            • mapontosevenths a day ago ago

              > People have the right to believe things that could get them killed and the right to share their beliefs with others.

              You're allowed to believe whatever you like. Selling horse paste and 5g shields to mental defectives on the internet and getting THEM killed is wrong.

            • a day ago ago
              [deleted]
          • Bender a day ago ago

            Pfizer hid a lot of the damage done as did the others. A lot of people can die by the time books come out. [1] That's one of the many reasons I held off and glad I did.

            [1] - https://www.amazon.com/Pfizer-Papers-Pfizers-Against-Humanit...

            • mapontosevenths a day ago ago

              Not every book that gets published is accurate, especially print on demand Amazon books with forwards by men like Bannon.

              You know how many excess deaths there have been among the vaccinated? Now compare that to the unavaccinated for the same period. Make the same comparison with disability if you'd like.

              That's all the evidence you need.

          • immibis a day ago ago

            Shouting "fire" in a crowded theater being illegal was used to make it illegal to oppose the draft (Schenck v. United States). So actually, since opposing the draft is legal, shouting "fire" in a crowded theater is legal too.

            • mapontosevenths a day ago ago

              You would be charged with inciting a riot, reckless homicide, etc regardless of the actual words you shouted to cause the deaths, but I see your point.

            • a day ago ago
              [deleted]
            • trollbridge a day ago ago

              Yep, and that's what Brandenburg v. Ohio enshrined.

              • mapontosevenths 7 hours ago ago

                This is inaccurate. Brandenburg v. Ohio explicitly states the opposite.

                Justice Douglas specifically talks about the "fire in a crowded theater" issue being one of the few types of language that is specifically illegal beginning on the bottom of page 456 below. It's a PDF file of the original decision.

                "The example usually given by those who would punish speech is the case of one who falsely shouts fire in a crowded theatre.

                This is, however, a classic case where speech is brigaded with action. See Speiser v. Randall, 357 U. S. 513, 536- 537 (DOUGLAS, J., concurring). They are indeed insep- arable and a prosecution can be launched for the overt acts actually caused. Apart from rare instances of that kind, speech is, I think, immune from prosecution. "

                https://tile.loc.gov/storage-services/service/ll/usrep/usrep...

            • jjk166 a day ago ago

              That's quite the legal theory.

            • pessimizer a day ago ago

              "Shouting 'fire' in a crowded theater" being used as an excuse for censorship is the surest way to know you are talking to someone who hasn't even started doing the reading. Even worse, they often (over the past very few years) self-identify as socialists or anti-war, and the decision was in order to prosecute anti-war socialists for passing out pamphlets.

              If somebody says it, they not only don't care about free speech, they don't even care about having a good faith conversation about free speech. They've probably been told this before, and didn't bother to look it up, just repeated it again. Wasting good people's time.

              edit: here's a copy of fire in a crowded theater, https://postimg.cc/gallery/q4PJnPh

              • mapontosevenths 7 hours ago ago

                See my other comments here. You are wrong.

                The current law is dictated by the Brandenburg v. Ohio decision, and it's explicit that shouting fire in a crowded theater is very much one of the only kinds of speech that IS restricted. They literally use that example in the decision.

                So.... I think you might be the one who didn't do the research. Thank you for attending my TED talk.

              • mapontosevenths a day ago ago

                Brother, I'm on the spectrum so it's possible I'm the one missing the point here, but I think this time its the other way around.

                To me and most folks that I know its a figure of speech, not a reference to the actual 1919 supreme court case.

                • dragonwriter a day ago ago

                  > To me and most folks that I know its a figure of speech, not a reference to the actual 1919 supreme court case.

                  A figure of speech meaning what? Most people, AFAICT, that use it use it as an widely-perceived authoritative example of something specific that is accepted to be outside of the protection of free speech, a use that derives from and its use in the Schenk v. U.S. decision (it is sometimes explicitly described as something the Supreme Court has declared as outside of the protection of the 1st Amendment, which clearly derives from that origin.)

                  Of course, reliance on it for that purpose has problems because (1) it was dicta, not part of the ruling, in Schenk, and (2) Schenk is a notoriously bad decision impinging on core political speech in its specific application, and whose general rule is also no longer valid.

                  I have no idea what it would communicate as “a figure of speech”, and if it is actually used by some people as a figure of speech meaning something other than what it literally says being an example of unprotected speech (including, though I can see how this use would have some logic, as a figure of speech meaning “a persistently popular, despite being notoriously wrong, understanding of a legal rule”) it is one that impedes rather than promotes communication.

                  • mapontosevenths 7 hours ago ago

                    > it is one that impedes rather than promotes communication.

                    Maybe it does, but the simple truth is that the vast majority of people who use the phrase have never even heard of Oliver Wendell Holmes Jr. or his fabulous mustache and probably have no idea that it ever even went to the supreme court.

                    > A figure of speech meaning what?

                    People typically use it as a stand in for any language that might be dangerous enough to others to be curtailed. IE - Inciting a riot, or instigating a stampede that gets others killed. To use the legal phrasing from Brandenburg v. Ohio - they use it to describe language that would instigate "imminent lawless action" which WOULD still be curtailed under current law.

                    > Schenk is a notoriously bad decision impinging on core political speech in its specific application, and whose general rule is also no longer valid.

                    Schenk was mostly overturned by Brandenburg v. Ohio and in that decision justice Douglas actually specifically talks about falsely shouting fire in a crowded theater as "probably the only sort of case in which a person could be prosecuted for speech."

                    So Schenk, or no Schenk... It's still illegal.

                    See the bottom of page 456 (PDF Warning): https://tile.loc.gov/storage-services/service/ll/usrep/usrep...

    • jimt1234 a day ago ago

      [flagged]

    • zobzu a day ago ago

      [flagged]

  • woeirua a day ago ago

    It seems to me that a lot of people are missing the forest for the trees on misinformation and censorship. IMO, a single YouTube channel promoting misinformation, about Covid or anything else, is not a huge problem, even if it has millions of followers.

    The problem is that the recommendation algorithms push their viewers into these echo chambers that are divorced from reality where all they see are these videos promoting misinformation. Google's approach to combating that problem was to remove the channels, but the right solution was, and still is today, to fix the algorithms to prevent people from falling into echo chambers.

    • CobrastanJorji a day ago ago

      Yeah, there are two main things here that are being conflated.

      First, there's YouTube's decision of whether or not to allow potentially dangerous misinformation to remain on their site, and whether the government can or did require them to remove it.

      Second, though, there's YouTube's much stronger editorial power: whether or not to recommend, advertise, or otherwise help people discover that content. Here I think YouTube most fairly deserves criticism or accolades, and it's also where YouTube pretends that the algorithm is magic and neutral and they cannot be blamed for actively pushing videos full of dangerous medical lies.

    • asadotzler a day ago ago

      Why. Why is Google obligated to publish your content? Should Time Magazine also give you a column because they give others space in their pages? Should Harvard Press be required to publish and distribute your book because they do so for others.

      These companies owe you nothing that's not in a contract or a requirement of law. That you think they owe you hosting, distribution, and effort on their algorithm, is a sign of how far off course this entire discourse has moved.

    • stronglikedan a day ago ago

      The problem is that misinformation has now become information, and vice versa, so who was anyone to decide what was misinformation back then, or now, or ever.

      I like the term disinformation better, since it can expand to the unfortunately more relevant dissenting information.

      • 3cKU a day ago ago

        [dead]

    • kypro a day ago ago

      I've argued this before, but the algorithms are not the core problem here.

      For whatever reason I guess I'm in that very rare group that genuinely watches everything from far-right racists, to communists, to mainstream media content, to science educational content, to conspiracy content, etc.

      My YT feed is all over the place. The algorithms will serve you a very wide range of content if you want that, the issue is that most people don't. They want to hear what they already think.

      So while I 100% support changing algorithms to encourage more diversity of views, also I think as a society we need to question why people don't want to listen to more perspectives naturally? Personally I get so bored here people basically echo what I think. I want to listen to people who say stuff I don't expect or haven't thought about before. But I'm in a very significant minority.

      • woeirua a day ago ago

        I might agree that the algos making recommendations on the sidebar might not matter much, but the algos that control which videos show up when you search for videos on Google, and also in YouTube search absolutely do matter.

    • theossuary a day ago ago

      The problem with this is that a lot of people have already fallen into these misinformation echo chambers. No longer recommending them may prevent more from becoming unmoored from reality, but it does nothing for those currently caught up in it. Only removing the channel helps with that.

      • hsbauauvhabzb a day ago ago

        Algorithms that reverse the damage by providing opposing opinions could be implemented.

        • amanaplanacanal a day ago ago

          Why would Google ever do that? People are likely to leave YouTube for some other entertainment, and then they won't see more ads.

          • hsbauauvhabzb 17 hours ago ago

            I agree. My point was that it is possible. Google would never do it without being forced.

      • squigz a day ago ago

        I don't think those people caught up in it are suddenly like "oop that YouTuber is banned, I guess I don't believe that anymore". They'll seek it out elsewhere.

        • int_19h a day ago ago

          If anything, these people see the removal of their "favorite" videos as validation - if a video is removed, it must be because it was especially truthful and THEY didn't like that...

        • theossuary a day ago ago

          It's actually been showed many times that deplatforming significantly reduces the number of followers an influencer has. Many watch out of habit or convenience, but won't follow when they move to a platform with less moderation.

    • terminalshort a day ago ago

      The algorithm doesn't push anyone. It just gives you what it thinks you want. If Google decided what was true and then used the algorithm to remove what isn't true, that would be pushing things. Google isn't and shouldn't be the ministry of truth.

      • woeirua a day ago ago

        Exactly, they shouldn't be the ministry of truth. They should present balanced viewpoints on both sides of controversial subjects. But that's not what they're doing right now. If you watch too many videos on one side of a subject it will just show you more and more videos reinforcing that view point because you're likely to watch them!

        • terminalshort 21 hours ago ago

          Why should Youtube try to tell me what it thinks I should want to watch instead of what I actually want to watch? I'm not particularly interested in their opinion on that matter.

          • woeirua 12 hours ago ago

            Because search fundamentally requires curation. An algorithm has to "decide" which videos are most relevant to you. Otherwise, you'd be flooded with irrelevant results every time you make a search query.

      • TremendousJudge a day ago ago

        "what it thinks you want" is doing a lot of work here. why would it "think" that you want to be pushed into an echo chamber divorced from reality instead of something else? why would it give you exactly what you "want" instead of something aligned with some other value?

        • terminalshort 21 hours ago ago

          Given the number of people that describes it's pretty clear that people do want that. It's not exactly a new and surprising thing that people want things that are bad for them.

  • throwmeaway222 3 hours ago ago

    I'm shocked at how often people flip-flop their arguments when discussing private entities censoring speech. It's frustrating because it feels like the only speech allowed today is right-wing commentary. When Democrats were in power, it seemed like only left-wing commentary was permitted. It's baffling that, despite our education, we're missing the point and stuck in this polarized mess.

  • ggm a day ago ago

    Without over-doing it, as a non-american, not resident in the USA, It is so very tempting to say "a problem of your making" -but in truth, we all have a slice of this because the tendency to conduct state policy by mis-truths in the media is all-pervasive.

    So yes. This is a problem rooted in the USA. But it is still a problem, and it's a problem for everyone, everywhere, all the time.

  • petermcneeley a day ago ago

    Arguing online about the merits of free speech is as paradoxical as having discussions about free will.

    • thrance 12 hours ago ago

      I think you have a shallow understanding of both free speech and free will if you think this is the gotcha you seem to think it is. Why couldn't people have discussions about free will in a determinist universe? They could be weaved by the laws of physics into having them.

      As for free speech online, do you think there should be no limit to what can be said or shared online? What about pedophilia or cannibalism? Or, more relevantly, what about election-denialism, insurrectionism or dangerous health disinformation that are bound to make people act dangerously for themselves and society as a whole? Point is, free speech is never absolute, and where the line is drawned is an important conversation that must be had. There is no easy, objective solution to it.

  • st-keller 20 hours ago ago

    More speech! The signal vs. noise-ratio shifts. So access to information will become more difficult. More disinformation and outright nonsense will make it more difficult to get to the valuable stuff. Ok - let‘s see how that works!

  • frollogaston a day ago ago

    Still curious if the White House made them pin those vaccine videos on the homepage, then disable dislikes.

  • rustystump a day ago ago

    The problem with any system like this is that due to scale it will be automated which means a large swath of people will be caught up in it doing nothing wrong.

    This is why perma bans are bad. Id rather a strike system before a temp ban to give some breathing room for people to navigate the inevitable incorrect automation. Even then if the copyright issue is anything to go by this is going to hurt more than help.

  • pessimizer a day ago ago
  • bluedino a day ago ago

    I'm banned from posting in a couple subreddits for not aligning with the COVID views of the moderators. Lame.

    • c-hendricks a day ago ago

      Whenever someone says "i was banned from ..." take what they say with a huge grain of salt.

      • int_19h a day ago ago

        On Reddit, you can get banned from some subreddits simply because you have posted in another completely different sub (regardless of the content of the post).

        It's not even always politics, although that's certainly a major driving force. But then you have really stupid fights like two subs about the same topic banning each others' members.

      • pinkmuffinere a day ago ago

        Everybody here is strangers online, so I think grains of salt are reasonable all around. That said, I'm not sure that people-who-were-banned deserve above average scrutiny. Anecdotally, a lot of the RubyGems maintainers were banned a week ago. It seems really unfair to distrust people _just_ because a person-in-control banned them.

        • c-hendricks a day ago ago

          Oof, I'm outside my edit window and didn't make my correct point. It's when people say "I was banned from _____ for _____". When people say "for _____" I take their word with a huge grain of salt.

          Not even much to do with Reddit, it's something I picked up from playing video games: https://speculosity.wordpress.com/2014/07/28/the-lyte-smite/

          • pinkmuffinere a day ago ago

            Ah I see, you’re saying it’s very hard/impossible to verify the reason for the ban, so the given reason is especially low-signal. That actually does make sense to me, thanks for clarifying

      • qingcharles a day ago ago

        The problem (?) with Reddit is that the users themselves have a lot more control over bans than on other social media where it is the platform themselves that do the banning. This makes bans much more arbitrary even than on Facebook and et al.

      • EasyMark a day ago ago

        I was banned because I was simply in a covid sub debating with the covid-deniers. The "powers-that-be" mods literally banned anyone on that particular sub from popular subs, some of which I hadn't even been in, ever. There was (is?) a cabal of mods on there that run the most popular subs like pics/memes/etc that definitely are power hungry basement dwellers that must not have a life.

      • mvdtnz a day ago ago

        Reddit (both admins and many subreddit moderators) are extremely trigger happy with bans. Plenty of reasonable people get banned by capricious Reddit mods.

      • incomingpain 14 hours ago ago

        That's the funny thing about reddit. You can get banned trivially on a whim of a mod. I've been banned from multiple subreddits that I've never been to. Simply because I posted on another subreddit and that mod found detestable.

        My favourite. I'm trans/autistic. I was posting on r/autism being helpful. OP never mentioned their pronouns, just that they have a obgyn and feminine problems. I replied being helpful. but I misgendered them and they flipped out. Permabanned me from r/asktransgender, even though i never posted on it. Then left me a pretty hateful reply on r/autism. Reddit admins give me a warning for hate toward trans people. Despite me never doing any such thing and being one.

        Right about the same time r/askreddit had a thread about it being hard not to misgender trans. So i linked this thread, linking an imgur of the reddit admin warning. I went to like 30,000 upvotes. r/autism mods had to reply saying they dont see any hate in my post and that people should stop reporting it.

      • Loocid a day ago ago

        Eh, I was banned from several major subreddits for simply posting in a conversative subreddit, even though my post was against the conservative sentiment.

        • c-hendricks a day ago ago

          Same, happened to me after replying to a comment in the JRE sub, I think I was calling something / someone dumb. Coincidentally, that sub is openly against him now.

          Tried clarifying this in another comment, my point was more that people who say "I was banned from X for doing something innocuous" are often not telling the whole truth.

          • tbrownaw a day ago ago

            > my point was more that people who say "I was banned from X for doing something innocuous" are often not telling the whole truth.

            ... Except when the X in question is Reddit.

      • alex1138 a day ago ago

        Stop excusing it. It's a very real, very serious problem with Reddit. They're very much abusive on this and many other topics

        • frollogaston a day ago ago

          The answer is to leave Reddit and let them have their echo chamber. There's no point of posting there anyway.

    • croes a day ago ago

      I was banned because a moderator misunderstood my single word answer to another post.

      Reddit bans aren‘t an indicator for anything

  • bromuro a day ago ago

    YouTube is like old school televison - at different scale, they have to answer to politics and society. Our videos are their line up.

  • cavisne a day ago ago

    They should bring back the content too. When history books are written the current state of things is misleading.

  • keeda a day ago ago

    In other news (unrelated, I'm sure):

    "DOJ aims to break up Google’s ad business as antitrust case resumes"

    https://arstechnica.com/gadgets/2025/09/google-back-in-court...

  • EasyMark a day ago ago

    I'm not sure why they would, it's kind of a dumb move. They aren't violating anyone's freedom of speech by banning disinformation and lies. It's a public service, those people can head on over to one of the many outlets for that stuff. This is definitely a black mark on YouTube.

  • flohofwoe 19 hours ago ago

    Even more misinformation, Russian propaganda and bots to sift through in the recommendations and comments, got it!

  • a day ago ago
    [deleted]
  • lupusreal a day ago ago

    Prediction, nobody will be unbanned because they'll all be found to have committed other bannable offenses. Youtube gives Trump a fake win while actually doing nothing.

  • ironman1478 a day ago ago

    There isn't really a good solution here. A precedent for banning speech isn't a good one, but COVID was a real problem and misinformation did hurt people.

    The issue is that there is no mechanism for punishing people who spread dangerous misinformation. It's strange that it doesn't exist though, because you're allowed to sue for libel and slander. We know that it's harmful, because people will believe lies about a person, damaging their reputation. It's not clear why it can't be generalized to things that we have a high confidence of truth in and where lying is actively harmful.

    • asadotzler a day ago ago

      No speech was banned. Google didn't prevent anyone from speaking. They simply withheld their distribution. No one can seem to get this right. Private corporations owe you almost nothing and certainly not free distribution.

      • ironman1478 a day ago ago

        In the article it mentions that Google felt pressured by the government to take the content down. Implying that they wouldn't have if it wasn't for the government. I wasn't accusing Google of anything, but rather the government.

        Maybe it's not banning, but it doesn't feel right? Google shouldn't have been forced to do that and really what should've happened is that the people that spread genuine harmful disinformation, like injecting bleach, the ivermectin stuff or the anti-vax stuff, should've faced legal punishment.

    • alex1138 a day ago ago

      Virtually all of the supposed misinformation turned out not to be that at all. Period, the end. All the 'experts' were wrong, all those that we banned off platforms (the actual experts) were right

    • reop2whiskey a day ago ago

      What if the government is the source of misinformation?

      • ironman1478 a day ago ago

        It's interesting you say that, because the government is saying Tylenol causes autism in infants when the mother takes it. The original report even says more verification is required and it's results are inconclusive.

        I wouldn't be surprised if some lawsuit is incoming from the company that manufactures it.

        We have mechanisms for combatting the government through lawsuits. If the government came out with lies that actively harm people, I hope lawsuits come through or you know... people organize and vote for people who represent their interests.

      • EasyMark a day ago ago

        It certainly happens we're currently flooded with it from current regime

        - Tylenol causes autism

        - Vaccines cause autism

        - Vaccines explode kids hearts

        - Climate change is a hoax by Big Green

        - "Windmill Farms" are more dangerous for the environment than coal

        - I could go on but I won't

  • jameslk a day ago ago

    Misinformation, disinformation, terrorism, cancel culture, think of the children, fake news, national security, support our troops, and on and on. These will be used to justify censorship. Those who support it today may find out it's used against them tomorrow.

  • incomingpain 15 hours ago ago

    Canada has a tyrannical style government that has been censoring speech. I had a discussion recently with a liberal who was arguing that it's a good thing the government is censoring the speech of their political opponents. That free speech comes with consequences.

    My argument, free speech is a limit on the government. Give them as much consequences you please but not with government power.

    That's the problem here, Democrats were using government power to censor their political opponents; but they wouldnt have been able to do it without government power.

  • serf a day ago ago

    i'd like to think that if I were a YTer that got banned for saying something that I believed in that I would at least have the dignity not to take my value back to the group that squelched me.

    ..but i'm not a yter.

    • TeMPOraL a day ago ago

      It's showbiz. For those making actual money there, sacrificing dignity is the price of entry.

  • saubeidl a day ago ago

    The world is going backwards rapidly. The worst people are once again welcomed into our now-crumbling society.

  • dev1ycan a day ago ago

    Social media and lack of scientific research literacy is going to eventually prove to be fatal for modern society, even with this Tylenol thing, I have on one side people that believe a study blindly without reading that it's not taking into consideration several important variables and more studies are needed, and on the other hand I have people that did not read at all the study saying that it's impossible Tylenol could be causing anything because it is the only pain med pregnant women can take... clear non understanding of how controlled trials work...

    Same thing with the UFO "Alien" video that was "shot down" by a hellfire missile (most likely a balloon), people just automatically assume that because it was said in congress it has to be true, zero analysis whatsoever of the footage or wanting to seek analysis by an expert, nope, it must be an alien.

    There is so much misinformation, so much lack of understanding, and so many people, from every side that just have complete and utter lack of understanding of how seemingly basic things work, I am afraid for the future.

    But yeah! let's unban unscientific sources, oh and people who are okay with a literal coup on a democracy.

  • TwoNineFive a day ago ago

    They have a desperate need for false-victimhood.

    Without their claim to victimization, they can't justify their hatred.

  • guelo a day ago ago

    The amount of flagged hidden comments here by the supposed anti censorship side is almost funny.

    • dang a day ago ago

      If you (or anyone) run across a flagged comment that isn't tediously repeating ideological battle tropes, pushing discussion flameward, or otherwise breaking the site guidelines, you're welcome to bring it to our attention. So far, the flagged comments I've seen in this thread seem correctly flagged. But we don't see everything.

      On this site, we're trying for discussion in which people don't just bash each other with pre-existing talking points (and unprocessed rage). Such comments quickly flood the thread on a divisive topic like this one, so flagging them is essential to having HN operate as intended. To the extent possible at least.

      (oh and btw, though it ought to go without saying, this has to do with the type of comment, not the view it's expressing. People should be able to make their substantive points thoughtfully, without getting flagged.)

      https://news.ycombinator.com/newsguidelines.html

      • croes a day ago ago

        Flagging isn’t the worst that can happen, you could also be rate limited what prevents you from answering in a discussion because of „you are posting too fast“

        I know what I‘m talking about

        • dang a day ago ago

          Yes, when accounts have a pattern of posting too many unsubstantive and/or flamewar comments, we sometimes rate limit them.

          We're happy to take the rate limit off once the account has built up a track record of using HN as intended.

          • croes 21 hours ago ago

            What exactly is meant by track record?

      • alex1138 a day ago ago

        Yeah but in practice this isn't actually the case, people flag all the time for people just having a dissenting opinion, fitting none of the categories you mentioned

        • dang a day ago ago

          As mentioned, I haven't seen cases of that in the current thread. If there are any, I'd appreciate links. We don't see everything.

          • michtzik 11 hours ago ago

            Looking at the flagged comments throughout this thread, perhaps https://news.ycombinator.com/item?id=45354759 is the closest to being flag-killed for a different opinion.

            • dang 8 hours ago ago

              That one is arguably borderline—on the one hand, it makes large statements about a divisive topic without adding much information, but on the other hand it isn't snarky, name-calling, etc. I've turned off the flags.

          • 3cKU a day ago ago

            [dead]

        • braiamp a day ago ago

          There's one comment literally spreading misinformation and it isn't flagged, but instead got pushback by others, critically pointing the weakness of their arguments.

        • 3cKU a day ago ago

          [dead]

  • valentinammm a day ago ago

    [dead]

  • a day ago ago
    [deleted]
  • EverydayBalloon 21 hours ago ago

    [dead]

  • cindyllm a day ago ago

    [dead]

  • cbradford a day ago ago

    So absolutely no one involved will have any repercussions. So they will all do it over again at the next opportunity

    • JumpCrisscross a day ago ago

      > they will all do it over again at the next opportunity

      Future tense?

    • asadotzler a day ago ago

      They are mega-corporations. They always do what ever the hell they want, certainly absent your input. Did you really believe they don't do what ever they want, because that's pretty damned naive.

    • johnnyanmac a day ago ago

      yeah, 2025 in a nutshell. The year of letting all the grifts thrive.

    • lazyeye a day ago ago

      What should the punishment be for having opinions the govt disagrees with?

      • Supermancho a day ago ago

        Promoting medical misinformation or even health misinformation should be critically judged. Alternative health companies are rubbing their hands together.

        The next Drain-o chug challenge "accident" is inevitable, at this rate.

        • cubefox 13 hours ago ago

          What is considered "misinformation" depends on whatever the censoring authority in question (e.g. Facebook or YouTube or some news website) _believes_ to be misinformation.

          For example, in 2020, the WHO(!) Twitter account literally tweeted that masks don't work. That same statement would have been considered medical misinformation by a different authority.

          Another example: the theory that Covid leaked from a lab in Wuhan which was known to do gain of function experiments on coronaviruses was painted as a wacky conspiracy theory by most of the mainstream media, despite the fact that many respectable sources (e.g. the CIA) later concluded that it has a significant amount of plausibility versus the alternative Wuhan wet market hypothesis which required that the virus somehow arrived there from a bat cave more than a thousand kilometres away.

        • lazyeye a day ago ago

          That sounds great in theory. In practice, "misinformation" ends up being defined as anything the govt finds inconvenient. Or it is selectively applied so that when misinformation comes from all sides of the political spectrum, only people the govt doesnt like (in the more general sense) get kicked off platforms.

      • th0ma5 a day ago ago

        Notoriety

        • lazyeye a day ago ago

          Yep..and fame, admiration, contempt, loathing, indifference etc

  • oldpersonintx2 a day ago ago

    [dead]

    • a day ago ago
      [deleted]
  • jimt1234 a day ago ago

    [flagged]

  • apercu a day ago ago

    [flagged]

  • boxerab a day ago ago

    tl;dr The Biden Administration has been caught using the government to force Twitter, YouTube and Facebook to censor its political enemies.

    • EasyMark a day ago ago

      They never forced them, and they certainly never said "that's a nice merger you got there, it would be awful if something were to happen to it" per the current policies of the US government.

  • najarvg a day ago ago

    [flagged]

    • ch4s3 a day ago ago

      Far too many people are free speech hypocrites.

    • eschulz a day ago ago

      who doesn't get free speech?

      • jimt1234 a day ago ago

        [flagged]

        • a day ago ago
          [deleted]
        • cptnapalm a day ago ago

          [flagged]

        • SV_BubbleTime a day ago ago

          [flagged]

          • lesuorac a day ago ago

            [flagged]

            • SV_BubbleTime a day ago ago

              Nice edit where you cut out all the “progressive” stuff to make the same lie that the shooter is right wing because you want to do anything you can to avoid admitting that extreme left even exists lets alone is capable of assassination.

              Yes, those are Kimmel words, and when he said them, everyone already knew the facts about his gay lifestyle and trans boyfriend and that he murdered Kirk out of hate. It’s nice to see that you picked up his torch regardless of how little rational sense it makes.

              >Somebody using violence for political means is exactly aligned with Charlie Kirk's spoken words.

              Cite it. In whole context, but video is preferable. The guy had a decade of being in front of the camera, post any the hateful video, it should be easy! Show us the hate, if not the justification for his own murder of course.

      • apercu a day ago ago

        [flagged]

  • guelo a day ago ago

    [flagged]

    • paulryanrogers a day ago ago

      Steelman argument is it's better to know what liars, bigots, and other naughty people are up to than push them entirely underground. And someday future moderators may think you're naughty/lying/a quack/etc.

      IMO we should not let private platforms become near monopolies, and certainly not without regulation, since they become a defacto public square. But if we're going to let them eat the world, then hopefully they'll at least use good judgment and measures like de-ranking or even banning folks who encourage others to do harm. Making bans temporary is a safety valve in case of bad moderation.

      • immibis a day ago ago

        That steelman is still a pretty bad argument, though. I don't see why giving liars, bigots and other naughty people a megaphone is required in order to know what they're saying.

        • paulryanrogers a day ago ago

          Yeah, I personally still see a place for permanent bans. But I can see the other side.

        • a day ago ago
          [deleted]
        • jimmygrapes a day ago ago

          I suppose the argument there is that it's not necessarily a megaphone for the fella with 24 followers. The concern comes from when someone amasses a following through "acceptable" means and then pivots. Not sure how to balance that.

          • nickthegreek a day ago ago

            new algos will gladly give people with 24 followers millions of views if the content pushes the right metrics

        • brokencode a day ago ago

          Who gets to decide who’s naughty? One day it’s the Biden admin, and the next it’s the Trump admin. That’s the tough part about censorship.

          You can leave it up to companies, but what happens when Trump allies like Elon Musk and Larry Ellison buy up major platforms like Twitter and TikTok?

          Do we really trust those guys with that much power?

          • immibis 19 hours ago ago

            We can use objective metrics to tell if someone is a liar, a bigot, and a paid disinformant working for a foreign state.

            ... or we could have, 5-10 years ago. No matter what we call one of these people, no matter how true our findings are, if you accuse a bad person of a bad thing these days, they'll just accuse you of the same bad thing (no matter how little sense it makes) and they have so many stupid followers who will believe it, that being correct is no longer a winning strategy.

            • brokencode 10 hours ago ago

              That’s just the thing. Your definition of an “objective metric” can be totally different than somebody else’s.

              Biden’s definition of the objective truth is very different than Trump’s. It doesn’t matter what the actual truth is. Whoever’s in charge gets to decide what gets censored.

      • hash872 a day ago ago

        What is Youtube a 'near monopoly' in? Online video.....? Do you have any idea how much video there is online that's not on Youtube? They don't meet the legal definition of a monopoly

    • bawolff a day ago ago

      People change/make mistakes. Permanent bans are rarely a good idea.

      • ryandrake a day ago ago

        Earlier in 2025, the video game Fortnite announced[1] that they were giving cheaters with lifetime bans a "second chance" and let them return to the game. Lo and behold, cheating in the game spiked up this year and has returned as a huge ongoing problem. Turns out, the vast majority of the bans were probably correct, and when you let people back into something who were banned for doing X, they're going to immediately start doing X again once they're back in.

        1: https://www.fortnite.com/news/fortnite-anti-cheat-update-feb...

        • bawolff a day ago ago

          Both these things can be true.

          People deserve second chances every now and then. Many people squander their second chances. Some people don't.

      • stefantalpalaru a day ago ago

        [dead]

    • dotnet00 a day ago ago

      Admittedly, Google was very heavy handed with Covid censorship. Sure, there was a lot of genuine misinformation that maybe deserved it, but they also tended to catch a lot of actual qualified scientists engaging in scientific debate (say, arguing in favor of masks and the transmission through air theory in the early days) or even some discussion that wasn't opposing the official stances.

      Somewhat related, it's pretty insane how even to this day YouTubers have to avoid referring to by name a global multi-year situation that everyone who existed at the time went through. It's due to advertisers rather than government pressure, but still, insane.

      • andy99 a day ago ago

        Yeah at the time I get the impression they were banning dissent, not just egregious or dangerous content (whatever that even means). I though most places came to their senses a long time ago and walked back that heavy handedness, I'm surprised this just happened.

      • layman51 a day ago ago

        Your point reminded me that around the time when the pandemic first started, I saw a YouTube video on physics titled something like "Corona and Arc Discharge" and it had the contextual note that is sometimes added to videos. I think the official name YouTube gives it is: "topical context in information panel". I thought it was a funny case where the automated system thought this physics video had something to do with COVID.

    • IncreasePosts a day ago ago

      Merriam Webster defines con man as "a person who tricks other people in order to get their money : con artist"

      Even if people were straight up wrong about their COVID-19 theories, I don't think many of the banned people were trying to get viewers to send them money.

      • mapontosevenths a day ago ago

        > trying to get viewers to send them money.

        They were trying to get viewers to get money. It's an important distinction.

      • heavyset_go a day ago ago

        We both know that ads and sponsorships are a significant way influencers monetize their viewers.

        All they have to do is lie to attract eyeballs and they make money. E-begging isn't necessary, the platforms allow you to extract value from viewers at an incredible scale.

  • jmyeet a day ago ago

    First, let's dispense with the idea that anybody is a free speech absolutist. Nobody is. No site is. Not even 4chan is (ie CSAM is against 4chan ToS and is policed).

    Second, some ideas just aren't worth distributing or debating. There's a refrain "there's no point debating a Nazi". What that means is there is a lot of lore involved with being a modern Nazi, a labyrinth of conspiracy theories. To effectively debate a Nazi means learning all that lore so you can dismantle it. There's no point. In reality, all you end up doing is platforming those ideas.

    I'm actually shocked at how ostensibly educated people fall into the anti-vax conspiracy trap. Covid definitely made this worse but it existed well before then. Certain schools in San Francisco had some of the lowest child vaccination rates in the country.

    As a reminder, the whole vaccine autism "theory" originated from one person: Andrew Wakefield. He was a doctor in the UK who was trying to sell a vaccine. The MMR vaccine was a direct compeititor so he just completely made up the MMR link to autism. He his medical license because of it. But of course he found a receptive audience in the US. He is and always was a complete charlatan.

    Likewise, the Covid anti-vax movement was based on believing random Youtube videos from laymen and, in many cases, an intentional ignorance in the most esteemed traditions of American anti-intellectualism. People who are confidently wrong about provably wrong things who had no interest in educating themselves. Some were griters. Some were stupid. Many were both.

    We had people who didn't understand what VAERS was. (and is). We had more than 10 million people die of Covid yet people considered the vaccine "dangerous" without any evidence of side effects let alone death. As one example, you had people yelling "J'accuse!" at hints of myocardial inflammation from the vaccine. But you know what else causes myocardial inflammation? Getting Covid.

    If you're excited by this move, it just further highlights that you have no idea whta's going on and zero interest in the truth. What's happening here is big tech companies capitulating to the fringe political views of the administration, a clear First Amendment violation, to curry favor, get their mergers approved, get cgovernment contracts and so on.

    Regardless of your views on this or any otehr issue you should care about capitulation by social media sites in this way.

    This comments on this post are just a graveyard of sadness.

    • int_19h a day ago ago

      The problem with those "ideas that just aren't worth" is the usual, who decides?

      In my country of origin, you get called a Nazi simply for being opposed to the war of aggression that it is currently engaged in. In US, we have a long history of "terrorist" and "extremist" being similarly abused.

      • jmyeet a day ago ago

        Do you think it's a good idea that this administration gets to decide what is and isn't acceptable speech? That's one of my points. So regardless of your positions on Covid and the 2020 you shouldn't celebrate this move because the government shouldn't have this kind of influence.

        • int_19h 21 hours ago ago

          Oh, absolutely, I don't think this move by Google has anything to do with them being some kind of staunch free speech supporters. It's an obvious and rather pathetic attempt to suck up to the Trump administration, which itself is cancer when it comes to rights and freedoms. I'm no COVID denialist either.

          I just don't think that "there's no point debating a Nazi" is, in general, a good argument in favor of censorship, whether public or private. It's one of those things that have a good ring to it and make some superficial sense, like "fire in the crowded theater", and then you look at how it works in the real world...

  • alex1138 a day ago ago

    So the other day, I linked to something on Rumble right here on Hacker News and was told to find a better source

    First of all, you can't separate a thing's content from the platform it's hosted on? Really?

    Second of all, this is why

    I'll just go do this again and if you flag me it's on you, you have no standing to do it (the internet is supposed to be democratic, remember?)

    https://rumble.com/v28x6zk-sasha-latypova-msc.-nsa-team-enig...

    https://rumble.com/v3zh3fh-staggering-17m-deaths-after-covid...

    https://rumble.com/vt62y6-covid-19-a-second-opinion.html

    https://rumble.com/v2nxfvq-international-covid-summit-iii-pa...

    I could go on. Feel free if you want to see more. :)

    (Was it misinformation when Fauci said you shouldn't rush a vaccine or all hell breaks loose years later? Or when he intimated that masks wouldn't work for covid?)

    • braiamp a day ago ago

      The reason why you are asked better source is because, and let me say this slowly, anyone can post any crap on the internet without repercussions. Lets start with the one that references "Sasha Latypova". If I search her credentials she earned a title on Master of Business Administration, except that she used that to work as a co-founder of two companies, and none of them are even adjacent to pharmacology, but she is a "global PHARMA regulation expert". I'm sure that the other persons there will not have those issues, right?

      • The_President 16 hours ago ago

        “And let me say this slowly” No point in typing this out - it is condescending to the parent poster.

        • 1121redblackgo 12 hours ago ago

          Equal and opposite reactions, if parent poster is falsely that confident, then its fair to meet them with strong condescension on the other side is it not?

    • 1121redblackgo a day ago ago

      Boo

  • pcdoodle a day ago ago

    So great to see the censorship apparatus in full swing on HN. Lots of great comments into the dust bin.

  • moomoo11 a day ago ago

    I think hardware and ip level bans.. should be banned.

    I know that some services do this in addition to account ban.

    • ocdtrekkie a day ago ago

      Any service which allows user generated content and allows arbitrary IP addresses to create infinite accounts is guaranteed to be overrun with CSAM. It's practically a law of physics.

      • jjk166 a day ago ago

        If you actually cared about CSAM you would want those posting it to self incriminate and then face consequences in real life at the hands of actual authorities. Websites banning such posters only serves to alert them that they need to improve their tactics and give them the opportunity to hide. Removing only the offending content and alerting authorities is the appropriate thing a website like Youtube should be doing.

        Even if one does argue that CSAM should result in hardware and IP bans, there's no reason that can't be a sole exception to a wider prohibition on such bans.

        • JumpCrisscross a day ago ago

          > If you actually cared about CSAM you would want those posting it to self incriminate and then face consequences in real life

          We don’t have the resources for this, even when the FBI isn’t being purged and sent to Home Depots. Unrestricting IPs means a boom for CSAM production and distribution.

          • jjk166 a day ago ago

            Well work on making those resources available instead of, again, informing CSAM creators how to better hide their activities. I fail to see how repeatedly removing CSAM from a single IP address is more of a boon to CSAM distributors than playing whackamole with multiple IP addresses. Wasting law enforcement resources on other things while CSAM producers are free to operate is a separate, and in my opinion much more pressing issue.

            • JumpCrisscross a day ago ago

              > informing CSAM creators how to better hide their activities

              This adds to their risks and costs. That tips the economic balance at the margin. Actually going after all creators would require an international law-enforcement effort for which, frankly, there isn't political capital.

              • jjk166 a day ago ago

                > This adds to their risks and costs. That tips the economic balance at the margin.

                Charging would be bank robbers a fee to do practice runs of breaking into a vault adds to their costs; somehow that doesn't seem like an effective security measure.

                > Actually going after all creators would require an international law-enforcement effort for which, frankly, there isn't political capital.

                I'm not talking about going after all creators, just the ones you have the identifying information for which are so continuously pumping out such quantities of CSAM that it is impossible to stop the firehose by removing the content.

                If you don't have the political capital to go after them, again you have bigger issues to deal with.

                • JumpCrisscross a day ago ago

                  > somehow that doesn't seem like an effective security measure

                  …this is literally how we police bank theft. Most bank thieves are never caught because they can do it online from an unresponsive jurisdiction.

                  > just the ones you have the identifying information for

                  Sure. You’re still going to have a firehose of CSAM, and worse, newly-incentivised producers, if you turn off moderation.

            • squigz a day ago ago

              > Wasting law enforcement resources on other things while CSAM producers are free to operate is a separate

              It's been a long time since I had anything remotely to do with this (thankfully) but... I'm pretty sure there are lots of resources devoted to this, including the major (and even small) platforms working with various authorities to catch these people? Certainly to say they're "free to operate" requires some convincing.

              • jjk166 a day ago ago

                Pick a lane. Either we have the resources to go after CSAM producers, in which case we should be using them; or we don't, in which case we should be getting those resources. In either scenario, banning IPs is a counterproductive strategy to combat CSAM and it is a terrible justification for permitting IP bans.

                • JumpCrisscross a day ago ago

                  > Either we have the resources to go after CSAM producers, in which case we should be using them; or we don't, in which case we should be getting those resources

                  We don’t have the resources and we don’t want to divert them.

                  > banning IPs is a counterproductive strategy to combat CSAM and it is a terrible justification for permitting IP bans

                  The simple reason for banning Russian and Chinese IPs is the same as the reason I block texts from Vietnam. I don’t have any legitimate business there and they keep spamming me.

                • squigz a day ago ago

                  I'm not the one you were arguing with initially, I just wanted to address the idea that child abusers are just free to do whatever they want, and we're not doing anything about it.

        • ocdtrekkie a day ago ago

          Yes, we should let people "self-incriminate" with Tor and disposable email services...

          • jjk166 a day ago ago

            We're talking about websites like Youtube implementing hardware and IP bans. If your argument is that these are easily circumventable by CSAM distributors, that seems like all the more reason not to use them to combat CSAM.

  • reop2whiskey a day ago ago

    is there any political censorship scheme at this large of scale in modern us history?

    • rimbo789 a day ago ago

      Yes; the way the us government big business and the media, specifically Hollywood colluded during the Cold War

  • rob74 20 hours ago ago

    > Google's move to reinstate previously banned channels comes just over a year after Meta CEO Mark Zuckerberg said [...] that the Biden administration had repeatedly pressured Meta in 2021 to remove content related to COVID-19. "I believe the government pressure was wrong, and I regret that we were not more outspoken about it," Zuckerberg wrote in the August 2024 letter.

    I'm sure Zuckerberg will say the same thing in 2029 too if the ruling party changes again. Until then, removing fact-checking and letting conspiracy theorists have their freedom of speech while suppressing voices critical of the current administration will make that change less likely...