• dubteedub@beehaw.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Any writers still on SubStack need to immeadiately look at alternative options and shift their audiences to other platforms. To stick around on the site when the founder straight up condones neo nazis and not only gives them a platform, but profit shares with them and their nazi subscribers is insane.

  • alyaza [they/she]@beehaw.orgM
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    techno-libertarianism strikes again! it’s every few years with these guys where they have to learn the same lesson over again that letting the worst scum in politics make use of your website will just ensure all the cool people evaporate off your website–and Substack really does not have that many cool people or that good of a reputation to begin with.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    🤖 I’m a bot that provides automatic summaries for articles:

    Click here to see the summary

    While McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation.

    In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions.

    “We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said.

    In a 2020 letter from Substack leaders, including Best and McKenzie, the company wrote, “We just disagree with those who would seek to tightly constrain the bounds of acceptable discourse.”

    The Atlantic also pointed out an episode of McKenzie’s podcast with a guest, Richard Hanania, who has published racist views under a pseudonym.

    McKenzie does, however, cite another Substack author who describes its approach to extremism as one that is “working the best.” What it’s being compared to, or by what measure, is left up to the reader’s interpretation.


    Saved 57% of original text.

  • ursakhiin@beehaw.org
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Not gonna lie. I’ve never heard of Substack but I appreciate their stance of publicly announcing why I would continue to avoid them.

  • Omega_Haxors@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    1 year ago

    Translation: “We support Nazis and would like to offer them passive protection. If you have a problem with them, we will ban you”

  • janguv@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    There’s a lot of empirical claims surrounding this topic, and I’m unaware who really has good evidence for them. The Substack guy e.g. is claiming that banning or demonetising would not “solve the problem” – how do we really know? At the very least, you’d think that demonetising helps to some extent, because if it’s not profitable to spread certain racist ideas, that’s simply less of an incentive. On the other hand, plenty of people on this thread are suggesting it does help address the problem, pointing to Reddit and other cases – but I don’t think anyone really has a grip on the empirical relationship between banning/demonetising, shifting ideologues to darker corners of the internet and what impact their ideas ultimately have. And you’d think the relationship wouldn’t be straightforward either – there might be some general patterns but it could vary according to so many contingent and contextual factors.

  • katy ✨@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    1 year ago

    if you say nazi and white supremacist content is just a “different point of view”, you support nazi and white supremacist content. period.

    and it’s not surprising since lulu meservey’s post on twitter when the whole situation with elon basically abandoning moderation.

    “Substack is hiring! If you’re a Twitter employee who’s considering resigning because you’re worried about Elon Musk pushing for less regulated speech… please do not come work here.”

    https://www.inverse.com/input/culture/substack-hiring-elon-musk-tweet

    • Drewski@lemmy.sdf.org
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      The problem is that some people are quick to call things Nazi and white supremacist, when it’s actually just something they disagree with.

      • Powerpoint@lemmy.ca
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        That’s not the problem at all. If you support fascists then you support Nazi’s and white supremacy.

  • Plume (she/her)@beehaw.org
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    So they are complacent with it very well. If you are complacent with Nazis, to me, you’re a Nazi. I don’t give a shit. What’s the saying that the Germans have? Like there are six guys at a table in a bar and one of them is a Nazis, therefore there are six Nazis at the table? Yeah, that.

  • PotentiallyAnApricot@beehaw.org
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    I really struggle to take seriously what these tech people say about ‘not wanting to censor’. They made a business calculation, and maybe an ideological one, and decided “we want that nazi money, it’s worth it to us.” which really tells you everything about a company and how it is likely to approach other issues, too.

  • some_guy@lemmy.sdf.org
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Reading about this at work the other day, I announced to my coworkers that Substack is officially bad. Profiting off of nazi propaganda is bad. Fuck Substack.

    I had recently subscribed to the RSS feed for The Friendly Atheist and was considering monetary support. They accept via Substack or Patreon. I would have opted for Patreon anyway, because that’s where I already have subscriptions. But after learning about this, I’ll never support anything, no matter what, via Substack. Eat my ass, shitheads.

  • frog 🐸@beehaw.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    It is true that removing and demonetising Nazi content wouldn’t make the problem of Nazis go away. It would just be moved to dark corners of the internet where the majority of people would never find it, and its presence on dodgy-looking websites combined with its absence on major platforms would contribute to a general sense that being a Nazi isn’t something that’s accepted in wider society. Even without entirely making the problem go away, the problem is substantially reduced when it isn’t normalised.

    • alyaza [they/she]@beehaw.orgM
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      the weirdest thing to me is these guys always ignore that banning the freaks worked on Reddit–which is stereotypically the most cringe techno-libertarian platform of the lot–without ruining the right to say goofy shit on the platform. they banned a bunch the reactionary subs and, spoiler, issues with those communities have been much lessened since that happened while still allowing for people to say patently wild, unpopular shit

      • Auzy@beehaw.org
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        They took way too long unfortunately , but totally agree. thedonald, femaledatingstrategy and fatpeoplehate should have been banned a lot quicker

        It feels like they’ve let it degrade again too now. Last I was on it, lots of subs had gone really toxic and weird

      • frog 🐸@beehaw.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Yep! Reddit is still pretty awful in many respects (and I only even bother with it for specific communities for which I haven’t found a suitable active equivalent on Lemmy - more frogs and bugs on Lemmy please), but it did get notably less unpleasant when the majority of the truly terrible subs were banned. So it does make a difference.

        I feel like “don’t let perfect be the enemy of good” is apt when it comes to reactionaries and fascists. Completely eliminating hateful ideologies would be perfect, but limiting their reach is still good, and saying “removing their content doesn’t make the problem go away” makes it sound like any effort to limit the harm they do is rendered meaningless because the outcome is merely good rather than perfect.

      • jarfil@beehaw.org
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        1 year ago

        I’d argue that it still broke Reddit.

        Back in the day, I might say something out of tone in some subreddit, get the comment flagged, discuss it with a mod, and either agree to edit it or get it removed. No problem.

        Then Reddit started banning reactionary subs, subs started using bots to ban people for even commenting on other blacklisted subs, subs started abusing automod to ban people left and right, even quoting someone to criticize them started counting as using the same “forbidden words”, conversations with mods to clear stuff up pretty much disappeared, application of modern ToS retroactively to 10 year old content became a thing… until I got permabanned from the whole site after trying to recur a ban, with zero human interaction. Some months later, while already banned sitewide, they also banned me from some more subs.

        Recently Reddit revealed a “hidden karma” feature to let automod pre-moderate potentially disruptive users.

        Issues with the communities may have lessened, but there is definitely no longer the ability to say goofy, wild, or unpopular stuff… or in some cases, even to criticize them. There also have been an unknown number of “collateral damage” bans, that Reddit doesn’t care about anymore.

          • jarfil@beehaw.org
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            1 year ago

            The only time I got banned for bigoted stuff, was precisely for quoting someone’s n-word and calling them out on it. Automod didn’t care about the context, no human did either. Also got banned for getting carried away and making a joke in a “no jokes” (zero tolerance) sub. Several years following the rules didn’t grant me even a second chance. Then was the funny time when someone made me a mod of a something-CCP sub, and automatically several other subs banned me.

            There is a lot more going on Reddit than what meets the eye, and they like to keep it out of sight.

            • Vodulas [they/them]@beehaw.org
              link
              fedilink
              arrow-up
              0
              ·
              1 year ago

              The only time I got banned for bigoted stuff, was precisely for quoting someone’s n-word and calling them out on it. Automod didn’t care about the context, no human did either.

              It sounds like the right call was made (as long as both you and the OP were banned). As a white person, there is no reason for you to use the n-word. In that situation simply changing it to “n-word” is the very least that could have been done

              I’m not really sure how that provides and example of stuff going on in the background that someone wants to keep out of sight.

      • jasory@programming.dev
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        You’re literally on a platform that was created to harbor extremist groups. Look at who Dessalines is, (aka u/parentis-shotgun) and their self-proclaimed motivation for writing LemmyNet. When you ban people from a website, they just move to another place, they are not stupid it’s pretty easy to create websites. It’s purely optical, you’re not saving civilisation from harmful ideas, just preventing yourself from seeing it.

        • alyaza [they/she]@beehaw.orgM
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          1 year ago

          When you ban people from a website, they just move to another place, they are not stupid it’s pretty easy to create websites. It’s purely optical,

          you are literally describing an event that induces the sort of entropy we’re talking about here–necessarily when you ban a community of Nazis or something and they have to go somewhere else, not everybody moves to the next place (and those people diffuse back into the general population), which has a deradicalizing effect on them overall because they’re not just stewing in a cauldron of other people who reinforce their beliefs