I saw talk of them being federated with instances like “exploding heads” (literal neo-nazis) and “posting lolis rocks”, along with having interactions from users there. Is that true? idk how to check for that, I can only just barely stumble around the normal forum interface.
If so, why? They’ve been going hard on purging communists user-by-user and comment-by-comment, which seems like a little bit of a secondary concern to nazis and pedos, especially something as simple as just blocking the instances at the very least. What’s going on over there?
What I mentioned, if it is correct, might not be exhaustive, I’m just repeating what I heard about.
Edit: Correction, exploding heads are closer to being latter-day blue checks, complete garbage but not as pressingly bad as the other site.
Isn’t there a way for individual users to completely hide an instance from their feed? Because I would just do that if it bothers you.
I’m not saying that some posts/instances aren’t unethical, objectively incorrect, and/or dangerous, but widespread censorship is a slippery slope. If you want what you’re able to see to be heavily controlled, there are probably other instances with that purpose. An instance that describes itself as general-purpose and for everyone probably isn’t the place for that, though.
Edit: To be clear, I don’t have a problem with an instance blocking another instance. My problem is with an instance blocking another instance while implying that they aren’t.
I wouldn’t call blocking gore, nazis, and pedos “widespread”. For anyone who hasn’t heard the nazi bar story (source @IamRageSparkle on Twitter):
I think that’s a somewhat reasonable argument when such people are actually congregating in your space, and their presence necessarily impacts everyone else in that space.
In your example, by telling those people to leave you would necessarily be taking away what they believe to be their space and hindering the ability for group members to talk with each other. In the context of federated communities, it doesn’t matter when you block an instance (if and when such actions become necessary) because they’ll still have both of those things.
In your example, all interactions would constantly be influenced by the presence of that group and non-group-members would understandably associate that group with that space. In the context of a federated community, anyone can easily and instantly remove all influence on them by that group. Non-group-members wouldn’t make such associations because that group’s influence on non-group-members would by default be minimal.
That’s not how Nazis and other disgusting groups operate, though. Once they’re part of a community, they’ll start trying to get any other unknowing people interested in their content, so they start creating communities in disguise, trying to gain any new members, and make sure anyone who feels like an outsider, like they don’t belong, will feel welcome there, which in turn only strengthens those types of communities.
4chan used to be a pretty diverse community, which while it was definitely always pretty intense and insensitive, had a bit of everything, but after the American 2016 elections, /Pol/ started spreading to other boards until the /Pol/ folk became a very sizeable group everywhere else.
I think that happened during gamergate
Censoring literal, open Nazis and pedos is perfectly fine by me, thank you very much. I don’t think there’s any “slippery slope” involved other than the risks of allowing such people free rein in open spaces.
Keyword literal.
But nowadays people like to call Nazis other people that don’t agree with me and pedos weebs that like hs anime characters
So I agree with you if anything illegal is going on.
Are the Nazis advocating for violence and genocide and pedos distributing CP? sure block them and report them to the authorities.
But I don’t care for fictional characters that doesn’t even resemble real life humans.
Hot take I know.
To be clear, the “jailbait” forums that Reddit tolerated for years were not officially tolerating material that would be clearly illegal by US standards. If they had been, Reddit would have shut them down out of self-protection. Possessing outright child pornography is a strict-liability offense.
They were, however, posting “upskirt” photos showing children’s underwear. They were posting pictures of children at the beach in swimsuits. They were, in short, posting photos of actual children, where the photos had been selected for being sexually appealing to pedophiles.
Not drawings of fictional characters; real kids, selected for hotness to pedophiles.
I think that’s something about which site owners can very reasonably say, “Even if it is not a criminal offense, we think it’s wrong and we don’t want it on our site” without anyone needing to be concerned about censorship of ideas.
Nazism, by its very nature, advocates for violence and bigotry. There are a lot of abhorrent viewpoints that are despicable before they become illegal.
Fuck them and fuck their claims and wanting free speech. That’s not even something they actually believe in.
There are some absurd people on this board with bad political analysis, but you can see that I edited the OP. Hopefully I didn’t offend your identitarian tendencies or whatever that toe I stepped on was.
The name of the second instance, let me remind you, is “posting.lolicon.rocks”. It seems to be run by an aggressively racist (loves to use the hard-r gamer word) pedo. Anyone who cheers what they even call lolicon is normally going to be some sort of 4chan-style sicko who probably pitched a fit at the Card Captor Sakura reboot having fewer upskirts.
Well I don’t care for lolicon but you are right with this one, literal nazi pedo LMAO.
When you get a chance could you do an AMA? I’ve never met a talking ostrich before.
Blocking and unblocking should be normal, expected, easily discoverable, and openly discussed. There are a lot of people on the Internet; a nonzero number of them are frothing assholes; and frothing assholes are quite capable of running servers.
The whole system we’re on here is still new and in rapid flux. Expect change. This isn’t Reddit with admins saying for years that hosting /r/jailbait is essential to free speech. It takes time to develop agreeable responses to kinds of trouble this system hasn’t yet seen.
smh this is just the kind of NSFL content we’re talking about
The thread was already tagged :)
hah touché!
Giant images don’t really contribute to discussions. :(
I liked them, but I think it requires more context than can be assumed. The first picture is titled “frothing fash” and the second one is an illustration of a frozen peach (“freeze peach” -> “free speech”).
The problem is not that some (or many) communities may block access to some (or many) other communities, but rather that such blocking is not immediately obvious and may give an appearance that those blocked communities don’t exist (kind of like lying by omission). This is especially true if the description of your community implies a lack of enforced blocking. If someone manages a community where it’s very clear that outside access is moderately to extremely controlled, I completely support that.
The point of blocking is to cause things to be invisible to the default view.
If the blocked material is put in users’ face so they know it’s blocked, that misses the point of blocking it.
Mod logs and published block lists are a great way of allowing users who are concerned about overblocking to investigate … without failing at the whole endeavor by sending every user a copy of all the horse porn & Nazi spam that got blocked.
When I ran spam filtering for an institutional email server about 20 years ago, I made the “mod logs” (or rather, SMTP envelope data of blocked messages) available to users; but they had to go to a web page to see what messages had been blocked; and the content was not visible (since the mail server had never accepted it). I wrote that code so that my users could tell me if the spam blocking I’d configured was mistakenly blocking mail they wanted to receive.
(The users were scientists & engineers. They could read email headers. If they wanted to.)
But the point of blocking the horse porn spam and Nazi froth is lost if the users have to see it anyway so they know it’s blocked.
Horse porn is not a made-up example, by the way. There was an email spammer named Jeremy Jaynes, who was in the habit of sending spam promoting bestiality porn. When he was arrested, suddenly the users on the email server I was running stopped complaining to me about horse porn.
Gonna just hammer this home: this slippery slope has literally never been true. Reddit getting rid of neo nazis, the fatpeoplehate lynch mob, and the pedophile subreddits only ever made the site better. This fear of censorship has ruined platforms far more than any so called censorship has. I would rather go back to Something Awful and their capricious mods than deal with another laissez faire administration
deleted by creator