![](https://lemmy.dbzer0.com/pictrs/image/e0720ce4-a77d-4219-9e06-0711e33f6d9c.png)
![](https://lemmy.ca/pictrs/image/7b0211f0-7266-4e13-9d26-8c3e6126af62.png)
Most of the people who shill for DRM are such sad and pathetic trolls that they usually get banned from most sensible communities and platforms, there are still a good amount on Reddit but even there they often get buried with downvotes.
“Let Chaos storm, let cloud shapes swarm; I wait for form”
Most of the people who shill for DRM are such sad and pathetic trolls that they usually get banned from most sensible communities and platforms, there are still a good amount on Reddit but even there they often get buried with downvotes.
I think it’s just very messed up, ultimately it doesn’t work against the real nasty people Reddit claims to be going up against because those people have bot armies that monitor their astroturf accounts so they know when the shadowbans happen and dump the account to move on to the next ones. No this system disproportionately affects the people who aren’t expecting it and probably don’t even deserve it.
Also for braindead spammers it’s actually a terrible strategy because spammers’ purpose is both to annoy users and chew through your resources, even if they are shadowbanned and uploading multiple gigabytes of white noise they aren’t annoying people but they are chewing through bandwidth and CDN storage. IMO that’s not feasible long term, and wouldn’t even be initially feasible for most Fediverse services, hence why most basically just don’t do it.
Don’t forget about shadowbans that attempt to make it seem like you aren’t banned when your entire account is hidden without your knowledge.
Stopping viewing on a per-account basis doesn’t make sense to me, since people don’t need accounts to view any content in Lemmy, therefore it’s trivial to bypass by logging out or fetching the discussion information without logging in from a custom frontend. What would be better is simply stopping them from interracting, just like what happens with bans, they can still view but all interractions are simply dropped or disallowed.
You should report them as trolls in any case, most instances on Lemmy have rules against trolling in general, further violations beyond that just add to the initial violation of trolling.
I meant the kind that goes around the body even when in use, not just a carrying case, they’re different kinds of cases.
The guy who runs the growyourownservices network despite seeming very professional is extremely emotionally biased and hates Lemmy as a software (and seemingly any instances that will choose to run that software, regardless of their affiliation towards the developers).
So he basically refuses to acknowledge the existence of Lemmy and by extension a large portion of the threaded fediverse.
My god get a case for that thing, screen protector too (they have matte screen protectors for anti-glare I have one and it’s awesome, better than the 512 any day). I was holding off on it because I wanted to get the Killswitch but it expensive 😖 but I’m definitely doing it now.
It’s not nearly the same as following communities or groups, it’s just a collection of posts grouped by tags, as opposed to a space where people discuss or post about a more broad topic. Also Communities and groups typically invite more interaction than simply tagging posts by virtue of being a place people post as opposed to simply being a post tag category.
I should note that there are groups on Mastodon (Not really in Mastodon itself but federated Group actors from other services show up there) though they are less intuitive and thus are usually overlooked by most Mastodon users.
Maybe one of the forks or backend replacements could implement an option using it to make it compliant. I wouldn’t go with the OP’s solution since privacy is non-existent on Lemmy, but just blocking interaction seems like it would be enough to make it compliant, and prevent the harassment issues mentioned, I made an issue which addresses this in the Lemmy Github, it proposes a new feature rather than changing the existing blocks because it’s good to have mutes and blocks at the same time.
Unfortunately Lemmy isn’t like that and does not follow activitypub spec in that regard, in their current form the block doesn’t seem to do that at all and simply hides the blocked user from the blocking user as if the blocked user didn’t exist. There are no checks on interactions.
Also if you’re wondering how it works with Mastodon, Lemmy basically ignores Mastodon’s blocking system and freely allows interraction with Mastodon accounts in the thread even if they blocked the user replying, and also the community actor.
Yup this happened a lot on Reddit. As much as people complain about the newer two way blocking system on Reddit this type of harassment disappeared basically overnight when that rolled out. It largely was a good thing because for every user who was legitimately being abused by it, there were a lot who were benefiting from it by stopping harassment from others.
Now Lemmy can implement anything but nothing could ever prevent blocked/muted user to create another account in order to continue harassment.
Not a great argument because the same could also apply to community and site bans.
I think that having more tools to fight harassment is ultimately a good thing, are these tools perfect? Of course not, but they are still better than having nothing.
I think the only way to prevent such issue would be a system which would require to prove identity in some way in order to create a single account. But this is completely against the openness of a federated network.
Indeed it is, plus it doesn’t stop those malicious enough to commit a felony just to harass someone but that is neither here nor there, this discussion is about protective measures that can be done before ban evasion.
A good two way blocking system should mainly focus on preventing interaction from the blocked user (The specs of Activitypub mention interraction, as opposed to viewing), even if they don’t hide it, the interactions from the blocked user wouldn’t be federated as if they were a banned user.
I don’t think that blocking people from seeing posts really makes sense since they can just log out to see them, it’s all public. So I don’t really feel like hiding content from them is very useful since they can view it without an account. What is really needed is a way to restrict users from interacting, I made an issue addressing this it would be treated as a kind of profile ban, similar to community bans but for individual users but even when banned people can still view and read, like they always could before.
The simple fact of the matter is, the Fediverse is public. It’s a space specifically built on sharing.
You’re thinking about it wrong, a good blocking system doesn’t need to hide the content but rather block interaction from the offending user, like a softer form of a ban, but only for that specific user. They can still see all content from the user but they just aren’t allowed to interact anymore. Could they bypass it with alt accounts? Yes but they can also bypass bans as well using that same method, so it’s not a good argument against something like that.
@db0@lemmy.dbzer0.com Hey do you think this would be a good idea, a bot to waste scammers’ time (possibly with AI)?
Anyone seeing this might want to consider combat training and counter custody training while it’s easy, just in case this is the future we’re headed for. (Also Remember they’re not real people, they’re bureaucrats they’re the same as robots, don’t hold back.)
The first one might suck but can be solved by either working under the table or with underhanded tactics, not super easy but doable.
For the second one, it’s much easier, !shoplifting@slrpnk.net can help you.
Also tracking protection in the browser to prevent reading browser history and such. Security and privacy practices are absolutely paramount if you’re planning on visiting services like that. Of course the best thing is to not visit them at all but some people feel they need to see it for themselves, if they choose they should be prepared and keep themselves safe.