Show thread history

Anthony Accioly

5d ago

Thanks for your service Fishcake. Moderating bigger Blossom and NIP-96 instances is already a lot of work. Doing that plus dealing with notes from real people who are willing to attack your service (and you personally) every time moderation doesn’t go their way is almost impossible. I’ve seen so many instance mods put in serious effort to keep Mastodon instances alive, only to eventually give up because of constant attacks, doxxing, smear campaigns, threats to their families, and all sorts of nonsense.

I have no illusion that some owner / admin moderation is needed at both the relay and media server levels, but when Nostr grows past the "manageable" 20K daily active users and heads toward 200K or even 2M+ users, decentralised moderation (beyond just reports) is going to need some real attention.

I think we need something like Stack Exchange, where a certain number of reports trigger a media/post review. That review could then be handled either by consensus among a few "reputable" users or by a single "highly reputable" user (where reputation is defined by a mix of relay/media server trust and npub metrics). Maybe even add some perks for voluntary moderation to sweteen the deal, i.e., some sats, bigger upload limits, Nostr badges, etc. Gamifying moderation has its own issues, but at scale, it tends to work pretty well.

See translation

1
2
0
0
0


Do you have thoughts?

Log in to leave a comment


Replies

The Fishcake🐶🐾 & 763 others

@The Fishcake🐶🐾

♥︎ by author

5d ago

All good ideas. Fortunately we have very clear cut rules now, and AI assistance to pass harmless media. With greater scale we will definitely need to evolve our ways of approaching the problem. Thanks for all your insights

See translation

0

5
0
0
0