MVP Proposal: User interaction moderation

This will need to be fleshed out quite a bit in terms of the types of flags and how the actions are performed. In SE, there is a (very low) minimal rep needed to flag - I think this is helpful to prevent a new/casual user (or worse, sock puppet) from flagging.

Also consider things like “Vote to Close” - a bit stronger than a flag as ‘n’ VTC can Close without a moderator doing anything (as I understand it - if I’m wrong, please clarify).

Also consider (I don’t know if/how SE does this) having two types of moderators - one type specific to topical site (like the elected moderators in SE) and another type (especially important at the beginning) at the full system level - i.e., able to act as a moderator on all topical sites.

1 Like

If we are going to use anything like the community election model that SE uses, I think there should be term limits or a duration on the position that expires, requiring the person to seek re-election. Otherwise, dispense with the democracy theatre all together. Lifetime elections do not foster accountability or respond to changing needs.

I was thinking a bit about whether the diamond moderators really are necessary. I could see a model where the trusted user moderation tools continue to scale upward, maybe as a function of both time and reputation, so that those trusted users eventually gain access to the “diamond” tools combined with peer sign-off in the fashion of close votes. It doesn’t seem that simple reputation checks are adequate barriers to such tools, though. A better metric might involve consideration of time on the site and frequency of visiting, along with accuracy in handling lower level moderation tools.

3 Likes

I think with limiting the duration of a term, you have a very important point. There are sites on stackexchange that last elected their mods 9 years ago. However the users of a site change over time. Some of the old users leave, some of them stay and new users join as the site grows. This means that after a couple of years many of the currently active users won’t be the users that elected the moderators years ago. Regular elections (for example every 1 or 2 years with possibility for re-election) would ensure that a significant part of the current user base had the opportunity to take part in the election, thus making sure that the current user base knows from their own experience that they can influence the site, which strengths their connection to their site.

Going years and years without elections, and thus having most of the community not having participated in the elections, is problematic. So is having incumbent mods having to campaign to keep their jobs; it seems to create the wrong incentives, particularly when most moderation is not visible to the community. I don’t know what the solution is, but it should take into account both of these conflicting tensions.

A challenge with either more cycling or having moderation privs be just another priv is that you increase the number of people who have access to private information (that is necessary to perform moderation tasks). I’m not sure what the answer is there either, but it’s not to clamp down on more people being able to become mods.

3 Likes

This post on meta.se would be my solution to the question of whether or not to have an election/term limits, basically a yearly checking asking if an election is needed.

Could we get a yearly check in meta post by the Community user asking if the site would like to have a mod election?

On some sites, a mod election may be necessary while on others the community could be perfectly happy with the mod team. In either case I don’t see a harm in asking and in the second it would help the moderators know that the community is happy with what they are doing.

The reason for having the Community user post it on a schedule is that otherwise the mods may feel like the asker of the meta post is irritated with them for some reason, because if they are happy then there is no reason for an election right? While if the Community user posts it then it’s just a neutral question that gets asked every so often.

2 Likes

Good point with the private information. I think access to private information should not come with just being active on a site. Someone can be good at the site topic, and at the same time untrustworthy with private data. Thus there definitely should be some involvement of the community to decide who should get that access.

As to the problem of term length: Maybe there could be two phases: In the first phase, there’s just people voting if they want a change in the moderation or not. If the majority is happy with the moderation, nothing further happens. Only if the happy users are in the minority, a full-fledged new moderator re-election process would be started.

1 Like

I think the assumption SE made was that communities would grow over time and so there would be a natural need to keep adding to the team. (Plus, sometimes people step down.) In practice, though, some communities are small and can get by with three mods for years and years.

Here’s another way to mitigate the privacy problem while enabling more community participation: what if we separated mod powers into two buckets? Think about the things mods can do – most of them don’t require private information, just the community’s trust. I’m thinking (from SE) of unilateral close/open, locking/unlocking, cleaning up comments, seeing deleted content, and, sometimes, taking normal user actions with the “moral weight” of that diamond next to your name.

What that leaves out: user-level actions like suspensions, seeing annotations, investigating suspicious voting, and stuff like that. Basically (for SE mods), the “mod dashboard” tab on the user profile is only for this higher level of moderator.

If we did this, we could have elections more often for the first level, which is probably 80% of moderation anyway, and that could have term limits or just be subject to retention votes or whatever the community wants. These people are the primary moderators. And then there are a small number of higher-tier mods who also have access to the user stuff and private information. Assume some workable communication channel that all the mods have access to (like the private chat rooms on SE), and it’s not hard for the community mods to get an assist from the mods with private-info access when needed.

In a way, it’s kind of how moderators and community managers operated 5-8 years ago on SE, when mods had way fewer tools. (We always had some PII, though.) But the CM-like role here is filled by community members, not Codidact admins.

I’m brainstorming here; please take this as a starting point for discussion, not a concrete proposal.

7 Likes

I propose that we have multiple levels of moderators, that have powers that are distributed among them with the lower level being the one with the highest powers and the most trusted ones.

Sensitive information should be displayed according to the levels. On smaller levels, we can have moderators who’ll take care of thread locking/unlocking with comment powers and post deleting powers while the higher moderators will have access to more data.

With levels, data will be available accordingly based on how sensitive it is. We can have many people on the lower level and fewer people on a higher level and so on.

With more people on the lower level, we don’t have to worry about sensitive things and the moderation of content will continue

2 Likes

I think the idea to separate the moderation tasks with and without needing access to private information is a good idea. Indeed, the private-information-access moderators could generally act network-wide (with network-wide elections). That way less of them would be needed. Note that acting network wide doesn’t mean they could not specialize on specific sites by informal mutual agreement; but the point is that there would not need to be one (or even several) per site. Big sites would probably need exclusive attention by one or more PIA moderators, but several smaller sites might be collectively policed by the same PIA moderators.

BTW, does anyone know if there are legal regulations to take into account, given that private data is involved?

4 Likes

Re legal regulations, on SE moderators have to accept the moderator agreement, which says “no sharing or outside storage of private info”. SE takes care of the compliance issues with having that info at all, which amounts to barring underage users.

1 Like

I am not sure, whether multiple moderator types is MVP. I am not generally opposed to having network-wide moderators (aka. elected community team), however I don’t think it’s MVP.

What needs to be decided, in my opinion, is, which of these are MVP:

  • having some moderator-role (I think consensus is: Yes.)
  • user deletion (by moderators, by account owner)
  • user suspension (temporarily limiting user’s abilities to interact with the site, simple solution would be to disallow login during suspension, probably MVP)
  • user private messaging (to warn users, to inform of suspension reasons, to appeal against warnings/suspensions, probably not MVP, however then a different way of appeal is needed)
  • granular bans (e.g. editing, commenting, posting; all probably not MVP)
2 Likes

My proposed refinement of your list:

MVP:

  • having some moderator-role (I think consensus is: Yes.)
  • user (soft) self-deletion (a person should always be able to abandon a site)
  • user suspension (mechanism TBD)
  • private messages to users (I think if we allow suspension we have to allow both warnings and communication about the suspension)
  • some way for a user to appeal a suspension

Not MVP:

  • user deletion by moderators (since mods can suspend, there’s another way to keep someone from causing damage)
  • granular bans (e.g. editing, commenting, posting)
4 Likes

We can soft-delete user profiles for now and hard delete them later if necessary. I’m still for a no-restoration policy, unless it was clearly a moderator’s error.

A related problem to deletion of a user is votes/rep/etc.

If there is a soft-delete (which will almost always be the case - even if there is a “hard delete” of a user due to legal issues, the master record for that user could stick around as a placeholder with “name deleted”, no email, etc.) then it is actually not a big deal - posts would be marked as deleted and votes (and any effects of those votes) could either be kept as-is - i.e., so Alice still gets the 17 upvotes from Bob, they just don’t say Bob any more, now they say “Deleted User”.

Of course, if the delete is due to sock puppet, voting ring, spam user, etc. then the votes would be deleted as well. The only time votes/etc. should be kept is if it was a “real” user deleted due to actions unrelated to the original activity (e.g., user request or a “user gone bad”).

1 Like

Handling votes from deleted users is something we will definitely have to address. I propose that we do not need to address it in the MVP. For now we can just let votes continue to exist, and when we have a way to review for possible abuse (like sockpuppets) we can set matters right then.

For targeted voting we will (later, not MVP) need a way to invalidate votes separate from user deletion. We’ll need that sometime after we have the ability to identify targeted voting.

3 Likes

We can soft-delete profiles in the sense that it’s still visible that one account made a particular set of contributions. But we must be able to scrub all personal information (including the display name). It’s not just the law, it’s a good idea.

Unfortunately, human nature being what it is, I fear that account deletion with vote deletion needs to be in the MVP. It doesn’t need to be a tool with a nice UI, but we do need to properly get rid of sockpuppets.

I’ve never been happy that moderators have access to PII. The fewer people have access the better, and I am not comfortable with >500 people knowing my IP addresses on Stack Exchange. (And formerly an email address that identified me IRL — I used to be young(er) and foolish(er).) But it is extremely useful to have some PII to resolve sockpuppet cases quickly. Because most cheaters are idiots who think that nobody will ever have a clue that [email protected] and [email protected] are the same person.

I fully approve the idea of separating what SE calls diamond mods into two tiers, with only a small number having PII access. PII is not needed to lock and delete posts, to merge tags, to suspend users, etc.

5 Likes

I agree that access should be limited as much as practical. But in any system (free or paid, volunteer or commercial) someone is going to have the information, hopefully on a need to know basis.

I have dealt with real-world situations - e.g., someone who due to legitimate work concerns doesn’t want any PII to make it out to published organization newsletters because those newsletters are (for good reasons for the rest of the organization and public to legitimately see) posted on an organization web site, so any time he has what for virtually anyone else would be a “normal” mention in the newsletter, he has me carefully word things per his request to minimize PII. But those are few & far between - and the duty of the user to make their wishes known.

In our situation (and IMHO with most publicly accessible free-to-sign-up web sites) that means:

  • If you are worried about your IP being used to track you, use a proxy that effectively hides it
  • If you are worried about your name, make up something fake & unrelated
  • If you are worried about your email address, get a disposable/single-purpose email address
    and obviously then put nothing of consequence in your public profile.

So we limit what access we can, but we don’t promise perfection (it will never happen), we vet moderators as best we can, and we spell out in the terms of service what we do & don’t collect and what we plan to do with the information.

1 Like

The only issue I have here is a possible lack of transparency. Could we ever be certain that a site’s moderation is giving all of the links or all possible information they can?

No we can’t.

That’s why we need a fair and balanced appeal procedure. However I think this is not really within the scope of this topic, as it is more about tools rather than policies. I made a proposal concerning that here: