Last modified: 2013-01-12 22:24:52 UTC
The amount of spam accounts on the beta.wmflabs.org wikis is staggering, partly because CheckUser is not available to root out their IP addresses as they are for the main WMF wikis. If for some reason, CheckUser cannot be enabled, account creation should either go through the ConfirmAccount extension or otherwise require approval from existing users, because this spam is too much to deal with.
CheckUser cannot be enabled due to disclosure of users' ip addresses to typically non privelged users.
A request page on the main Meta can be made for those who want accounts to request one; that'll definitely stop all the spammers.
Creating such a requirement would make the site almost unusable, we should rather think of some advanced extension that would effectively block these. Despite this is annoying, the production is having same problem, only difference is that there are way more people who deal with it. If we made a system that effectively block spammers, we could eventually use it on production as well. Having the RC feed is a first step, I will try to set up a relay to freenode today so that we can be easily notified on spammers. On other hand, the shell access is restricted right now on beta. Most of users who have it are either wmf employees or identified to foundation. So we could eventually enable checkuser for some time if you believe that this is only reason, why fighting spammers is harder on beta than on production, but this needs to be discussed with Ryan
Without CheckUser, there's little the stewards can do.
The stewards asked for checkuser to be removed...
My point is that it's not a lack of patrolling users that leads to out-of-control spam, it's the fact that we can't block the underlying IPs except by autoblock; it's also because extensions like AbuseFilter and Titleblacklist don't appear to be functional.
Indeed. It's non-ideal. For sure, if AbuseFilter and Titleblacklist aren't working, bugs should be entered for that.
I don't really think it was the intention to have them work - after all, everything really is just copied from production wikis, right?
(In reply to comment #8) > I don't really think it was the intention to have them work - after all, > everything really is just copied from production wikis, right? What's the point of having a test environment if we don't test tools such as spam prevention etc? Only CheckUser has been specifically disabled for a reason
https://bugzilla.wikimedia.org/show_bug.cgi?id=38433 filed.
I've changed this bug's summary to "Disable anonymous account creation on beta.wmflabs.org" from "Disable anonymous account creation". Please update the summary if I've misunderstood the request.
I propose a reverse method of spam cleanup for the test/dev wikis (in addition to all of the normal vandal fighting tools): Any article/revision not tagged "NOTSPAM" will be deleted/reverted within 24 hours of its creation by a bot. Does this seem like a legitimate way to handle SPAM seeing as that these are test/dev wikis and the content doesn't hold much value?
(In reply to comment #12) > I propose a reverse method of spam cleanup for the test/dev wikis (in addition > to all of the normal vandal fighting tools): > > Any article/revision not tagged "NOTSPAM" will be deleted/reverted within 24 > hours of its creation by a bot. > > Does this seem like a legitimate way to handle SPAM seeing as that these are > test/dev wikis and the content doesn't hold much value? Seems reasonable to me, though I'd suggest __NOTSPAM__ so that you can track the pages easily using the built-in magic word tracking page_props table. And rather than a bot, the wiki could just delete the pages itself. You could just put logic in the code that viewing the page when it's older than 24 hours deletes it. Extension:SelfDestruct or whatever.
(In reply to comment #13) > Seems reasonable to me, though I'd suggest __NOTSPAM__ so that you can track > the pages easily using the built-in magic word tracking page_props table. > > And rather than a bot, the wiki could just delete the pages itself. You could > just put logic in the code that viewing the page when it's older than 24 hours > deletes it. Extension:SelfDestruct or whatever. Well, we don't want to change existing page content or how MediaWiki functions (since this is supposed to be used for testing MediaWiki as close to production as possible), so a bot is likely safer, and using tagging keeps the page content clean.
I don't believe a bot would be productive; it wouldn't stop the mass creation of spam accounts that pollute the user lists (not all spam accounts match the regex I tried to use in my abuse filter and title blacklist entries). I also don't see how fully turning off account creation by anonymous users would make the wiki harder to use, especially if the deployment wiki alone was also given the ConfirmAccount extension. Existing global sysops/stewards on the cluster would not have to be subject to any of this; we could also create a global account creator group if someone needs to test account creation.
The concept of this project is that the wikis are configured as close to production as possible. If they aren't it ruins testing. I'm not sure I see the problem in having polluted user listings on a wiki that isn't actually used for content. I see the problem with spam in the content, as it could lead to users clicking on bad things, though. That's why I was suggesting a bot that reverts non-positively patrolled content.
This is a great idea, I started work on this bot, it will be stored in git once Chad make a repo for me
I have enabled SORBS based auto blocker with Gerrit change #15768. That should automatically block any users listed in the SORBS list of HTTP open proxies.
I just completed the setup and enabled the captcha in wmflabs. That should solve this issue.
Rephrased bug summary.