Last modified: 2012-11-03 19:23:38 UTC
Basically, this is a request to undo r34769 (bug 1505). Allowing blacklisted links to remain in the page is bad for a few reasons: * Allows duplication of blacklisted links (bug 14114) * Whitelisting is the correct way to use blacklisted links, per bug 1505 comment 2 * Shifts the burden to remove blacklisted links onto a small team of users. Previously, the load was distributed since anyone trying to save a page with the link couldn't do so Concerns about reverting vandalism (bug 1505 comment 1) are valid, and should be addressed; see bug 15450 to make *rollback* (only) exempt.
One issue with this is that if page A transcludes page B, and B trips the spam filter, then saving A will fail. To make this less bad, the spam filter should be put deeper into the parser code, so that as soon as B is parsed the spam filter is checked against it. That would allow a more detailed error message - that page A cannot be saved because page B trips the spam filter with a certain link.
(In reply to comment #1) > One issue with this is that if page A transcludes page B, and B trips the spam > filter, then saving A will fail. To make this less bad, the spam filter should > be put deeper into the parser code, so that as soon as B is parsed the spam > filter is checked against it. That would allow a more detailed error message - > that page A cannot be saved because page B trips the spam filter with a certain > link. > That would require a more thorough rewrite, I think, and should therefore be requested separately. This might conceivably be done at the same time as bug 4459.
Isn't this how it works now? I'm sure i've seen issues with bots on en.wiki because they are failing when their is blacklisted links on the page.