Last modified: 2013-03-15 16:35:06 UTC
When undoing a revision that removed some or all of the page content, and that content contained external links, users are prompted to solve a captcha because they introduced "new external links", even though they are merely restoring the page to a previous state. Whilst it may be a little more work on the back end, correcting this is worth it for the improvement in user experience. While such a change would allow a user to revert the removal of unwanted external links without having to solve a captcha, they would have had to solve one in order to introduce the links in the first place, so this would not make things any easier for spambots.
Spammers solve captchas either by machine (if the captcha has been broken) or by outsourcing to companies that have poorly-paid humans do it for them. For the spammer, solving captchas is an ongoing cost. A one-time captcha can fail as the sole prevention mechanism; allowing the same spam links to be restored time after time with no cost to the spammer might be sub-ideal. What may be more appropriate is some concept of link trust; links that have been kept around for a long time or have been confirmed as good by other editors shouldn't need to trigger a new captcha every time, while links that didn't stay long in the first place or were marked as suspicious should be more likely to trigger a check.