Last modified: 2012-12-27 01:10:53 UTC
When someone edits a very high-use template (a template with millions of transclusions), the template edit doesn't finish cleanly. Instead of the template page reloading on page save, the user is presented with a read timeout error message. This read timeout behavior has existed for a few years now.
According to Tim Starling in #wikimedia-tech just now: > [20-Jun-2012 02:28:44] Fatal error: Maximum execution time of 180 seconds exceeded at /usr/local/apache/common-local/php-1.20wmf5/includes/db/DatabaseMysql.php on line 285 > that's me > and it fails in BacklinkCache->partition, very nice This was as a result of this edit: <https://commons.wikimedia.org/w/index.php?diff=prev&oldid=72965783>. The template has 2,783,343 transclusions according to the Toolserver's commonswiki_p right now.
Updated summary to reflect cause.
Workaround method to queue jobs from the command line: <https://commons.wikimedia.org/wiki/User:Tim_Starling/fixing_link_tables>
https://gerrit.wikimedia.org/r/#/c/32488/
(In reply to comment #4) > https://gerrit.wikimedia.org/r/#/c/32488/ Status Merged
https://gerrit.wikimedia.org/r/#/c/32488/ added a limit to BacklinkCache::getNumLinks but some related jobs are still failing in BacklinkCache::getLinks like this: Wed Dec 26 23:40:42 UTC 2012 mw14 commonswiki BacklinkCache::getLinks 10.0.6.61 2008 MySQL client ran out of memory (10.0.6.61) SELECT /*! STRAIGHT_JOIN */ page_namespace,page_title,page_id FROM `templatelinks`,`page` WHERE tl_namespace = '10' AND tl_title = 'Date' AND (page_id=tl_from) ORDER BY tl_from That query returns 12384915 rows and would have to be batched.
(In reply to comment #6) > https://gerrit.wikimedia.org/r/#/c/32488/ added a limit to > BacklinkCache::getNumLinks but some related jobs are still failing in > BacklinkCache::getLinks like this: > > Wed Dec 26 23:40:42 UTC 2012 mw14 commonswiki > BacklinkCache::getLinks > 10.0.6.61 2008 MySQL client ran out of memory (10.0.6.61) > SELECT > /*! STRAIGHT_JOIN */ page_namespace,page_title,page_id FROM > `templatelinks`,`page` WHERE tl_namespace = '10' AND tl_title = 'Date' AND > (page_id=tl_from) ORDER BY tl_from > > That query returns 12384915 rows and would have to be batched. Moved to bug 43452 since this is about users getting timeouts.