Last modified: 2013-06-26 12:58:32 UTC
Server: mw1011 URL: http://[unknown-host] Backtrace: #0 /usr/local/apache/common-local/php-1.21wmf12/extensions/Wikibase/client/includes/ChangeHandler.php(400): array_merge(Array, Array) #1 /usr/local/apache/common-local/php-1.21wmf12/extensions/Wikibase/client/includes/ChangeHandler.php(447): Wikibase\ChangeHandler->coalesceChanges(Array) #2 /usr/local/apache/common-local/php-1.21wmf12/extensions/Wikibase/lib/includes/ChangeNotificationJob.php(151): Wikibase\ChangeHandler->handleChanges(Array) #3 /usr/local/apache/common-local/php-1.21wmf12/maintenance/runJobs.php(98): Wikibase\ChangeNotificationJob->run() #4 /usr/local/apache/common-local/php-1.21wmf12/maintenance/doMaintenance.php(110): RunJobs->execute() #5 /usr/local/apache/common-local/php-1.21wmf12/maintenance/runJobs.php(148): require_once('/usr/local/apac...') #6 /usr/local/apache/common-local/multiversion/MWScript.php(96): require_once('/usr/local/apac...') #7 {main}
[01-Apr-2013 06:53:09] Fatal error: Allowed memory size of 157286400 bytes exhausted (tried to allocate 71 bytes) at /usr/local/apache/common-local/php-1.21wmf12/extens ions/Wikibase/client/includes/ChangeHandler.php on line 400
The incidence is quite high. It's starting to spam the logs.
https://gerrit.wikimedia.org/r/#/c/58753/ - try reducing the batch size
There are less OOMs, but there are still a good number. Like: Server: mw1001 URL: http://[unknown-host] Backtrace: #0 /usr/local/apache/common-local/php-1.22wmf1/includes/Revision.php(1199): gzinflate('?V?r?0?????#??l...') #1 /usr/local/apache/common-local/php-1.22wmf1/extensions/Wikibase/lib/includes/store/sql/WikiPageEntityLookup.php(293): Revision::getRevisionText(Object(stdClass), 'old_', 'wikidatawiki') #2 /usr/local/apache/common-local/php-1.22wmf1/extensions/Wikibase/lib/includes/store/sql/WikiPageEntityLookup.php(215): Wikibase\WikiPageEntityLookup->loadEntity('item', Object(stdClass)) #3 /usr/local/apache/common-local/php-1.22wmf1/extensions/Wikibase/lib/includes/store/sql/CachingEntityLoader.php(88): Wikibase\WikiPageEntityLookup->getEntity(Object(Wikibase\EntityId), 25587853) #4 /usr/local/apache/common-local/php-1.22wmf1/extensions/Wikibase/client/includes/ChangeHandler.php(258): Wikibase\CachingEntityLoader->getEntity(Object(Wikibase\EntityId), 25587853) #5 /usr/local/apache/common-local/php-1.22wmf1/extensions/Wikibase/client/includes/ChangeHandler.php(373): Wikibase\ChangeHandler->mergeChanges(Array) #6 /usr/local/apache/common-local/php-1.22wmf1/extensions/Wikibase/client/includes/ChangeHandler.php(399): Wikibase\ChangeHandler->coalesceRuns(Array) #7 /usr/local/apache/common-local/php-1.22wmf1/extensions/Wikibase/client/includes/ChangeHandler.php(447): Wikibase\ChangeHandler->coalesceChanges(Array) #8 /usr/local/apache/common-local/php-1.22wmf1/extensions/Wikibase/lib/includes/ChangeNotificationJob.php(151): Wikibase\ChangeHandler->handleChanges(Array) #9 /usr/local/apache/common-local/php-1.22wmf1/maintenance/runJobs.php(98): Wikibase\ChangeNotificationJob->run() #10 /usr/local/apache/common-local/php-1.22wmf1/maintenance/doMaintenance.php(110): RunJobs->execute() #11 /usr/local/apache/common-local/php-1.22wmf1/maintenance/runJobs.php(148): require_once('/usr/local/apac...') #12 /usr/local/apache/common-local/multiversion/MWScript.php(97): require_once('/usr/local/apac...') #13 {main}
on my test wiki, I got this out of memory error: PHP Stack trace: PHP 1. {main}() /var/www/common/multiversion/multiversion/MWScript.php:0 PHP 2. require_once() /var/www/common/multiversion/multiversion/MWScript.php:86 PHP 3. require_once() /var/www/common/php-master/maintenance/runJobs.php:149 PHP 4. RunJobs->execute() /var/www/common/php-master/maintenance/doMaintenance.php:110 PHP 5. Wikibase\ChangeNotificationJob->run() /var/www/common/php-master/maintenance/runJobs.php:99 PHP 6. Wikibase\ChangeHandler->handleChanges() /var/www/common/php-master/extensions/Wikibase/lib/includes/ChangeNotificationJob.php:151 PHP 7. Wikibase\ChangeHandler->coalesceChanges() /var/www/common/php-master/extensions/Wikibase/client/includes/ChangeHandler.php:462 PHP 8. Wikibase\ChangeHandler->coalesceRuns() /var/www/common/php-master/extensions/Wikibase/client/includes/ChangeHandler.php:411 PHP 9. Wikibase\ChangeHandler->mergeChanges() /var/www/common/php-master/extensions/Wikibase/client/includes/ChangeHandler.php:383 PHP 10. Wikibase\EntityChange::newFromUpdate() /var/www/common/php-master/extensions/Wikibase/client/includes/ChangeHandler.php:279 PHP 11. Wikibase\Entity->getDiff() /var/www/common/php-master/extensions/Wikibase/lib/includes/changes/EntityChange.php:316 PHP 12. Wikibase\Item->entityToDiffArray() /var/www/common/php-master/extensions/Wikibase/DataModel/DataModel/Entity/Entity.php:818 PHP 13. Wikibase\Item->getSiteLinks() /var/www/common/php-master/extensions/Wikibase/DataModel/DataModel/Entity/Item.php:244 PHP 14. Wikibase\SiteLink::newFromText() /var/www/common/php-master/extensions/Wikibase/DataModel/DataModel/Entity/Item.php:128 PHP 15. SiteSQLStore->getSite() /var/www/common/php-master/extensions/Wikibase/DataModel/DataModel/SiteLink.php:59 PHP 16. SiteSQLStore->getSites() /var/www/common/php-master/includes/site/SiteSQLStore.php:244
Note that runJobs.php uses a 150mb limit by default.
Aude's patch in comment 3 got merged, removing keyword. Is this still an issue? If so, any idea/proposal how to proceed?
Closing, since the practical problem is gone.
Verified in Wikidata demo sprint 22-9