Last modified: 2011-11-04 21:49:05 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T33627, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 31627 - too many links on one page break the parser - Allowed memory size exhausted
too many links on one page break the parser - Allowed memory size exhausted
Status: RESOLVED FIXED
Product: MediaWiki
Classification: Unclassified
Parser (Other open bugs)
1.18.x
All All
: Normal minor (vote)
: 1.18.0 release
Assigned To: Nobody - You can work on this!
:
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2011-10-11 22:49 UTC by Saibo
Modified: 2011-11-04 21:49 UTC (History)
4 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments

Description Saibo 2011-10-11 22:49:40 UTC
https://commons.wikimedia.org/wiki/MediaWiki_talk:Titleblacklist/Archive_1 

"PHP fatal error in /usr/local/apache/common-local/php-1.18/extensions/WikiEditor/WikiEditor.i18n.php line 17997:        Allowed memory size of 125829120 bytes exhausted (tried to allocate 7864320 bytes)"

logged in, same on http
failed *.php file changes on reloads

The page only has  95.920 Bytes 

User:hoo commented something out and now it works
https://commons.wikimedia.org/w/index.php?title=MediaWiki_talk:Titleblacklist/Archive_1&action=history

306 links shouldn't break the parser
Comment 1 Marius Hoch 2011-10-11 22:57:21 UTC
I played around with it for a while, only the 306 [[MediaWiki:Senselessimagename/X|X]] links alone caused the error (in the preview) but on my test wiki with 1.17 I was able to paste the links 12 times and it did render fine (MW 1.17.0, PHP 5.3.1, memory_limit was 128MB, no extensions). I know that this isn't really comparable cause the extensions eat a lot of memory, but 12 times...
Comment 2 Marius Hoch 2011-10-11 23:16:09 UTC
Ok, with some more testing I can now for sure say that it's a 1.18 bug. I've tried to submit just the 306 MediaWiki links on test.wikipedia.org but failed (with the interface and using the API) so by the time the bot on commons submitted the links it must have worked fine or it did the impossible ;)

Btw, I was able to paste the links 20 times on my test wiki now, without any problems
Comment 3 Brion Vibber 2011-10-12 00:25:12 UTC
Possibly hitting all those links is hitting localization initialization: 306 languages is a lot of strings to load, and those are going to add up.
Comment 4 Marius Hoch 2011-10-12 10:14:56 UTC
Yes, that's pretty likely the error always occurs in i18n files. But I still wonder how the bot could submit the page without an error...
Comment 5 Niklas Laxström 2011-11-04 21:19:52 UTC
I believe this has been fixed by r97815.
Comment 6 Aaron Schulz 2011-11-04 21:49:05 UTC
(In reply to comment #5)
> I believe this has been fixed by r97815.

Deployed. Works now.

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links