Last modified: 2014-05-05 11:33:13 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T47900, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 45900 - Does MediaWiki use more memory for something than it did before?
Does MediaWiki use more memory for something than it did before?
Status: NEW
Product: MediaWiki
Classification: Unclassified
General/Unknown (Other open bugs)
1.20.x
All All
: Low normal (vote)
: ---
Assigned To: Nobody - You can work on this!
: parser, performance
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2013-03-08 16:10 UTC by Edward Chernenko
Modified: 2014-05-05 11:33 UTC (History)
3 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments
An article that can't be rendered with 128M memory_limit (XML, for Special:Import) (302.98 KB, text/plain)
2013-03-14 18:33 UTC, Edward Chernenko
Details
The MediaWiki debug log (1.10 MB, text/plain)
2013-03-14 18:33 UTC, Edward Chernenko
Details
Nginx error log (with PHP message "Allowed memory size of ... bytes exhausted") (436 bytes, text/plain)
2013-03-14 18:35 UTC, Edward Chernenko
Details
The MediaWiki debug log for another page (which parsed successfully but took almost 64M) (68.08 KB, text/plain)
2013-03-14 19:04 UTC, Edward Chernenko
Details

Description Edward Chernenko 2013-03-08 16:10:04 UTC
Hello,
I've recently upgraded from MediaWiki 1.19 to MediaWiki 1.20.3.

I'm getting PHP errors "Allowed memory size of ... bytes exhaused" on some pages with a lot of templates (in includes/parser/Preprocessor_DOM.php on line 1029).

I understand that it's perfectly normal, however with MediaWiki 1.19 I had my memory_limit set to 64M, and it worked perfectly; I've increased this value to 128M but some pages still fail to render (the same pages that rendered OK before the upgrade with only 64M).

Maybe there were some changes to parser or preprocessor, which could account for this? A memory leak somewhere.

The pages in question are about 200K of wiki code and produce 1-2Mb of HTML, however they contain some heavy templates with >10 parameters each (multiple invocations of en.wikipedia.org/wiki/Template:Chess_diagram).

I have a feeling that every template invocation is being kept in memory, even when this is not really needed. If this is the issue, this has to be optimized.
Comment 1 Andre Klapper 2013-03-08 16:20:51 UTC
Please provide more info about exact pages and the setup (database version etc). Also see http://www.mediawiki.org/wiki/Manual:How_to_debug
Comment 2 Edward Chernenko 2013-03-14 18:33:24 UTC
Created attachment 11929 [details]
An article that can't be rendered with 128M memory_limit (XML, for Special:Import)
Comment 3 Edward Chernenko 2013-03-14 18:33:56 UTC
Created attachment 11930 [details]
The MediaWiki debug log
Comment 4 Edward Chernenko 2013-03-14 18:35:16 UTC
Created attachment 11931 [details]
Nginx error log (with PHP message "Allowed memory size of ... bytes exhausted")
Comment 5 Edward Chernenko 2013-03-14 18:37:14 UTC
The database is MySQL 5.5.

$wgMainCacheType = CACHE_ACCEL;
$wgParserCacheType = CACHE_DBA; # db4
Comment 6 Edward Chernenko 2013-03-14 18:40:46 UTC
Note the unusual amount of operations with images in mediawiki.log.

The template includes only two images. An the article includes this templates, say, 200 times. Looks like MediaWiki does something with them 400 times (instead of 2 times).
Comment 7 Edward Chernenko 2013-03-14 19:04:05 UTC
Created attachment 11932 [details]
The MediaWiki debug log for another page (which parsed successfully but took almost 64M)

Note the amount of memory used in Parser::braceSubstitution and PPFrame_DOM::expand.
Comment 8 Edward Chernenko 2013-03-14 19:11:56 UTC
(a strategic suggestion)

Even if we assume that the memory usage is what it should be:

why should one increase the PHP memory_limit just because it is not enough to parse 10-20 pages on the wiki? (which all other pages requiring much less memory)

MediaWiki should predict the cases of high memory usage and handle it (by creating a temporary file, for example) instead of letting PHP crash.

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links