Last modified: 2014-08-06 23:25:21 UTC
Looking in our apache logs I can see tonnes of the following "GET /wiki/load.php?debug=false&lang=en&modules=ext.wikiEditor%7Cext.wikiEditor.dialogs%2Ctoolbar%7Cext.wikiEditor.toolbar.hideSig%7Cjquery.wikiEditor%7Cjquery.wikiEditor.dialogs%2Ctoolbar%7Cjquery.wikiEditor.dialogs.config%7Cjquery.wikiEditor.toolbar.config%2Ci18n&skin=vector&version=20140806T052205Z&* HTTP/1.0" If I manually download one of those pages, I see it has nice "cache me!" headers in it (basically cache this page for a month) Date: Wed, 06 Aug 2014 22:54:18 GMT Server: Apache X-Powered-By: PHP/5.3.3 X-Content-Type-Options: nosniff Last-Modified: Wed, 06 Aug 2014 22:54:18 GMT Cache-Control: public, max-age=2592000, s-maxage=2592000 Expires: Fri, 05 Sep 2014 22:54:18 GMT Connection: close Content-Type: text/javascript; charset=utf-8 That sounds great, but the link is NEVER downloaded again! The next time a user comes back, they get a different URL variant and download it all over again. I can see users editing pages and every time this (effective) url is re-downloaded because the version string is different Surely that version variable at the end is at fault? Shouldn't you just have the version of WikiEditor in there instead, so that everyone downloads the same URL all the time? (until there's an actual code change of course) Basically we're running mediawiki as an HTTPS-based intranet server and I'm noticing it's quite slow as large numbers of staff are countries away from the server, and these big repeated downloads are really noticable due to latency. So I'm trying to see what could be cached to improve responsiveness Thanks!