Last modified: 2014-05-15 13:37:59 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T62003, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 60003 - HTTP 503 error when requesting linked data for large entities
HTTP 503 error when requesting linked data for large entities
Status: VERIFIED FIXED
Product: MediaWiki extensions
Classification: Unclassified
WikidataRepo (Other open bugs)
unspecified
All All
: High major (vote)
: ---
Assigned To: Wikidata bugs
varnish u=dev c=infrastructure p=0
: performance
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2014-01-13 15:52 UTC by James Clarke
Modified: 2014-05-15 13:37 UTC (History)
10 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments

Description James Clarke 2014-01-13 15:52:11 UTC
For some entities using Wikidata's Special:EntityData linked data URIs return HTTP 503.

$ curl -I https://www.wikidata.org/wiki/Special:EntityData/Q30.json
HTTP/1.1 503 Service Unavailable

$ curl -I https://www.wikidata.org/wiki/Special:EntityData/Q30.nt
HTTP/1.1 503 Service Unavailable

$ curl -I https://www.wikidata.org/wiki/Special:EntityData/Q30.rdf
HTTP/1.1 503 Service Unavailable

Access via api.php works fine:

$ curl -I "https://www.wikidata.org/w/api.php?action=wbgetentities&ids=q30&format=json"
HTTP/1.1 200 OK

Other examples:

 * Q30
 * Q148
 * Q145
Comment 1 Daniel Kinzler 2014-01-14 11:33:48 UTC
Hypothesis: large entities trigger an Out Out Memory error, because SpecialEntityData uses output buffering on the JSON, effectively doubling the memory footprint of the serialized entity.
Comment 2 Daniel Kinzler 2014-02-06 17:25:53 UTC
So, according to Chris & Chad, since this error doesn't show in fatal.log, it's not an OutOfMemory error. And it's not a timeout either. We'll need assistance from someone with shell access to identify the problem.
Comment 3 Aude 2014-02-06 17:58:49 UTC
the error is:

Request: GET http://www.wikidata.org/wiki/Special:EntityData/Q30.json, from 10.64.0.102 via cp1065 cp1065 ([10.64.0.102]:3128), Varnish XID 2665166859<br/>Forwarded for: 87.138.110.76, 91.198.174.103, 208.80.154.77, 10.64.0.102<br/>Error: 503, Service Unavailable at Thu, 06 Feb 2014 17:30:26 GMT

seems to come from varnish.  Maybe we are hitting some memory or other limit?
Comment 4 Daniel Kinzler 2014-02-06 18:30:25 UTC
According to Katie, it's a problem with Varnish:

  [18:31] <aude> DanielK_WMDE: it says "Varnish XID 2665166859"
  [18:32] <aude> tells me it probably is varnish
  [18:33] <aude> it's a text varnish cache
  [18:33] <aude> cp1065.eqiad.wmnet

So I suppose we'll have to ask Mark about it.
Comment 5 Marius Hoch 2014-05-04 19:38:36 UTC
This indeed is a varnish problem, fetching the data directly from one of the apache's works fine (and fast).
Comment 6 Mark Bergsma 2014-05-06 17:20:22 UTC
Varnish 503s with this error message:

  170 FetchError   c straight insufficient bytes

which I think is a problem with compression that we've seen before. See if this varies whether or not you send an Accept-Encoding header of gzip/deflate or not?
Comment 7 Gerrit Notification Bot 2014-05-06 17:49:49 UTC
Change 131746 had a related patch set uploaded by Hoo man:
Don't set Content-Length in EntityDataRequestHandler

https://gerrit.wikimedia.org/r/131746
Comment 8 Gerrit Notification Bot 2014-05-06 21:04:38 UTC
Change 131746 merged by jenkins-bot:
Don't set Content-Length in EntityDataRequestHandler

https://gerrit.wikimedia.org/r/131746

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links