Last modified: 2014-04-29 14:16:46 UTC
As API user it is confusing, if I had to mix API payload with request handling (retry) params e.G.: HEADER: User-Agent:... test@example.org ... HEADER: Accept-Encoding: gzip HEADER: Accept: application/json REQUEST: api.php?action=query&meta=siteinfo&maxlag=5 says:My client is written by "test@example.org", it will accept gziped json respones and send you a GET with request: give me the siteinfo of the wiki please (and btw: my client supports wikipedia maxlag request throttleing) ... What do you think about: HEADER: X-Accept-MediaWiki:maxlag says:My client .... will accept correct retry handling of responses with Statuscode 200 or 503 with headers: MediaWiki-API-Error: maxlag (optional) Retry-After: [0-9]+ X-Database-Lag: [0-9]+ See Also: http://en.wikipedia.org/wiki/Content_negotiation
I'm inclined to WONTFIX this. What is the advantage to specifying the maxlag as a header rather than a parameter? How is it confusing to have the maxlag error be reported as an API error rather than a HTTP error? Regarding the latter, you may want to review bug 38716 and the discussion thread beginning at http://lists.wikimedia.org/pipermail/mediawiki-api/2009-July/001227.html.
I see no problem, with HTTP statuscode or api responses. I think the advantage of separation between header and request parameter is, that there is no mix between client and api abilities. My separation assumes that, the main goal of mediawiki api is to work with _wiki contents_ and lesser to handle server performance in hight traffic environments. Application Level : Do some magic with e.g. categories and its members API Level : map application data from/to something like json (uses params) select the endpoint urls for api requests Client Level : send the data to MW, handle gz compression, throttling (uses headers) encoding, applies user-agent, cookies, caching, ... If you have an existing application with these layers, and you want to add the maxlag ability, there I see the following posibilities: * API Level ** (3) change endpiont selection (create urls with maxlag) ** (2,3) handle maxlag responses from client level * Client Level ** (2) mutate endpoint urls from api level ** (1) add request header ** (1,2) handle maxlag The solution with the header field (1) wins for me, because it does'nt mutate the original request and breaks no layers between server performance and wiki data. Sorry, it's only an idea, so if you think, it's to exotic to give bots this additional possibility to separate server performance from working with content, feel free to close this with WONTFIX ``` function maxLag($params, $headers) { $paramMaxLag = $params["maxlag"]; $headersMaxLag = $headers["X-Accept-MediaWiki"]["maxlag"]; if (isset($paramMaxLag)) { return $paramMaxLag; } else if (isset($headersMaxLag)) { return $headersMaxLag; } else ... ```
I don't think we need two different ways to do maxlag.