Last modified: 2014-04-29 14:16:46 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T66508, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 64508 - Add a Maxlag header
Add a Maxlag header
Status: RESOLVED WONTFIX
Product: MediaWiki
Classification: Unclassified
API (Other open bugs)
1.24rc
All All
: Low enhancement (vote)
: ---
Assigned To: Nobody - You can work on this!
:
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2014-04-27 14:14 UTC by Thomas
Modified: 2014-04-29 14:16 UTC (History)
4 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments

Description Thomas 2014-04-27 14:14:34 UTC
As API user it is confusing, if I had to mix API payload with request handling (retry) params

e.G.:
HEADER:  User-Agent:... test@example.org ...
HEADER:  Accept-Encoding: gzip
HEADER:  Accept: application/json
REQUEST: api.php?action=query&meta=siteinfo&maxlag=5

says:My client is written by "test@example.org", it will accept gziped json respones and send you a GET with request: give me the siteinfo of the wiki please (and btw: my client supports wikipedia maxlag request throttleing)

...

What do you think about:
HEADER:  X-Accept-MediaWiki:maxlag

says:My client .... will accept correct retry handling of responses with Statuscode 200 or 503 with headers: 
  MediaWiki-API-Error: maxlag (optional)
  Retry-After: [0-9]+
  X-Database-Lag: [0-9]+

See Also: http://en.wikipedia.org/wiki/Content_negotiation
Comment 1 Brad Jorsch 2014-04-28 17:10:36 UTC
I'm inclined to WONTFIX this. What is the advantage to specifying the maxlag as a header rather than a parameter? How is it confusing to have the maxlag error be reported as an API error rather than a HTTP error?

Regarding the latter, you may want to review bug 38716 and the discussion thread beginning at http://lists.wikimedia.org/pipermail/mediawiki-api/2009-July/001227.html.
Comment 2 Thomas 2014-04-28 19:41:52 UTC
I see no problem, with HTTP statuscode or api responses.

I think the advantage of separation between header and request parameter is, that there is no mix between client and api abilities.

My separation assumes that, the main goal of mediawiki api is to work with _wiki contents_ and lesser to handle server performance in hight traffic environments.

  Application Level : Do some magic with e.g. categories and its members
  API Level         : map application data from/to something like json
  (uses params)       select the endpoint urls for api requests
  Client Level      : send the data to MW, handle gz compression, throttling
  (uses headers)      encoding, applies user-agent, cookies, caching, ...

If you have an existing application with these layers, and you want to add the maxlag ability, there I see the following posibilities:

* API Level
** (3)   change endpiont selection (create urls with maxlag)
** (2,3) handle maxlag responses from client level

* Client Level
** (2)   mutate endpoint urls from api level
** (1)   add request header
** (1,2) handle maxlag

The solution with the header field (1) wins for me, because it does'nt mutate the original request and breaks no layers between server performance and wiki data.

Sorry, it's only an idea, so if you think, it's to exotic to give bots this additional possibility to separate server performance from working with content, feel free to close this with WONTFIX 

```
  function maxLag($params, $headers) {
    $paramMaxLag = $params["maxlag"];
    $headersMaxLag = $headers["X-Accept-MediaWiki"]["maxlag"];
    if (isset($paramMaxLag)) {
      return $paramMaxLag;
    } else if (isset($headersMaxLag)) {
      return $headersMaxLag;
    } else ...
```
Comment 3 Brad Jorsch 2014-04-29 14:16:46 UTC
I don't think we need two different ways to do maxlag.

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links