Last modified: 2014-09-17 11:25:30 UTC
Originally from: http://sourceforge.net/p/pywikipediabot/bugs/1447/ Reported by: Anonymous user Created on: 2012-05-19 16:38:59 Subject: PostData() datas are not gzip compressed Assigned to: xqt Original description: Using the 'pagegenerators', data transfer is not compressed. That data will be several megabytes. That transfer is very time consuming. Therefore, I think a good idea to remove that line. Thank you consider.
It is less time consuming than loading e.g. 60 times each page separately. What is the intention of this request?
I also do not consume more time is good. gzip compression will compress to about one-third of the text data. Therefore, transfer time will be one-third. Is roughly. However, data transfer has not been compressed in the generator. This proposal is one way to solve this problem.
This has nothing to do with preloading generator. http requests are done by Site.PostData\(\) or Site.getUrl\(\) with gzip compression enabled by default.
- **assigned_to**: nobody --> xqt
I understand. In the request header are described. However, the response data are not compressed. May be a bug of MediaWiki.
I don't know whether they are compressed or not. How did you see it?
I was confirmed by Wireshark. This is a packet analysis software.
Maybe the header is wrong. I found 'Accept-encoding' as header parameter but at https://www.mediawiki.org/wiki/API:Client\_Code header is described as 'Accept-Encoding' \(capital Encoding\)
- **summary**: Do not use preloading in featured.py --> PostData() datas are not gzip compressed