Last modified: 2014-01-03 15:57:27 UTC
This issue was converted from https://jira.toolserver.org/browse/DBQ-115. Summary: what Issue type: Task - A task that needs to be done. Priority: Major Status: Done Assignee: Hoo man <hoo@online.de> ------------------------------------------------------------------------------- From: keerthi <vasan82@gmail.com> Date: Wed, 29 Dec 2010 19:15:52 -------------------------------------------------------------------------------
------------------------------------------------------------------------------- From: EdoDodo <dodo.wikipedia@gmail.com> Date: Wed, 29 Dec 2010 19:33:02 ------------------------------------------------------------------------------- Please describe the task you would like done.
------------------------------------------------------------------------------- From: keerthi <vasan82@gmail.com> Date: Thu, 30 Dec 2010 05:14:38 ------------------------------------------------------------------------------- I need a complete dump of the files in the following link http://en.wikipedia.org/wiki/Category:Cooking in sql format . so that i can use the contents in my site .It is ok if the images are not available . Please help me in this .you can provide any other place or link from where I can get these dumps of cooking alone .
------------------------------------------------------------------------------- From: keerthi <vasan82@gmail.com> Date: Fri, 31 Dec 2010 10:40:23 ------------------------------------------------------------------------------- Please resolve this issue for me
------------------------------------------------------------------------------- From: Hoo man <hoo@online.de> Date: Fri, 31 Dec 2010 11:05:26 ------------------------------------------------------------------------------- We are not doing any custom dumps, if you want an enwikibooks dump, you have to use a full dump from the WMF. Those can be found here: http://download.wikimedia.org/enwikibooks/ (pages-articles.xml.bz2 should be the wright one for your needs)
------------------------------------------------------------------------------- From: keerthi <vasan82@gmail.com> Date: Sat, 01 Jan 2011 06:24:27 ------------------------------------------------------------------------------- is it possible to get the sql dumps .can you please provide the link for sql dumps .Thanks for you help
------------------------------------------------------------------------------- From: keerthi <vasan82@gmail.com> Date: Sat, 01 Jan 2011 06:26:05 ------------------------------------------------------------------------------- can you please provide the link for sql dumps
------------------------------------------------------------------------------- From: Hoo man <hoo@online.de> Date: Sat, 01 Jan 2011 14:56:29 ------------------------------------------------------------------------------- This is the XML dump you need: http://download.wikimedia.org/enwikibooks/20101031/enwikibooks-20101031-pages-articles.xml.bz2 It can be imported like explained here: http://meta.wikimedia.org/wiki/Data_dumps#importDump.php (or if you have set upload_max_filesize in your php.ini to a very high value it might as well work through Special:Import)
This bug was imported as RESOLVED. The original assignee has therefore not been set, and the original reporters/responders have not been added as CC, to prevent bugspam. If you re-open this bug, please consider adding these people to the CC list: Original assignee: hoo@online.de CC list: dodo.wikipedia@gmail.com, hoo@online.de