Last modified: 2014-09-18 12:59:38 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T57029, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 55029 - Run in -async mode, the logs sequence mess
Run in -async mode, the logs sequence mess
Status: NEW
Product: Pywikibot
Classification: Unclassified
General (Other open bugs)
unspecified
All All
: Unprioritized enhancement
: ---
Assigned To: Pywikipedia bugs
:
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2013-10-05 04:06 UTC by Kunal Mehta (Legoktm)
Modified: 2014-09-18 12:59 UTC (History)
2 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments

Description Kunal Mehta (Legoktm) 2013-10-05 04:06:37 UTC
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/315/
Reported by: yfdyh000
Created on: 2012-07-15 10:53:38
Subject: Run in -async mode, the logs sequence mess
Original description:
In asynchronous mode, recommended to wait for the same page update details full output, then insert other information.
"Updating links on page \[\[xx:XXX\]\]" should not be inserted into the middle of the Continuous log.
The "Changes to be made​​:" of Same page should be with a continuous output.
Comment 1 Kunal Mehta (Legoktm) 2013-10-05 04:06:40 UTC
That does not make any sense to me. If we have to wait for the update so we can have the output from the script and from the async save routine in the same place, the entire advantage of async saving is gone\!
Comment 2 Kunal Mehta (Legoktm) 2013-10-05 04:06:42 UTC
Oh. The following idea feasible?

Suppose a situation:
\# max\_queue\_size = 10
...
\# The current waiting put queue is 9
Getting 60 pages from wikipedia:en...
\# Add 10+ pages to the put queue at same time
\# The current waiting put queue is 19+
======Post-processing \[\[en:xxx\]\]======
"Updating links on page \[\[en:xxx\]\]." 10+
...
"Updating page \[\[en:xxx\]\] via API" 10+
"Getting 60 pages from wikipedia:de..."
......
Comment 3 Kunal Mehta (Legoktm) 2013-10-05 04:06:43 UTC
What you want is not really clear to me, but I suppose you want to let the bot stall when there are 10 pages waiting to be saved?

What would be the use case for that?
Comment 4 Kunal Mehta (Legoktm) 2013-10-05 04:06:45 UTC
I mean is all associated pages of a page as a unit to output and updates, rather than each page as a unit.
Comment 5 John Mark Vandenberg 2014-09-18 12:59:38 UTC
Is this a enhancement for core or compat?

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links