Last modified: 2014-09-30 13:48:59 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T73086, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 71086 - ApiQueryAllUsers with auactiveusers fails on duplicate querycachetwo rows
ApiQueryAllUsers with auactiveusers fails on duplicate querycachetwo rows
Status: RESOLVED FIXED
Product: MediaWiki
Classification: Unclassified
API (Other open bugs)
unspecified
All All
: High normal (vote)
: ---
Assigned To: Nobody - You can work on this!
:
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2014-09-20 14:13 UTC by Marius Hoch
Modified: 2014-09-30 13:48 UTC (History)
5 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments

Description Marius Hoch 2014-09-20 14:13:21 UTC
Rows in querycachetwo aren't enforced to be unique... that can lead to the following error:


2014-09-20 13:11:57 mw1194 commonswiki: [526b05b6] /w/api.php?format=json&list=allusers&aufrom=AnnaDydo&action=query&auactiveusers=1&aulimit=500   Exception from line 1821 of /srv/mediawiki/php-1.24wmf21/includes/api/ApiBase.php: Internal error in ApiQueryAllUsers::execute: MediaWiki configuration error: The database contains more user groups than known to User::getAllGroups() function
#0 /srv/mediawiki/php-1.24wmf21/includes/api/ApiQueryAllUsers.php(286): ApiBase::dieDebug('ApiQueryAllUser...', 'MediaWiki confi...')
#1 /srv/mediawiki/php-1.24wmf21/includes/api/ApiQuery.php(282): ApiQueryAllUsers->execute()
#2 /srv/mediawiki/php-1.24wmf21/includes/api/ApiMain.php(930): ApiQuery->execute()
#3 /srv/mediawiki/php-1.24wmf21/includes/api/ApiMain.php(364): ApiMain->executeAction()
#4 /srv/mediawiki/php-1.24wmf21/includes/api/ApiMain.php(335): ApiMain->executeActionWithErrorHandling()
#5 /srv/mediawiki/php-1.24wmf21/api.php(85): ApiMain->execute()
#6 /srv/mediawiki/w/api.php(3): require('/srv/mediawiki/...')
Comment 1 Gerrit Notification Bot 2014-09-20 14:15:14 UTC
Change 161664 had a related patch set uploaded by Hoo man:
Don't INNER JOIN querycachetwo in ApiQueryAllUsers

https://gerrit.wikimedia.org/r/161664
Comment 2 Brad Jorsch 2014-09-22 20:34:50 UTC
But the rows there *should* be unique: SpecialActiveUsers::doQueryCacheUpdate() takes a lock to make sure only one process is updating at a time, and with the lock held it uniquifies the list of names to be inserted against what's already in the database.

Aaron, any ideas?
Comment 3 Aaron Schulz 2014-09-23 00:21:42 UTC
(In reply to Brad Jorsch from comment #2)
> But the rows there *should* be unique:
> SpecialActiveUsers::doQueryCacheUpdate() takes a lock to make sure only one
> process is updating at a time, and with the lock held it uniquifies the list
> of names to be inserted against what's already in the database.
> 
> Aaron, any ideas?

The updating method should make sure it doesn't not happen in a transaction unless that transaction started after the lock() call. Alternatively, the query that gets the existing names (doing <<'qcc_title' => array_keys( $names ) )>>) could use LOCK IN SHARE MODE. 

Both of these get around any stale snapshots. The i18n staleness message could use tweaking for the first option in case DB writes are already pending from other stuff and the update gets deferred till post commit. The user might not see any results if it's the first time the cache was filled.

As for the duplicates already there, I guess the cache can just be cleared/rebuilt.
Comment 4 Aaron Schulz 2014-09-23 16:28:32 UTC
Also could be the connection dropping and reconnecting in cli mode when run for $wgSpecialPageCacheUpdates. The DB layer will reconnect but the lock will be gone.
Comment 5 Gerrit Notification Bot 2014-09-23 16:48:28 UTC
Change 162287 had a related patch set uploaded by Aaron Schulz:
Fixes to prevent duplicate rows in ActiveUser cache

https://gerrit.wikimedia.org/r/162287
Comment 6 Gerrit Notification Bot 2014-09-24 14:49:41 UTC
Change 162287 merged by jenkins-bot:
Fixes to prevent duplicate rows in ActiveUser cache

https://gerrit.wikimedia.org/r/162287
Comment 7 Brad Jorsch 2014-09-24 15:06:38 UTC
(In reply to Aaron Schulz from comment #3)
> As for the duplicates already there, I guess the cache can just be
> cleared/rebuilt.

Do we have a script for that, or is it more along the lines of "use sql.php to delete all the rows then maintenance/updateSpecialPages.php with --only to rebuild it"?
Comment 8 Aaron Schulz 2014-09-29 20:52:10 UTC
(In reply to Brad Jorsch from comment #7)
> (In reply to Aaron Schulz from comment #3)
> > As for the duplicates already there, I guess the cache can just be
> > cleared/rebuilt.
> 
> Do we have a script for that, or is it more along the lines of "use sql.php
> to delete all the rows then maintenance/updateSpecialPages.php with --only
> to rebuild it"?

That it. I just deleted the excess rows on enwiki manually in eval.php:

$dbw=wfGetDB(DB_MASTER);
$res = $dbw->query("SELECT qcc_title FROM querycachetwo WHERE qcc_type='activeusers' AND qcc_namespace = 2 GROUP BY qcc_title HAVING COUNT(*) > 1");
$titles = array(); foreach ( $res as $row ) { $titles[] = $row->qcc_title; }
$dbw->delete( 'querycachetwo', array( 'qcc_type' => 'activeusers', 'qcc_title' => $titles ) );
Comment 9 Brad Jorsch 2014-09-29 21:17:30 UTC
Ok. We know commonswiki still needs it, and may as well do it for all wikis. If you don't get to it this afternoon (your time) I'll do it tomorrow morning (my time).
Comment 10 Gerrit Notification Bot 2014-09-29 21:59:14 UTC
Change 161664 abandoned by Hoo man:
Don't INNER JOIN querycachetwo in ApiQueryAllUsers

Reason:
I guess someone came up with some other solution

https://gerrit.wikimedia.org/r/161664
Comment 11 Aaron Schulz 2014-09-29 22:20:43 UTC
(In reply to Brad Jorsch from comment #9)
> Ok. We know commonswiki still needs it, and may as well do it for all wikis.
> If you don't get to it this afternoon (your time) I'll do it tomorrow
> morning (my time).

I ran it on all wikis just now.

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links