Last modified: 2014-03-10 10:24:21 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T49063, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 47063 - Jenkins: Investigate using concurrent builds of the same job
Jenkins: Investigate using concurrent builds of the same job
Status: RESOLVED FIXED
Product: Wikimedia
Classification: Unclassified
Continuous integration (Other open bugs)
wmf-deployment
All All
: Normal enhancement (vote)
: ---
Assigned To: Nobody - You can work on this!
:
Depends on: 58094
Blocks:
  Show dependency treegraph
 
Reported: 2013-04-09 23:51 UTC by Antoine "hashar" Musso (WMF)
Modified: 2014-03-10 10:24 UTC (History)
4 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments

Description Antoine "hashar" Musso (WMF) 2013-04-09 23:51:07 UTC
Jenkins can run a job several time by isolating the code in different workspace. I have tried it out by creating a job that sleep for 30 seconds and triggered it several time. The disk shows:

$ ls -1 -d workspace*
workspace/
workspace@2/
workspace@3/
workspace@4/
$

That does not work with custom workspace though, we will have to handle it ourselves.

A use case would be the MediaWiki parser tests, the risk is that we could end up with all the Jenkins executor being busy running parser tests which would delay the other faster tests.
Comment 1 Krinkle 2013-04-10 05:50:45 UTC
Concurrently running different jobs or concurrent builds of the same job?

As far as I know we already run concurrent builds of different jobs. Eg. on patchset-create mediawiki-core-phpunit-misc runs simultaneously with mediawiki-core-phpunit-api, that is already the case.

Running concurrent builds of the same job (if that is even possible, is it?) will likely get rather tricky.

For one, it will no longer make our gate an effective gate as it will no longer be testing the result as it will be when merged, other conflicting commits could make it into the repository separate from each other, thus breaking the end result with no warning *at all* anywhere to be found (until the next unrelated commit comes in and fails with nobody having a clue).

Secondly, we have various jobs (such as in Parsoid) that have certain hacks in place to run twice in a row to use last-run build artefacts. Not sure how that goes with concurrency.

Except for this last category (Parsoid double-runs) concurrency would be fine as long as the gate pipeline does not use it (for aforementioned reasons). On the check and test pipelines for patchset creations it should be fine.
Comment 2 Antoine "hashar" Musso (WMF) 2014-01-06 10:44:40 UTC
I have made lot of jobs concurrent in December.  The mediawiki-core-phpunit-parsertests had a race condition (bug 58094) which is apparently fixed.
Comment 3 Antoine "hashar" Musso (WMF) 2014-01-06 10:45:10 UTC
https://gerrit.wikimedia.org/r/#/c/102149/ fixed a race condition with .sqlite files.
Comment 4 Gerrit Notification Bot 2014-01-06 10:51:28 UTC
Change 102152 had a related patch set uploaded by Hashar:
(WIP) mediawiki-core-phpunit-* made concurrent (WIP)

https://gerrit.wikimedia.org/r/102152
Comment 5 Gerrit Notification Bot 2014-01-06 12:43:28 UTC
Change 102152 merged by jenkins-bot:
mediawiki-core-phpunit-* made concurrent

https://gerrit.wikimedia.org/r/102152
Comment 6 Antoine "hashar" Musso (WMF) 2014-01-08 16:37:05 UTC
The Wikibase / Wikidata jobs need to be made concurrent as well.
Comment 7 Antoine "hashar" Musso (WMF) 2014-03-10 10:24:21 UTC
Wikibase / Wikidata got moved to another CI system.

For mw/core the last non concurrent job was the qunit one which Timo modified last Friday with https://gerrit.wikimedia.org/r/#/c/117603/

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links