Last modified: 2012-04-23 02:20:09 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T37315, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 35315 - XSS in CharInsert, forged strip markers
XSS in CharInsert, forged strip markers
Status: RESOLVED FIXED
Product: MediaWiki
Classification: Unclassified
Parser (Other open bugs)
unspecified
All All
: High normal (vote)
: ---
Assigned To: Nobody - You can work on this!
:
Depends on:
Blocks: 35387
  Show dependency treegraph
 
Reported: 2012-03-19 08:45 UTC by Tim Starling
Modified: 2012-04-23 02:20 UTC (History)
4 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments
Patch for CharInsert (437 bytes, patch)
2012-03-19 11:11 UTC, Tim Starling
Details
Unstrip in CPF::tagObj() (903 bytes, patch)
2012-03-19 11:38 UTC, Tim Starling
Details
Limit the number of iterations of the unstrip loop to the total number of items to unstrip (1.20 KB, patch)
2012-03-21 01:16 UTC, Bawolff (Brian Wolff)
Details

Description Tim Starling 2012-03-19 08:45:00 UTC
-------- Original Message --------
Subject: 	XSS in when combined with faked strip item markers
Date: 	Sat, 17 Mar 2012 23:04:30 -0300
From: 	Bawolff Bawolff <bawolff@gmail.com>
To: 	security <security@wikimedia.org>

I saw r113981, and it made me wonder what sort of mischief someone
could get up to if they forged strip item markers. Anyhow, create a
page containing only the following:

<nowiki>','',''); alert("XSS",')</nowiki>

{{#tag:charinsert|{{padleft:|21|<nowiki/>}}-nowiki-00000002-QINU}}

This creates a link on the page, which one clicked, arbitrary js is
executed. Note this is slightly probabilistic, depending on how big a
number Parser::getRandomString() returns. Most of the time (I'd
estimate ~75%) it works. if it doesn't the first time, hit preview
again. Using more complex parser functions, one could probably make
code that works 99.9% of the time.

The link is ugly such that most users would probably not click on it,
but I imagine you could use css to make "clicking" on the link seem to
be a legitimate thing to do.

This particular issue could be fixed by doing $data =
$parser->mStripState->unstripBoth( $data ); from the charinsert
extension. However I'm concerned there may be similar issues with
other extensions

Thanks,
Bawolff
Comment 1 Tim Starling 2012-03-19 10:58:41 UTC
I think it would be sensible to import ExtParserFunctions::killMarkers() into CoreParserFunctions and to run it on pretty much everything. It's surprising that there aren't a hundred bugs reporting "strip marker exposed in CoreParserFunction X". That might be disruptive, so maybe it can wait until 1.20 if an audit of popular tag hook extensions comes up clear.

Obviously CharInsert should be patched along the lines suggested by Bawolff. We don't usually announce extension vulnerabilities, but since it's a popular extension, maybe we could include a note about it in the mail to mediawiki-announce for 1.18.2.
Comment 2 Tim Starling 2012-03-19 11:11:21 UTC
Created attachment 10269 [details]
Patch for CharInsert

Issue reproduced, fix tested. I'm not sure whether to commit this now with a coy commit message or to wait until closer to 1.18.2.
Comment 3 Tim Starling 2012-03-19 11:38:38 UTC
Created attachment 10270 [details]
Unstrip in CPF::tagObj()

I think part of the blame lies with #tag, since before #tag was introduced, the arguments to tag hook functions could not include strip markers. Calling unstrip on the #tag inputs could be said to preserve interface backwards compatibility.

Patch proposed for 1.18.2. Ran parser tests.
Comment 4 Bawolff (Brian Wolff) 2012-03-19 19:32:54 UTC
(In reply to comment #3)
> Created attachment 10270 [details]
> Unstrip in CPF::tagObj()
> 
> I think part of the blame lies with #tag, since before #tag was introduced, the
> arguments to tag hook functions could not include strip markers. Calling
> unstrip on the #tag inputs could be said to preserve interface backwards
> compatibility.
> 
> Patch proposed for 1.18.2. Ran parser tests.

Hmm, that would break Wikipedia's fancy <ref> inside a <ref> template (when they do things like {{#tag:ref|Some note<ref>citation for note</ref>}}). Otherwise it seems like that would fix a lot of things.

Maybe have some way tag extensions could specify that they want to not have things unstripped, in order to make having strip tags be opt in (Kind of reminds me of the /have a method to denote certain tag extensions should have PST be run on them/ bug).


Maybe removing U+007F in things like Xml::escapeTagsOnly and
MediaWiki's various other escaping functions might be a good idea. It
wouldn't fix the charinsert issue, but could prevent future problems
of a similar nature (I haven't actually looked to see if there's any
code that assumes strip markers can get through the escaping functions
safely, so that might not be that good an idea). Intuitively though, If I
pass something into an escaping function, usually that's the absolute
last step before output, and I would expect at that point that all
strip markers have already been unstriped, and that there would be no way for unescaped text to slip back in.


----


Also of interest, the following causes the parser to go in an indef loop:

<pre>test</pre>

{{#tag:pre|{{padleft:|21|<nowiki/>}}-pre-00000004-QINU}}
Comment 5 Tim Starling 2012-03-19 23:23:01 UTC
(In reply to comment #4)
> Hmm, that would break Wikipedia's fancy <ref> inside a <ref> template (when
> they do things like {{#tag:ref|Some note<ref>citation for note</ref>}}).
> Otherwise it seems like that would fix a lot of things.

Good point. And there already is a parser test that checks for that, I'm not sure how I missed it.
Comment 6 Tim Starling 2012-03-20 04:39:59 UTC
Committed killMarkers/markerStripCallback patch in r114231.
Comment 7 Tim Starling 2012-03-21 00:43:09 UTC
Suggested release announcement text:

A long-standing bug in the wikitext parser (bug 22555) was discovered to have security implications. In the presence of the popular CharInsert extension, it leads to cross-site scripting (XSS). XSS may be possible with other extensions or perhaps even the MediaWiki core alone, although this is not confirmed at this time. A denial-of-service attack (infinite loop) is also possible regardless of configuration.
Comment 8 Bawolff (Brian Wolff) 2012-03-21 01:16:14 UTC
Created attachment 10295 [details]
Limit the number of iterations of the unstrip loop to the total number of items to unstrip

Sounds good.


Also might I suggest limiting the unstrip loop so that it is impossible to cause an indef loop (Hopefully the other change would prevent people from mucking with strip markers, but just in case, not having the potential for indef loop would be good to)
Comment 9 Sam Reed (reedy) 2012-03-21 22:20:46 UTC
(In reply to comment #8)
> Created attachment 10295 [details]
> Limit the number of iterations of the unstrip loop to the total number of items
> to unstrip
> 
> Sounds good.
> 
> 
> Also might I suggest limiting the unstrip loop so that it is impossible to
> cause an indef loop (Hopefully the other change would prevent people from
> mucking with strip markers, but just in case, not having the potential for
> indef loop would be good to)

When I asked Tim about this, he didn't think it was necessary to be backported, but it's fine to go into trunk after the other commits have been made
Comment 10 Tim Starling 2012-04-16 10:20:41 UTC
For XSS, this test case seems to work just as well:

{{#tag:charinsert|<nowiki>','',''); alert("XSS",')</nowiki>}}

It doesn't need forged strip markers. But the infinite loop in comment 4 does need forged strip markers.
Comment 11 Tim Starling 2012-04-23 02:20:09 UTC
(In reply to comment #8)
> Created attachment 10295 [details]
> Limit the number of iterations of the unstrip loop to the total number of items
> to unstrip

This still left quite a bit of scope for algorithmic DoS. The iterative algorithm that you're hacking was an inefficient hack to start with, since it scans the entire expanded text at every recursion depth, giving a running time proportional to the depth multiplied by the size. I had a go at doing a proper job of it at https://gerrit.wikimedia.org/r/#change,5596

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links