Last modified: 2014-11-04 22:00:15 UTC
While batch-fixing broken Ogg videos in https://commons.wikimedia.org/wiki/Category:Wikimania_2009_presentation_videos I found that a few files were too large to do in one POST (>100M) and I had to do them via chunked upload. I was just going to do them manually since there weren't many, but found I could not: * Special:Upload has only single-POST support * UploadWizard barfed with a JS error (and I'm pretty sure it won't let me upload over existing files anyway) so I ended up tweaking my CLI batch script to do chunked uploads via API. Probably should have done that anyway, but hey. Note that Special:Upload is used directly when clicking the 'upload a new version of this file' link on existing media -- it would be nice if it worked for large files! :)
I agree this would be nice. Many people using a user script currently to get around this limitation: https://commons.wikimedia.org/wiki/User:Rillke/bigChunkedUpload.js