Last modified: 2014-04-24 18:14:14 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T66053, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 64053 - Add an Amazon AWS domain to wgCopyUploadsDomains whitelist
Add an Amazon AWS domain to wgCopyUploadsDomains whitelist
Status: RESOLVED DUPLICATE of bug 64372
Product: Wikimedia
Classification: Unclassified
Site requests (Other open bugs)
wmf-deployment
All All
: Normal enhancement (vote)
: ---
Assigned To: Nobody - You can work on this!
: shell
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2014-04-17 13:27 UTC by johanmmuller
Modified: 2014-04-24 18:14 UTC (History)
7 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments

Description johanmmuller 2014-04-17 13:27:26 UTC
please add the following domain(s) to the wgCopyUploadsDomains whitelist: http://aws.amazon.com/s3/

I would like to use this cloud storage site as a source for images to upload to Wikimedia Commons using the GlamWikiToolset, which requires an URL as source. I am Wikipedian in special residence, presently at the Peace Palace Library at The Hague, to upload 4500 images to Wikimedia Commons. IT staff there suggested the Amazon-site.

Thanks!
Comment 1 Marius Hoch 2014-04-19 15:43:37 UTC
I don't think we really want to whitelist amazon's s3 as everyone can upload content to it and there are many non-free images on it (probably).
Comment 2 jeremyb 2014-04-19 17:11:26 UTC
(In reply to Marius Hoch from comment #1)
> I don't think we really want to whitelist amazon's s3 as everyone can upload
> content to it and there are many non-free images on it (probably).

I had similar concerns. In any case, I think that's the wrong endpint for s3?

Please have your staff set up a CNAME from a more specific domain to a specific s3 bucket. Or else we can patch to allow filtering on path prefix in addition to domain name. But we'd still need to limit to a single bucket.

(The current examples at http://mediawiki.org/wiki/Manual:$wgCopyUploadsDomains don't filter on path)
Comment 3 johanmmuller 2014-04-22 15:19:36 UTC
(In reply to jeremyb from comment #2)
> (In reply to Marius Hoch from comment #1)
> > I don't think we really want to whitelist amazon's s3 as everyone can upload
> > content to it and there are many non-free images on it (probably).
> 
> I had similar concerns. In any case, I think that's the wrong endpint for s3?
> 
> Please have your staff set up a CNAME from a more specific domain to a
> specific s3 bucket. Or else we can patch to allow filtering on path prefix
> in addition to domain name. But we'd still need to limit to a single bucket.
> 
> (The current examples at
> http://mediawiki.org/wiki/Manual:$wgCopyUploadsDomains don't filter on path)

Would this suffice: wikipedian-in-residence.s3-website-eu-west-1.amazonaws.com
Example link
Test link:

http://wikipedian-in-residence.s3-website-eu-west-1.amazonaws.com/Wikipedian-in-Residence%20Sessie%202014-01-16/Tydsgn%20213%20051504510%20_MG_8631%20kleurenkaart.tif 

Kind regards, hans muller ~~~~
(In reply to jeremyb from comment #2)
> (In reply to Marius Hoch from comment #1)
> > I don't think we really want to whitelist amazon's s3 as everyone can upload
> > content to it and there are many non-free images on it (probably).
> 
> I had similar concerns. In any case, I think that's the wrong endpint for s3?
> 
> Please have your staff set up a CNAME from a more specific domain to a
> specific s3 bucket. Or else we can patch to allow filtering on path prefix
> in addition to domain name. But we'd still need to limit to a single bucket.
> 
> (The current examples at
> http://mediawiki.org/wiki/Manual:$wgCopyUploadsDomains don't filter on path)
Comment 4 johanmmuller 2014-04-23 08:26:46 UTC
(In reply to jeremyb from comment #2)
> (In reply to Marius Hoch from comment #1)
> > I don't think we really want to whitelist amazon's s3 as everyone can upload
> > content to it and there are many non-free images on it (probably).
> 
> I had similar concerns. In any case, I think that's the wrong endpint for s3?
> 
> Please have your staff set up a CNAME from a more specific domain to a
> specific s3 bucket. Or else we can patch to allow filtering on path prefix
> in addition to domain name. But we'd still need to limit to a single bucket.
> 
> (The current examples at
> http://mediawiki.org/wiki/Manual:$wgCopyUploadsDomains don't filter on path)

Dear bureaucrats, 

Something went wrong with my last post, so that it seems to be answeredalready (which is not the case)
** Would the following domain suffice: 
** wikipedian-in-residence.s3-website-eu-west-1.amazonaws.com 
** ??

Test link:

http://wikipedian-in-residence.s3-website-eu-west-1.amazonaws.com/Wikipedian-in-Residence%20Sessie%202014-01-16/Tydsgn%20213%20051504510%20_MG_8631%20kleurenkaart.tif 

Hope to hear from you, thank you, kind regards, 

hans muller, 
wikipedian in residence Peace Palace Library
Comment 5 johanmmuller 2014-04-24 16:39:00 UTC
Today I requested whitelisting of the domain of my institution peacepalacelibrary.nl in bug report 64372(https://bugzilla.wikimedia.org/show_bug.cgi?id=64372).

Approval of that request would make the present request 64053 obsolete.
Kind regards, hans muller
Comment 6 Tomasz W. Kozlowski 2014-04-24 18:13:55 UTC

*** This bug has been marked as a duplicate of bug 64372 ***
Comment 7 Tomasz W. Kozlowski 2014-04-24 18:14:14 UTC
Yeah, let's do it as 64372 then.

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links