Last modified: 2014-02-10 11:07:56 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T61118, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 59118 - "500 Internal Server Error" on all non-HTML pages
"500 Internal Server Error" on all non-HTML pages
Status: RESOLVED FIXED
Product: Wikimedia Labs
Classification: Unclassified
tools (Other open bugs)
unspecified
All All
: High blocker
: ---
Assigned To: Marc A. Pelletier
:
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2013-12-30 12:06 UTC by DrTrigon
Modified: 2014-02-10 11:07 UTC (History)
4 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments

Description DrTrigon 2013-12-30 12:06:57 UTC
> Internal Server Error
>
> The server encountered an internal error or misconfiguration and was unable to 
> complete your request.
> 
> Please contact the server administrator, mpelletier@wikimedia.org and inform 
> them of the time the error occurred, and anything you might have done that may 
> have caused the error.
> 
> More information about this error may be available in the server error log.
> 
> Additionally, a 500 Internal Server Error error was encountered while trying 
> to use an ErrorDocument to handle the request.

Very simple/basic cgi script: http://tools.wmflabs.org/saper/cgi-bin/simple
Essentially ALL cgi-bin scripts do not work anymore, see e.g.:
http://tools.wmflabs.org/drtrigonbot/cgi-bin/sum_cat_disc.py
http://tools.wmflabs.org/drtrigonbot/cgi-bin/panel.py
etc.

Only plain html works: http://tools.wmflabs.org/saper/test.html
Comment 1 Marcin Cieślak 2013-12-30 12:13:10 UTC
I checked this with a newly created "tool" from scratch, and my simplest CGIs do not work: http://tools.wmflabs.org/saper/cgi-bin/env (simplest CGI in /bin/sh) gives 500

Same for the PHP script stored in "public_html":

http://tools.wmflabs.org/saper/test.php
Comment 2 DrTrigon 2013-12-30 12:23:08 UTC
Is this related to http://icinga.wmflabs.org/icinga/ outage?
Comment 3 Tim Landscheidt 2013-12-30 12:30:25 UTC
(In reply to comment #2)
> Is this related to http://icinga.wmflabs.org/icinga/ outage?

No.  Icinga is hosted in the Labs project "Nagios" and isn't a dependency for Tools.
Comment 4 Tim Landscheidt 2013-12-30 12:31:47 UTC
Lots of requests from Baiduspider/2.0 on tools-webproxy.
Comment 5 Tim Landscheidt 2013-12-30 12:35:20 UTC
Set up robots.txt as a temporary measure to:

| User-agent: *
| Disallow: /

and will reinstate the spiders block in tools-webproxy's /etc/apache2/sites-available/webproxy in a jiffy.
Comment 6 Tim Landscheidt 2013-12-30 12:47:35 UTC
tools-webserver-01 is (very much :-)) out of memory.

| scfc@tools-webserver-01:~$ sudo -i
| -bash: fork: Cannot allocate memory
| -bash: fork: Cannot allocate memory
| scfc@tools-webserver-01:~$

Rebooting.
Comment 7 Tim Landscheidt 2013-12-30 13:00:04 UTC
And it's out again:

| [Mon Dec 30 12:55:41 2013] [error] [client 10.4.1.89] (12)Cannot allocate memory: couldn't create child process: /usr/lib/suphp/suphp for /data/project/ipp/public_html/npp_extern.php

Unfortunately, the bot hitting that page doesn't send a User-Agent, and before I exclude the whole world, I gotta read up on mod_rewrite (and then later re-start ipp as NewWeb).  Moment, please.
Comment 8 Marc A. Pelletier 2013-12-30 13:05:29 UTC
Yeah, the OOM was a consequence of Baidu insanely spidering some of the tools that have links to themselves with expensive parameters.  Recursion for the loss.
Comment 9 Marc A. Pelletier 2013-12-30 13:15:39 UTC
I've blocked the spider at the network level.  With a bit of luck, things should settle back down.
Comment 10 Tim Landscheidt 2013-12-30 13:16:37 UTC
Executed as local-ipp "webservice start", removed the rewrite on tools-webproxy, and tools-webserver-01 is down again.  *Argl*.
Comment 11 Marc A. Pelletier 2013-12-30 13:19:15 UTC
Oh, even more fun.  We have Nigma.ru also crawling and disobeying robots.txt.
Comment 12 Tim Landscheidt 2013-12-30 13:25:36 UTC
(In reply to comment #11)
> Oh, even more fun.  We have Nigma.ru also crawling and disobeying robots.txt.

I think my rewrite rule may have prevented the named spiders from accessing robots.txt :-) (fixed now).  Oh, well, I should get my IRC client going again for some synchronous communication.

There seems to be a wiki at bookmarkmanagerv2?
Comment 13 Marc A. Pelletier 2013-12-30 13:27:39 UTC
Indeed.  Disabled (it had open registration, and is infested by spambots)
Comment 14 Tim Landscheidt 2013-12-30 13:43:05 UTC
JFTR: Notified APPER about restart as NewWeb at [[de:Benutzer Diskussion:APPER#IP-Patrol auf Tools]].
Comment 15 DrTrigon 2013-12-30 15:12:03 UTC
Looks like the scripts work now. Thanks! But I have issues to connect to the DB:

<class '_mysql_exceptions.OperationalError'>: (2003, "Can't connect to MySQL server on 'dewiki.labsdb' (110)")
Comment 16 Tim Landscheidt 2013-12-30 15:38:04 UTC
(In reply to comment #15)
> Looks like the scripts work now. Thanks! But I have issues to connect to the
> DB:

> <class '_mysql_exceptions.OperationalError'>: (2003, "Can't connect to MySQL
> server on 'dewiki.labsdb' (110)")

That must be related to my reboot of tools-webserver-01.

Coren, where are the port forwards loaded?  I see identical "/etc/iptables.conf"s, on both -01 and -03 "sudo iptables -L" gives identical, yet "empty" output, however, -03 works, while -01 doesn't.
Comment 17 Marc A. Pelletier 2013-12-30 15:40:47 UTC
That's because -L shows the /filter/ table by default, not nat table where those rules live.
Comment 18 Tim Landscheidt 2013-12-30 15:55:46 UTC
(In reply to comment #17)
> That's because -L shows the /filter/ table by default, not nat table where
> those rules live.

D'oh!  DrTrigon, working for you now?
Comment 19 DrTrigon 2013-12-30 16:34:02 UTC
Yupp! Up and running again! Perfect, thanks to everybody involved!!! Greetings
Comment 20 Nemo 2014-02-10 11:07:56 UTC
(In reply to comment #5)
> Set up robots.txt as a temporary measure to:
> 
> | User-agent: *
> | Disallow: /

Temporary? Bug 61132.

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links