View Single Post
Old 02-15-2003, 11:51 AM  
hel0
Registered User
 
Join Date: Feb 2003
Posts: 14
Hi,


First, apache doesn't include a way to control runaway processes. You could do it with mod_perl, but that's a gigantic pain in the ass.

Second, it could certainly be mod_gzip's fault here. Chances are that the crawler doesn't support mod_gzip, or even understand it. So you could be in some kind of content negotiation loop with the crawler. Give me a file. Do you support gzip? What? Give me a file. Do you support gzip? What? Give me a file. Do you support gzip? What? It's possible that because a request never actually happens, it's not incrementing the count.

Strace the process the next time it happens, and you'll have the answer.

If you have mod_perl, you can download Apache::GTopLimit or Apache::Watchdog::RunAway. Be warned, they are a major pain to install. You could also search google for shell scripts to check for huge procs and kill them. Just install it in cron and you're good to go.


Hel0
ICQ: 348407599
hel0 is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote