Quote:
Originally posted by Matthew
okay,
I'm the guy behind tmanager, and I'll try to explain what do I think about it.
First, if done right, 2nd apache for thumbs help a lot. Also a slight FreeBSD kernel tuning never hurted anyone
Why? Correctly compiled and configured 2nd apache will consume LESS memory, which is what this server needs!
But this won't solve the whole problem. The problem is that perl script consumes a lot of memory while checking links. I think there is a memory leak somewhere in it because it shouldn't take 100 MB of RAM to check links.
And of course it's good to get a faster server anyway. Then you have the space to grow.
Also I'm suprised that guys still suggest "use c++ script only". You guys should check out how much time does launching a process via CGI takes. really. But don't start a flame war again about php and C. This is not the point I'm trying to make.
The point is, that in this particular case, the load occurs because
1) server runs this perl script, this consumes memory.
2) server goes into swap, starting a chain reaction. More and more processes have to wait for memory/CPU time to become available.
3) over a time, server processes those requests and becomes available just till next cronjob.
|
Hmmm... thinking about what you said... Perl scripts don't act on memory directly like C or C++ so memory leaks is kinda a moot point, unless you simply mean it doesn't clear a buffer after using certain data or close of file it has opened and stored into a handler.
If the script worked on smaller chunks of data (used a smaller buffer or worked on chuncks of 2000 links at a time) it might give you better performance system resource wise but it will take a significant ammount of more time to run.