![]() |
MySQL querys traffic and load
If I have a page that connects to a MySQL db and prints results to the page everytime the page is accessed instead off running crontab and uptate the page 2 time's aday so the page becomes static html. How much traffic would this page handle before the mysql php printing becomes slow?
|
impossible to say exactly without knowing what queries you're doing and knowing about the dataset, but think of it this way. sending a static page costs practically 0 cpu. processing and sending a dynamic page costs X cpu. X * hits-per-second * 120-seconds is going to be a lot more than 0. :)
however, if you have a small site (20 hits per second or whatever) you might be OK. the biggest thing to watch out for there is concurrent queries. if you have a page that takes 1/5th a CPU second to process, and it's being accessed 20 times per second, you're obviously going to have issues. realistically, any time you can change a dynamic page to a static page that is updated periodically, do it - any CPU you save now you can use later, and it's easier to make changes now than later when your site is busier. btw: most programming languages come with methods to determine CPU usage (in seconds). in C and PHP check out getrusage(), in perl check out times(), in ASP check your head you may have sprung a leak. :1orglaugh |
bah, ASP rocks your world. vbscript is the shit.
|
Anybody knows where to find good free ASP scripts ??? Thanx!
|
Quote:
|
Quote:
|
Database shit also gets exponentially slower with write queries. So if your table is written to on a frequent basis, don't try to use it for high volume reads at the same time.
|
Quote:
|
anyone know where i can find an overview of using crontab? i currently have a tgp which calls a php database on every hit and want to optimse! =)
btw, ill shoot anyone who says google :BangBang: :BangBang: |
Quote:
|
note that lynx is actually pretty crappy for use in cron. it's one of the worst text browsers, sadly, because it's just so bloated (if it had some cool features included with the bloat, that'd be one thing.. but i'm straying from topic). in cron, it's worse, because it gets confused and sometimes doesn't even request the page you asked for.
if you're on freebsd, check out its built in 'fetch' command. no fuss, no muss. 'fetch -q -o /dev/null url'. on linux, perhaps 'wget' would work, tho its probably overkill :) (just not as much as lynx is.. ugh) |
I think he wants a static page generated.
If you want cron info, try: http://www.ualberta.ca/CNS/HELP/unix/crontab.1.html If you just needed a daily index.html page generated You could create a shell script like dailyfetch.sh chmod 755 dailyfetch.sh move it into your /etc/cron.daily directory if you have one. maybe even /etc/cron.hourly or /etc/cron.half-hourly, etc. -- dailyfetch.sh -- hope that turned out right. |
yeah, that's good - just one last thing i'd suggest (i promise, i'm not trying to spam the thread. :) )
wget -O saved.html.new http://url.com/ then something like if [ -s saved.html.new ] then mv saved.html.new saved.html fi immediately afterwards. this isn't perfect, but what it'll do is two things: 1) it'll write to a second file instead of the primary one - this is important because if someone hits your page while the wget process is running, they'd potentially get an incomplete page, or worse, a blank page.. and.. 2) the -s part will check if the page is more than zero bytes before moving it in to place. if it's not, it'll just ignore it. you could probably have it echo an error if it isn't if you wanted. last thing you'd want is an error popping up at 3AM while you're in bed, causing your pages to come up blank for who knows how many hours. :) this is also one of the advantages to static pages. |
Quote:
|
Quote:
|
actually you could use lynx -dump $url > file for the crontab idea...
|
Quote:
|
haha
|
All times are GMT -7. The time now is 04:11 AM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123