![]() |
![]() |
![]() |
||||
Welcome to the GoFuckYourself.com - Adult Webmaster Forum forums. You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today! If you have any problems with the registration process or your account login, please contact us. |
![]() ![]() |
|
Discuss what's fucking going on, and which programs are best and worst. One-time "program" announcements from "established" webmasters are allowed. |
|
Thread Tools |
![]() |
#1 |
Confirmed User
Join Date: Oct 2001
Location: Where the sun don't shine
Posts: 1,185
|
MySQL querys traffic and load
If I have a page that connects to a MySQL db and prints results to the page everytime the page is accessed instead off running crontab and uptate the page 2 time's aday so the page becomes static html. How much traffic would this page handle before the mysql php printing becomes slow?
|
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#2 |
Confirmed User
Join Date: Feb 2002
Location: Seattle
Posts: 1,070
|
impossible to say exactly without knowing what queries you're doing and knowing about the dataset, but think of it this way. sending a static page costs practically 0 cpu. processing and sending a dynamic page costs X cpu. X * hits-per-second * 120-seconds is going to be a lot more than 0.
![]() however, if you have a small site (20 hits per second or whatever) you might be OK. the biggest thing to watch out for there is concurrent queries. if you have a page that takes 1/5th a CPU second to process, and it's being accessed 20 times per second, you're obviously going to have issues. realistically, any time you can change a dynamic page to a static page that is updated periodically, do it - any CPU you save now you can use later, and it's easier to make changes now than later when your site is busier. btw: most programming languages come with methods to determine CPU usage (in seconds). in C and PHP check out getrusage(), in perl check out times(), in ASP check your head you may have sprung a leak. ![]()
__________________
![]() |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#3 |
Registered User
Join Date: May 2002
Location: AU
Posts: 96
|
bah, ASP rocks your world. vbscript is the shit.
__________________
<br>my weeds have sprouted |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#4 |
Confirmed User
Join Date: May 2002
Posts: 264
|
Anybody knows where to find good free ASP scripts ??? Thanx!
__________________
.. |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#5 | |
♥♥♥ Likes Hugs ♥♥♥
Industry Role:
Join Date: Nov 2001
Location: /home
Posts: 15,841
|
Quote:
![]() |
|
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#6 | |
Confirmed User
Join Date: Oct 2001
Location: Where the sun don't shine
Posts: 1,185
|
Quote:
![]() |
|
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#7 |
Confirmed User
Join Date: Feb 2002
Location: Toronto, ON
Posts: 962
|
Database shit also gets exponentially slower with write queries. So if your table is written to on a frequent basis, don't try to use it for high volume reads at the same time.
__________________
SIG TOO BIG! Maximum 120x60 button and no more than 3 text lines of DEFAULT SIZE and COLOR. Unless your sig is for a GFY top banner sponsor, then you may use a 624x80 instead of a 120x60. |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#8 | |
Beer Money Baron
Industry Role:
Join Date: Jan 2001
Location: brujah / gmail
Posts: 22,157
|
Quote:
__________________
|
|
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#9 |
Confirmed User
Industry Role:
Join Date: Apr 2002
Posts: 3,387
|
anyone know where i can find an overview of using crontab? i currently have a tgp which calls a php database on every hit and want to optimse! =)
btw, ill shoot anyone who says google ![]() ![]() |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#10 | |
Confirmed User
Join Date: Oct 2001
Location: Where the sun don't shine
Posts: 1,185
|
Quote:
|
|
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#11 |
Confirmed User
Join Date: Feb 2002
Location: Seattle
Posts: 1,070
|
note that lynx is actually pretty crappy for use in cron. it's one of the worst text browsers, sadly, because it's just so bloated (if it had some cool features included with the bloat, that'd be one thing.. but i'm straying from topic). in cron, it's worse, because it gets confused and sometimes doesn't even request the page you asked for.
if you're on freebsd, check out its built in 'fetch' command. no fuss, no muss. 'fetch -q -o /dev/null url'. on linux, perhaps 'wget' would work, tho its probably overkill ![]()
__________________
![]() |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#12 |
Beer Money Baron
Industry Role:
Join Date: Jan 2001
Location: brujah / gmail
Posts: 22,157
|
I think he wants a static page generated.
If you want cron info, try: http://www.ualberta.ca/CNS/HELP/unix/crontab.1.html If you just needed a daily index.html page generated You could create a shell script like dailyfetch.sh chmod 755 dailyfetch.sh move it into your /etc/cron.daily directory if you have one. maybe even /etc/cron.hourly or /etc/cron.half-hourly, etc. -- dailyfetch.sh -- Code:
#!/bin/sh /usr/bin/wget -O saved.html http://urltosave.com/urltosave.php
__________________
|
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#13 |
Confirmed User
Join Date: Feb 2002
Location: Seattle
Posts: 1,070
|
yeah, that's good - just one last thing i'd suggest (i promise, i'm not trying to spam the thread.
![]() wget -O saved.html.new http://url.com/ then something like if [ -s saved.html.new ] then mv saved.html.new saved.html fi immediately afterwards. this isn't perfect, but what it'll do is two things: 1) it'll write to a second file instead of the primary one - this is important because if someone hits your page while the wget process is running, they'd potentially get an incomplete page, or worse, a blank page.. and.. 2) the -s part will check if the page is more than zero bytes before moving it in to place. if it's not, it'll just ignore it. you could probably have it echo an error if it isn't if you wanted. last thing you'd want is an error popping up at 3AM while you're in bed, causing your pages to come up blank for who knows how many hours. ![]()
__________________
![]() |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#14 | |
Macdaddy coder
Industry Role:
Join Date: Feb 2002
Location: MacDaddy pimp coder
Posts: 2,806
|
Quote:
![]()
__________________
MacDaddy Coder. |
|
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#15 | |
Confirmed User
Industry Role:
Join Date: May 2002
Location: oregon.
Posts: 2,243
|
Quote:
__________________
php/mysql guru. hosting, coding, all that jazz. |
|
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#16 |
Confirmed User
Industry Role:
Join Date: May 2002
Location: oregon.
Posts: 2,243
|
actually you could use lynx -dump $url > file for the crontab idea...
__________________
php/mysql guru. hosting, coding, all that jazz. |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#17 | |
Registered User
Join Date: May 2002
Location: AU
Posts: 96
|
Quote:
__________________
<br>my weeds have sprouted |
|
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#18 |
Confirmed User
Join Date: Apr 2002
Location: Mauritius
Posts: 1,118
|
haha
|
![]() |
![]() ![]() ![]() ![]() ![]() |