GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   MySQL vs. Plain text Logs - Help I am Ignorant. (https://gfy.com/showthread.php?t=439030)

Dusen 03-02-2005 12:49 PM

MySQL vs. Plain text Logs - Help I am Ignorant.
 
I'm logging surfers to plain text files. These files are running megabytes/day. IP referrer, time using php's fopen fwrite fclose.

Other than the obvious benefits to searching the logs with queries instead of parsing text files, I have a few questions for you SQL gurus.

How would the same info logged to MySQL database compare in size?

How would the speed compare?

What problems could I run into that I'm not considering?

Basically, is it overall worth my while to convert the logging to SQL? I'm totally ignorant about SQL databasing when it comes to high speed adds/queries. All I have ever used it for is non-intensive size or speed wise applications.

Thanks Guys

colpanic 03-02-2005 01:10 PM

What kind of logs, just http logs?

Mysql is good if you want to run any sort of reports off of things in the long term. you can do it with plain text files, but it gets slow... you couldn't really do good dynamic reports off of them.

obviously sql has more overhead than writing to a plain text file.

VideoJ 03-02-2005 01:12 PM

The data itself will be around the same size, assuming you have the same fields. The indexes, which are what make the searches in a database much quicker, will add space to your database, making it larger then the text files. How much larger will depend on whiat columns you index.

The speed will depend on the searches and the indexes you set up. The worse case is the searches will take roughly the same amount of time. The best case is where you have an index on a column and you do a search on that column, it should be much quicker.

Dusen 03-02-2005 01:24 PM

Thanks for the replies guys, I'll add a few things I thought of after reading your replies:

My main concern is runtime execution - not report speed. The report generation speed is really only a bonus from my perspective - currently I just parse the daily text files ( I have a cronjob that renames them every night at midnight). and then using explode and php get specific data from that. But I don't mind waiting extra time to see a report if the surfers / stats are getting through faster.

As far as runtime - I'm currently running 3 php fopens and closes, and fwrites in the middle different stats to each file. And yes, it's basically just httpd log stuff with a few extras I have. The benefit I see there is only one mysql_connect statement with multiple table writes. When the surfer visits I would be doing NO SELECT statements, only writes.

In fact, I just realized that I actually am already opening the mysql connection for an unrelated thing each time the surfer visits anyway - so this may certainly speed things up.

But another thing I was thinking of is that perhaps the mysql could handle several simulatneous writes better than writing to a file? Does MySQL spool requests? I don't think PHP and file writes does. Seems sloppy there for me.

Is there a limit to a mysql database size? I can easily see it reaching the gigabyte size down the road.

Thanks for entertaining these questions.

NetRodent 03-02-2005 01:32 PM

Why are you using php to write the logs instead of using the raw webserver logs? If you're looking for write efficiency, not much will beat direct logging from the web server.

Direct logging to a database is probably not a good idea.

Dusen 03-02-2005 02:28 PM

It's not just webserver logs - I have other things that I'd like to be able to reference to surfers - tracking variables, etc. Otherwise I'd just use my apache logs for sure!

borked 03-02-2005 03:05 PM

log everything and everything you want directly to an appended flat text file, then when you want to process the logs for anything, let php parse it and log whatever you need to into a mysql fatabase - never log access-type logs to mysql. You'll regret it once your site starts getting decent traffic.

MickeyG 03-02-2005 03:32 PM

better make sure to read up on file locking too.

borked 03-02-2005 03:45 PM

if ($fp)
{
@flock ($fp, LOCK_EX);
@fwrite($fp,$output);
@flock ($fp, LOCK_UN);
@fclose($fp);
}

But that's for wimpy sites - if you've got serious traffic, the logs need caching to RAM and writing out every second or so by a seperate process. No locking needed and far superior performance.

SicChild 03-02-2005 04:05 PM

If you know what you are doing sql would be a MUCH better solution.

You can archive past data and summarize the info so filesize is much smaller.

borked 03-02-2005 04:14 PM

Quote:

Originally Posted by SicChild
If you know what you are doing sql would be a MUCH better solution.

You can archive past data and summarize the info so filesize is much smaller.

You clearly DON'T know what you're doing to offer that advice. :2 cents:

Babaganoosh 03-02-2005 04:15 PM

Quote:

Originally Posted by borked
log everything and everything you want directly to an appended flat text file, then when you want to process the logs for anything, let php parse it and log whatever you need to into a mysql fatabase - never log access-type logs to mysql. You'll regret it once your site starts getting decent traffic.

This is exactly right. Maybe set up a script on a cron job that will import all of the data into mysql and then use mysql to search/generate stats. I definitely wouldn't try to insert data into mysql live. That would most likely cause you some serious problems.

MickeyG 03-02-2005 04:42 PM

Quote:

Originally Posted by borked
if ($fp)
{
@flock ($fp, LOCK_EX);
@fwrite($fp,$output);
@flock ($fp, LOCK_UN);
@fclose($fp);
}

But that's for wimpy sites - if you've got serious traffic, the logs need caching to RAM and writing out every second or so by a seperate process. No locking needed and far superior performance.


I love ramdisks, especially for my mail queues :)

just a punk 03-02-2005 04:55 PM

Text logs are much faster than MySQL.

Babaganoosh 03-02-2005 05:03 PM

Quote:

Originally Posted by cyberxxx
Text logs are much faster than MySQL.

Not always true. Generate detailed stats from a 300 MB text file and then do the same with a well-indexed mysql db. Mysql will win every time.

Dusen 03-02-2005 05:41 PM

That ramdisk/sql cron idea is fantastic. Fantabulous. I wish I had thought of it myself.

jwerd 03-02-2005 05:52 PM

I tend to use SQLite a bit more than MySQL, if it basically varies between using a flatfile verse database. SQLite is far less overhead (within reason) and allows for a little bit better control, if you are into parsing a bit of data. Of course like most of the people have already said, flatfile is the way to go most definately... but before you even consider to go with MySQL, consider SQLite first :) Just my :2 cents:

NetRodent 03-02-2005 06:13 PM

Quote:

Originally Posted by Armed & Hammered
Not always true. Generate detailed stats from a 300 MB text file and then do the same with a well-indexed mysql db. Mysql will win every time.

If you only need a small portion of the data in the 300 MB, a well indexed db is better. If you need to process all 300 MB, plain text is probably faster.

skillfull 03-02-2005 06:44 PM

Quote:

Originally Posted by NetRodent
If you only need a small portion of the data in the 300 MB, a well indexed db is better. If you need to process all 300 MB, plain text is probably faster.

what he said :2 cents:


All times are GMT -7. The time now is 07:29 AM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123