GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   When does a MySQL table become too large? (https://gfy.com/showthread.php?t=1014914)

Jakez 03-19-2011 10:33 AM

When does a MySQL table become too large?
 
Does a large database have the most impact on page load times or cpu load?

I have a table with about 1,000,000 entries so far, every time an image on the page is clicked it inserts an entry, and on every page load it groups the table by 'img' and then sorts by the amount of entries (or clicks) for each img, then displays them in order on the page, about 50 thumbs.

I'm mostly concerned about how long it takes the page to load, does sorting this big of a table make the page load a lot slower or is it mostly unnoticeable?

The table (size is about 70mb):
(
`id` int(11) NOT NULL auto_increment,
`time` int(11) NOT NULL,
`img` varchar(255) NOT NULL,
`country` varchar(30) NOT NULL,
`c_ip` varchar(15) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=INNODB DEFAULT CHARSET=latin1;

woj 03-19-2011 10:52 AM

store country as chr(2) and ip as an int, will make the table way smaller... and if you are grouping by img, probably create an index on it...
:2 cents:

mafia_man 03-19-2011 11:00 AM

Quote:

Originally Posted by Jakez (Post 17990603)
Does a large database have the most impact on page load times or cpu load?

I have a table with about 1,000,000 entries so far, every time an image on the page is clicked it inserts an entry, and on every page load it groups the table by 'img' and then sorts by the amount of entries (or clicks) for each img, then displays them in order on the page, about 50 thumbs.

I'm mostly concerned about how long it takes the page to load, does sorting this big of a table make the page load a lot slower or is it mostly unnoticeable?

The table (size is about 70mb):
(
`id` int(11) NOT NULL auto_increment,
`time` int(11) NOT NULL,
`img` varchar(255) NOT NULL,
`country` varchar(30) NOT NULL,
`c_ip` varchar(15) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=INNODB DEFAULT CHARSET=latin1;

Why not increment a counter and index that column? You could cache as well, I take it these clicks don't change the top 50 very often (if I understand your system correctly).

mvc333 03-19-2011 11:04 AM

Quote:

Originally Posted by Jakez (Post 17990603)
Does a large database have the most impact on page load times or cpu load?

I have a table with about 1,000,000 entries so far, every time an image on the page is clicked it inserts an entry, and on every page load it groups the table by 'img' and then sorts by the amount of entries (or clicks) for each img, then displays them in order on the page, about 50 thumbs.

I'm mostly concerned about how long it takes the page to load, does sorting this big of a table make the page load a lot slower or is it mostly unnoticeable?

The table (size is about 70mb):
(
`id` int(11) NOT NULL auto_increment,
`time` int(11) NOT NULL,
`img` varchar(255) NOT NULL,
`country` varchar(30) NOT NULL,
`c_ip` varchar(15) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=INNODB DEFAULT CHARSET=latin1;

I'd me more concerned about your sql statement than the table size. SQL databases can easily handle millions of rows per table but what are your indexes like vs your queries? Care to post the sql and index info??

borked 03-19-2011 11:22 AM

Quote:

Originally Posted by Jakez (Post 17990603)
Does a large database have the most impact on page load times or cpu load?

I have a table with about 1,000,000 entries so far, every time an image on the page is clicked it inserts an entry, and on every page load it groups the table by 'img' and then sorts by the amount of entries (or clicks) for each img, then displays them in order on the page, about 50 thumbs.

I'm mostly concerned about how long it takes the page to load, does sorting this big of a table make the page load a lot slower or is it mostly unnoticeable?

The table (size is about 70mb):
(
`id` int(11) NOT NULL auto_increment,
`time` int(11) NOT NULL,
`img` varchar(255) NOT NULL,
`country` varchar(30) NOT NULL,
`c_ip` varchar(15) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=INNODB DEFAULT CHARSET=latin1;


What woj said, but 2 main concerns -

1. if you're grouping by image, the img really should be img_id - an int(x) where x is large enough for all your images (ie 4 if you have <9999 images). Far far faster to group an integer index than a varchar one, since the index will be numeric and a shit load smaller.

2. If this is the table that is getting hit with the inserts when an image is viewed, you really should make it an InnoDB table, not myisam.

fris 03-19-2011 04:39 PM

bad db design

blackmonsters 03-19-2011 04:47 PM

MySQL...



:throwup

DangerX !!! 03-19-2011 04:55 PM

Quote:

Originally Posted by blackmonsters (Post 17991276)
MySQL...



:throwup

If not MySQL then what? :)

blackmonsters 03-19-2011 07:27 PM

Quote:

Originally Posted by DangerX !!! (Post 17991284)
If not MySQL then what? :)

Knowing how to write an old school program works pretty good.

:pimp

Tempest 03-19-2011 07:48 PM

Quote:

Originally Posted by Jakez (Post 17990603)
Does a large database have the most impact on page load times or cpu load?

I have a table with about 1,000,000 entries so far, every time an image on the page is clicked it inserts an entry, and on every page load it groups the table by 'img' and then sorts by the amount of entries (or clicks) for each img, then displays them in order on the page, about 50 thumbs.

I'm mostly concerned about how long it takes the page to load, does sorting this big of a table make the page load a lot slower or is it mostly unnoticeable?

The table (size is about 70mb):
(
`id` int(11) NOT NULL auto_increment,
`time` int(11) NOT NULL,
`img` varchar(255) NOT NULL,
`country` varchar(30) NOT NULL,
`c_ip` varchar(15) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=INNODB DEFAULT CHARSET=latin1;

Simple answer... Depends on how beefy your server is... Regardless, the DB is designed poorly to do this sort of thing. Don't know why you'd want to track every single click to every image by every IP. If all you really want is to sort the images, that can be accomplished much easier.

Ron Bennett 03-19-2011 10:07 PM

DB reads, if the indexes and layout is decent, should be no problem - it's presumably the insertions that's killing your server's performance. Batch processing is likely the better way to do what you're seeking ...

More specifically, log the information to an append file with one entry per line (akin to a log file).

Then every hour or whatever (imho, just once a day would likely be sufficient for what you're using that data for), run a cron job to processes that file into another file that contains the aggregate data for each image clicked - then add those click totals / additional data (ie. if also tracking IPs and/or countries, etc.) to the respective image totals / data already in the database.

Ron

Brujah 03-19-2011 10:15 PM

Only 70mb, might be a candidate for a well-managed memory table solution. You might also look at using INSERT DELAYED if you aren't.

bl4h 03-19-2011 10:18 PM

why would you need to insert a row every click. inc a number on one row

if you're doing logging, then log it to a database thats not used to sort the pics ....

Kiopa_Matt 03-19-2011 11:17 PM

Quote:

Originally Posted by bl4h (Post 17991616)
if you're doing logging, then log it to a database thats not used to sort the pics ....

That would be my suggestion. You're inserting a row everytime an image is viewed by IP address, then displaying images based on that data everytime a page is loaded? Don't do that. Separate the two processes.

cam_girls 03-19-2011 11:24 PM

If you search by indexes then the average time complexity is

O(log(tablesize))

which means you could have 1,000,000,000,000 rows and it would only take twice as long to search as 1,000,000 rows

lopez 03-20-2011 05:13 AM

normalization needed.

Due 03-20-2011 06:21 AM

Change the script so it actually build a page rather than making it on the fly.

Log clicks and impressions in a memory table and parse the clicks / impressions every few minutes.

The more load you remove from mysql the more complex sorting you'll be able to do (more popular pix per referring domain, most productive design per domain etc)

blackmonsters 03-20-2011 08:39 AM

:1orglaugh

mafia_man 03-20-2011 08:45 AM

Quote:

Originally Posted by lopez (Post 17991867)
normalization needed.

But that's not degrading performance, that's just wasting space.

bl4h 01-27-2013 03:14 AM

im going through my old posts and reading up. sorry to necropost but really....

a million rows is way too much just to sort 50 thumbnails. It doesnt even matter if your server can handle it, its a complete and total waste. you create a table with all the ips, you parse that table into a smaller table which the website will read from quickly. you purge that database as often as you wish it to be purged. the website would read from a simple table that has the number of unique clicks the image has. simple


All times are GMT -7. The time now is 07:09 PM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123