View Single Post
Old 03-19-2011, 10:07 PM  
Ron Bennett
Confirmed User
 
Join Date: Oct 2003
Posts: 1,653
DB reads, if the indexes and layout is decent, should be no problem - it's presumably the insertions that's killing your server's performance. Batch processing is likely the better way to do what you're seeking ...

More specifically, log the information to an append file with one entry per line (akin to a log file).

Then every hour or whatever (imho, just once a day would likely be sufficient for what you're using that data for), run a cron job to processes that file into another file that contains the aggregate data for each image clicked - then add those click totals / additional data (ie. if also tracking IPs and/or countries, etc.) to the respective image totals / data already in the database.

Ron
__________________
Domagon - Website Management and Domain Name Sales
Ron Bennett is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote