DB reads, if the indexes and layout is decent, should be no problem - it's presumably the insertions that's killing your server's performance. Batch processing is likely the better way to do what you're seeking ...
More specifically, log the information to an append file with one entry per line (akin to a log file).
Then every hour or whatever (imho, just once a day would likely be sufficient for what you're using that data for), run a cron job to processes that file into another file that contains the aggregate data for each image clicked - then add those click totals / additional data (ie. if also tracking IPs and/or countries, etc.) to the respective image totals / data already in the database.
Ron
__________________
Domagon - Website Management and Domain Name Sales
|