The only problem I remember around that time was some directory splitting errors Google had in their algo that caused the same thing to happen to a few of mine - basically it had to do with writing sites with relative links instead of absolute links. If you had "vanity domains" like the .net and .org that were mirroring the main site, then Googles algo would split the pages between the domains which of course resulted in dupe penalties and loss of backlinks to the main site. They have cleaned up the code a little in the algo to try to prevent that, but until they finish the testing on the scraper prevention code they are working into the algo now, I wont know the full effect of the other fix. Hope that helps - if not - give me a yell on icq 1276570340 - always shows me offline but I hide back here
