Quote:
Originally Posted by BigFurry
I have no stake in this fight, but oldgaga I don't think you understand what cloaking really means.
Real cloaking means serving a different page to the Search Engine and Visitors. If the /out/ links on TPD behave the same way for Googlebots as for normal visitors, it's not cloaking. Have you tested TPD with a Googlebot user agent?
There is nothing wrong with running your external links through an internal tracker. Facebook and Twitter does it too.
|
Hello BigFurry, thanks for you reply

I perfectly understand that cloaking means serving a different page to the Search Engine and Visitors.
VISITORS: when a visitor clicks this kind of "hidden" links he before get redirected to an internal page and finally he arrives to an external site (the external site in this example hosts malware)
GOOGLEBOT: when Googlebot checks the same link he detects it as internal link. For Google there is no track of the malware site where the visitor was sent by the redirect.
Conclusion: As i showed here above Googlebot can't detects these "bad" external links because these links are set as internal. In few words when Google analyze this link for it theporndude.com is linking to a subpage of theporndude.com and not to the real external source that is hosting malware. Theporndude applies this technique to a large amount of sites so that he is serving to Google an optimized, different version of his website. Cloaking!
About Facebook and Twitter: they don't use script links like theporndude, plus they shows a message to the visitors asking them to give their confirmation/click before to get redirected to an external site.