View Single Post
Old 02-19-2006, 12:35 PM  
fr0gman
Confirmed User
 
Join Date: Feb 2005
Posts: 2,093
My understanding of how Google views pages is that it looks for content that appears in unique forms so if you have been updating an average of 4 times a week for 4 months you should have about 64 articles in your archive. If you replicate your original blog x5 and setup a random RSS include on each of the replicated blogs you will have created 5 new blogs that have content that does not exist anywhere else in that form. Each blog will have a unique assembly of the original original content and therefore you have effectively create 5 new blogs with that is unique to that blog.

Now, you replicate the same set of blogs another 5 time on a different domain and you have created another 5 instances of original contant comprised of the archive from the first blog. Then you issue regular updates to the first blog and you will effect changes across the matrix.

Imagine that if you spent one day a week replicating your blog 5x on a new domain and server then spent the other 4 or 5 days a week adding something new to your "home blog" after two months you would have a self supporting network of blog pages that are attracting to Google and feed traffic to each of the other blog pages.

When you have a large archive of content on the home blog you should be able to present pages to surfers that are unique in content, meaning that even if a surfer hit 3 or 5 or 10 of your replicated pages he should be presented with an unique assembly of your content and therefore will see an original page each time he loads one of your pages.
__________________
Earn up to $.03 per Visitor -> No Click Monetization!
"Because the World Wide Web is all about two things: horrifyingly stupid psychodrama, and naked chicks."
Wild College Videos | ICQ: 7746696
fr0gman is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote