![]() |
Google's new methodologies? Discussion on google's alternative PR methods
Just some alternative methodologies. Google keeps its method secret obviously. However, based on the recent update the method has changed dramatically. Here's some alternatives that may explain the change. Also offers possibilities re how webmasters can use these changes to their advantage.
Source: D. Leonhart 1. Google might start valuing inbound links within paragraphs much higher than links that stand on their own. (For all we know, Google is already doing this.) Such links are much less likely to be the product of a link exchange, and therefore more likely to be genuine "democratic" votes. 2. Google might look at the concentration of inbound links across a website. If most inbound links point to the home page, that is another possible indicator of a link exchange, or at least that the site's content is not important enough to draw inbound links (and it is content that Google wants to deliver to its searchers). 3. Google might take a sample of inbound links to a domain, and check to see how many are reciprocated back to the linking domains. If a high percentage are reciprocated, Google might reduce the site's PageRank accordingly. Or it might set a cut- point, dropping from its index any website with too many of its inbound links reciprocated. 4. Google might start valuing outbound links more highly. Two pages with 100 inbound links are, in theory, valued equally, even if one has 20 outbound links and the other has none. But why should Google send its searchers down a dead-end street, when the information highway is paved just as smoothly on a major thoroughfare? 5. Google might weigh a website's outbound link concentration. A website with most outbound links concentrated on just a few pages is more likely to be a "link-exchanger" than a site with links spread out across its pages. Google might use a combination of these techniques and ones not mentioned here. We cannot predict the exact algorithm, nor can we assume that it will remain constant. What we can do is to prepare our websites to look and act like a website would on a "democratic" Web as Google would see it. For Google to hold its own against upstart search engines, it must deliver on its PageRank promise. Its results reflect the "democratic" nature of the Web. Its algorithm must prod webmasters to give links on their own merit. That won't be easy or even completely possible. And people will always find ways to turn Google's algorithm to their advantage. But the techniques above can send the Internet a long way back to where Google promises it will be. |
Where did you get it?
|
Quote:
|
Quote:
|
time to build googlewebs
|
The never ending dance is on.
Expect nothing stable from Google anylonger. Your lucky weeks are now just lucky days. Which is not all bad. |
Quote:
|
Quote:
|
Very true Chemical.
My point was is that ya will no longer be able to estimate the duration of placement like the past. Or Can You? I not seen one hold longer than 72 hours but I am only doing Cursery research, I am not in the thick of it. |
I read that complete article somewhere recently... good speculation but only time will tell.
In the meantime word on the SEO street is that the 'legit' techniques have been largely unaffected. |
Quote:
|
Unfortunately there's no such thing as "legit" techniques, as anything done to improve your ranking in the SERPS has always been frowned upon by the guys at Google. There's never been ethical SEO also, it's just something used by people to try and seem "morally higher" than cloakers/spammers.
|
Quote:
|
Nice article, thanks for sharing.
|
The biggest factor nowadays is the filter. Certain keywords are now just essentially impossible.
|
Quote:
cluck, if i used the xml feeds from searchfeed on a site of mine, could i open relevant search results directly as a popup? or do they have to be searched for from that site? |
Quote:
|
Or you can just use those Russian RSS pimping scripts floating around. Works better and your popups can make you MEEEEEEEEELIONS. LOL
|
I am new to all this but I have leaned a lot real fast.
1. The number of sites linking to you means very little. Everyone says this is most important, I say that is bullshit. Linkbacks mean very little as far as getting high PR. 2. I think the number of keywords on your site versus daily searches for those keywords has a major part in PR ranking. The more keywords you have the better. I do not think they penalize you for too many. 3. I really think that googles toolbar is working like Alexa's in some way. Has anyone broken the code apart and can confirm this? I may be totally off on all this but from what I see....... |
Quote:
|
old news.
|
Quote:
|
Quote:
thanks |
Quote:
|
way to copy and paste a certain newsletter $5 submissions :Graucho
|
Quote:
|
Quote:
|
Read this; http://www.scroogle.org/fiasco.html .
This is what I thought might happen (though they have done it on a larger scale). I thought they might start using keywords in domains, and the similarity between links (i.e. if every site linking to another site uses the words 'teen sex') it might set off a red light and make them punish that site somewhat. It is a good read anyway. |
I don't think Google's algorithm has changed much, if at all - I think it is making some changes and will make more over a long period.
From what I can see, their first change was to reduce the valus of keywords some *sites are using. *Sites that are interlinking from the same ip address - (If your box only runs from 1 ip - All the domains on that box has been devalued) I could be way off - But this is my current thinking! |
I believe the original post to be pretty accurate based on what I've seen over the last month. Good synopsis.
|
Quote:
|
How long did it take for you to figure out a successful business model for your site? How often did this business model change while you were growing? Is it still changing?
|
All times are GMT -7. The time now is 01:01 AM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123