Quote:
|
Originally Posted by SmokeyTheBear
You make some very good points but i have to disagree with a few of those points i think..
"The moment a Class C is identified as participating in a scheme to undermine and/or circumvent an SE algo, not only is it toast, but so is everyone touching it and to some degree those touched by it."
Can that degree really be very signifigant ? i always tend to disagree with penalization extending beyond the site its on ( ip )
Or else you could make se traps to penalize your competition..
|
Officially, other sites linking to yours are not supposed to be able to hurt your standing for the very reason that they tend to be out of your control and anyone would be able to sabotage your success. That's the official statement by G, but I'm not alone in suspecting (from personal observation) that this does not hold true as much as we would like to believe.
There is a concept known as a 'bad neighbourhood'. Participating (to/from) in 'bad neighbourhoods' can be penalized. The best examples of this are 'Link Farms' - they were an attempt to bolster SE ranks though a syndicated methodology which was really an artificial means to tip the 'link-back scales' in a site's favour. At first, it worked, but once it (link farm) was recognized as being mostly counter-productive (to SE ranking), and *identifiable* (the syndication of a link farm has clear patterns of links, domains, IPs, etc), 'link farms' were targeted as a specific form of SE abuse and measures were applied to flag anyone suspected of *participating* in a link farm and penalizing them accordingly.
Now with human review (yeah! which has been going on for a very long time now), sites being linked to can be evaluated as *participants* in a 'bad neighbourhood'.
There is no question that penalizing a site based on who links to it is treading on thin ice, but it does not take much analysis to flag obvious 'bad neighbourhood' players. That's one of the advantages of human review augmenting automated algos (as in Trust Rank et al).
Because of the covert nature of these schemes (attempts to *artificially* manipulate SERPs) and the deep impact they have on messing up legitimate SERPs, the penalties are serious. If you get busted, you're toast.
Quote:
|
Originally Posted by SmokeyTheBear
Seo is not alowed at all is it ? I thought that was a google rule. so "undermine and/or circumvent an se algo" is basically , anything to improve your rank that wasn't done for your site..
I can barely think of any sites that dont fit into that category.. Or you could argue the opposite that any optomization you do ( shady or not ) is for your own internal se , so nobody fits that category..
|
Academically speaking, SEO does go against the spirit of the rules of SEs. But this is when you take the concept to an abstract level, and as with most consideration at that level, it is less practical and more difficult to apply to tangible scenarios. It does make for sensational statements, but it does not help someone trying to 'do well' when it comes to SE traffic.
Even so, with the most abstract notion of 'avoiding algo manipulation', there is absolutely nothing preventing one from building good sites with lots of good content which will win the legitimate attention of surfers looing for your topic through search engines.
If you take the time to create pages which on their own are something that a surfer would want to visit and unique from other pages out there, you fall into the winning zone - without tripping any 'SE rule violations'.
So in exploring this further, we introduce Meta Tags. I have long ago stopped considering my efforts as SEO (search engine optimizing) and instead think of them as SEF (search engine friendliness). As such, given that the structure of a web page on the internet involves different components and various search engines make use of these components, not using Meta Tags would be like not adding a title to your page. Most of the more sophisticated SEs have stopped looking at Meta Tags because of how they have been abused and since their algos can ferret out what pages are about from the actual text content, the Meta Tags are less relevant.
But this is not true for all SEs. Some SEs (which can bring convertable traffic) do rely on Meta Tags more, so it is prudent to make sure your web pages are 'readible' by as many SEs as possible. Adding Meta Tags does not necessarily 'manipulate' SE results.
Meta Tags can be (and are) abused - that sort of usage does fly in the face of the 'manipluation' guidelines.
The same applies to H1,2,3... tags, and CSS. If you use CSS to make text invisible to surfers to stuff your pages with keywords without messing up your pitch rather than using CSS to apply legitimate style to your site your on 'that side' of the fence again.
SEO/SEF does delve into shades of gray, but just like my 10 year old montitor, some shades of gray show up as clearly black or white now. The same applies to the extreme edges of SEO.
Quote:
|
Originally Posted by SmokeyTheBear
google does it, they were caught not long ago keyword stuffing to improve the rank of a page for its own se ( because it wasn't getting listed properly ) as i recall.
|
That's a different issue all together - now we're talking about the integrity of a biz and with the financial stakes at play, it's not really a big surprise if/when a big player is caught breaking their own rules. Didn't MS get busted by the Linux guys for actually passing up 'private' info from their earlier IE browsers despite their blatent denials of doing so?
G is no different when it comes to pushing the limits. They are huge and behave like a huge biz. Sometimes the corporate mandate of making profit for the shareholders can yield some unsavoury manifestations. And, one of the benefits of being so large is that even being called on it requires serious mass (on the part of the whistle blower). To change anything would require the efforts of a peer titan (with whom they probably already have preemptive deals to keep the boat from rocking).
On this note though, with the enormous influence some SEs now have over Internet commerce, don't be surprised to see more discussion related to FTC, anti-trust and other aspects which evaluate whether the practices and reach of SEs conflict and require the same legal management as MS did when it was deemed that its operating system and web browser were no longer able to be sheltered under the same roof without further compromising market fairness.
SEs which claim to offer surfers pointers to sites which are 'objectively' appropriate and relevant to their queries *AND* engage in Ad Revenue programs related to those queries might get the two jumbled up (to put it nicely) - and not by accident. Hopefully, when the stakes are high enough from the market's perspective, they (SEs) may be called on how they do this and any potential trade conflicts will have to be examined more closely.
But don't hold your breath. When enterprises like G have the audacity to propose a program whereby they will scan/index and publish copyrighted books without the express permission of the copyright holders, and moreso, propose that the copyright holders should notify them if they do NOT want to have their properties published this way, tell us a lot about who's who in the power game.
-Dino