GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   SEO: how important are different ip`s? (https://gfy.com/showthread.php?t=502716)

hmmwv 08-12-2005 08:07 AM

SEO: how important are different ip`s?
 
letīs say i want to build 20 new sites.
how important for SEO is it that each domain has itīs own ip adress?

EroticySteve 08-12-2005 08:08 AM

They are important if you want to do well with Google.

montel 08-12-2005 08:08 AM

yah different servers is key!

Linkster 08-12-2005 08:12 AM

They dont make one iota of difference unless you are running hundreds of domains and interlinking the pages - and even then unless you come up looking like a link farm you'd probably still be way below their radar - unfortunately a lot of people that dont understand SE's use this as a "fix-all" when it is really such a small issue.
Think about all the free-hosts you see listed in the top 10 - and there is bunches - where they have hundreds of sites on the same IP - and the same domain - doesnt seem to affect them :)

wedouglas 08-12-2005 08:41 AM

Not very important.

Monsieur 08-12-2005 08:44 AM

I'ld say very important if you're building networks linking around just as A --> B --> A, etc.
I'm now building all my portals on different ip ranges

Doc911 08-12-2005 09:01 AM

I'd try to have at least 3-4 different servers with different ip ranges

Juilan 08-12-2005 09:19 AM

Quote:

Originally Posted by Doc911
I'd try to have at least 3-4 different servers with different ip ranges

Good advice from Doc :pimp

Linkster 08-12-2005 09:22 AM

Quote:

Originally Posted by Doc911
I'd try to have at least 3-4 different servers with different ip ranges

Dont you think thats a little overkill for 20 domains? Especially without knowing if they are going to be interlinked at all?

hmmwv 08-12-2005 10:46 AM

keep the opinions coming... :thumbsup

mkx 08-12-2005 11:00 AM

Good question. I have 100's of sites on the same server and many interlink each other for SEO. If I move the biggest sites to a server of their own will google still associate the sites with the old ip?

agaysex 08-12-2005 11:11 AM

no, I think at current time it not affect site ranking. May be very big doorway network can be found by this way - but in manual mode by google experts only.

Linkster 08-12-2005 11:44 AM

Quote:

Originally Posted by agaysex
no, I think at current time it not affect site ranking. May be very big doorway network can be found by this way - but in manual mode by google experts only.


Exactly right - and plenty of people have proven it - its not till you show up on their radar that people start looking at you.
But it is a good little scare tactic to use if youre selling yourself as an SEO :) Along with all the other scam tactics they employ to scare people into thinking that you cant be your own SEO

koreanbbque 08-12-2005 11:50 AM

different IPs are very important. well.. according to the google patent article that i read. it appears they check for same ips, same IP blocks, DNS registration information such as date of registration , how long they are registered for as well as DNS server information.

pretty crazy those google guys.

here's someone's take on the google thingy.. you guys have probably already read something similiar to this.

http://www.mediajunk.com/public/archives/000365.html

Troels 08-12-2005 12:56 PM

If you get outside links into your network it doesn't mean a damn thing.
If you're doing dodgy stuff I suppose there's a bigger risk of a total network ban, but you'll get that anyway if there's a larger manual effort from google's end. There rarely is though.

dcortez 08-12-2005 01:58 PM

There are conflicting schools of thought re having different Class C IPs and their effectiveness on SERPs.

I have operated a dozen or so domains on a single IP address and they have worked ok as far as contributing PR to each other and bringing in SE traffic. Part of the reason for this is that my 'interrealm-linking' is thematic.

There are some who are offering (what I call) 'Class C IP Farms' - claiming that a single web operator can now enjoy the benefits of multiple Class C's for better SE performance. Just as 'Link Farms' have had their day and are now instant (SE) death to any site participating in them, I believe 'Class C IP Farms' will follow the same terminal path.

Here is why:

The spirit of SEs giving greater value to links from 'other' sites is based on the assumption that those 'other' sites are genuine votes of popularity from unrelated websites/operators.

SEs also assume (but understand that this is not always the case), the same Class C IP is often owned by the same website.

When the 'other' sites are really *fake* 'other sites' this does not fall into the spirit of rewarding truly unbiased votes of popularity with PR and subsequent SE prominence. Once 'found out' (just as with 'Link Farms'), Class C Farms hosting 'fake other sites' will be flagged as spamdexers (in my opinion) and the poison domino effect (being flagged as a 'bad neighbourhood') will be deadly.

For this reason, and to assert the logic proposed above, you will be hard pressed to find any published list of the Class C IPs being offered by the Class C IP Farms - they know that secrecy is the only way they can get away with their illusion of 'other sites' interlinking. They also know that the jig is up as soon as an IP network is reported as being a farm.

These 'farmers' should not underestimate the pattern matching capabilities of the SEs and while having geographically and economically separated hosts may offer some 'distance' between the IPs, there are other indicators which can be matched and filtered. For example, using different class Cs with different domains but the domain owner is recognized to be the same (either explicitly or implicitly) kind of defeats the whole purpose of getting secret different IPs.

If there is a benefit to being linked to from different Class Cs it is intended for legitimate arms-length owners. Cheating that rule may work for a while but eventually it will go bust.

I would not invest in building sites on IPs spread across these 'fake other sites' - its too hard to tell when that bubble will burst.

-Dino

Linkster 08-12-2005 02:31 PM

Dino - you have just explained perfectly why it is a waste of time to seperate out domains over class c's - taking Google as an example - if they get a flag that you "might" be running a farm or group for your own benefit (which really doesnt happen much anymore) its all in a manual review and supported by things like an auto review of the domains registrations (guess why Google became a certified registrar recently) and probablility calculations that look at the distributions.
The point made about external links coming into the network is why most people that have hundreds of domains on single Class C's dont get looked at as link farms - they are distributing and receiving PR to and from different "owners" - and no flag would get raised.
As far as ranking in the actual SE results (SERPS) - the more incoming and outgoing you can build the better - which if youre playing with your own 20 domains, is easy enough to do in concert with other WMs - and be able to fly way under the SEs flags

hmmwv 08-12-2005 03:05 PM

thx guys, some good points in here!

Monsieur 08-12-2005 03:34 PM

Great post there Dino!!

mkx 08-12-2005 07:30 PM

I am interested to see what wiredguy has to say :)

baddog 08-12-2005 07:34 PM

Here are a couple of facts provided by .Google :

* Near-Duplicate Content Filter = If multiple search results contain identical titles and snippets, then only one of the documents is returned.
* Host Crowding = If multiple results come from the same Web host, then only the first two are returned

Now, you will find some people that will say, "well, that only applies if the filter is on." They are correct, however, the filter on is the default setting, and I challenge you to figure out how to turn it off. I tried, and gave up after 15 or 20 minutes. I highly doubt any surfer is going to go thru the effort, primarily because, why bother? They don't care.

So, the question is how do I get around this host crowding filter?

You have two options. You can put each domain on a different host, meaning you will have multiple control panels, multiple bills and billing cycles, multiple personalities to deal with . . . kind of a PITA.

Or, you can use Got Web Host. We can give you multiple Class C IP's, with multiple nameservers. We also offer free SEO consultations with our dedicated and virtual SEO plans.

The filter only will happen if the sites are going for the same terms. But your links between your sites will be more valuable if on separate C's. If on the same it doesn't hurt you (unless going for the same terms). But it doesn't help you, Google will see that almost as the same site, like an internal link instead of external.

MrIzzz 08-12-2005 07:49 PM

there is no money in SEO :(

Linkster 08-12-2005 08:32 PM

baddog - that is very poorly written Google speak - when they say they only return the first two results they mean from the same domain (which in actual practice does not include subdomains - they will also include those results up to two per subdomain on a particular search) - not the same hosting co.

the way to see all of the results is to add &filter=0 to the end of the resultant URL that you get on a search - hope that helps

mardigras 08-12-2005 09:00 PM

Personal experience: I recently moved 5 sites to a single hosting account (they were fairly small and all combined only needed a fraction of the space at the account). There was a PR4, two PR3s a PR2 and a PR1. They all went to no PR really quickly and now all my search engine traffic to those domains comes from Yahoo. :2 cents:

baddog 08-12-2005 09:08 PM

Quote:

Originally Posted by Linkster

the way to see all of the results is to add &filter=0 to the end of the resultant URL that you get on a search - hope that helps


How many surfers do you think are adding that to their searches?

Your Mothers Secret 08-12-2005 09:19 PM

25 posts so far and 25 conflicting opinions. :1orglaugh

webmasterworld will provide you an accurate answer hmmwv :thumbsup

Don't ask those guys about porn and obviously do not ask these guys about
the innerworkings of Google.

dcortez 08-12-2005 10:49 PM

Quote:

Originally Posted by baddog
* Near-Duplicate Content Filter = If multiple search results contain identical titles and snippets, then only one of the documents is returned.

And it should be so. Because it is duplicate content.

The problem is in policing page jackers who scrape copyrighted content and/or trademarks and use them on their pages to win the traffic for those terms. It's one thing to use a snippet for a directory listing (regrettably 'fair use' copyright law stretched to the limit almost permits this - with the exception of trademark abuse), but when your brand name is used on a page in a context which suggests it will lead the brand source, but instead cloak/refresh to an entirely different page (usually a sponsor for the jacker), this is violation with material damages.

Any jackers doing this better watch their asses - times are changing and they will quickly find themselves without any sponsors to send their stolen traffic to.

The good news (regarding scrapers/jackers) is that recently I have started successfully shutting sites down by contacting sponsors who benefit from the copyright/trademark violations. Most decent sponsors will close an affiliate account if it is engaged in illegal content activities.

The road for this was partly paved by the FTC going after sponsors whose affiliates violated SPAM laws - sponsors are starting to pay more attention to 'cheating by proxy'. It does not pay to risk their biz by associating with copyright criminals.

Quote:

Originally Posted by baddog
* Host Crowding = If multiple results come from the same Web host, then only the first two are returned. So, the question is how do I get around this host crowding filter?

This is an exaggeration. My experience has been to the contrary.

Specifically, if I do searches on terms for which I have won SE prominence (through original content), ALL of my domain results (SAME web host, SAME class C, SAME IP) come up in the SERPs for a given search.

I don't just get one of my domains showing up in the list. Often the first page is dominated by all my domains listed - and yes that includes G.

-Dino

dcortez 08-12-2005 11:01 PM

Quote:

Originally Posted by Linkster
baddog - that is very poorly written Google speak - when they say they only return the first two results they mean from the same domain

Exactly.

For brevity sake (and variety) the first two for EACH domain (even if same IP) are listed with a link to 'more results from this site'.

This whole notion of needing multiple class C's through false representation (ie. pretending that they are all different web bizs) is not really founded in fact and even if it were, it is just one tiny (fragile) degree of separation from the problem which those buying into the Class C IP farms are trying to avoid.

The moment a Class C is identified as participating in a scheme to undermine and/or circumvent an SE algo, not only is it toast, but so is everyone touching it and to some degree those touched by it.

Again, the big SEs are trying to value link-backs for what they are as far as votes of confidence. If you fake it, they will not come.

Can you spell 'permanent sandbox'?

If one is worried about only one of their domains winning the SE traffic for a particular term - why? You only need one stream to direct to your conversion machine. So if your A site shows up instead of your B site, that's fine - you still won the surfer's attention - now turn them into $$$.

-Dino

-Dino

SmokeyTheBear 08-12-2005 11:25 PM

Quote:

Originally Posted by dcortez
Exactly.

For brevity sake (and variety) the first two for EACH domain (even if same IP) are listed with a link to 'more results from this site'.

This whole notion of needing multiple class C's through false representation (ie. pretending that they are all different web bizs) is not really founded in fact and even if it were, it is just one tiny (fragile) degree of separation from the problem which those buying into the Class C IP farms are trying to avoid.

The moment a Class C is identified as participating in a scheme to undermine and/or circumvent an SE algo, not only is it toast, but so is everyone touching it and to some degree those touched by it.

Again, the big SEs are trying to value link-backs for what they are as far as votes of confidence. If you fake it, they will not come.

Can you spell 'permanent sandbox'?

If one is worried about only one of their domains winning the SE traffic for a particular term - why? You only need one stream to direct to your conversion machine. So if your A site shows up instead of your B site, that's fine - you still won the surfer's attention - now turn them into $$$.

-Dino

-Dino

You make some very good points but i have to disagree with a few of those points i think..


"The moment a Class C is identified as participating in a scheme to undermine and/or circumvent an SE algo, not only is it toast, but so is everyone touching it and to some degree those touched by it."

Can that degree really be very signifigant ? i always tend to disagree with penalization extending beyond the site its on ( ip )

Or else you could make se traps to penalize your competition..

Seo is not alowed at all is it ? I thought that was a google rule. so "undermine and/or circumvent an se algo" is basically , anything to improve your rank that wasn't done for your site..

I can barely think of any sites that dont fit into that category.. Or you could argue the opposite that any optomization you do ( shady or not ) is for your own internal se , so nobody fits that category..

google does it, they were caught not long ago keyword stuffing to improve the rank of a page for its own se ( because it wasn't getting listed properly ) as i recall.

PbG 08-13-2005 02:59 AM

I think that the popularity and variety of the inbound links relative to the term searched outweigh the value of the IP absent any flags to the contrary. Google is the only SEO to mention host crowding and I agree that the same are likely to be limited to results from the same domain. Besides SEO is not an ARIN justification for issuing a dedicated IP.

Linkster 08-13-2005 03:44 AM

Quote:

Originally Posted by baddog
How many surfers do you think are adding that to their searches?

I was just providing you with the method as you requested - the only time a surfer will use it is unknowingly - when you get to the end of the results Google automatically does it for you with their link to see the "omitted results" with a repeated search.

As far as moving domains - mardigras - that could have happened for numerous reasons which shouldnt be automatically attributed to going to different hosting other than the new host may not have the serverconfig set correctly (which happens all the time and screws peoples SE rankings)

YourMothersSecret - I would have to say that that would be the last place I would send someone for advice unless they were able to understand the difference between the hype and the truth - too many people answer questions over there with what "they have heard" - it's a good way to be misled quickly

Linkster 08-13-2005 03:46 AM

Quote:

Originally Posted by PbG
Google is the only SEO to mention host crowding and I agree that the same are likely to be limited to results from the same domain. Besides SEO is not an ARIN justification for issuing a dedicated IP.

Very true - also Google already knows about their "mis-speaks" and have someone rewriting all of the help pages right now - they are supposed to be done with the rewrite at the end of the summer.

dcortez 08-13-2005 09:59 AM

Quote:

Originally Posted by SmokeyTheBear
You make some very good points but i have to disagree with a few of those points i think..


"The moment a Class C is identified as participating in a scheme to undermine and/or circumvent an SE algo, not only is it toast, but so is everyone touching it and to some degree those touched by it."

Can that degree really be very signifigant ? i always tend to disagree with penalization extending beyond the site its on ( ip )

Or else you could make se traps to penalize your competition..

Officially, other sites linking to yours are not supposed to be able to hurt your standing for the very reason that they tend to be out of your control and anyone would be able to sabotage your success. That's the official statement by G, but I'm not alone in suspecting (from personal observation) that this does not hold true as much as we would like to believe.

There is a concept known as a 'bad neighbourhood'. Participating (to/from) in 'bad neighbourhoods' can be penalized. The best examples of this are 'Link Farms' - they were an attempt to bolster SE ranks though a syndicated methodology which was really an artificial means to tip the 'link-back scales' in a site's favour. At first, it worked, but once it (link farm) was recognized as being mostly counter-productive (to SE ranking), and *identifiable* (the syndication of a link farm has clear patterns of links, domains, IPs, etc), 'link farms' were targeted as a specific form of SE abuse and measures were applied to flag anyone suspected of *participating* in a link farm and penalizing them accordingly.

Now with human review (yeah! which has been going on for a very long time now), sites being linked to can be evaluated as *participants* in a 'bad neighbourhood'.

There is no question that penalizing a site based on who links to it is treading on thin ice, but it does not take much analysis to flag obvious 'bad neighbourhood' players. That's one of the advantages of human review augmenting automated algos (as in Trust Rank et al).

Because of the covert nature of these schemes (attempts to *artificially* manipulate SERPs) and the deep impact they have on messing up legitimate SERPs, the penalties are serious. If you get busted, you're toast.

Quote:

Originally Posted by SmokeyTheBear
Seo is not alowed at all is it ? I thought that was a google rule. so "undermine and/or circumvent an se algo" is basically , anything to improve your rank that wasn't done for your site..

I can barely think of any sites that dont fit into that category.. Or you could argue the opposite that any optomization you do ( shady or not ) is for your own internal se , so nobody fits that category..

Academically speaking, SEO does go against the spirit of the rules of SEs. But this is when you take the concept to an abstract level, and as with most consideration at that level, it is less practical and more difficult to apply to tangible scenarios. It does make for sensational statements, but it does not help someone trying to 'do well' when it comes to SE traffic.

Even so, with the most abstract notion of 'avoiding algo manipulation', there is absolutely nothing preventing one from building good sites with lots of good content which will win the legitimate attention of surfers looing for your topic through search engines.

If you take the time to create pages which on their own are something that a surfer would want to visit and unique from other pages out there, you fall into the winning zone - without tripping any 'SE rule violations'.

So in exploring this further, we introduce Meta Tags. I have long ago stopped considering my efforts as SEO (search engine optimizing) and instead think of them as SEF (search engine friendliness). As such, given that the structure of a web page on the internet involves different components and various search engines make use of these components, not using Meta Tags would be like not adding a title to your page. Most of the more sophisticated SEs have stopped looking at Meta Tags because of how they have been abused and since their algos can ferret out what pages are about from the actual text content, the Meta Tags are less relevant.

But this is not true for all SEs. Some SEs (which can bring convertable traffic) do rely on Meta Tags more, so it is prudent to make sure your web pages are 'readible' by as many SEs as possible. Adding Meta Tags does not necessarily 'manipulate' SE results.

Meta Tags can be (and are) abused - that sort of usage does fly in the face of the 'manipluation' guidelines.

The same applies to H1,2,3... tags, and CSS. If you use CSS to make text invisible to surfers to stuff your pages with keywords without messing up your pitch rather than using CSS to apply legitimate style to your site your on 'that side' of the fence again.

SEO/SEF does delve into shades of gray, but just like my 10 year old montitor, some shades of gray show up as clearly black or white now. The same applies to the extreme edges of SEO.

Quote:

Originally Posted by SmokeyTheBear
google does it, they were caught not long ago keyword stuffing to improve the rank of a page for its own se ( because it wasn't getting listed properly ) as i recall.

That's a different issue all together - now we're talking about the integrity of a biz and with the financial stakes at play, it's not really a big surprise if/when a big player is caught breaking their own rules. Didn't MS get busted by the Linux guys for actually passing up 'private' info from their earlier IE browsers despite their blatent denials of doing so?

G is no different when it comes to pushing the limits. They are huge and behave like a huge biz. Sometimes the corporate mandate of making profit for the shareholders can yield some unsavoury manifestations. And, one of the benefits of being so large is that even being called on it requires serious mass (on the part of the whistle blower). To change anything would require the efforts of a peer titan (with whom they probably already have preemptive deals to keep the boat from rocking).

On this note though, with the enormous influence some SEs now have over Internet commerce, don't be surprised to see more discussion related to FTC, anti-trust and other aspects which evaluate whether the practices and reach of SEs conflict and require the same legal management as MS did when it was deemed that its operating system and web browser were no longer able to be sheltered under the same roof without further compromising market fairness.

SEs which claim to offer surfers pointers to sites which are 'objectively' appropriate and relevant to their queries *AND* engage in Ad Revenue programs related to those queries might get the two jumbled up (to put it nicely) - and not by accident. Hopefully, when the stakes are high enough from the market's perspective, they (SEs) may be called on how they do this and any potential trade conflicts will have to be examined more closely.

But don't hold your breath. When enterprises like G have the audacity to propose a program whereby they will scan/index and publish copyrighted books without the express permission of the copyright holders, and moreso, propose that the copyright holders should notify them if they do NOT want to have their properties published this way, tell us a lot about who's who in the power game.

-Dino

Rui 08-13-2005 12:20 PM

way overrated, specialy what people use them for (the projects that is)...


All times are GMT -7. The time now is 05:58 PM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2026, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123