GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   Recent Google SEO Interview: Hmmm no mention of page speed :) (https://gfy.com/showthread.php?t=959405)

$5 submissions 03-20-2010 02:05 PM

Recent Google SEO Interview: Hmmm no mention of page speed :)
 
Interesting to note what was deemed IMPORTANT ENOUGH to talk about and the stuff left off the table. http://www.stonetemple.com/articles/...s-012510.shtml

$5 submissions 03-20-2010 02:06 PM

The "crawl budget":

Quote:

There is also not a hard limit on our crawl. The best way to think about it is that the number of pages that we crawl is roughly proportional to your PageRank. So if you have a lot of incoming links on your root page, we'll definitely crawl that. Then your root page may link to other pages, and those will get PageRank and we'll crawl those as well. As you get deeper and deeper in your site, however, PageRank tends to decline.

Another way to think about it is that the low PageRank pages on your site are competing against a much larger pool of pages with the same or higher PageRank. There are a large number of pages on the web that have very little or close to zero PageRank. The pages that get linked to a lot tend to get discovered and crawled quite quickly. The lower PageRank pages are likely to be crawled not quite as often.
The recent blogosphere babble re "page speed" I think is clarified by this:

Quote:

One thing that's interesting in terms of the notion of a crawl budget is that although there are no hard limits in the crawl itself, there is the concept of host load. The host load is essentially the maximum number of simultaneous connections that a particular web server can handle. Imagine you have a web server that can only have one bot at a time. This would only allow you to fetch one page at a time, and there would be a very, very low host load, whereas some sites like Facebook, or Twitter, might have a very high host load because they can take a lot of simultaneous connections.

Kroy 03-20-2010 02:13 PM

Very informative article, thanks for posting that one!

$5 submissions 03-20-2010 03:43 PM

Quote:

Originally Posted by Kroy (Post 16963427)
Very informative article, thanks for posting that one!

yw, man :)

$5 submissions 03-20-2010 06:12 PM

Check out the newest update at http://www.highrevenue.com

baddog 03-20-2010 06:56 PM

Quote:

Your site could be on a virtual host with a lot of other web sites on the same IP address. In theory, you can run into limits on how hard we will crawl your site. If we can only take two pages from a site at any given time, and we are only crawling over a certain period of time, that can then set some sort of upper bound on how many pages we are able to fetch from that host.
Who needs dedicated IPs?

$5 submissions 03-20-2010 09:06 PM

Quote:

Originally Posted by baddog (Post 16963962)
Who needs dedicated IPs?

Yep, that quote would make for great sales copy quote for a web host.

madawgz 03-21-2010 01:45 AM

found this on matts page

interesting


rowan 03-21-2010 01:52 AM

So how can the crawler of an unrelated third party determine "host load?" Back off once the response time hits 20 seconds? :Oh crap :helpme

rowan 03-21-2010 02:20 AM

I can't wait until Googlebot recognises the "X-Chill: Back the fuck off dude" header. :Graucho

TheDoc 03-21-2010 02:31 AM

Nice article....

I always took the page speed/load as how fast everything together worked, the server, web server, code, the overall connection speed to the site. If the site is slow, the bot isn't going to be able to connect as much as it wants or needs based on the incoming links and size of the site.

So if you want a facebook, besides creating something popular, you need the infrastructure in place to be able to handle your own growth and googles bots pounding your ass too if you want love from them.

$5 submissions 03-21-2010 06:39 AM

Good news for hosting companies :D

$5 submissions 03-21-2010 02:06 PM

Quote:

Originally Posted by rowan (Post 16964398)
I can't wait until Googlebot recognises the "X-Chill: Back the fuck off dude" header. :Graucho

:1orglaugh:thumbsup

gaffg 03-21-2010 02:30 PM

good read ty

fatfoo 03-21-2010 04:09 PM

The article talks about Google's SafeSearch. I tried SafeSearch myself to try out how it works. It does filter the search results.

I just used Google's SafeSearch and I turned it to strict filtering.

Here are the topics I got as results for search keyword "sex":

1) Sex Dolls
2) Sexworkers Rights = Human Rights


I also got this image as a search result from Google Images (with strict SafeSearch being used):

http://www.redstategraffix.com/Zawahiri_Sex_Change.jpg

rowan 03-21-2010 06:14 PM

Maybe it didn't recognise boobies in B&W


All times are GMT -7. The time now is 03:55 AM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123