Quote:
Originally Posted by AdultKing
Majestic & Ahrefs
They both index the entire web. There is usually a 5% tolerance when it comes to results, meaning one may deviate from the other by around 5% on average. Used together you get a pretty complete and accurate picture.
Availability of scalable powerful computing and network resources is better now than it was in 2008. It's possible to index the entire web quite cheaply (relatively) using off the shelf compute resources offered by Amazon and other providers, feed that data to micro services which are scalable and cheap then you can analyse that data for pennies.
|
It was actually 2008 when i let my networks go and started focusing on PPC... but you're right. I think it was that year that seomoz started trying to index portions of the web, a year later or so or later that year, they launched opensiteexplorer which was really just pulling data from yahoo and google. it shouldn't be surprising that there are a few tools now indexing significant chunks of the web to analyze link relationships given cloud services and declining prices. I just hadn't really thought about it. Prior to that, there was only data from search engines and it was sparse at best. It was then that amazon and others were coming online and we were only barely taking advantage and it was still a little pricey.