![]() |
Effective batch WHOIS program or script?
I'd like to find a program for looking up large (anywhere from hundreds to thousands) of domain names. I have my own program that does this, but it is purely a comnand-line *nix program. I would really like to find something that will run server-side or even under Windows. I am not looking for anything super-fancy (I don't care if it supports non-standard TLDs, I don't need a billion fancy options), I just want a reliable "Yes" or "No" on availability.
If you know of a program or script that does effective batch WHOISing, I'd appreciate it if you'd point me toward it. Thanks, SpaceAce |
analogx.com has one called whois ultra.. i've been using it forever.
will do single/lists/wildcards/etc.. |
Quote:
Anyone who knows of another, please post it, still. Thanks, SpaceAce |
SpaceAce, I wrote a windows client that does it based on a thread YOU wrote a couple of months ago.
Goes through any number of search engines, reads through a list of search terms, extracts all the matching domains on the top 5 or so pages, then runs a whois on 'em and dumps the output into a textfile or DB. I could polish it up for you or just the whois portion if you want. How much are you looking to pay?:Graucho |
Get the analog-x one, it's fast as hell and free.
WG |
Quote:
The only part I need is the WHOIS. I still use my search engine leeching script once in a while and I actually wrote an advanced PERL module for spidering. I did the entire adult tree on a certain directory site using my new module. The problem is, I want something that doesn't run from a Linux command line. SpaceAce |
Quote:
SpaceAce |
Quote:
WG |
|
AnalogX does a nice job, be sure you use the Ultra Scan option.
|
I am bumping this to see if there are any others. AnalogX is a decent program but it seems to have some sort problem that causes it to gradually bog down while running large projects. By the time it's checked a few thousand, it is only doing one search every few seconds as opposed to completing several searches per second as it does in the beginning. It really bites the big one and crashes with a memory error around 10,000-12,000.
Nasty, I think I used DNA before. I will go download it, again. Thanks for the link. SpaceAce |
Quote:
1) It is pretty damn slow. I wish it had an "UltraSearch" mode of operation. 2) It won't let me import a list of words into the "generate domains" panel. It wants me to type them in one at a time whihc is not going to fly. The padding feature is nice, but useless if I have to type in all the main keywords by hand. It's almost what I want... SpaceAce |
Ever considered trying to split your input file into smaller batches for analog-x? I never tried batches that large but you could just split it couldn't you?
WG |
Quote:
So, as you see, even using a small list of 200 words and a small list of words for padding purposes, it adds up very quickly. I would have to split my word lists up into chunks of about 5 words each which is very time consuming. I like the way AnalogX works, but that bog-down problem is terrible. It's probably something silly, too, like not closing down a thread after a connection or a massively inefficient listbox control. SpaceAce |
All times are GMT -7. The time now is 04:04 AM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123