![]() |
SSH experts
What wget (or other) commandline to use to retrieve only one webpage and not the other pages on the same domain linked to (ie no depth)?
|
curl ?
*shrug* |
fetch and wget
most servers have one or the other enabled.. if not, get that shit installed |
wget http://single.page/a.html
or GET (perl module) http://single.page/a.html for mirroring: w3mir or wget in a loop |
wget is easiest
|
Wget gets you the html. You could also try it with lynx, download to a file using something like 'lynx URL > filename.ext', I believe that would work. Check 'man lynx' or 'man wget' for all commands.
|
wget or lynx should work well.
[[email protected] ~]# wget http://primeoutsourcing.com/genstaff.html i just tested it and works :pimp |
| All times are GMT -7. The time now is 01:38 PM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123