Welcome to the GoFuckYourself.com - Adult Webmaster Forum forums.

You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today!

If you have any problems with the registration process or your account login, please contact us.

Post New Thread Reply

Register GFY Rules Calendar
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >
Discuss what's fucking going on, and which programs are best and worst. One-time "program" announcements from "established" webmasters are allowed.

 
Thread Tools
Old 02-02-2006, 02:33 PM   #1
Rhesus
Confirmed User
 
Join Date: Aug 2004
Posts: 2,009
SSH experts

What wget (or other) commandline to use to retrieve only one webpage and not the other pages on the same domain linked to (ie no depth)?
Rhesus is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 02-02-2006, 02:35 PM   #2
psili
Confirmed User
 
Join Date: Apr 2003
Location: Loveland, CO
Posts: 5,526
curl ?

*shrug*
__________________
Your post count means nothing.
psili is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 02-02-2006, 03:43 PM   #3
willysbirthday
Confirmed User
 
Join Date: Feb 2004
Location: balmer
Posts: 436
fetch and wget

most servers have one or the other enabled.. if not, get that shit installed
__________________
oOooooooOOo
willysbirthday is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 02-02-2006, 03:56 PM   #4
x3guide
Confirmed User
 
Join Date: Dec 2001
Location: lake titicaca
Posts: 735
wget http://single.page/a.html

or

GET (perl module) http://single.page/a.html

for mirroring: w3mir or wget in a loop
__________________
Shomer fuckin shabbas
x3guide is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 02-02-2006, 04:34 PM   #5
everestcash
Confirmed User
 
Join Date: Apr 2002
Posts: 2,194
wget is easiest
everestcash is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 02-02-2006, 05:12 PM   #6
keyDet79
Confirmed User
 
Join Date: Feb 2003
Location: Netherlands
Posts: 1,109
Wget gets you the html. You could also try it with lynx, download to a file using something like 'lynx URL > filename.ext', I believe that would work. Check 'man lynx' or 'man wget' for all commands.
__________________

Multihomed quality BW for less
ICQ 51034232 - MSN [email protected] - Email keydet(at)vibehosting.com
keyDet79 is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 02-02-2006, 08:05 PM   #7
prime
Confirmed User
 
Join Date: Feb 2005
Location: Manila
Posts: 400
wget or lynx should work well.

[[email protected] ~]# wget http://primeoutsourcing.com/genstaff.html

i just tested it and works
__________________

| offshore solutions | manual labor | staff leasing | and more!
Dedicated -Motivated-Managed Employees
icq.: 309570461 live chat
prime is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Post New Thread Reply
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >

Bookmarks



Advertising inquiries - marketing at gfy dot com

Contact Admin - Advertise - GFY Rules - Top

©2000-, AI Media Network Inc



Powered by vBulletin
Copyright © 2000- Jelsoft Enterprises Limited.