GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   Curl or Wget (https://gfy.com/showthread.php?t=1091865)

Vapid - BANNED FOR LIFE 12-06-2012 06:46 AM

Curl or Wget
 
What do you prefer

EddyTheDog 12-06-2012 06:48 AM

Curl....

Vapid - BANNED FOR LIFE 12-06-2012 06:49 AM

#1 curl ^

HomerSimpson 12-06-2012 08:07 AM

cURL :thumbsup

fris 12-06-2012 08:25 AM

curl from php/ruby/python,etc

use curl and wget from console

bns666 12-06-2012 08:56 AM

wget :thumbsup, using it from shell only.

ottopottomouse 12-06-2012 08:57 AM

curl + php

V_RocKs 12-06-2012 09:02 AM

depends on what I am doing...

AndrewX 12-06-2012 09:10 AM

Quote:

Originally Posted by V_RocKs (Post 19354936)
depends on what I am doing...

From console wget is easier to use and is available even when curl is not installed (yet).

seeandsee 12-06-2012 09:13 AM

Curl i guess :D

Stann 12-06-2012 09:14 AM

wget from shell

Killswitch 12-06-2012 09:14 AM

Quote:

Originally Posted by fris (Post 19354858)
curl from php/ruby/python,etc

use curl and wget from console

:thumbsup

JamesM 12-06-2012 09:16 AM

for downloading wget from shell.,

for doing automation curl and php.

shake 12-06-2012 11:29 AM

As most others have said, curl for programming, wget is a nice program to use on it's own from the command line.

idolbucks 12-07-2012 03:46 PM

wget -i list.txt is nice for dl'ing a huge list of things

moeloubani 12-07-2012 03:51 PM

im actually in the early stages of planning something here and im thinking of using curl but still trying to figure out how to do what i need to do with it

basically i need to use a login form on one site to log into another site - easy
then i need to display what was on that other site onto the new site - easy

the tricky part is i want to change the links on the new site and on the page gathered from the old one so that when someone clicks on them they stay on the new site instead of navigating out

any ideas? maybe set up some sort of iframe after the initial login? i have access to both sites

AutumnBH 12-07-2012 04:00 PM

Same as everyone else, curl + php or wget from the command line...

However the other day I was scraping an ebook archive. I used curl + php to scrape the html pages and get the actual pdf urls. Then in the next stage I downloaded all the pdfs using a shell script. However I found some hosts would issue a 206 error (partial content) when using wget. I ended up having to use curl from the command line to get the goods.

MrCain 12-07-2012 09:42 PM

Wget from the shell. CURL in PHP.

pornsprite 12-09-2012 11:22 AM

wget with perl and the command line curl with php

fris 12-09-2012 11:25 AM

Here is an example i did recently with wget

Code:

#!/bin/bash

# grab images from fishki.net blog posts

if [ $# -ne 1 ]
then
        echo "fishki.net url to grab images needed."
        exit
fi

if [[ "$1" == http://fishki.net/comment.php?id=* ]]
then
        echo "downloading..."
        dir=`date "+%Y-%m-%d_%H:%M"`;
        wget -r -p -q -nd -e robots=off -P images_$dir -A '*.jpg' -R 'tn.jpg' -H -D 'ru.fishki.net' $1
        echo -n "images saved: " ; ls images_$dir/ | grep ".jpg" -c
else
        echo "invalid url: must contain http://fishki.net/comment.php?id=xxxxx"
fi
exit 64
;;
esac

;)

mafia_man 12-09-2012 12:10 PM

http://aria2.sourceforge.net/

fris 12-09-2012 12:35 PM

get the latest 20 popular images from 500px.com

Code:

for line in `wget --quiet -O- http://500px.com/popular | grep "from=popular" | sed -n 's/.*<img src="\([^"]*\)".*/\1/p' | sed s/"3.jpg"/"4.jpg"/ | sed s/"?t".*$//`; do wget -O $RANDOM.jpg --quiet "$line"; done

VenusBlogger 12-09-2012 05:37 PM

I'm suprised so much people here are programmers...

so where are the people that never used CURL+WGET?

You dont wonder or ask why they use it? Are you not curious at all?

Maybe its something that could be helpful and help in your webmaster tasks.. but nobody that never used it jumps into this thread, im very surprised... No curiosity? NOBODY?... That's how humans discover things, with curiosity...

I personally used WGET in SSH before, but only a few times.

Miguel T 12-09-2012 06:36 PM

cURL all the way!

CYF 12-09-2012 06:44 PM

wget for command line, curl for scripts and stuff.

edgeprod 12-09-2012 06:52 PM

I don't think I've ever used curl OUTSIDE of a script, or wget INSIDE of one, so I see them as environment-based tools. I've been coding cURL stuff all weekend, so I'd definitely say I use cURL a lot more FREQUENTLY, if that's the question.

The answer of "this or that" is usually "the right tool for the job" .. so what's the job?

sandman! 12-09-2012 06:58 PM

wget.....


All times are GMT -7. The time now is 05:01 PM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123