Linux Network Tools


GNU Wget is a free utility for non-interactive download of files from the Web. It supports HTTP, HTTPS, and FTP protocols, as well as retrieval through HTTP proxies.

log to a server using POST and then proceed to download the desired pages, presumably only accessible to authorized users

Log in to the server. This can be done only once.

wget --save-cookies cookies.txt \
    --post-data 'user=foo&password=bar' \

Now grab the page or pages we care about.

wget --load-cookies cookies.txt \

Note: If the server is using session cookies to track user authentication, the above will not work because –save-cookies will not save them (and neither will browsers) and the cookies.txt file will be empty. In that case use –keep-session-cookies along with –save-cookies to force saving of session cookies.

download all the files that are necessary to properly display a given HTML page

wget -E -H -k -K -p

mirror site

debug wget with -d (and -o)

run wget with “-d” to check how to send HTTP request

$ wget -d --load-cookies cookies.txt --content-disposition


---request begin---
HEAD /show_bug.cgi?id=214397 HTTP/1.0
User-Agent: Wget/1.12 (linux-gnu)
Accept: */*
Connection: Keep-Alive
Cookie: Bugzilla_login=2088; Bugzilla_logincookie=iehkBQOdds

---request end---

then, run wget with your your HTTP header.

$ wget --no-cookies --header "Cookie: Bugzilla_login=2088; Bugzilla_logincookie=iehkBQOdds" \

with -o, save output to a file.


curl auth

curl --user name:pass
curl -u name:pass

curl post data

for post data:

curl --request POST --data 'key1=val1&key2=val2'

for file upload:

curl --request POST --data "fileupload=@filename.txt"

curl and cookies