App to download an entire website in Ubuntu?

Bes

Bes

Soldato
OP
Joined
18 Oct 2002
Posts
7,318
Location
Melbourne
Augmented said:
wget.

But do try and play nice; people usually don't appreciate automated tools sucking up masses of bandwidth and resources ;).
Nah its for work purposes. I need to download an HTML library from our website as I will be without internet for a few days from wednesday :)
 
Associate
Joined
18 Oct 2002
Posts
2,261
Location
Kidderminster
-r or -m

-r --recursive
Recursive web-suck. According to the protocol of the URL, this can mean two things. Recursive retrieval of a HTTP URL means that Wget will download the URL you want, parse it as an HTML document (if an HTML document it is), and retrieve the files this document is referring to, down to a certain depth (default 5; change it with -l). Wget will create a hierarchy of directories locally, corresponding to the one found on the HTTP server.
This option is ideal for presentations, where slow connections should be bypassed. The results will be especially good if relative links were used, since the pages will then work on the new location without change.
When using this option with an FTP URL, it will retrieve all the data from the given directory and subdirectories, similar to HTTP recursive retrieval.
You should be warned that invoking this option may cause grave overloading of your connection. The load can be minimized by lowering the maximal recursion level (see -l) and/or by lowering the number of retries (see -t).
-m --mirror
Turn on mirroring options. This will set recursion and time-stamping, combining -r and -N.

more info here:-

http://www.cbi.pku.edu.cn/Doc/CS/wget/man.wget.html

it's a great tool :)
 
Back
Top Bottom