-r or -m
-r --recursive
Recursive web-suck. According to the protocol of the URL, this can mean two things. Recursive retrieval of a HTTP URL means that Wget will download the URL you want, parse it as an HTML document (if an HTML document it is), and retrieve the files this document is referring to, down to a certain depth (default 5; change it with -l). Wget will create a hierarchy of directories locally, corresponding to the one found on the HTTP server.
This option is ideal for presentations, where slow connections should be bypassed. The results will be especially good if relative links were used, since the pages will then work on the new location without change.
When using this option with an FTP URL, it will retrieve all the data from the given directory and subdirectories, similar to HTTP recursive retrieval.
You should be warned that invoking this option may cause grave overloading of your connection. The load can be minimized by lowering the maximal recursion level (see -l) and/or by lowering the number of retries (see -t).
-m --mirror
Turn on mirroring options. This will set recursion and time-stamping, combining -r and -N.
more info here:-
http://www.cbi.pku.edu.cn/Doc/CS/wget/man.wget.html
it's a great tool
