Associate
- Joined
- 10 Jul 2006
- Posts
- 2,423
I want some software that will download a whole website so I can view it offline and browse it as if I was online (so rejig all the links).
I want to give it an address such as Http://www.bbc.co.uk/sport and it download all the information that falls in that root or below...so it would not start to crawl Http://www.bbc.co.uk for instance.
I want to give it an address such as Http://www.bbc.co.uk/sport and it download all the information that falls in that root or below...so it would not start to crawl Http://www.bbc.co.uk for instance.