App to get webpage source from cmd line fed url ?

Associate
Joined
10 Nov 2004
Posts
2,237
Location
Expat in Singapore
Whilst I am fine with shell scripting I am not sure how to interface with the internet and get the webpage source from an address which I can then manipulate in a script.

What I am looking for is an application you can feed an address from the commandline and receive the page source back.

Does anyone know of such a program or method to do this on Linux. I would happly write the app in VB on Windows but I do not know 'C' and feel learning it for this will take quite a bit of time.

Thanks
RB
 
Thanks Chris

wget did exactly what I needed and I understand lynx can also do the same with the -dump -source flags.

RB
 
Semi-hijack:

What's the difference between curl and wget? Both seem to do the same things. Is there a historical difference? Why choose one over another?
 
Semi-hijack:

What's the difference between curl and wget? Both seem to do the same things. Is there a historical difference? Why choose one over another?

As the name suggests, wget provides just 'get' functionality while curl can work with content both ways - download and upload, while additionally being much more advanced in what you can you with the various protocols and the ones it supports.

I'd say wget is more of an end-user/user-friendly program -it's really just a download manager- while curl is more useful as a developer tool, where it can be scripted.
 
Back
Top Bottom