Danger's one night programming challenge

Soldato
Joined
15 Nov 2008
Posts
5,060
Location
In the ether
I'm clearly bored, and hence have decided to give myself a little challenge for the evening, I thought I'd see if anyone wants to join in.

After "finally" unpacking my laser printer I've always wanted to be able to swtich on my computer in the morning (or even better have it switch itself on from suspend) collate the days major news stories (and tech / sports stuff) and print all that info out on no more than two pages of A4 in a nice format with some (but few) images.

Possible in one evening? Probably not. Worth a go? I think so.

Anyone interested - post here with ideas etc.. :)

Dstat

EDIT 1: 18:18.
Okay so I'm going for using Perl (scripting language) and curl for now. Which is easy enough. I think I'll have some sort of array to store the URLs that have the news stories that I'm interested in and then cycle through them and grab the site code with curl so something like:

my @wanted_sites = ("news.bbc.co.uk".......);
my @wanted_sites_code;

my $counter =0;
foreach $site (@wanted_sites){
@wanted_sites_code[$counter] = `curl $site`;
}

then write parsers for each site maybe :confused:
 
Last edited:
Associate
Joined
14 Apr 2008
Posts
1,230
Location
Manchester
Why don't you just follow the RSS feeds and collect the stories that way?

Then you can parse each story in turn for keywords you like and get it to print those.

This sort of system as a commercial entity is called Media Monitoring...
 
Soldato
Joined
13 Jan 2003
Posts
21,462
With OSX you can cut areas of web pages of interest to dashboard widgets. If the web page changes, the widget is updated automatically.

Hit F12 and you have a complete summary from all your interesting locations.

So you want your own newspaper :D
 
Top Bottom