Newznab - config help etc. Not an invites thread. :)

Joined
1 Oct 2006
Posts
13,901
Posted this in the NZBMatrix thread, but figured it would probably be better off in it's own thread.

Before we start, this thread is aimed at people running their own NewsNab indexes at home. It's not a thread for people looking to go public, it's solely about home indexing folk.

Therefore, and as above this is not a place to ask for invites to people's indexes. :D

-

Anyway, I'll start. I've got a NN index setup indexing around 40 groups, and I'm starting to back fill each group when I can spare the processing power and disk I/O on the main rig. When that's done, I'll move the VM back onto it's permanent and slightly less-powered home. Things have gone well, and the docs out there fairly helpful.

The problem I have is:

Has anyone managed to get Sickbeard talking to NewzNab via HTTPS yet? I keep getting a bunch of auth errors sadly.

I noticed a bunch of homespun newznab servers out there, but they're all http only. I'd rather not back off to http to get the API working, but until Sickbeard is updated to allow auth to Newznab servers I fear this is my only option. :(

Any takers? Happy to offer advice/assistance to anyone with config issues (apart from the one above. :D)
 
Soldato
OP
Joined
1 Oct 2006
Posts
13,901
Any MySQL nuggets you can impart RB? Be interested to have a dig around the DB. In particular I was after something to purge and delete all groups from the command line instead of having to point and click my way through the lot on the GUI.

Have you trusted the self signed certificate? I've not tried this, but normally issues with SSL and self signed (in general) is from needing to add the cert into the trusted users / publishers container for the computer account (on windows atleast).

Yep, it's trusted. I'll back it off to HTTP tonight and see if I can get it working. Other than that I think it's going to be a case of getting the API URL crafted correctly.

Finally, I've started backfilling my groups. Doing it in blocks of 30 days, but it's terribly slow. My update_binaries used to thrash the DB, CPU and disk I/O with the php/mysql processes being top talkers, now I'm lucky if it uses 100k/sec bandwidth and it's only making one connection per group to get the headers.

Does this sound about right to you all?
 
Soldato
OP
Joined
1 Oct 2006
Posts
13,901
Oh and another thing, the threaded scripts are very handy but you have no idea what they're doing. I'm trying to diagnose why my backfill is running at 5kb/sec and I've really not got a lot to go on. :(
 
Back
Top Bottom