I ended up doing what I suggested and it worked quite well. The biggest problems I encountered were from SELinux which I have now come to hate (even though it's the muts nuts security wise) as the easiest solution always seems to be to disable it.
Due to the way that BackupPC works it's quite difficult to just manually extract data as it retains file structures and allows for incrimental backups so it doesn't just store the data in one massive tar file.
It was quite fun extracting the data actually, servers so often use only a fraction of their capabilities while you're actually watching them. On this occasion I was uncompressing 300Gb of data off one array onto a RAID6 set of velociraptors, I was surprised that the CPU usage wasn't higher - one core was used and it only went up to about 60% usage but the whole transfer took less than 10 minutes.