Yeah I know, I've searched around the internet and think I already know the answer but here goes
Is there such as thing as knowing what bitrate to use when encoding to say for example XVid? I know that the bitrate is obviously directly related to the size of the end file but seeing stuff on the internet there are loads of so called 720p files. These are normally 1280 x 5xx with a bitrate of around 2500-3000. Now this bitrate isn't really that much higher then the same file at a res of 720 x 3xx.
So how the hell do the guys encoding the files work out what bitrate to use? Is there some sort of utility that would scan the source and give you low and high values or is it just simply a case of taking a best guess.
Is there such as thing as knowing what bitrate to use when encoding to say for example XVid? I know that the bitrate is obviously directly related to the size of the end file but seeing stuff on the internet there are loads of so called 720p files. These are normally 1280 x 5xx with a bitrate of around 2500-3000. Now this bitrate isn't really that much higher then the same file at a res of 720 x 3xx.
So how the hell do the guys encoding the files work out what bitrate to use? Is there some sort of utility that would scan the source and give you low and high values or is it just simply a case of taking a best guess.