I have lost count of how many avi files I have converted to DVD, yet I never really bothered with one aspect of the encoding process - the bitrate.
I always just used 2-pass VBR (TMPGEnc) at the maximum bitrate possible, ie so that the output would be around 4200MB. I have always been satisfied with the quality of the output.
The question I wanted to ask which I cannot seem to find 'definitively' answered anywhere is, if the video bitrate of my avi is say 1000kbps, is there any advantage, when encoding, to using a bitrate higher than 1000kbps? I know that source avi quality will not improve no matter how high the bitrate, but if I can get away with encoding at 1000kbps instead of the usual 4000-ish, it would save a lot of CPU time. I vaguely remember being told a few years back that output bitrate should be set at 3-4 times that of the source bitrate in order to maintain the same quality?
Any input appreciated, thanks.
I always just used 2-pass VBR (TMPGEnc) at the maximum bitrate possible, ie so that the output would be around 4200MB. I have always been satisfied with the quality of the output.
The question I wanted to ask which I cannot seem to find 'definitively' answered anywhere is, if the video bitrate of my avi is say 1000kbps, is there any advantage, when encoding, to using a bitrate higher than 1000kbps? I know that source avi quality will not improve no matter how high the bitrate, but if I can get away with encoding at 1000kbps instead of the usual 4000-ish, it would save a lot of CPU time. I vaguely remember being told a few years back that output bitrate should be set at 3-4 times that of the source bitrate in order to maintain the same quality?
Any input appreciated, thanks.
Comment