Hey guys,
quick:
When I render a 1080p video with avidemux, AVC codec, constant rate factor at 26 I get an overall bitrate of ca. 11.5 (MediaInfo). When I render the same video in Vegas 11 Pro Trial with the MainConcept AVC with the maximum bitrate set to 15k, average set to 12 I get a HORRIBLY less quality video! The overall bitrate is higher but I see a LOT of blockyness in the video... So many blocks! Bah!
I almost want to use Lagarith Lossless Codec in Vegas and then re--render the stuff with Avidemux but Avidemux ... Sadly... Does not like GPU accelleration which I am a BIG FAN of.
Any tips?
Is there a better way to use the x264 codec in Vegas? Maybe a way where I can actually use the great great "constant rate factor"? I am all about quality, not really telling the exact file size if you know what I mean...
Thank you a lot guys, this is my first post and I hope I am not making to many mistakes.
Have a nice one,
Mega
quick:
When I render a 1080p video with avidemux, AVC codec, constant rate factor at 26 I get an overall bitrate of ca. 11.5 (MediaInfo). When I render the same video in Vegas 11 Pro Trial with the MainConcept AVC with the maximum bitrate set to 15k, average set to 12 I get a HORRIBLY less quality video! The overall bitrate is higher but I see a LOT of blockyness in the video... So many blocks! Bah!
I almost want to use Lagarith Lossless Codec in Vegas and then re--render the stuff with Avidemux but Avidemux ... Sadly... Does not like GPU accelleration which I am a BIG FAN of.
Any tips?
Is there a better way to use the x264 codec in Vegas? Maybe a way where I can actually use the great great "constant rate factor"? I am all about quality, not really telling the exact file size if you know what I mean...
Thank you a lot guys, this is my first post and I hope I am not making to many mistakes.
Have a nice one,
Mega
Comment