How can i encode dvds using nvidia6800 hardware?

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • leftiszi
    Junior Member
    Junior Member
    • Aug 2004
    • 2

    How can i encode dvds using nvidia6800 hardware?

    Hello guys. As the title suggests i do not understand how can i encode dvds into divx, xvid etc using the built in hardware in nvidia 6800. Can anyone HEEELP? Thans.
  • setarip
    Retired
    • Dec 2001
    • 24955

    #2
    Read the excellent tutorials at:

    Digital Digest DivX Xvid Section - List of recommended, top 10 DivX, Xvid and AVI articles, guides and software, latest news and updates

    Comment

    • leftiszi
      Junior Member
      Junior Member
      • Aug 2004
      • 2

      #3
      Well the first thing i did before posting here was to look at the guides but i didn't find anything. Could you send me a direct link to the guide you saw info on how to use nvidia's hardware to do the encoding? Thanks.

      p.s. Strangely enough, google does not return any useful links on nvidia 6800 dvd encoding.

      Comment

      • BoF
        Moderator
        • Nov 2001
        • 954

        #4
        Re: How can i encode dvds using nvidia6800 hardware?

        Originally posted by leftiszi
        Hello guys. As the title suggests i do not understand how can i encode dvds into divx, xvid etc using the built in hardware in nvidia 6800. Can anyone HEEELP? Thans.
        the decoding of mpeg2 streams (dvd) in the encoding process (dvd->xvid/divx) could use video card hardware capabilities but I don't know any program like virtualdub that uses such an acceleration.
        [www.scandiumrecords.com][Logan dataspirit]

        Comment

        • ziadost
          Super Moderator
          • Mar 2004
          • 5525

          #5
          as far as i know, a graphics card's decoder would only be used in video playback, or output to another device, such as to a tv through s-video
          "What were the things in Gremlins called?" - Karl Pilkington

          Comment

          • BoF
            Moderator
            • Nov 2001
            • 954

            #6
            Originally posted by ziadost
            as far as i know, a graphics card's decoder would only be used in video playback, or output to another device, such as to a tv through s-video
            it would mean that programmers can't get uncompressed frames from hardware decoder output and have to use a software decoder...
            [www.scandiumrecords.com][Logan dataspirit]

            Comment

            Working...