How good is the W-VHS analog format

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • Carlos Menem
    Platinum Member
    Platinum Member
    • Feb 2002
    • 164

    How good is the W-VHS analog format

    I read their runtime is higher but I would also like to know other aspects. For example I heard they can record in 480i and 1080i resolutions... does that mean W-VHS can equal or even surpass DVD quality? In terms of price I would image DVD is the way to go but I just bought a AG-1980 to transfer all my tapes without jitter and I noticed it can record in W-VHS so is that better then record directly on my PC with a card like PCTV for example?
    me gusta la coca cola con ron
  • RFBurns
    To Infinity And Byond
    • May 2006
    • 499

    #2
    That machine records digital to the tape instead of analog, which gives it the ability to record the hi-resolutions. It can match standard DVD quality.

    And yes it would be better than capturing to PC via capture card, unless you set your capturing to no compression.

    Its not a bad thing as far as the concept of recording hi-res pictures, the drawback is the fact that it is a tape-based, physical contact medium which like any other vcr or device that uses tape and a spinning or stationary head assembly, wear and tear is at a considerably higher rate than an optical or hard disk/memory stick system.

    The big argument is "what exactly is considered HI-DEF". Some would say that 1080I is true HI-DEF, while others would say no, 1080P and up is HI-DEF.

    A true HI-DEF picture will be in progressive scan mode, which is the entire intent of HD television. Replacing the gap in the scan with another scan line gives us more information of the image, which of course means more to see on the screen.

    A picture at 1080I has a gap every other scan line. That means less information we see on the screen. Half of it to be exact. So if we only get half of the information scanned and displayed, does that mean it is HI-DEF??

    No.

    It might look impressive on a large HD screen. And the monitor/television may incorporate circuits to improve the image, maybe even increase the resolution to a higher number, ie 720I to 1080I upscaling, but it still is only half of the information that could be there to begin with.

    The same picture using progressive scanning puts all of the information that can be scanned into the image displayed. The gap is replaced with another scan line, and the scan rate is increased by 2 times. For example, a 1080I image would register at a horizontal scan rate of 60 kilocycles per second, or 60KHz. Its partner, 1080P would register at a scan rate of 120 kilocycles per second, or 120KHz. (Kilo=1000)

    Since we have to replace the unscanned portions of the image with a scan, we must increase the scan rate by two times in order to maintain proper frame rate for the number of lines being scan per frame, otherwise we would see "lag" from frame to frame if the scanning were not doubled, we would miss out on bits and pieces of the information, meaning less resoltuion, and less resolution is not HD. The image may be 1080 in size, but it is not size that defines HD, it is the scanning frequency rate per frame and number of frames per second rate that defines HD.

    So with the technical facts on I vs P, it is quite simple to see that 1080P has more information than 1080I, which translates to the crisp clear imagry of High Definition.

    Upscalers can take the smaller size, ie 480 or 720 and bump that up to 1080. It may even "fill in" the gaps of interlace and make it progressive. An upscaler taking an interlace signal and converting it upward in size as well as increasing the scan rate must get that information to fill in the gap from somewhere. It does not magically come up with the missing information, it takes each line of interlace and doubles it to fill in the gap. More circuits compensate and correct for scan lag or delay and produces a new image that "looks" higher resolution, but in fact is not.

    This can be sampled with a Blu-Ray DVD vs its standard DVD counterpart. The Blu-Ray has twice the information per scan than its standard DVD partner, and if anyone has seen a side-by-side comparison of a standard DVD and a Blu-Ray DVD, your seeing the difference between true Hi-Def vs semi-Hi-Def as well as seeing how upscaling has its limits.


    Welp I did not mean for this to get long winded. "whew, tired fingers!" Hope this tidbit of info helps.


    Here..I will fix it!

    Sony Digital Video and Still camera CCD imager service

    MCM Video Stabalizer

    Comment

    Working...