Quote:
|
Originally Posted by woj
how can you measure something subjective as quality in terms of percentages? 
|
It's not that hard really? it?s just a representation of what you will see visually. There are no ?hard numbers? you can really give. You can get into the tech as to WHY there is a visual difference but still it?s just easier to give my opinion on it and how you can test to see for yourself. When you've been dealing with video for as long as I have it's really not that hard to gauge what the difference is w/what your eyes can visually see. To test, the only thing you need to do is to get two identical monitors going on, same computer (same video card ideally w/two inputs, one VGA, other DVI). Hook up them side by side and setup your video card to play video streams simultaneously on both screens. Have a look. You'll clearly see a mass difference in quality. Sure, you can't really get a 100% accurate representation of what it's going to be in % or anything like that, but there is a clear difference (and quite a large one at that) between VGA and DVI. I would say at *least* a 25% change in quality. The larger the screens you test it with the easier you will tell there is a difference in quality.
For reference the DVD's I used were Hero and The Fifth Element

Use just about any film with bright vibrant colors and you will instantly see a difference with a 19" LCD monitor and above. Also? the difference between DVI and VGA is huge, whereas the HDMI interface I can?t tell the difference at all with 19? LCD?s when using DVI. I?m guessing you?ll see a difference with monitors that are at least 21? and w/super high quality video, I?m just not up to that point yet so I can?t really test it accurately.