04-18-2007, 05:36 PM
|
|
|
Confirmed User
Join Date: Feb 2006
Location: Porn Valley
Posts: 937
|
Quote:
Originally Posted by ucv.karl
If someone tells me they have a 1280x720 at 4Mbps, I know that it's going to look good. If it's an xvid, divx, H.264, or WMV, it's (in my opinion) secondary information. The most important factors to accessing the general perceived quality are the bitrate and frame size. For example, the movies sold from the apple store are shit quality and they use the H.264. They look like shit because the bitrate is so low, it's barely DVD quality bitrate. So when people talk about 720p = "HD". That's incorrect, or ill-defined. The apple store movies uses 720p and the H.264 and look lackluster on an HDtv. The bitrate and the frame size are the essential pieces of information when discussing the perceived quality of the video.
Also, implying that 'quicktime HD' (i.e., H.264) is better than WMV, because it's 'HD', isn't accurate. The H.264 codec is used on Blu-ray disc (hence it's 'labeled' as a 'HD') and it's a great form of compression. However, this doesn't mean that H.264 is going to look better than the WMV. At the same bitrate and frame size, the WMV always seems to looks better. Also, the H.264 is also a bit heavy on the processing side (i.e., the 'average' computer would choke on the high bitrate of the H.264, and be fine with the WMV).
That's what I am talking about!!!
Terrific HD content 
|
Glad you like!
|
|
|