• Welcome to Milwaukee HDTV User Group.
 

News:

If your having any issues logging in, please email admin@milwaukeehdtv.org with your user name, and we'll get you fixed up!

Main Menu

TMJ4 Full HD?

Started by trev57, Wednesday Sep 08, 2010, 03:30:39 PM

Previous topic - Next topic

trev57

Did I really just see a commercial advertising TMJ4 as "full high definition"? You have got to be kidding me. Do they think we're that stupid or do they not know what FULL HD is? C'mon man!

REVM1M

well , that stupid Scott Steele claims his 7 day forcast is 3D HD. What do you expact from TMJ4.

WITI6fan

I think they're using "Full HD" to refer to 1080i opposed to WITI's 720p. And the 3D forecast thing is referencing the "Live on every level" feature of the weather graphics computers. Basically you can make elements appear in front of the talent that can then move behind them. Really, I don't think it's something that they should bother promoting because nobody cares that it's being generated in "real time" besides people who know about the system. In my opinion it's about as pointless as promoting what kind of switcher or character generator the station has.

In related news, I know WTMJ recently took delivery of at least one new live unit (a Sprinter model van), and could have sworn I've seen a couple HD live shots in the past few weeks, so maybe they're finally upgrading their remotes to HD too? They should be embarrassed that they're still doing 16:9 SD lives more than a year after their HD launch when the competitor has been doing HD lives since their HD debut.

SRW1000

Quote from: WITI6fan;56183I think they're using "Full HD" to refer to 1080i opposed to WITI's 720p.
"Full HD" was a term adopted by manufacturers to indicate equipment that could handle a 1080p signal.  1080i is no more "Full HD" than is 720p, so it's a rather disingenuous claim on their part.

Quote from: WITI6fan;56183And the 3D forecast thing is referencing the "Live on every level" feature of the weather graphics computers. Basically you can make elements appear in front of the talent that can then move behind them. Really, I don't think it's something that they should bother promoting because nobody cares that it's being generated in "real time" besides people who know about the system. In my opinion it's about as pointless as promoting what kind of switcher or character generator the station has.
Agreed.  I keep picturing some uninformed viewers quickly grabbing their old red/blue glasses just to see the 3D graphics swirling around Scott Steele.

Quote from: WITI6fan;56183They should be embarrassed that they're still doing 16:9 SD lives more than a year after their HD launch when the competitor has been doing HD lives since their HD debut.
Agreed.  They should also feel ashamed leaving the HD bug on the screen during those obvious SD shots.

Scott

Xizer

Quote from: SRW1000;56185"Full HD" was a term adopted by manufacturers to indicate equipment that could handle a 1080p signal.  1080i is no more "Full HD" than is 720p, so it's a rather disingenuous claim on their part.

"Full HD" is a marketing buzzword; it has no set meaning. In my opinion, 1080 lines of resolution = full HD.

Jimboy


SRW1000

Quote from: Xizer;56186"Full HD" is a marketing buzzword; it has no set meaning. In my opinion, 1080 lines of resolution = full HD.
1080i and 720p are roughly equivalent.  And while Full HD is a marketing buzzword, it does have a generally accepted meaning.

What Channel 4 is showing wouldn't qualify, and is disingenuous.

Scott

SRW1000

Quote from: Jimboy;56187Really....what is "Full HD"?

http://www.nhk.or.jp/digital/en/superhivision/
Now that would be nice.  Heck, I'll just settle for pure HD, unmarred by sub-channels at this point.  Half the stations in the market are abandoning  the ideal of OTA HD.

Scott

Xizer

Quote from: SRW1000;561941080i and 720p are roughly equivalent.  And while Full HD is a marketing buzzword, it does have a generally accepted meaning.

What Channel 4 is showing wouldn't qualify, and is disingenuous.

Scott

Uhh, what are you smoking? 1080i is as significant an improvement over 720p as 720p is over 480p. It all depends on the bitrate. Have you seen the 720p channels? Even the worst 1080i channels look better.

SRW1000

Quote from: Xizer;56196Uhh, what are you smoking? 1080i is as significant an improvement over 720p as 720p is over 480p. It all depends on the bitrate. Have you seen the 720p channels? Even the worst 1080i channels look better.
That's simply not true, and is dependent on content.  Many prefer 720p for sports over 1080i, for example, due to the smoother picture and lack of interlacing artifacts.  Considering temporal resolution, 1080i and 720p are almost equal.

1080p is an improvement over either 1080i or 720p, but low bitrates can make any format look bad.

Scott

Xizer

#10
640x480 = 307,200 pixels
1280x720 = 921,600 pixels
1920x1080 = 2,073,600 pixels

720p is a 3x resolution increase over standard definition. 1080i/p is a 6x resolution increase over standard definition. It's over twice the resolution of 720p.

The majority of content looks significantly better at 1080i resolution than it does at 720p. Only bitrate-starved fast-action sequences look bad, but even then a good de-interlacer should still make it a bearable experience.

Milwaukee's NBC is an example of poor 1080i. Their bitrate-starved signal is 12.5 Mbps and they must have a really shitty converter because although Chicago's NBC affiliate has a bitrate of only 500 Kbps - 1 Mbps more than Milwaukee's, their transmission of identical programming looks a lot clearer and less "blurry" than Milwaukee's. It's absolutely noticeable when switching back and forth between 4-1 and 5-1, and that is why I always opt for WMAQ over WTMJ whenever they're showing the same program.

Milwaukee's CBS is an example of poor 1080i because the bitrate is 9 Mbps. It's horrible. The lowest bitrate HD channel in the Milwaukee or Chicago market (Live Well HD doesn't count)

The only station in the Milwaukee market that presents a good example of why 1080i is a massive improvement over 720p is the CW affiliate, which looks amazing. WVTV transmits at 17 Mbps. It looks even better than Chicago's best looking HD channel, WGN-TV, which is also a CW affiliate. The CW seems to have their act together when it comes to high definition.

So at the end of the day, it all comes down to the equipment used by the affiliates and the bitrates they broadcast at for 1080i to make a big difference.

SRW1000

Quote from: Xizer;56204640x480 = 307,200 pixels
1280x720 = 921,600 pixels
1920x1080 = 2,073,600 pixels
You're dismissing resolution over time.  More accurately:

1280x720 = 921,600 pixels @ 60 fields per second = 55,296,000 pixels per second
1920x540 = 1,036,800 pixels @ 60 fields per second = 62,208,000 pixels per second

Quote from: Xizer;56204720p is a 3x resolution increase over standard definition. 1080i/p is a 6x resolution increase over standard definition. It's over twice the resolution of 720p.
That's true, for static content.  As soon as the picture starts moving, it's no longer accurate.

Quote from: Xizer;56204The majority of content looks significantly better at 1080i resolution than it does at 720p. Only bitrate-starved fast-action sequences look bad, but even then a good de-interlacer should still make it a bearable experience.
Interlacing can help mitigate the problems, but 720p content can offer a significantly smoother experience, especially for sports.

Here's a good summary article that explains the differences.  

A more in-depth article can be found at this link, which discusses why the Department of Defense chose to go with 720p instead of 1080i.  Here's a few select quotes from their FAQs:
QuoteFAQ #1: "Does the 1080i format provide better picture resolution in which 1080 lines are scanned, which provides for two million pixels - double the number offered by the 720p format?"

The 1080i format does not provide better picture resolution, nor does it double the resolution of the 720p format. Of the two recognized commercial technology formats for high definition television (HDTV), 720p is the HDTV format that yields the best quality images with the fewest image artifacts. Progressive scan formats compress more efficiently thereby providing a higher quality image to the end user.

Furthermore, it is fundamentally the quality of the pixels that matter, not the simple raw pixel count. The process of interlace scan image interleaving introduces significant artifacts (clearly visible damage to the image). In the interlace picture illustration above you can see the damage that is done to any part of the image that has any motion. For critical DoD imagery functions, such artifacts are clearly discernable and undesirable. 720 progressive scan has virtually no distortion artifacts, whereas 540/60i has distortion artifacts throughout the image caused by the interlacing scan lines. These interlace distortion artifacts show up in object motion, but also cause still image distortion artifacts on imagery details like herringbone or striped patterns. These interlace distortion artifacts obscure important data and can give the appearance of false data. Lastly, what is called 1080 interlace does not appear to the human eye as 1080 lines, but rather as something more like 700 lines. This known phenomenon is the so-called "Kell Factor," which acknowledges degradation caused by receiver interline flicker and motion during the scanning process. Therefore, 720p is the optimum choice from both a temporal quality and distortion free spatial quality point of view.

FAQ #5: "How do I know I'm making the right decision in choosing a progressive format ?.. what about the standards ?"

Today 720p delivers the best image quality at the lowest practical bandwidth, with the fewest image artifacts, and thus delivers the best "bang for the buck" to the American taxpayer. When 1080p technology is commercially available and stable, DoD plans to actively pursue this imaging format.

The other factor that comes into play is screen size and seating distance.  For the human eye to start seeing the additional 1080p/i details on a 50" set, for example, one would have to sit less that 10' away from the screen.

Quote from: Xizer;56204Milwaukee's NBC is an example of poor 1080i. Their bitrate-starved signal is 12.5 Mbps and they must have a really shitty converter because although Chicago's NBC affiliate has a bitrate of only 500 Kbps - 1 Mbps more than Milwaukee's, their transmission of identical programming looks a lot clearer and less "blurry" than Milwaukee's. It's absolutely noticeable when switching back and forth between 4-1 and 5-1, and that is why I always opt for WMAQ over WTMJ whenever they're showing the same program.

Milwaukee's CBS is an example of poor 1080i because the bitrate is 9 Mbps. It's horrible. The lowest bitrate HD channel in the Milwaukee or Chicago market (Live Well HD doesn't count)
Agreed.  WTMJ and WDJT are both significantly degrading their programming.

Quote from: Xizer;56204The only station in the Milwaukee market that presents a good example of why 1080i is a massive improvement over 720p is the CW affiliate, which looks amazing. WVTV transmits at 17 Mbps. It looks even better than Chicago's best looking HD channel, WGN-TV, which is also a CW affiliate. The CW seems to have their act together when it comes to high definition.
I will agree that WVTV is doing a great job.  Let's hope they remain an example of high-quality, and forgo the lure of multicasting.

Quote from: Xizer;56204So at the end of the day, it all comes down to the equipment used by the affiliates and the bitrates they broadcast at for 1080i to make a big difference.
That's all true, but it also depends on the content.

Scott

Talos4

In the End, it all comes down to how it looks on my TV.

In this town, ( I don't live in any other) on my displays and the programming I watch, 720p is superior to 1080i.

Quoting reams of techno babble may prove a point technically. but, my eyes don't read. all they see are motion artifacts and they don't like it.

ArgMeMatey

Quote from: Talos4;56208In the End, it all comes down to how it looks on my TV.

In this town, ( I don't live in any other) on my displays and the programming I watch, 720p is superior to 1080i.

Quoting reams of techno babble may prove a point technically. but, my eyes don't read. all they see are motion artifacts and they don't like it.

Is there some calculation that uses 1080i/720p as well as bit rate to yield some ordinal or preferably interval "grade" that consumers could use to objectively assess the quality of broadcasters' images?  

Maybe the FCC and the FTC should require stations to use a minimum resolution and bit rate to call themselves HD.  Otherwise they could just call themselves D.  For example, "George Mallet, Today's TMJ4, D".  :)

SRW1000

Quote from: ArgMeMatey;56209Is there some calculation that uses 1080i/720p as well as bit rate to yield some ordinal or preferably interval "grade" that consumers could use to objectively assess the quality of broadcasters' images?
Not really.  It's too dependent on the content and the encoding, and the equipment and perception of the viewer.  There are people out there that either can't tell the difference, or aren't bothered by the drop in quality.

Quote from: ArgMeMatey;56209Maybe the FCC and the FTC should require stations to use a minimum resolution and bit rate to call themselves HD.  Otherwise they could just call themselves D.  For example, "George Mallet, Today's TMJ4, D".  :)
That would be tough to do.  Although the ATSC set up different formats that stations could digitally broadcast, they don't have control over how it was implemented.  The intent was for the available bandwidth to be used for one HD signal, or up to four SD signals.  That was never an established rule, however, and it's completely up to the stations how they use their allotted bandwidth.  I don't have a problem with that, but at the same time it's frustrating to see what has happened to HD in the past couple of years on some of the local stations.

Even though they can use their bandwidth however they see fit, it doesn't seem quite right that they should still be able to call the result HD, when visible artifacts bring the quality down to almost YouTube quality.  The HD-lit monikor was popular years ago, but now it's just accepted.  Back in the days of that battle, I had written a few emails to the folks in charge of the ATSC, looking for some explanations and guidance, but never got any kind of reply.  Sadly, that war was lost, and the effects can be seen today.

One thing that I never understood is why the 1080i stations didn't broadcast their dramas and other less-active content in 1080p @ 24 or 30 frames per second, which is part of the ATSC spec.  A lot of that content was acquired at 24 or 30 fps anyway, so it's not like they would be discarding content.  Perhaps it's due to equipment compatibility reasons.

Scott