• Welcome to Milwaukee HDTV User Group.
 

News:

If your having any issues logging in, please email admin@milwaukeehdtv.org with your user name, and we'll get you fixed up!

Main Menu

World's First 4K TV Channel Goes Live

Started by Tom Snyder, Monday Jan 14, 2013, 09:56:24 PM

Previous topic - Next topic

KryptoNyte

Ralph, you do have to remember that 720p video frequently comes in at 60 frames per second (as opposed to 1080i at 29.97) with cable QAM, and because folks have so many different types of devices pushing the video as well as massive variation in display device, the users experience obviously varies, so it makes sense that some folks can actually see 720p as a better image than 1080i.  There are too many factors involved to make a cover-all statement that one is better than the other.

KryptoNyte

Quote from: PONIES;59164That's what ignorant people say when they are presented with an overwhelming influx of information that they cannot comprehend.

Here's an example of what NBC's national distribution feeds to their affiliates (like our very own WTMJ) look like:



ABC's feeds are basically identical to NBC tech spec wise. They use the same encoders and bitrates. Except NBC is 1080i, and ABC is 720p.

ABC looks considerably worse than NBC because of this.

..........

Ralph Kramden

KryptoNyte, I NEVER said 720p was better OR worse than 1080i. I simply said that 720p is "good". PONIES claim that "there's no such thing as good 720p" is ridiculous and wrong.

PONIES

I think the problem here is that we simply have very different standards of what constitutes "good."

Here in the year 2013, with 4K right around the corner, a 921,600 pixel picture no longer constitutes "good" to me. 720p might have been "good" quality a decade ago, but now it is just mediocre at best. Standard definition is firmly in the realm of "terrible" and has been for a long time.

1080i on the other hand, when given a proper bitrate, still qualifies as "good" in my eyes as it is 2,073,600 pixels - over twice the resolution of 720p.

1080i/p used to be "great." Now 4K/Ultra HD is the new "great," coming in at over 8 million pixels.

So, to summarize:

480i/p - terrible
720p - mediocre
1080i/p - good
4K/Ultra HD - great

Hope this helps. :wave:

budda

When this forum started. HD sets were just coming out. OTA was the best picture and it was spectacular. I remember PBS had a looping DEMO show of art and scenery was quite the view. Then as cable and Satellite.Tried to offer more channels, save bandwidth and suck people in. The picture while still clear got worse and worse from the providers. One year, I think it was CBS. But from one year to the next, I was like is there something wrong with my TV? It was transmission compression and bit rate.
In the next couple of years Ultra will be pushed, you will see new topset boxes, tuners and services as well as TV's. Once main stream. It will have the life squeezed out of it as well. Not for quality, but the bottom line. Peace

Ralph Kramden

I only use our OTA antenna for our HD programming. Do the TV stations in Milwaukee compress the signal also? Or is it just the cable and satellite companies?

KryptoNyte

Ralph, the source broadcasting standard in the United States is straight up Mpeg-2 video (like my cable ATSC service and your OTA service), so yes, there is some minor compression.  The compression artifacting that you see in Mpeg-2 is, well, simply put, typically not as bad as the h.264 highly compressed satellite feeds (DVB h.264 mpeg4 video is the junk that Ponies keeps posting, and tries to pass it off as a feed directly from the network sources).

If Ponies was experiencing something other than the crap h.264 compression that his provider apparently uses, he would likely be seeing a much better 720p (and 1080i) image on his viewing devices.

Ralph Kramden


PONIES

#23
Quote from: KryptoNyte;59175Ralph, the source broadcasting standard in the United States is straight up Mpeg-2 video (like my cable ATSC service and your OTA service), so yes, there is some minor compression.  The compression artifacting that you see in Mpeg-2 is, well, simply put, typically not as bad as the h.264 highly compressed satellite feeds (DVB h.264 mpeg4 video is the junk that Ponies keeps posting, and tries to pass it off as a feed directly from the network sources).

If Ponies was experiencing something other than the crap h.264 compression that his provider apparently uses, he would likely be seeing a much better 720p (and 1080i) image on his viewing devices.

Oh sweet zombie Jesus.

Now you're definitely trolling.

There is no way you could read my posts and think I was talking about that DirecTV/Dish Network "pizza pan" satellite technology trash that is intended for the end consumer.

If you're not trolling, I suggest you read up on:

http://en.wikipedia.org/wiki/Backhaul_(broadcasting)
http://en.wikipedia.org/wiki/Television_receive-only

And then go re-read my posts. Carefully, this time.

LoadStar

To quote from the good book of Wikipedia:

"The ATSC A/53 standard used in the United States, uses MPEG-2 video at the Main Profile @ High Level (MP@HL), with additional restrictions such as the maximum bitrate of 19.4 Mbit/s for broadcast television and 38.8 Mbit/s for cable television, 4:2:0 chroma subsampling format, and mandatory colorimetry information."

There is a fairly new version of the ATSC standard, the A/72 Part 1:2008 standard, that does support H.264. No one uses it, though, since it would require all new tuners.

KryptoNyte

Quote from: PONIES;59177Oh sweet zombie Jesus.

Now you're definitely trolling.

There is no way you could read my posts and think I was talking about that DirecTV/Dish Network "pizza pan" satellite technology trash that is intended for the end consumer.

If you're not trolling, I suggest you read up on:

http://en.wikipedia.org/wiki/Backhaul_(broadcasting)
http://en.wikipedia.org/wiki/Television_receive-only

And then go re-read my posts. Carefully, this time.

Let me know when you believe that you've finished editing [yet another] post of yours before I take any more of my personal time to respond.

SRW1000

Quote from: PONIES;59172So, to summarize:

480i/p - terrible
720p - mediocre
1080i/p - good
4K/Ultra HD - great

Hope this helps. :wave:

I'll agree that 480i/p is bad, but 720p and 1080i are roughly equal in quality.  Sure, 720 has only half the resolution, but it doesn't suffer from any interlace artifacts.   It also delivers almost the same amount of visual information per second, since the entire frame is being refreshed 1/60 of a second, instead of half of a 1080i frame being refreshed in that same period.

The real enemy in this market is all of the multicasting, which reduces the amount of data provided for either standard.  Starving the signal is a sure way to reduce detail and/or introduce macroblocking or other distracting digital artifacts.

I would also agree that WISN's process of re-encoding ABC's 720p content to 1080i has reduced the quality.

On the other hand, 1080p is a big improvement over both 1080i and 720p, but this isn't being broadcast in any local markets, and probably never will be.  For it to be practical, stations would need to switch to MPEG4 or the upcoming H.265 encoding, and that's highly unlikely.  (Remember all the uproar and hassle during the digital conversion?)  Stations seem to be much more interested in sending us more content versus better content.  Even if MPEG4 were to be used, we'd likely see more subchannels or stations could be tempted to take advantage of the lower required bit rate and return some of their spectrum to the FCC.

Your best sources for 1080p content is Blu-ray.

While 4K would be nice, it's a long way off for average consumers.  Little content and really expensive displays will slow adoption, and you'll need a pretty big screen or sit really close in order to fully see the improvement.

Scott

PONIES

Quote from: SRW1000;59180I'll agree that 480i/p is bad, but 720p and 1080i are roughly equal in quality.  Sure, 720 has only half the resolution, but it doesn't suffer from any interlace artifacts.   It also delivers almost the same amount of visual information per second, since the entire frame is being refreshed 1/60 of a second, instead of half of a 1080i frame being refreshed in that same period.

This is incorrect.

Very little content is actually 60 FPS. Most of it has a native framerate of either 23.976 frames per second for scripted high budget material or 29.970 frames per second for documentaries/news/reality shows/etc.

Therefore, for most content, 1080i has 1080p detail upon de-interlacing. 1080i60 video de-interlaces into 1080p video at up to 30 frames per second.

720p doesn't suffer from interlacing artifacts but interlacing artifacts shouldn't be a problem for anyone with an even remotely competent de-interlacing hardware.

SRW1000

Quote from: PONIES;59181This is incorrect.

Very little content is actually 60 FPS. Most of it has a native framerate of either 23.976 frames per second for scripted high budget material or 29.970 frames per second for documentaries/news/reality shows/etc.
How about sports?

Quote from: PONIES;59181Therefore, for most content, 1080i has 1080p detail upon de-interlacing. 1080i60 video de-interlaces into 1080p video at up to 30 frames per second.

720p doesn't suffer from interlacing artifacts but interlacing artifacts shouldn't be a problem for anyone with an even remotely competent de-interlacing hardware.
Deinterlacing hardware and software can do a good job, but it also compromises the picture.  The wikipedia entry on this is actually pretty good.

Scott

PONIES

I don't know how many sporting events are actually at a native 60 FPS; however of the various ESPN and Fox feeds I've viewed, CBS, SportsNet, Pac-12, NBC, etc. always blow them away in the quality department. It's no contest. I don't really notice a temporal difference between 720p60 and 1080i60 content, but the resolution difference is abundantly clear.

And this is coming from a hardcore PC gamer that can spot 30 FPS console garbage and a video game running at 60 FPS from a mile away.