HD displays not handling 1080i properly?
-
akirby
- Major Contributor

- Posts: 819
- Joined: Mon Jul 09, 2007 2:52 pm
HD displays not handling 1080i properly?
There is an article in this month's The Perfect Vision that says over half of today's HD displays don't properly display a 1080i signal. I'll have to get the article tonight for specifics but basically it said that many displays take each one of the 540 line frames and scale them individually as opposed to putting both 540 frames together and scaling them at once. The effect is called "bobbing" and can be seen with a test pattern.
Some manufacturers have this flaw on all their displays, some have it on none of their displays and some have it on some but not others. The mfrs did not disagree with the findings. In fact Faroudja admitted that only their newest chips do it correctly.
I assume the same thing would apply to external scalers?
Some manufacturers have this flaw on all their displays, some have it on none of their displays and some have it on some but not others. The mfrs did not disagree with the findings. In fact Faroudja admitted that only their newest chips do it correctly.
I assume the same thing would apply to external scalers?
-
Richard
- SUPER VIP!
- Posts: 2578
- Joined: Wed Sep 08, 2004 1:28 pm
- Location: Atlanta, GA
- Contact:
There are two forms of images, film and video, and both require a unique process to properly deinterlace and then scale. The deinterlacing is the very tricky part and what we are typically trying to overcome with external scaling. Once an image is progressive the scaling is relatively straight forward by comparison.
Film is actually easy using 2/3 pulldown; once determined it locks into place creating a progressive image.
Video is a royal pain and uses adaptive deinterlacing which means it never really quite knows what it should be doing with two fields and is constantly sorting out what field goes with what to create a progressive image.
Interlaced images are typically vertically filtered either by the scaler or your display; another source of artifacts.
Regardless of what we think of 1080I once you understand all the issues of interlaced it becomes quite clear that progressive needs to be our future and 1080I needs to be moved to the dust bin of analog history. Fortunately that is happening with 480I via HD-DVD and Bluray.
Film is actually easy using 2/3 pulldown; once determined it locks into place creating a progressive image.
Video is a royal pain and uses adaptive deinterlacing which means it never really quite knows what it should be doing with two fields and is constantly sorting out what field goes with what to create a progressive image.
That is a cheap way to overcome the problems of adaptive deinterlacing leaving the nasty artifact as described. Now would it not be nice to know that you can simply use an external scaler to overcome that? This is why we want our 1080P displays to accept a 1080P signal!of the 540 line frames and scale them individually as opposed to putting both 540 frames together and scaling them at once. The effect is called "bobbing" and can be seen with a test pattern.
Interlaced images are typically vertically filtered either by the scaler or your display; another source of artifacts.
Regardless of what we think of 1080I once you understand all the issues of interlaced it becomes quite clear that progressive needs to be our future and 1080I needs to be moved to the dust bin of analog history. Fortunately that is happening with 480I via HD-DVD and Bluray.
-
donshan
- Major Contributor

- Posts: 103
- Joined: Fri Nov 12, 2004 1:23 am
As I remember the history back in the 1990s the battle over the new digital standard was between the advocates of progressive vs. the advocates of interlaced held up the final digital standard for years and was only resolved by including both as a compromise.Richard wrote:Regardless of what we think of 1080I once you understand all the issues of interlaced it becomes quite clear that progressive needs to be our future and 1080I needs to be moved to the dust bin of analog history. Fortunately that is happening with 480I via HD-DVD and Bluray.
The problem was the broadcast industry understood analog/interlace and did not want those "computer geeks" taking over their industry with a "computer" screen progressive format.
Of course in the end the progressive will win because today's TVs are just big computers, but old dogs learn new tricks very slowly. It has taken a whole new generation of people in broadcasting to understand digital technology.
I saw a quote recently:
There are ten types of people; those who understand binary and those who don't!
-
Richard
- SUPER VIP!
- Posts: 2578
- Joined: Wed Sep 08, 2004 1:28 pm
- Location: Atlanta, GA
- Contact:
As a format 720P was a no brainer but came with one problem; it is not cheap to build a CRT display to do it. 1920X1080 as a progressive mode was impossible for mass production and low prices. Just like 480I, 1080I was offered instead which only required a CRT display to do 540 lines per sweep cycle, slightly more than the 480 of 480P which was possible and cost effective. What we received were 1080I displays that could not really do 1080I correctly but what the hey, they did work. If they had worked correctly then 720P would have been possible as well as a multiscan display ofering both.As I remember the history back in the 1990s the battle over the new digital standard was between the advocates of progressive vs. the advocates of interlaced held up the final digital standard for years
In the public this turned into a battle of resolution versus frame rate.
Fortunately those pioneers were forward looking by including 1080P 30 and 24 frames in the ATSC format for future use. Today, a 1080P display that does not break the bank is a reality! 1080P sources are soon to be introduced to replace the DVD!
-
lcaillo
- Member
- Posts: 30
- Joined: Thu Aug 25, 2005 7:18 pm
- Location: Gainesville, FL
The idea that the reason progressive was chosen because it is expensive to produce CRTs with that resolution in fallacious. No one was making that argument in the early discusions of the standards. The reason is that the interlaced format could be used to advantage in broadcast bandwidth. 720p was chosen with a lower horizontal resolution for the same reason. Computer monitors have been made for years with higher scan rates and the cost is more related to economies of scale than technological limitations. The decisions that were made with respect to ASTC resolutions and scan type were a set of trade-offs to try to bring together many factions in sereral industries.
Leonard
-
Richard
- SUPER VIP!
- Posts: 2578
- Joined: Wed Sep 08, 2004 1:28 pm
- Location: Atlanta, GA
- Contact:
Not really since 720P encodes more efficiently than 1080I merely because it is progressive rather than interlaced. This is why 720P stations can get away with a sub channel without shaving bits on the HD content while 1080I stations cannot.The reason is that the interlaced format could be used to advantage in broadcast bandwidth.
As an example a film based source encoded at 1080I 30 frames will consume more bandwidth than that same content encoded at 1080P 30 frames and the 24 frame version a bit less.
Granted 1080I was promoted as the higher resolution HD format based purely on pixel depth but that disregarded the common effects of interlaced processing which typically removes about 25% of the vertical resolution due to vertical filtering circuits to prevent twitter components/artifacts from being visible on screen.
Interlace has and always will be a cheap form of analog compression to improve resolution that also allows the manufacturer of less expensive displays at the cost of imaging artifacts.
With the death of CRT around the corner 1080I really should follow since interlace will no longer serve any practical purpose.
-
lcaillo
- Member
- Posts: 30
- Joined: Thu Aug 25, 2005 7:18 pm
- Location: Gainesville, FL
Richard, I hate to pick nits, particualrly when we agree on the most important principle, but you are contradicting yourself. You say that interlace is a compression method, which in effect is what I said it was chosen to do. In fact, it trades resolution in the vertical domain for resolution in the horizontal domain. My statement that it presented an advantage in bandwidth is correct and was a primary consideration in its choice as part of ATSC.
720p encodes more efficiently for two reasons. First, and most significant, the horizontal resolution is far lower than 1080i. Second, but accounting for little in terms of sompression, progressive is easier to work with in any compression scheme.
I agree that interlace should have never been a choice for HD. The reason that it was is that it was familiar technology that added information in a proposed system that at the time was pushing to get as much across a limited bandwidth as possible. Also, many video people did not clearly understand the difficulties that it presented relative to progressive in the digital world of compression codecs. Many on the video side were paranoid about giving up too much to the computer people. Besides, Gates came out as a strong proponent of 720p, so it had to be bad...
Leon.
720p encodes more efficiently for two reasons. First, and most significant, the horizontal resolution is far lower than 1080i. Second, but accounting for little in terms of sompression, progressive is easier to work with in any compression scheme.
I agree that interlace should have never been a choice for HD. The reason that it was is that it was familiar technology that added information in a proposed system that at the time was pushing to get as much across a limited bandwidth as possible. Also, many video people did not clearly understand the difficulties that it presented relative to progressive in the digital world of compression codecs. Many on the video side were paranoid about giving up too much to the computer people. Besides, Gates came out as a strong proponent of 720p, so it had to be bad...
Leon.
Leonard
-
Richard
- SUPER VIP!
- Posts: 2578
- Joined: Wed Sep 08, 2004 1:28 pm
- Location: Atlanta, GA
- Contact:
It appears we are debating semantics...?
1080I versus 720P
viewtopic.php?t=3381
If you read the complete thread, 2 posts from Mr. Cripps, in essence the rest of your comments agree.
My point is that 1080I was as much if not more of a manufacturing and distribution decision from the source through the system to our displays. 1080I was a form of analog compression mistakenly applied to ATSC since in the end interlace ended up being a penalty in the digital domain of ATSC compression. Further, they could transmit 1080P 30 frames now as our receivers are supposed to be able to receive it although they would have to interlace it for output since all are limited to 1080I.
Reading that thread it seems the manufacturers under estimated their success at fixed pixel digital displays as the CRT is basically dead in the HDTV market. That 20 year work horse hypothesis went by in only 7. I bet CRT is history even in low end categories within 5 years and replaced by LCD flat panels.
Actually I said "analog compression" for good reason as it relates to displaying images in the most cost effective manner which is not the same as digital. No contradiction at all.You say that interlace is a compression method
It certainly was part of the puzzle and...My statement that it presented an advantage in bandwidth is correct and was a primary consideration in its choice as part of ATSC.
The
manufacturers--maintained for years that they wanted to retain interlace in
order to reduce the cost of CRT based displays. The CRT will dominate the
market for another 20 years, though certainly fixed pixel systems of a
variety of matrixes will eat away at the CRT markets. To drive the CRT at
the scan rate for 720p X 60 carries a cost penalty over that of 1080i X 1920
native display and that cost is not a trivial one.
Since the CRT based devices do a great job and will be around as the lower
cost solution for a long time it was decided that Interlace would be the
preferred way for the CRT display. While it was recognized that production
and display standards no longer need to the same in the digital domain as
they must be in analog systems, a certain "momentum" carried interlace
forward.
By introducing
720p as the display standard this cost formula takes a reverse hit, at least
in CRT-based displays. In this case of 720p the broadcaster gets the chief
benefit--reduced bandwidth (costs)--and the consumer pays a higher cost for
displaying it natively. The CRT display today is 95% of the market and so
cannot be ignored in favor of a growing fixed pixel based approach.
This is from:I have talked to most of the manufacturers over the years and those who sell
CRT based displays say interlace is essential for cost purposes (lower scan
rate).
1080I versus 720P
viewtopic.php?t=3381
If you read the complete thread, 2 posts from Mr. Cripps, in essence the rest of your comments agree.
My point is that 1080I was as much if not more of a manufacturing and distribution decision from the source through the system to our displays. 1080I was a form of analog compression mistakenly applied to ATSC since in the end interlace ended up being a penalty in the digital domain of ATSC compression. Further, they could transmit 1080P 30 frames now as our receivers are supposed to be able to receive it although they would have to interlace it for output since all are limited to 1080I.
Reading that thread it seems the manufacturers under estimated their success at fixed pixel digital displays as the CRT is basically dead in the HDTV market. That 20 year work horse hypothesis went by in only 7. I bet CRT is history even in low end categories within 5 years and replaced by LCD flat panels.
-
akirby
- Major Contributor

- Posts: 819
- Joined: Mon Jul 09, 2007 2:52 pm