Page 1 of 1

1080I versus 720P

Posted: Wed May 19, 2004 10:05 pm
by HDTV Forum
From Dale Cripps
August 16 2003


In the beginning of the production standards discussions around the world there was one standard offered up by the Japanese and ratified by the Americans. It was the 1125/60 world wide candidate. We all wanted at that time a single productions standard for the practical reason that it was thought unwise to have to pay for the development of two or more. Anything that could create commonality contributed to economies of scale. This would hold true to cameras, the solid state devices that would be eventually used (CCDs), the recording equipment, and so on, In those days the production standard and the distribution standard were highly linked, though you could make conversions. The hope for a single production standard drew out the consumer electronics people and those in Europe, where TVs are still made, feared Japan's technical superiority and feared that if Japan got the jump on European manufacturers at any time in the future those stodgy European companies, with their large welfare systems, would be destroyed in the event of a runaway hit to the Japanese standard.

The international technical community never viewed it that way. All they did was decide that to have the viewing experience which science said was the target--a 30 degree field of view--that at least one thousand scanning lines top to bottom were needed and at about 2000 samples on each horizontal line making an aspect ratio near 2:1. The Japanese number--1125/60 was chosen because it could be downconverted with nearly equal ease to both 525/59.94 and 625/50 systems. I say nearly equal because the frame rate conversion to 50 was actually proving to be more costly and so the European consumer manufactures jumped on that and said that a HDTV system closer to the 625 has to be proposed as the single world wide standard. They chose the numbers 1250/50, which was a doubling of 625 and at the same frame rate. That was proving unacceptable to the 60 Hz world and the world fell divided into the 1250/50 camp and the 1125/60 camp. To the thoughtful that meant two whole new systems from stem to stern had to be developed. Was there any hope for any commonality?

Common Image Format
After several years of haggling the technical community settled on the number of active lines for the two respective systems. Both the 1250 and 1125 systems decided on a matrix of 1080i X 1920 across. For Europe the field rate of 50 remained and for the rest of the world the field rate of 60 Hz remained, but a chip could be built for 1080 X 1920 and the HDTV image would meet the requirements set forth originally from Japanese research, ie.e, this 30 degree field of vision without visual artifacts.


Things progressed from analog to digital and data rates became a hot topic. Panasonic felt themselves disadvantaged by Sony, who had worked longer and harder in the production equipment side and they broke ranks in Japan much to the consternation of NHK and MITI and Minister of Post Telecommunications and produced a camera that was set with the 720 / 1440 parameters in progressive scan. The purpose of interlace is bandwidth saving. It was first used in our old TV system to give the illusion of 60 frames but that was made up of two half frames or fields that interleaved the lines. To the eye, with its well known persistence of vision, it all looked like 60 frames making up the moving image and not 30 frames per second, which was more accurately the real case. Due to two factors the image in an interlace system has some degradations. The first is simply the accuracy of the components aiming at the target on the screen and missing some as allowed by component tolerances. One could theoretically say that if a perfect cathode ray tube existed that interlace would not have any degradation from capture to eye, at least for a still image, and for a moving image there would be some degradation since the first field of the two that make up the frame are offset in time. The capture in interlace is passed along to the image and appears as a kind of smear. These taken together were called the Kell factor--a term drawn from the experimental work of Dr. Kell years ago when he measured the difference between the image quality of a progressive scanned source and that of an interlaced source. What Dr. Kell determined was that an interlace image was equal to only 70% of a progressive display of the same number of lines. If that is applied to the 1000 line interlace system (drop the other 80 lines for this discussion) you come to an equivalent image of 700 lines progressive. A progressive scanned image does not need deinterlacing (a method of trying to match the field lines to eliminate this "smear" in appearing from motion) and so takes less signal processing. That makes it easier to encode in terms of data and so if a 700 line progressive image is equal to the eye to a 1000 line interlace image why not make the progressive and save on processing during encoding? That was the decision made by Panasonic in an attempt to preempt their old rival Sony.

In subjective testing, however, the Kell factor proved to be a bit out of date since component construction was superior to that of Dr. Kell's era. Film scans effectively uses two unchanging fields for every frame and so there is no line "smudge" in a filmed image in the interlace system. So, it was said by many who worked hard on the subject that an interlace system did not impose a penalty on image quality except where motion in a camera is transmitted on to the face of a screen and still images were, indeed, equal to 1000 lines progressive, and so that was the better system. The arguments were fierce in this nation as we approached making the transmission standard. Should a 720 line progressive system be allowed in or could they shut the door with only 1080i as the HDTV standard. Panasonic prevailed with the help of the Department of Defense, who wanted only a progressive system available to them for accurate imaging in fast moving combat situations. They did not want the "smudge" of interlace to impact some tactical decision because of a distortion. They backed Panasonic and then ABC was roused to anger by the insistence of CBS to have only the interlace 1080i in the ATSC standard, and so the also backed the Panasonic initiative. With one network behind 720p the Japanese government reluctantly stood aside to permit the first production of 720p cameras. That has now grown and while the development cost has swollen it is no longer an issue. Cameras and chips in them are capable of both formats as are recording and SFX devices. There is still more 1080i equipment around and so it is cheaper, but that difference is sure to end in time. The problem with the two is that with a good display in a static image the 1080i usually gets the greater applause. With further refinement in manufacturing process and fixed pixel displays the 1080i is likely to take over the 720p in public appeal. Deinterlacing is motion adaptation and that science is growing leaps and bounds with faster processor speeds and cheaper memory. The fact that to the eye in 1080i there are 2 million pixels and to the eye in 720 there is something just less than one million pixels means that there is more information on the screen with 1080i. To the eye this must mean something and those welded to 720p may wish they had only used solder.

Posted: Wed May 19, 2004 10:05 pm
by HDTV Forum
More from Dale

----- HDTV Magazine Tips List -----

Here are a few things to consider. As far back as 1982 the debate broke out among manufacturing circles over whether the new television system should be progressive or interlaced at capture (camera and post production). The manufacturers--maintained for years that they wanted to retain interlace in order to reduce the cost of CRT based displays. The CRT will dominate the market for another 20 years, though certainly fixed pixel systems of a variety of matrixes will eat away at the CRT markets. To drive the CRT at the scan rate for 720p X 60 carries a cost penalty over that of 1080i X 1920 native display and that cost is not a trivial one. Anytime you change the scan rates by an increment you wind up quickly with needing a magnitude of power and much more wire for deflection coils. The television business does not add cost to the basic device by any choices they make. Once a picture quality goal is established than the lowest cost to that goal is taken. Panasonic was an exception led by a camera developer who had the "religion" that the next system should be only progressive. He was given encouragement by the DOD who wanted to buy progressive HD cameras off the shelf for military purposes. They argued that an interlace artifact could mean the difference in critical battle situations. Panasonic sought political favor by making the 720 cameras and, later, some displays.

Since the CRT based devices do a great job and will be around as the lower cost solution for a long time it was decided that Interlace would be the preferred way for the CRT display. While it was recognized that production and display standards no longer need to the same in the digital domain as they must be in analog systems, a certain "momentum" carried interlace forward. The production standard could do anything as long as it was meeting the criteria set for HDTV. That criteria is more about what you see and don't see than numbers, though numbers are certainly applied to the criteria to achieve the production and display standards.

Since the inception of broadcasting the entire cost of television has been most born at the source rather than the end user. The idea was to get cost out of the end product even if it meant more costly processing at the transmitter so sets could spread faster into the marketplace. By introducing 720p as the display standard this cost formula takes a reverse hit, at least in CRT-based displays. In this case of 720p the broadcaster gets the chief benefit--reduced bandwidth (costs)--and the consumer pays a higher cost for displaying it natively. The CRT display today is 95% of the market and so cannot be ignored in favor of a growing fixed pixel based approach.

We have learned from 1080p 24fps that interlacing can artificially be created from a progressive system and doing so favors the low cost at the display. We have learned over the years that compression of the progressive image is more efficient than it is with the interlace captured image (with its subsequent motion artifacts and aliasing). So, should we one day have a 1080p 60 Hz production system it could be interlaced before or after transmission without introducing artifacts and would be the ideal as it could be read out as two 540 "fields" making up 1080i (from 1080p). With good components there would be no interline twitter nor image differences (smearing) between lines, at least no more difference than comes with progressive scanning.

I have talked to most of the manufacturers over the years and those who sell CRT based displays say interlace is essential for cost purposes (lower scan rate). All filmed material is basically artifact free since one frame of the 24 fps film dwells in place long enough to be scanned by both fields. Unfortunately, most film sources are not as sharp as is the potential of 1080i and so fall short of delivering snappy pictures due as much to the generation of film print that is used for the transfer. There are glorious exceptions and we can only hope that those exceptions become the rule.

So, in film there is no artifact introduced from interlacing nor from interline twitter from motion blurring. That defect is left entirely up to live productions using video cameras at 30 frames made up of 60 fields. In the 1080p 24 fps system the conversion to a 60 field display is like film and so two fields make up one frame (all with 3/2 pull down). With film there is no particular advantage to 720p except in the lower amount of bandwidth needed to transmit as fully scanned image. There can be no argument that less pixels transmitted is less costly. Those endorsing 720p say the chief justification lies in the Kell Factor--something that was defined more than 50 years ago when components didn't have the precision of today. The Kell Factor says that 1000 lines of interlace is equivalent in image quality to 700 lines progressive. This Kell Factor has been cited as the reason that a 700 line system is equivelent to a 1000 interlaced line system

Many people on this and other forums say, however, that 720p is not as sharp--doesn't have that standout "snap" that you can get from 1080i, especially from an almost still image. ABC chose to champion the "progressive scanning is better" religion, but like all religions so far on earth it struggles with proofs. Many things have changed since the days of Dr. Kell and yet no new formal studies have been made to determine what a modern day result would be when replicating Dr. Kell's science.

The fact that there are less pixels by half cannot be wholly discounted when weighing the image quality of the two. It could be said that those favoring 720p are not accepting of the interline twitter in 1080i caused by motion blurring in live sports. hey think that is not a suitable trade-off for the reduced number of pixels. I think the next question is: Do we see all of those pixels of 1080i in the first place? The answer is entirely determined by the viewing distance. As you extend it, you need fewer pixels to have an equivalent clarity of image. Is 720p a four or five times PH system or is it a 3 time PH system? I think you have to opt for a little more distance in 720p, and that is the only penalty you pay for having it other than the cost of your native display.

Sine CBS has only 20% of the sports market and Disney/ABC/ESPN have 80% it is becoming next-to-impossible to buy a 1080i based truck. Europe is also clamoring for 720p as their solution and that makes the 1080 i and p a weaker format globally. This means that we could eventually have all 720p for origination and transmission (due to the fewer pixels that need to be transmitted) and display it as we are now with 540 fields. This is a trend downward in quality in the thinking of many top executives who have opted for 1080i. Sports could drive the entire movement into 720p and that would then reset the bar downward for displays. There would never be a need for 1080 display if it would never be exploited by any available signals. This is the present "threat" offered by a 720p trend.

As far as 1080i displays showing the 720p, they can do it quite well after a conversion to their native scan rate unless there is some inferior engineering in the chips. I have such a situation here and on all of the edges of a 720p picture are terrible black lines spaced about 4 TV lines apart and extend from all edges by about 1/4 of an inch. So, for me a 720p is inferior and to others it may be meeting the old Kell Factor expectations.

I am sure this is just the tip of the iceberg on this subject but I have had many conversations with the people who make decisions on such matters. I believe what I have said is correct, though it is from memory from research done more than 8 years ago.

Dale