1080I versus 720P
Posted: Wed May 19, 2004 10:05 pm
From Dale Cripps
August 16 2003
In the beginning of the production standards discussions around the world there was one standard offered up by the Japanese and ratified by the Americans. It was the 1125/60 world wide candidate. We all wanted at that time a single productions standard for the practical reason that it was thought unwise to have to pay for the development of two or more. Anything that could create commonality contributed to economies of scale. This would hold true to cameras, the solid state devices that would be eventually used (CCDs), the recording equipment, and so on, In those days the production standard and the distribution standard were highly linked, though you could make conversions. The hope for a single production standard drew out the consumer electronics people and those in Europe, where TVs are still made, feared Japan's technical superiority and feared that if Japan got the jump on European manufacturers at any time in the future those stodgy European companies, with their large welfare systems, would be destroyed in the event of a runaway hit to the Japanese standard.
The international technical community never viewed it that way. All they did was decide that to have the viewing experience which science said was the target--a 30 degree field of view--that at least one thousand scanning lines top to bottom were needed and at about 2000 samples on each horizontal line making an aspect ratio near 2:1. The Japanese number--1125/60 was chosen because it could be downconverted with nearly equal ease to both 525/59.94 and 625/50 systems. I say nearly equal because the frame rate conversion to 50 was actually proving to be more costly and so the European consumer manufactures jumped on that and said that a HDTV system closer to the 625 has to be proposed as the single world wide standard. They chose the numbers 1250/50, which was a doubling of 625 and at the same frame rate. That was proving unacceptable to the 60 Hz world and the world fell divided into the 1250/50 camp and the 1125/60 camp. To the thoughtful that meant two whole new systems from stem to stern had to be developed. Was there any hope for any commonality?
Common Image Format
After several years of haggling the technical community settled on the number of active lines for the two respective systems. Both the 1250 and 1125 systems decided on a matrix of 1080i X 1920 across. For Europe the field rate of 50 remained and for the rest of the world the field rate of 60 Hz remained, but a chip could be built for 1080 X 1920 and the HDTV image would meet the requirements set forth originally from Japanese research, ie.e, this 30 degree field of vision without visual artifacts.
Things progressed from analog to digital and data rates became a hot topic. Panasonic felt themselves disadvantaged by Sony, who had worked longer and harder in the production equipment side and they broke ranks in Japan much to the consternation of NHK and MITI and Minister of Post Telecommunications and produced a camera that was set with the 720 / 1440 parameters in progressive scan. The purpose of interlace is bandwidth saving. It was first used in our old TV system to give the illusion of 60 frames but that was made up of two half frames or fields that interleaved the lines. To the eye, with its well known persistence of vision, it all looked like 60 frames making up the moving image and not 30 frames per second, which was more accurately the real case. Due to two factors the image in an interlace system has some degradations. The first is simply the accuracy of the components aiming at the target on the screen and missing some as allowed by component tolerances. One could theoretically say that if a perfect cathode ray tube existed that interlace would not have any degradation from capture to eye, at least for a still image, and for a moving image there would be some degradation since the first field of the two that make up the frame are offset in time. The capture in interlace is passed along to the image and appears as a kind of smear. These taken together were called the Kell factor--a term drawn from the experimental work of Dr. Kell years ago when he measured the difference between the image quality of a progressive scanned source and that of an interlaced source. What Dr. Kell determined was that an interlace image was equal to only 70% of a progressive display of the same number of lines. If that is applied to the 1000 line interlace system (drop the other 80 lines for this discussion) you come to an equivalent image of 700 lines progressive. A progressive scanned image does not need deinterlacing (a method of trying to match the field lines to eliminate this "smear" in appearing from motion) and so takes less signal processing. That makes it easier to encode in terms of data and so if a 700 line progressive image is equal to the eye to a 1000 line interlace image why not make the progressive and save on processing during encoding? That was the decision made by Panasonic in an attempt to preempt their old rival Sony.
In subjective testing, however, the Kell factor proved to be a bit out of date since component construction was superior to that of Dr. Kell's era. Film scans effectively uses two unchanging fields for every frame and so there is no line "smudge" in a filmed image in the interlace system. So, it was said by many who worked hard on the subject that an interlace system did not impose a penalty on image quality except where motion in a camera is transmitted on to the face of a screen and still images were, indeed, equal to 1000 lines progressive, and so that was the better system. The arguments were fierce in this nation as we approached making the transmission standard. Should a 720 line progressive system be allowed in or could they shut the door with only 1080i as the HDTV standard. Panasonic prevailed with the help of the Department of Defense, who wanted only a progressive system available to them for accurate imaging in fast moving combat situations. They did not want the "smudge" of interlace to impact some tactical decision because of a distortion. They backed Panasonic and then ABC was roused to anger by the insistence of CBS to have only the interlace 1080i in the ATSC standard, and so the also backed the Panasonic initiative. With one network behind 720p the Japanese government reluctantly stood aside to permit the first production of 720p cameras. That has now grown and while the development cost has swollen it is no longer an issue. Cameras and chips in them are capable of both formats as are recording and SFX devices. There is still more 1080i equipment around and so it is cheaper, but that difference is sure to end in time. The problem with the two is that with a good display in a static image the 1080i usually gets the greater applause. With further refinement in manufacturing process and fixed pixel displays the 1080i is likely to take over the 720p in public appeal. Deinterlacing is motion adaptation and that science is growing leaps and bounds with faster processor speeds and cheaper memory. The fact that to the eye in 1080i there are 2 million pixels and to the eye in 720 there is something just less than one million pixels means that there is more information on the screen with 1080i. To the eye this must mean something and those welded to 720p may wish they had only used solder.
August 16 2003
In the beginning of the production standards discussions around the world there was one standard offered up by the Japanese and ratified by the Americans. It was the 1125/60 world wide candidate. We all wanted at that time a single productions standard for the practical reason that it was thought unwise to have to pay for the development of two or more. Anything that could create commonality contributed to economies of scale. This would hold true to cameras, the solid state devices that would be eventually used (CCDs), the recording equipment, and so on, In those days the production standard and the distribution standard were highly linked, though you could make conversions. The hope for a single production standard drew out the consumer electronics people and those in Europe, where TVs are still made, feared Japan's technical superiority and feared that if Japan got the jump on European manufacturers at any time in the future those stodgy European companies, with their large welfare systems, would be destroyed in the event of a runaway hit to the Japanese standard.
The international technical community never viewed it that way. All they did was decide that to have the viewing experience which science said was the target--a 30 degree field of view--that at least one thousand scanning lines top to bottom were needed and at about 2000 samples on each horizontal line making an aspect ratio near 2:1. The Japanese number--1125/60 was chosen because it could be downconverted with nearly equal ease to both 525/59.94 and 625/50 systems. I say nearly equal because the frame rate conversion to 50 was actually proving to be more costly and so the European consumer manufactures jumped on that and said that a HDTV system closer to the 625 has to be proposed as the single world wide standard. They chose the numbers 1250/50, which was a doubling of 625 and at the same frame rate. That was proving unacceptable to the 60 Hz world and the world fell divided into the 1250/50 camp and the 1125/60 camp. To the thoughtful that meant two whole new systems from stem to stern had to be developed. Was there any hope for any commonality?
Common Image Format
After several years of haggling the technical community settled on the number of active lines for the two respective systems. Both the 1250 and 1125 systems decided on a matrix of 1080i X 1920 across. For Europe the field rate of 50 remained and for the rest of the world the field rate of 60 Hz remained, but a chip could be built for 1080 X 1920 and the HDTV image would meet the requirements set forth originally from Japanese research, ie.e, this 30 degree field of vision without visual artifacts.
Things progressed from analog to digital and data rates became a hot topic. Panasonic felt themselves disadvantaged by Sony, who had worked longer and harder in the production equipment side and they broke ranks in Japan much to the consternation of NHK and MITI and Minister of Post Telecommunications and produced a camera that was set with the 720 / 1440 parameters in progressive scan. The purpose of interlace is bandwidth saving. It was first used in our old TV system to give the illusion of 60 frames but that was made up of two half frames or fields that interleaved the lines. To the eye, with its well known persistence of vision, it all looked like 60 frames making up the moving image and not 30 frames per second, which was more accurately the real case. Due to two factors the image in an interlace system has some degradations. The first is simply the accuracy of the components aiming at the target on the screen and missing some as allowed by component tolerances. One could theoretically say that if a perfect cathode ray tube existed that interlace would not have any degradation from capture to eye, at least for a still image, and for a moving image there would be some degradation since the first field of the two that make up the frame are offset in time. The capture in interlace is passed along to the image and appears as a kind of smear. These taken together were called the Kell factor--a term drawn from the experimental work of Dr. Kell years ago when he measured the difference between the image quality of a progressive scanned source and that of an interlaced source. What Dr. Kell determined was that an interlace image was equal to only 70% of a progressive display of the same number of lines. If that is applied to the 1000 line interlace system (drop the other 80 lines for this discussion) you come to an equivalent image of 700 lines progressive. A progressive scanned image does not need deinterlacing (a method of trying to match the field lines to eliminate this "smear" in appearing from motion) and so takes less signal processing. That makes it easier to encode in terms of data and so if a 700 line progressive image is equal to the eye to a 1000 line interlace image why not make the progressive and save on processing during encoding? That was the decision made by Panasonic in an attempt to preempt their old rival Sony.
In subjective testing, however, the Kell factor proved to be a bit out of date since component construction was superior to that of Dr. Kell's era. Film scans effectively uses two unchanging fields for every frame and so there is no line "smudge" in a filmed image in the interlace system. So, it was said by many who worked hard on the subject that an interlace system did not impose a penalty on image quality except where motion in a camera is transmitted on to the face of a screen and still images were, indeed, equal to 1000 lines progressive, and so that was the better system. The arguments were fierce in this nation as we approached making the transmission standard. Should a 720 line progressive system be allowed in or could they shut the door with only 1080i as the HDTV standard. Panasonic prevailed with the help of the Department of Defense, who wanted only a progressive system available to them for accurate imaging in fast moving combat situations. They did not want the "smudge" of interlace to impact some tactical decision because of a distortion. They backed Panasonic and then ABC was roused to anger by the insistence of CBS to have only the interlace 1080i in the ATSC standard, and so the also backed the Panasonic initiative. With one network behind 720p the Japanese government reluctantly stood aside to permit the first production of 720p cameras. That has now grown and while the development cost has swollen it is no longer an issue. Cameras and chips in them are capable of both formats as are recording and SFX devices. There is still more 1080i equipment around and so it is cheaper, but that difference is sure to end in time. The problem with the two is that with a good display in a static image the 1080i usually gets the greater applause. With further refinement in manufacturing process and fixed pixel displays the 1080i is likely to take over the 720p in public appeal. Deinterlacing is motion adaptation and that science is growing leaps and bounds with faster processor speeds and cheaper memory. The fact that to the eye in 1080i there are 2 million pixels and to the eye in 720 there is something just less than one million pixels means that there is more information on the screen with 1080i. To the eye this must mean something and those welded to 720p may wish they had only used solder.