HDTV: Picture quality based on imaging science

Looking for the why of it all? Check here.
HDTV Forum
Archives
Posts: 387
Joined: Wed May 19, 2004 4:39 pm
Location: HD Library
Contact:

Post by HDTV Forum »

I agree with Richard about somebody changing something on ER. Viewed a short segment this week and the quality has improved. Hope we can look forward to the same with many of the other TV produced shows.

Bill
HDTV Forum
Archives
Posts: 387
Joined: Wed May 19, 2004 4:39 pm
Location: HD Library
Contact:

Post by HDTV Forum »

April 27 2003

Since the last 2 episodes of Smallville, it appears they are being mastered to a higher level for this final phase of the season. In the past this program has appeared somewhere between 480P and HD. The picture was regularly soft and lacking very fine detail. Well, no more. Now you can see every pore in the actors face and the very fine lines in Clarks plaid shirt. For the first time the people in this show look real with all bodily and skin imperfections obvious. This reminds me of an article I just read about a full blown HD production where the producer stated that since HD is so revealing the actors will have to play their age - you can
Last edited by HDTV Forum on Wed May 19, 2004 5:42 pm, edited 1 time in total.
HDTV Forum
Archives
Posts: 387
Joined: Wed May 19, 2004 4:39 pm
Location: HD Library
Contact:

Post by HDTV Forum »

My feelings are that virtually all film-based prime time shows that are broadcast in HD have improved dramatically in quality. NYPD Blue looked very grainy and "non-HD" last year, no better than FOX's abhorent 480p widescreen. But this year it appears much, much better to me. The various Law and Order shows have gotten better, too; they now look as good as any film-based HD shows I've seen. I used to be disappointed in Showtime's HD offerings, too, but they also seem to have improved lately.

I have no proof of this, but I feel this improvement is due to the engineers learning how to do HD better than they had in the past. When HD was new, I'm sure the technicians were new to it, too, and had to go through a learning curve to determine how to make it look it's best.

Stosh
HDTV Forum
Archives
Posts: 387
Joined: Wed May 19, 2004 4:39 pm
Location: HD Library
Contact:

Post by HDTV Forum »

I don't particularly care for ER, but my wife loves it and therefore I watch it with her. We flip over from locals on satellite to OTA to view it in HD. I had a full 16:9 image that was very crisp (as crisp as film can be). I think ER is filmed at relatively low light levels (for effect), and the crew uses larger f-stops for a smaller depth of field ( objects in the very close foreground and background are a bit fuzzy and the main focus point is clear) to create more emphasis on what they want the subject to be, while de-emphasizing the fore and backgrounds without removing subject matter completely. For clarity, video has film beat hands-down, but film provides a different "feel" that can't be reproduced with video. Taking all this into account, I think ER was every bit as clear as other film shows (like Everybody loves Raymond, Yes, dear, Still Standing) and movies on HBO/HD.

BiggerDee
HDTV Forum
Archives
Posts: 387
Joined: Wed May 19, 2004 4:39 pm
Location: HD Library
Contact:

Post by HDTV Forum »

I talked with Greg the other night, our resident insider for film production, about a post on the TIPS concerning the use of a 1080I scaler for DVD. The intent was to get the same type of picture the viewer was getting with HBO and Showtime HD on satellite. The big thing that came up was that a DVD is not the same thing that HBO or Showtime typically use for the upconversion. For a better understanding remember that DVD's are still compressed. Check out or rent a Superbit DVD and the standard version for a better understanding.


Richard:
Greg,

Surely they are not just dropping the same DVD I get at the store into the system are they? Isn't the source some commercial NTSC master?

Greg:
Hi,

No, they're probably not using the same DVD's we get . . . but sometimes
the masters may be the same.

As I understand it -- from folks I know who master high end theatrical DVD's
-- the cable nets use the same standard definition masters that they do --
usually Digital Betacam mainly because it's true component video -- and run
them thru some insanely expensive box like a Teranex which does the
upconversion.


The masters maybe standard def but MOST of the film to tape transfers made
in the last six years or so are made on telecines that output TRUE component
RGB and that makes the HUGE difference in the quality of the upconvert.

Now where it REALLY murky is if the film is old enough to be a COMPOSITE
master -- one inch or D-2 -- then the upconversions tend to suck I'm told.

HOWEVER, the studios are making so much dough from their films on cable
that they are tending to retransfer the older stuff. Hey, in a world where
HOGAN's HEROS gets a new high def transfer what can I say.

Also many of the high end telecine machines scan the film at a higher
resolution -- like 2K -- and then down convert to NTSC SD. This sounds
counter intuitive if it's gonna then be UPCONVERTED but the important thing
is the higher encoding resolution really helps to smooth out NTSC
artifacting better than just going straight NTSC.

Whatever the technology it's still garbage in-garbage out. The better the
master, whatever the format, the better the upconvert.

And now with most new films being transferred to 1080 24P initially -- and
then down converted to Digital Betacam for DVD mastering --
the net artifacts coming out the end of the pipeline, whatever the format,
are much lower than just a few years ago.

The frustrating thing is that there are so many different standards -- every
studio does it a bit differently -- so it's hard to know PRECISELY what's
going on.


Hmmmmmmm, ya know after all is said and done I bet they COULD take a DVD
an upconvert it for cable broadcast. . . . of course the deal is that
they'd do it with a 100,000 dollar box.

Hey, Richard, there's no simple answer . . . but it's all interesting.


Greg
HDTV Forum
Archives
Posts: 387
Joined: Wed May 19, 2004 4:39 pm
Location: HD Library
Contact:

Post by HDTV Forum »

From the TIPS List
No score for this post November 8 2003, 1:52 PM


>
>Even if I had one of the rare +$20,000 devices that display 1980 x
>1080, what signal could I feed it? No OTA, DBS or CATV signals exist
>at this data rate [James Snyder or others, please correct me if I'm
>wrong and some has started to do it recently]. I think a handful of
>D-VHS tapes and the T2 Extreme edition DVD 2nd disc are the only
>prerecorded products that will do this.

This is not true.

There are plenty of programs available today that are true 1920 x
1080. CBS's prime time programs which are delivered on HD-D5 tape,
which unlike Sony's HDCAM at 1440 x 1080 and DVCPRO-HD's 1280 x 1080
recorded resolutions, can resolve true 1920 x 1080.

The data rate of broadcast DTV and cable HD doesn't really have an
effect on the resolution unless the programmer decides to downcovert
the program before MPEG-2 encoding. An exception is, of course, if
the programs were delivered on a broadcast-quality HD format that
does not resolve 1920 x 1080 (HDCAM and DVCPRO-HD). Another is if
the encode is pushed much below 12 Mbps, where 1920 x 1080 is still
encoded but the actual resolution within those pixels is nowhere near
1920 x 1080 because so much high-frequency information has been
discarded.

CBS, ABC, NBC and Fox all deliver their HD to affiliates via a 45
Mbps distribution system, meaning that as long as 1920 x 1080 is
coming off the tape that is what makes it to the affiliate. The 45
Mbps data rate allows the affiliate to then encode at 19.39 Mbps for
DTV transmission without throwing too much resolution away.

D-VHS/D-Theatre tapes are the same rule: as long as the source HD was
mastered onto tape at 1920 x 1080 and encoded that way, then DVHS can
reproduce that resolution. Keep in mind, the data rate for HD in
D-Theatre DVHS tapes is only 28 Mbps. Better than 19.39, but nowhere
near 45 Mbps that networks use for distribution.

All this said, it is worth noting that Sony has been spreading HDCAM
throughout the broadcast and post-production industries by cutting
the price of their equipment to get it into the field. DVCPRO-HD is
very appealing because it can be recorded onto standard DVCPRO tapes
(at 4 times speed for 1/4 the record time, of course) and is very
light and easy to use in the field. That means that much of the
programming you see today is not being acquired on tape formats which
record full 1920 x 1080 resolution.

While it doesn't matter now, since only those who can buy Barco or
Ikegami broadcast quality HD monitors can truly see the difference,
that will change in the not-too-distant future. The first true 1920
x 1080 Plasma will be marketed soon, and while still hideously
expensive to start out it will show TRUE 1920 x 1080 on a 60" screen
and will show to any person with decent eyes the difference between
material mastered on HD-D5, versus the rump HDCAM, and the even
rumper DVCPRO-HD.

>I just want us to realize there's still better stuff coming than
>"Today's version of HDTV."

And, unfortunately, worse as well. Unless the viewing public lets
programmers know that they can tell the difference, and can see the
artifacts in their pictures, broadcasters and cable-casters alike
will be tempted to down-sample, lower the data rate, and decimate
bitstreams just as they do now in SD.

DirecTV transmits its HD (ALL OF THEM) in 1440 x 1080 resolution even
though you will see 1920 x 1080 pixels actually coming out of your
box.

Need I say more.

James Snyder
HDTV Forum
Archives
Posts: 387
Joined: Wed May 19, 2004 4:39 pm
Location: HD Library
Contact:

Post by HDTV Forum »

>James,
>Thanks! great info, but now its my turn to be confused....
>
>My previous statements are based on the point of view of the home
>user, not a pro at a broadcast center. I think we all understand
>that 1080i distribution from NETWORKS to AFFILIATES are at full
>1980x1080 resolution and the full data rate of 45 mbps.
>But I was under the impression and have seen several times on both
>Tips list and in other places that the vast majority of Local
>Stations broadcast at much lower data rates and at 1440x1080
>resolution. Is the last statement true? If it IS true, then are you
>saying that all of our current HD STBs are capable of recreating the
>1980 resolution from the 1440 being broadcast locally? If it is not
>true, then when did this change and what % of local affiliates are
>doing full resolution today>

Keep in mind, there are a number of different relationships here:

DTV has a TOTAL data rate of 19.39 Mbps, of which video is a maximum
of about 18 Mbps. Stations can transmit a lower video data rate,
multiple video data rates (dubbed "multicasting") and even adjust the
data rates dynamically.

The ATSC standard permits full HD resolution 1920 x 1080 within the
19.39 Mbps of the DTV standard. If a station wishes to push the data
rate lower (perhaps to multicast an SD or data at the same time), the
way MPEG-2 encoding works is that it throws away the parts of the
picture that the human eye can't see well, which is usually the
highest frequencies under defined conditions. That is why a picture
looks soft when the compression level increases: the parts of the
picture that reproduce the sharp details are the highest frequencies
and usually the first to be discarded.

What a STATION transmits is different from the resolution they can
record. Keep in mind that even if they are encoding their HD at 18
Mbps and 1920 x 1080 for broadcasting, it doesn't mean that is what
their VTRs are recording. If a station uses Sony HDCAM, the format
records 1440 x 1080 even though the timing of the signal is 1920 x
1080. That may be a confusing concept, but think of it this way: the
VTR can record 1920 x 1080 pixels (that's the timing of the
horizontal and vertical lines), if they don't record the highest
frequencies the picture won't be as sharp as a tape format that can
record the highest frequencies. Remember, the higher the frequencies
you record, the more detail you are reproducing. Sony HDCAM makes
the compromise by throwing away the highest HORIZONTAL frequencies
but recording ALL of the VERTICAL frequencies. That fools the human
eye (except for those of us who are cursed with being able to see the
difference) into THINKING it is seeing 1080 lines (which it is) and
1920 pixels (which it isn't, but can be easily fooled into thinking
it is if you don't know what to look for).

Same goes for DVCPRO-HD: it records 1920 x 1080 pixels (remember,
that's TIMING of the lines and pixles, not what's contained in those
lines and pixels) but only samples horizontal frequencies up to the
equivalent to 1280 x 1080. Again, the video looks like its full
resolution, but it will look oddly soft because the horizontal
resolution is not equivalent to the vertical resolution.

>
>If the local station is multicasting, doesn't this automatically
>reduce the resolution of the primary HD channel?

No. It reduces the BITRATE.

Unlike uncompressed analog and digital video and audio in which the
higher the detail the higher the required frequencies, bitrate and
resolution are NOT DIRECTLY RELATED in MPEG compression. That is one
of the prime attractions: you can still have high resolution without
high bitrates. This is accomplished through a wide range of tools
for bitrate reduction, of which high frequency decimation is only one.

So many different decisions are being made within an MPEG-2 encoder
from moment to moment and picture part to picture part that the same
resolution can be encoded even if the bitrate is lower. That's
because the thrown away bits may be thrown away in a part of the
picture the human eye can't easily resolve, or the extra resolution
isn't thrown away at all but shifted to another point in the
bitstream moments later and returned to its proper place when
decoded. Its all a balancing game based on how the encoding software
is written and how efficiently the encoder encodes.

I know its a hard concept to get one's mind around, but that's the
beauty of MPEG encoding. Resolution is not directly tied to bitrate.
It is only one of many tools.

As for decoders, as long as their digital-to-analog converters can
recreate the 30 MHz analog bandwidth per channel (Y, Pr, Pb) at the
output of their A->D converter then your consumer STB can give you
the full 1920 x 1080 resolution. If you have a box with a DVI
connection that can feed a digital imaging device that has full 1920
x 1080 resolution (which none can today without conversion since
there are no LCD or Plasma displays today which are NATIVE 1920 x
1080) then you would see genuine 1920 x 1080 at full glory.

Since screens today (that are not CRT) don't exist in native 1920 x
1080 resolutions in the consumer realm, format conversion must occur
to fit the incoming 1920 x 1080 to whatever the native screen
resolution is, say 1024 x 768. Needless to say, they won't look the
same as the original 1920 x 1080.

Both my RCA DTC-100 (purchased 12/1999) and my Samsung SIR-T165
produce very respectable HD at their analog RGB outputs, with very
little loss at the top frequencies (~27 MHz for Y, 13.5 MHz for Pr
and Pb).

The rule of thumb is this: if your local station encodes 1920 x 1080
at above 16 Mbps and gets their HD from their network at 45 Mbps,
then you are getting genuine 1920 x 1080.

If your local station encodes below 16 Mbps, then it will look softer
because the higher frequencies (which result in the sharper image)
are being thrown away as the bitrate is being reduced.

If your local station is tape-delaying any HD on D5-HD, you are
getting full-resolution HD.

If your local station is tape delaying on HDCAM, DVCPRO-HD, or most
of the HD servers today (which use either HDCAM's or DVCPRO-HD's
compression schemes), then you are watching HD transmitted at 1920 x
1080, but with only an effective resolution with a lower horizontal
resolution.

There is no such thing as HD encoding at 1440 x 1080. That is not a
format which exists in any HD encoder today. If that resolution is
seen on the air, its because of the tape format feeding the encoder.

I know these concepts are a bit hard to grasp for some, since there
are so many factors that go into the HD that you see, but I hope this
has provided some useful information.

If anyone wants more specific information, feel free to email me
privately and I will point you in the right direction.

James Snyder
HDTV Forum
Archives
Posts: 387
Joined: Wed May 19, 2004 4:39 pm
Location: HD Library
Contact:

Post by HDTV Forum »

>It is ironic that now that we are having the display device ability to
>resolve full 1920 at reasonable prices, there is an increase on the use and
>development of techniques to reduce the bit rate and resolution to satisfy
>the MOS quantity model, satellite and cable alike, and multiplexing/data
>casting in broadcasting is not an angel either.
>
>You did not mention (or perhaps did not want to) the growing tendency on the
>use of controversial technologies that shave down the bandwidth of HD
>signals, such as the statistical multiplexing systems by Terayon, Motorola
>and the others that compress by taking video elements out, pixels THEY SAY
>are imperceptible to the human eye, and adjusting the bit rate by borrowing
>temporary bandwidth from static scenes. Stat "muxing".

First, you are confusing two different functions:

Variable bitrate multiplexing (VBR), also known as statistical
multiplexing or "stat muxing" is a function of the multiplexor in a
DTV encoder (the final stage in an encoding system) which allows more
than one program stream to be included in a DTV signal. The
multiplexor is the "traffic cop" that determines what data gets
inserted into the bitstream fed to the transmitter and when. It is
an amazing complex balancing act controlled by software.

Stat muxing (or VBR) is where the encoding software controller uses
feedback from the multiplexor to instruct each of the encoders to
change how they are encoding so that as few bits as possible are
wasted in a bitstream per unit of time.

The concept is this: in a multiplexing system without stat muxing
(called constant bitrate multiplexing or CBR) each program stream (or
"subchannel" to those of us in the viewing world) is assigned a
specific bitrate. For a 19.39 Mbps bitrate of an ATSC channel lets
say 15 Mbps is assigned to an HD and 4 Mbps to an SD. The two
channels cannot EXCEED 15 and 4 Mbps, but they frequently don't
actually USE 15 and 4 Mbps. This is because picture content changes
dynamically and frequently doesn't consume all of the bits assigned
to it. The bits not used are filled with null data which is simply
bits thrown away at the decode side. This is because an ATSC data
rate cannot fall below or exceed 19.39 Mbps for the signal to decode
properly.

With stat muxing the multiplexor lets the encoding software know when
too many null bits have to be inserted, and the software then adjusts
the video data rates of each subchannel encoder to make sure the most
data is devoted to video encoding and the least to null data.

After all, null data is wasted data.

>
>Stat "muxing" or rate shaping by the cable companies (and similar from the
>satellite companies) of fully resolved content is an illness they have for
>maximizing efficiency, and many say "we do not do". Even when BigBand
>Networks affirms that the video would not be of poorer quality with their
>technology, and Terayon is preaching that their CherryPicker 6400 that
>combines multiple HD streams into one channel is not noticeable, we are all
>on the hands of the MSOs operators pushing for the evil technology no matter
>how much a content provider could defend the need to maintain HD quality on
>their program.

What you are talking about is bitrate reduction or rate shaping, the
processing of an already encoded signal to further decimate the
encoded signals and reduce the bitrate further. This tends to be a
done at cable systems and others who need to shave bits to fit more
subchannels into a multiplex. Some DTV stations are doing this with
pre-recorded ATSC encoded streams being replayed but multiplexed with
live encodes that are being encoded live.

In many cases, this can be done seamlessly and without notice of the
viewer, assuming the shaver doesn't throw away too much data. You
can probably gain a 5-10 % gain in the bitrate without significantly
impacting the video. This is quite a gain in cable systems or
satellite signals carrying hundreds of such digital channels.

The difference between the two: stat muxing adjusts at the encode
point. bitrate reduction happens at the point where a set of data
packets is cherry picked out of a bitstream and placed into another
multiplex.

The "statistical" part is the operational term here: the statistical
part is the feedback control of the encoders by the encoding
software. Some people use the term interchangably with bitrate
reduction.

In some cases, it also applies when the bitrate reduction is
happening as the bits are placed into a multiplex with bitstreams
that are being encoded live. In that case, the correct term is
"bitrate reduction with statistical multiplexing." That is what many
cable systems and satellite services do.

>
>Unfortunately everything comes down to money, while we spend more money on
>our sets to get more quality, the MSO's reduce the quality to become richer
>with more quantity.

I can't disagree with you there.

Its up to us to make sure we stay informed about what our program
services are doing technically and making sure they know that we know
and we don't like it when we see that it is affecting the quality of
the signal.

James Snyder
HDTV Forum
Archives
Posts: 387
Joined: Wed May 19, 2004 4:39 pm
Location: HD Library
Contact:

Post by HDTV Forum »

More from James on Multicasting


>Hi All,
>
>This question has been asked in the past but I do not remember the
>answer.
>
>The local OTA CBS channel here in central Florida is transmitting on
>channel 58-01 and 58-02.
>
>Channel 58-02 has nothing but weather radar 27/7 and channel 58-01 has
>the CBS programming.
>
>When channel 58-01 is transmitting in HDTV then would not channel 58-02
>go off the air? I though you needed all the bandwidth of the channel to
>transmit HDTV OTA.
>
>Thank you for you answers,
>PHIL
>N2FHP

HD transmissions do NOT require the entire channel to work, just the
entire channel to give you virtually artifact free transmissions.
Any data rate less than about 18 Mb/s can be achieved, with
relatively few artifacts down to about 15 Mb/s except in high
action/high detail scenes.

In the case of weather radar, which changes very little from frame to
frame over time, it can be transmitted at the same time as an HD
transmission at a pretty low data rate, probably below 1 Mb/s
virtually all the time. Therefore, weather radar would work pretty
well simulcast with an HD channel.

The point where you see HD start to fall apart is if a station pushes
the HD data rate below about 14 Mb/s. At that data rate any sudden
movement in the picture will produce visible artifacting. Stations
who transmit HD and an active SD (like a simulcast of the SD channel)
will run into this problem.

It is worth saying at this point that not all HD is good-quality HD.
The new DirecTV transmissions use downsampling of the resolution,
which I can see and makes me wonder why I would pay extra for hobbled
HD. Some PBS and commercial stations are now multicasting with the
HD data rate below 15 Mb/s (some down to 12 Mb/s), and the HD looks
perfectly awful if the scene isn't static.

We who enjoy the quality of HD also need to be vigilant as to the
quality of the HD we receive. If we don't make it known to our HD
program vendors that the HD we want shouldn't be hobbled, we will
quickly see the quality of our HD degrade significantly. It is
already starting.

Hope this gives some insight.

James Snyder
HDTV Forum
Archives
Posts: 387
Joined: Wed May 19, 2004 4:39 pm
Location: HD Library
Contact:

Post by HDTV Forum »

I was fascinated ro read this thread because Stanley Donen who directed Funny Face in the late 50's with Audrey Hepburn was upbraided by the studio for not using their new VistaVision process to full capacity. VistaVision film cameras ran Horizontally as opposed to the usual Vertical film camera, giving higher clarity.
However, Funny Face was about the fashion world and Donen and advisor Richard Avedon had to fight the studio heads to make certain shots grainy, fuzzy, low saturation of color etc. in order to reflect the look of then current fashion photgraphy. Lucky for cinemaphiles the artistic directors won over the tech directors. This film is a wonder of the use of color, focus and "Smear" which was accomplished by using a Still photo lens over the VistVision camera lens. It seems that sharpness and clarity are all people want, not artistry. No wonder the Impressionists were called heretics.

Tom
Post Reply