Another Opinion - The Case for Outboard Video Processing

This forum is for the purpose of providing a place for registered users to comment on and discuss Columns.
GuyOwen
Member
Member
Posts: 13
Joined: Sat Jan 05, 2008 8:06 am

Post by GuyOwen »

Well, that's at least a start -- and good to know. I'm not arguing with the premise of your article in any way. I hope it did not sound like that. I enjoy your writing -- especially the idea you came up with for a website to share TV tweak settings with the average Owner.

What I'm frustrated about is the endless barrage of suggestions to bypass or replace the existing equipment that you get with your HDTV before you've even had time to plug it in. First, the Government is forcing the broadcast issues on us for our benefit -- then every expert says CRTs are still the best picture out there. Oops!! Next, the stores start playing their "You gotta add these extras to really enjoy it!" shtick. Then the magazines kick in with their Gary Merson-type reports about 81% of the sets all failing major tests. You aren't even getting what you paid for. The broadcasters aren't actually giving you true HD in many cases. No matter what your stereo setup might be, it's not good enough -- unless you rush out and buy an AVR, including HDMI 1.3a -- and you better have 4 inputs, at that. The Blu-Ray and HD-DVD camps can't even have polite conversations with one another. And the advice Forums are no haven from this madness because -- as we all know -- NONE of the sets on the market actually work well. They all have debilitating horrendous problems of one kind or another that are never mentioned in any professional Review -- they only arise AFTER you buy the set. And now I'm reading where NONE of the components or features on any set are worth a damn -- from the cables to the video processor.

Except God's TV -- the Pioneer Kuro.
It's the only perfect set out there... right?
You just will never see the picture in the daytime because of the horrendous glare.

When I ask the simple question of which processor is actually installed in the major models now available, nobody -- and I mean "nobody" -- can really answer that. They haven't even bothered to look. No Reviews list them, either. Nor compare them with after-market units.

The basic answers seem to be...
"Well, they all suck."
"Well, you didn't think you'd get a decent processor with that $3,000 TV -- did you??" -- (shudder the thought!)
"Well, no matter what processor it is, it has to be bad because I heard somewhere that some guy bought a stand alone video processor and he thinks the picture looked better when Mars was ascending."

Is this what this industry has come to this early in the game?
You say there are vastly varying degrees of video processing. I can agree and understand that. Can some of your associates start telling us which sets actually have the better ones -- with a comparison to, say, State-of-the-Art stand-alone video processors? As a Consumer I'm starting to feel that HD-anything is some big joke. And my empty wallet is the punchline.
rfowkes
Major Contributor
Major Contributor
Posts: 77
Joined: Sat Sep 11, 2004 5:05 am

Post by rfowkes »

Guy,

Good question. One word of advice that I give to everyone serious about providing the maximnum flexibility for any display purchase is the following: Make sure that whatever display you are considering offers an input that matches the native resolution of the display. The reasoning is that if you choose to use an external VP to create the resolution you are not tied to the display's internal processing. Because technology is always improving this would mean that you could always have the option of upgrading your video processing without having to purchase a new set. For a 1080p display that would mean the ability to handle a 1080p input, etc.

Granted, this is sometimes a little harder with "720p" sets which might actually be 768p sets, but that is disappearing as 1080p sets become more common. In the relatively early days of 1080p (circa 2005) it was not easy to find a 1080p with 1080p input capability. Most models only accommodated a maximum resolution input of 1080i. That meant that you had to use the internal VP capabilities of the display at all times. With a 1080p input to a 1080p set there was a good chance that if you sent in a 1080p input signal that it would go directly to the display. (I said "good chance" because some early 1080p sets with 1080p input actually converted it to 1080i internally as an intermediate step). When I purchased my first 1080p display (a 1080p HP MD5880n DLP RPM) it was one of the very few models in my price range with 1080p input capability and matched perfectly with my DVDO VP. (The others were a Brillian RPM and the Sony "Ruby" FP). Even though I now also own a JVC D-ILA RS1 (also 1080p capable including input) the HP still provides me with a remarkable picture.
GuyOwen
Member
Member
Posts: 13
Joined: Sat Jan 05, 2008 8:06 am

Post by GuyOwen »

That's a great bit of advice. I like the idea of outboard VP if I can be shown (or read) some test that proves it adds a benefit. One thing I'm confused about is if you buy a new HDTV that has 4 HDMI Inputs...
a) Can I assume all the Inputs are equal? That they handle the same features? That all of them support the same flavors of Image and Sound?
b) Can I assume they're all 1080p if the Native Resolution is 1080p? You point out that early sets may not have been. How about since, say, 2007?
c) Is there any real advantage to simply plugging in my external HDMI devices directly to those Inputs?

I jump around a lot on my reading. I grab every resource I can find. I own a small 37" Sharp Aquos, but am looking to upgrade to a 50" Somethingorother. But the info sure seems to conflict a lot. Or, possibly, I just get confused and take things out of context. Then I think I find a "revelation"-type article -- and it further muddies the water.

For example, all these articles start declaring the weaknesses in the sets.
We're told we really need to go buy a decent AVR.
Then someone else jumps in and says "It doesn't matter what your upconverting DVD Player or AVR is trying to do -- your Set will convert everything to its Native Resolution. And in many cases you really can't turn that feature off on the TV. So you're wasting your money buying two devices that are trying to do the same thing."
Then the next article says "Yes, but one is better than the other. You just won't know which one until you test it." -- which begs the question "Isn't that a hugely expensive test?"
All of a sudden you need a Pre-Pro because a standard AVR just ain't worth the effort.
And an outboard Video Processor, but -- again -- you really won't know, for sure, that it's better-performing than the Processor in your TV until you buy one and try it.

Then one of the answers posted in this very discussion says the author is going to publish an interview with one of the creators of HDMI -- who is warning us that too many devices in the chain creates big handshake issues. Apparently, there's a delay here for that device, a delay there for that one, turn everything off in all those other devices you spent thousands of dollars on -- and hope the quality of the HDMI cables connecting everything together can actually hear that mysterious signal waaaaaayyyyyy down at the other end of this Rube Goldberg behemoth you created -- and respond properly. Like a whisper from a wishing well.

And I have to wonder why we shouldn't just plug a device that matches the Native Resolution into one of those four HDMI Inputs that we paid $3,000 for -- and be done with it? Surely, the bundled Remote for the TV can switch from Input to Input easier than we can figure out that $600 Universal Remote thingy we were told we just have to buy?

Except then you warn about the Input maybe not matching the Native Resolution. What about the Inputs on those AVRs and VPs? Don't they all need to match the final resolution of the target?

So, I will definitely watch out for that. And reconsider buying a VP instead of an AVR if I can find some Reviews telling me precisely how much this helps. The difficulty on the HDMI Inputs seems to be that those details may not be listed in the Specs. Or may be hidden in some carefully-worded phrase. Example -- but off your point a bit -- there is an upconverting HDMI-equipped Yamaha AVR that many people rave about. I keep seeing it recommended to people buying 1080p sets. How frustrated are they going to be when the words that declare "Upconverts to 1080p" are discovered to mean "But not over HDMI"? It only upconverts to 1080p over Analog. Minefields are everywhere in this industry.
akirby
Major Contributor
Major Contributor
Posts: 819
Joined: Mon Jul 09, 2007 2:52 pm

Post by akirby »

My advice is to buy the display you like making sure it's 1080p and accepts 1080p input over HDMI. Then hook up your source device directly to the TV via HDMI and feed it 1080p and see what it looks like. If you're happy then stop there. If you're not happy and want better video performance then start worrying about external processors.

The difference between the built-in processors and stand alone or external processors is rapidly dwindling. Whether you'd be able to see the difference (or care) is something we can't answer.
terrypaullin
Member
Member
Posts: 50
Joined: Thu Oct 14, 2004 6:22 pm

Post by terrypaullin »

GuyOwen wrote:Is this what this industry has come to this early in the game?
You say there are vastly varying degrees of video processing. I can agree and understand that. Can some of your associates start telling us which sets actually have the better ones -- with a comparison to, say, State-of-the-Art stand-alone video processors? As a Consumer I'm starting to feel that HD-anything is some big joke. And my empty wallet is the punchline.
Guy, I'm sorry you feel so "betrayed" by the Consumer Electronics Industry. I know at times it must feel like a giant conspiracy, but I promise you, it's not. "Conspiracy" implies some sort of organization and even a casual observers can see that we are anything but organized. I don't argue with any of the points you made. Indeed, I suspect a reasonable percentage of this readership holds a similar view. While I don't claim to be a spokesman for the dark side, I AM "in the business" and I'd like to offer, well, another opinion.

I'll speak to your issues in the order of the original post.

Yes, CRTs are still the gold standard of visual display. Ironically, after 50+ years of perfecting the technology, we are now tossing them aside because they are big, clunky and ugly. The world has become enamored with flat screens.

Gary Merson is one of the few product reviewers whose advice you can take to the bank. He knows his stuff and he calls a spade "a friggin' shovel".

The technical spec. that "officially" defines HD is quite flexible It allows for several formats and resolutions. While all the content providers ARE giving us HD, not all HD is created equally. Depending on the "codec" used, HD can be compressed at various rates at different times and on different channels.

The high definition optical disc war is over. Blu-Ray won.

God's TV is not perfect. It STILL has to be ISF calibrated.

Stand alone video processing and the processing built into any TV is an apple and an orange. VP vendors such as Silicon Optics and Anchor Bay sell "chip sets" to TV manufacturers and they each implement them differently, so, yes, you can have two products that claim the same logo (processing technology) with quite different results.

Net-net, you have to depend on the product reviews, but the important corollary to that is, you have to know the reviewer and his/her credentials (more on that in a future column).

Yes, all products suck, but with varying degrees of suction. What I mean by that is, there is no such thing as a perfect TV. We know how to build them, but you (or any of us) could not afford one. ALL CE products are a compromise between performance perfection and affordability. We get to vote where on that scale we want to be with our wallets.

In defense of our relatively fledgling Home Theatre Industry, it certainly isn't perfect but it's way better that what we had 10 or even 5 years ago. It's complex, extremely competitive and moves at lightspeed - not unlike the Home Computer business we have come to live accept. HD is not a joke. Properly implemented, it is the most significant and best received development in Consumer Electronics since Bonanza was broadcast in color in the late 50s.

Guy, it's easy to slip into information overload. Don't let it get you down. Keep reading voraciously. You will eventually find voices you trust and those you discard. Enjoy what you have and upgrade when you feel confident you have the right data
rfowkes
Major Contributor
Major Contributor
Posts: 77
Joined: Sat Sep 11, 2004 5:05 am

Post by rfowkes »

Guy,

Lots of information to cover in your recent post but I'll try to answer a couple of your questions with a few general comments. I'll start by using the same lettering scheme you had in your post:

(a) All HDMI inputs into a display (assuming multiple HDMI inputs) are probably the same. I say this because it is probably most cost effective for manufacturers to use the same type chips for each input (probably even sharing some of them since only one input is generally active at a time except for PIP).

(b) Assuming that HDMI inputs are native 1080p on a 1080p display is a little harder to pin down. In fact, in the early days of 1080p sets the majority were not, but only 1080i. However, I think that the great majority of HDMI inputs on 1080p displays in today's market are 1080p capable. But to make sure you have to dig through the specs to find out. Don't trust word of mouth, etc. Look for the specs. Some manufacturers made it a practice to hide this information (or list it in obscure places) because their 1080p sets couldn't accept 1080p native input. And there's another variable here as well. As I stated earlier, even some sets that had 1080p inputs didn't pass that signal directly to the display but first converted it to 1080i and then back to 1080p. Some early Sonys did this and while I don't recall the exact reason for this I seem to recall it had to do with some internal design issues. I'm told by those who look at this on the schematic level that more recent sets have modified their innards to accommodate 1080p throughput. Now that this is in place the need for 1080p -> 1080i -> 1080p is no longer there (and all the possible problems that this might introduce.) So no, it's not always clear whether 1080p sets have native 1080p inputs although it's definitely getting better. Manufacturers are always competing for the "knowledgeable" market (at least to some extent) and having "pure" 1080p inputs is definitely a plus here, as are other things such as 1080p/24, etc. etc. It's a race for the "best in show" in these cases.

(c) Of course, the ideal situation would be to have 1080p source material connected to a native 1080p input to a 1080p display directly with no intervening video processing. But it's not always as simple as that because most people use their displays to watch an abundance of source material from SD DVDs, HD Media, broadcast materials (Dish, cable, OTA, etc.) and many others. The viewer is often at the mercy of the provider where the resolution of the source material is concerned and there are no across the board standards. Add to this the very real possibility of some "hidden" VP going on, and also throw in all the differences between 60 and 24 fps material (video vs. film) and there is definitely the need for video processing somewhere. As was pointed out, one cannot realistically expect even a good set to contain the same level of video processing performance as a dedicated video processor (standalone or part of a pre/pro/AVR). That just doesn't make any economic sense. Terry's remark that the cost of the VP circuitry in the average display probably being about the same as the cost of the packaging that the unit came in is spot on, in my opinion. And let's not forget that if you rely solely on the VP in your display you are married to that VP for the life of the set - even as VP circuitry improves (it always does). With access to the full resolution of the display via native inputs you can always upgrade your VP and prolong the life of your display with an even better picture. The people who state that all digital displays can only display the native resolution of the set are correct in that regard. All such displays must include video processing to change everything to the native resolution of the display panel or you would get no picture. However, when they venture into "so why bother with an external video processor?" territory they are ignoring the fact that most display VPs are not state of the art or, even if they are you are locked into using them. Yes, all video processors "do the same thing" but some do it much better than others.

I was the person who mentioned that multiple HDMI devices in a daisy chain (like a source to a pre/pro to a VP to a display) can compromise the HDMI signal in terms of handshaking, etc. Shane will soon be posting an article I wrote on this which goes into detail so I won't repeat it here. Look for the article for more clarification. My source for a lot of the information contained there was Jano Banks of Radiient Technologies, the co-inventor of HDMI.

Yes, there are many minefields in Home Theater - even more so when a new technology like HDMI, which is a bidirectional technology, is introduced. A lot of people aren't familiar with "handshaking" and they often make assumptions that are based on outdated analogies. While I've always been a fan of pre/pros instead of receivers (since good amps last a very, very long time and you could potentially save some upgrade money using pre/pros instead of AVRs) I notice that the price of excellent AVRs has dropped so much in recent years to allow me to ignore the amplification stages of an AVR. I'm currently using a very nice performing AVR, the Denon 3808ci, in my system. It doesn't have all the VP bells and whistles of the "bigger" Denons, but I don't need them since I use a DVDO VP50. I can live with the fact that I have some receiver amps that I'm bypassing (it also lets the pre/pro section perform better.) I'm very impressed with the performance of the latest Denons. I actually replaced a Lexicon MC-8 pre/pro, which didn't offer HDMI capabilities and I'm talking from experience with both "boutique" brands as well as more mainstream units. As to my future planning, I'm currently looking at the new Denon pre/pro as my possible next upgrade. No unnecessary amps and video processing which rivals the DVDOs and Lumagens, et. al. out there. If I take the plunge (I'll wait until after CEDIA 2008) I can get rid of a lot of "boxes" and probably make HDMI perform even better in the process. But that's another subject for another time.

-RAF
Post Reply