HDMI 1.3 increases its single-link bandwidth from 165MHz (4.95 gigabits per second) to 340 MHz (10.2 Gbps) to support the demands of future high definition display devices, such as higher resolutions, Deep Color, and high frame rates.
The foundation has been set for even higher bandwidths over a single-link. With double-link, if implemented, the 10.2 Gbps could virtually be doubled, if ever needed. How the possibility of several 1080p/60fps running simultaneously over HDMI for future applications sounds to you?
[url=http://www.hdtvmagazine.com/articles/2006/07/hdmi_part_3_-_h.php]Read the Full Article[/url]
HDMI Part 3 - HDMI Version 1.3, Digital Connectivity at its
-
Rodolfo
- Author
- Posts: 755
- Joined: Wed Sep 01, 2004 8:46 pm
- Location: Lansdowne VA
-
ericlhyman
- New Member
- Posts: 1
- Joined: Thu Sep 09, 2004 1:10 pm
-
Rodolfo
- Author
- Posts: 755
- Joined: Wed Sep 01, 2004 8:46 pm
- Location: Lansdowne VA
Eric,
Since part of your question is about a matter that would be better anwered by HDMI itself, I contacted them to provide you with such response (Leslie Chard), as follows:
Your question: For the lip sync feature to work, is HDMI 1.3 capability needed on both the source device and the AV receiver?
HDMI response>> The 1.3 lip sync correction functionality is required on the device that creates the lip sync problem (typically a display
Since part of your question is about a matter that would be better anwered by HDMI itself, I contacted them to provide you with such response (Leslie Chard), as follows:
Your question: For the lip sync feature to work, is HDMI 1.3 capability needed on both the source device and the AV receiver?
HDMI response>> The 1.3 lip sync correction functionality is required on the device that creates the lip sync problem (typically a display
-
LesMoss
- New Member
- Posts: 2
- Joined: Fri Aug 11, 2006 1:51 pm
Yes, maybe in high end equipment. But not in $100 DVD players and cost minimized cable STBs.Rodolfo wrote: ... in the future this functionality will be in DVD players, and most other CE devices.) The reports that we are getting from manufacturers indicate that this function is very popular and will be widely implemented.
The promise of HDMI will not be realized until Joe SixPack can plug his equipment into his tv with a single wire and get a good result.
-
videoengr
- Member
- Posts: 5
- Joined: Sun Jun 25, 2006 6:03 pm
As I understand the lip sync part of the HDMI 1.3 capability a DVD player or STB can send the amount of its video delay (relative to the audio) down the cable and the TV then can delay the audio by that amount plus the amount of video delay in the TV's processing. If all works well the audio comes out in sync with the video. Does the TV constantly looks at the incoming delay amount and if the source is changed the TV changes the audio delay accordingly? If the source is changed by the viewer for example by selecting another device or changing channels on the STB what happens? What happens when the source just changes the delay by itself as STB's have a habit of doing. When the video delay changes does the audio delay in the TV instantly jump, making pauses and clicks in the audio?
-
Rodolfo
- Author
- Posts: 755
- Joined: Wed Sep 01, 2004 8:46 pm
- Location: Lansdowne VA
videoengr (actual name?),
I could not respond until now due to time availability but I thought I would share with you a response from Joe Lee, Director of Marketing at Silicon Image.
-------------------
Response to your questions:
With HDMI 1.3, it is sink or repeater devices (i.e. TV or AV receivers) that report their amount of audio and/or video latency, not the source devices. Source devices (such as DVD players or STBs) typically will output the audio & video in fairly good synchronization.
This is expected because the source is reading the content material directly, and thus there are no intermediate processing steps that would cause the source to lose the reference information about audio & video synchronization.
With HDMI, the audio & video latency information is stored in a ROM chip in the device called the EDID ROM. This ROM stores information about the device's capabilities, such as supported audio & video formats. This EDID ROM is always read when a source device first powers up.
When a source is interfaced directed to a TV, there is usually no issue with audio & video sync since the audio/video goes out of the source synchronized, and the TV has its own audio delay electronics to compensate for the video delay resulting from the TV's video processing.
Since the TV knows how much video delay its own processing will impart, it can compensate by delaying the audio accordingly. The problem typically occurs when a user has an AV receiver between the source and the TV. In this instance, the AV receiver extracts and plays the audio (with no signicant delay), and then send the video to the TV (which often does significantly delay the video because of its processing).
Since the TV is only getting the video, it obviously has no control to perform the audio delay that it normally would do. And since the AV receiver does not know how much video delay the TV has, it does not perform the delay. Today, a user will usually manually set the AV receiver to delay the audio by a specific amount, but the users must "guess" how much delay to dial in.
With HDMI, there is no guessing as devices will be able to report their video or audio delay effects, and the source or AV receiver will be able to compensate automatically with just the right amount.
With this architecture, the audio delay is set once and does not result in any glitches when channels are changed, etc.
--------------------------
Best Regards,
Rodolfo La Maestra
I could not respond until now due to time availability but I thought I would share with you a response from Joe Lee, Director of Marketing at Silicon Image.
-------------------
Response to your questions:
With HDMI 1.3, it is sink or repeater devices (i.e. TV or AV receivers) that report their amount of audio and/or video latency, not the source devices. Source devices (such as DVD players or STBs) typically will output the audio & video in fairly good synchronization.
This is expected because the source is reading the content material directly, and thus there are no intermediate processing steps that would cause the source to lose the reference information about audio & video synchronization.
With HDMI, the audio & video latency information is stored in a ROM chip in the device called the EDID ROM. This ROM stores information about the device's capabilities, such as supported audio & video formats. This EDID ROM is always read when a source device first powers up.
When a source is interfaced directed to a TV, there is usually no issue with audio & video sync since the audio/video goes out of the source synchronized, and the TV has its own audio delay electronics to compensate for the video delay resulting from the TV's video processing.
Since the TV knows how much video delay its own processing will impart, it can compensate by delaying the audio accordingly. The problem typically occurs when a user has an AV receiver between the source and the TV. In this instance, the AV receiver extracts and plays the audio (with no signicant delay), and then send the video to the TV (which often does significantly delay the video because of its processing).
Since the TV is only getting the video, it obviously has no control to perform the audio delay that it normally would do. And since the AV receiver does not know how much video delay the TV has, it does not perform the delay. Today, a user will usually manually set the AV receiver to delay the audio by a specific amount, but the users must "guess" how much delay to dial in.
With HDMI, there is no guessing as devices will be able to report their video or audio delay effects, and the source or AV receiver will be able to compensate automatically with just the right amount.
With this architecture, the audio delay is set once and does not result in any glitches when channels are changed, etc.
--------------------------
Best Regards,
Rodolfo La Maestra