My computer (Windows xp) has a 64mb video card (Jaton MX440) with S-video and composite outputs. My aim is to be able to view my desktop on the TV (Sony, 36", 4:3, HDTV, KD-36xs955). I'm willing to spring for a more able video card if that's what it takes to get the best desktop image on the TV.
I haven't done this before, so I need some general guidance regarding what settings I should be paying attention to (frame rate, for example).
Also, if a better video card is necessary, I'd appreciate specific recommendations.
Thanks!
Al
Connecting a computer to an HD CRT
-
Richard
- SUPER VIP!
- Posts: 2578
- Joined: Wed Sep 08, 2004 1:28 pm
- Location: Atlanta, GA
- Contact:
27" CRT HDTV for PC and video applicationsMy aim is to be able to view my desktop on the TV (Sony, 36", 4:3, HDTV, KD-36xs955).
viewtopic.php?t=6653
Even though you have a far better performing display it will NEVER look like it does on a PC monitor and you too will have overscan problems. Your CRT HDTV is no where near as accurate...
Now if the purpose is for gaming or videos rather than computer stuff there might be some value but for the most part you will be frustrated over the lack of clarity by comparison.
While you can't get it right there are ways to improve things if you really want to pursue this but expect to play with settings and such rather than simple plug and play.
-
alfonso
- Member
- Posts: 5
- Joined: Tue Dec 12, 2006 11:13 am
Thanks Richard; playing .jpgs stored on my computer over the TV is what I had in mind and I can probably live with some image cutoff if it's indeed necessary.
I did try connecting the TV to the computer via S-video, but the TV screen remained blank. I'm thinking I should be syncronizing the frame rates. I think the TV's frame rate is 30 fps (?), so I should set the video card to match that.
Once I get something on the TV screen, I can try different resolutions to optimize things and maybe manipulate some other settings.
Am I approaching this right?
Thanks,
Al
I did try connecting the TV to the computer via S-video, but the TV screen remained blank. I'm thinking I should be syncronizing the frame rates. I think the TV's frame rate is 30 fps (?), so I should set the video card to match that.
Once I get something on the TV screen, I can try different resolutions to optimize things and maybe manipulate some other settings.
Am I approaching this right?
Thanks,
Al
-
Richard
- SUPER VIP!
- Posts: 2578
- Joined: Wed Sep 08, 2004 1:28 pm
- Location: Atlanta, GA
- Contact:
Got it!playing .jpgs stored on my computer over the TV is what I had in mind
Check your card to make sure those video connections aren't inputs... if they are outputs then there may be a switch in your card menu to turn it on.
Best performance would come with a card that can output component video and also allow a forced generic 480P, 720P, 1080I scan rate and you are looking for 480P or 1080I. If you happen to see a 540P select that instead of 1080I.
This is likely far more card than you need but...
http://www.nvidia.com/page/8800_tech_specs.html
Integrated HDTV encoder provides analog TV-output (Component/Composite/S-Video) up to 1080i resolution
That's the feature you are looking for!
-
Richard
- SUPER VIP!
- Posts: 2578
- Joined: Wed Sep 08, 2004 1:28 pm
- Location: Atlanta, GA
- Contact:
-
PenGun
- Member
- Posts: 7
- Joined: Tue Oct 31, 2006 12:49 am
-
alfonso
- Member
- Posts: 5
- Joined: Tue Dec 12, 2006 11:13 am
I guess I'm not understanding this: why does the information coming from the computer need to be decoded? Doesn't it have to be encoded for the (component) TV input?PenGun wrote:Any of the Nvidia cards from 6200 up will do hardware decoding of mpeg2 streams and also with "pure video" decoding of the H264 and standard AVC streams can be done.
Or were you talking about cards that can decode DTV signals?
Please pardon my confusion, I'm new to this video stuff.
Thanks!
Al
-
PenGun
- Member
- Posts: 7
- Joined: Tue Oct 31, 2006 12:49 am
Digital TV is just streaming mpeg2 at the moment. As HD becomes more prevelent the AVC (Advanced Video Codecs) codecs including H264 will be used more.
The Nvidia 6200 and up will decode this in"hardware" and unload your CPU.
For instance if I play an mpeg2 1920x1080 at 20 Mb/s from a file or stream, about 95% of one of my fairly powerful Opteron 165 at 2.4 Ghz cores is needed to decode that stream. With my GT6600 handling the decode it's under 40%.
The output of the Video card in my case is either HDMI and runs the TV as a digital frame buffer or Component TV which presents a HDTV signal to the TV pretty much the same as a set top box. It's a wash for quality on my Sony but either can be better depending on setup.
The Nvidia 6200 and up will decode this in"hardware" and unload your CPU.
For instance if I play an mpeg2 1920x1080 at 20 Mb/s from a file or stream, about 95% of one of my fairly powerful Opteron 165 at 2.4 Ghz cores is needed to decode that stream. With my GT6600 handling the decode it's under 40%.
The output of the Video card in my case is either HDMI and runs the TV as a digital frame buffer or Component TV which presents a HDTV signal to the TV pretty much the same as a set top box. It's a wash for quality on my Sony but either can be better depending on setup.