VGA and RCA ?

Discussion in 'General Electronics Chat' started by Mathematics!, Jan 28, 2010.

  1. Mathematics!

    Thread Starter Senior Member

    Jul 21, 2008
    Ok , I have made S-video to RCA composite video cable (yellow plug) .
    The signals are both analog and mostly the same. And it works fine.
    This thread talks about that if your interested

    What I am wondering now is can a cable be made as simple as the one for S-video to RCA composite that I made.... but for VGA to RCA component (GRB phono plugs).
    From my understanding their both analog video signals and VGA uses RGB and component RCA uses something close to RGB uses L = Luminancence , L - Red , L - Blue ????

    I also plugged the composite RCA video (yellow plug) into the component videos Green plug and I could get a black and white picture to display on the screen but when I plugged the yellow plug into the red or blue video rca component plugs I got no signal. I am wondering why composite yellow video output source to rca component green input source gives you a black and white clear image but why the other plugs of component won't?

    Is this something to do with Yellow carriering luminanceness and chromatic info and component being broken into luminance = L
    U = L - R
    V = L - B

    Either way the main question is can you directly make a VGA to RCA component cord. And if you can do this then you should beable to make a VGA to composite black and white image as well so you should beable to create a VGA to component or to black/white composite image.

    Curious how to convert RGB to component signal LUV I know how to convert it thru software using a matrices ...etc (they show you it on the wikipedia how to do the conversion mathematically but to implement it in a circuit/hardware duno)

    But how do you do the equivalent software matrix conversion in hardware?? Either way if I directly connected GRB to YUV then I should see some type of image because the only difference is that U = L - R
    V = L - B seems to me I would get the opposite colors on the screen but their still should be an image....
    Maybe the oppisite color space image or something

    When I get a chance I will verify this or do some testing.
    Wondering if anybody has done anything with this stuff before and knows the answers.

    I guess the many issue of converting RGB into lumencence , L-R , L -B is how to obtain Green from L to obtain red , and blue it's just L - (L -R) , and L - (L-B) so to do this hardware wise it is just creating a difference circuit for the red and blue but to get the green duno.....
    Last edited: Jan 28, 2010
  2. BMorse

    AAC Fanatic!

    Sep 26, 2009
  3. Mathematics!

    Thread Starter Senior Member

    Jul 21, 2008
    Well , I found this link as well looks pretty straight forward.
    Don't know if this will work though.

    Also I have a question if component is Y , Y - R , Y -B
    and VGA is RGB then would G , G - R , G - B be close enough to get some picture maybe different colors?

    I mean how does the component signal know from the Y how much G it needs. Obviously R = Y -( Y -R) ,...etc so all you would have to do is combined the Y and Y-R signals like a subtraction mixer to get the red/blue data. But I still don't see how you get green is it just the left overs like once you got the red and blue is the green just Y - (R + B).
    If this is so then the circuit will be just a few subtractions and additions.

    If that is the case then I am all set all you would have to do to convert component into VGA is subtract Y , R-Y and subtract Y , B-Y then take those 2 outputs and use them as inputs to get the Green as Y - (R+B).

    To go from VGA to component I am still debating it.... don't know how to generate the Y from RGB is the Y just R+G+B if so then to convert from VGA to component I would have R+G+B then once I have Y I can make the differences Y-R and Y - B.

    If this is the case the converting from VGA to component wouldn't much harder the the reverse.

    Maybe their is more to the store.
    However I have converted S-video to composite I am wondering though why the quality on from the DVD players s-video to my TV RC composite is so good but when I hookup my S-video to my PC and the RC composite to the TV the quality is crappy. Alot less quality noticable. I won't understand how you can lose quality same length from PC to DVD player plus ot minus a few feet but that should make much difference. Is their some setting on the graphics card I have to set like resolution or something to get the equivalent quality ,sharpnesss ,...etc duno.

    But this doesn't make since same signal same length , same quality... I would even go ahead and say the quality on my Graphics card is far beter then the DVD players quality....

    Any help would be great in clearing up this issue.
    But the main issue is VGA to component and visa-versa circuit with out an IC. (don't want to use any chips I have to send away for should beable to do this with simple built from scratch circuits and maybe some Op-amps or something.

    Thanks again
  4. bertus


    Apr 5, 2008
  5. BMorse

    AAC Fanatic!

    Sep 26, 2009
    The link you show is for RGB component output from VGA (You will have to have a component input on your TV, they would be separate Red,Green, Blue connectors) This is different than VGA to composite (single yellow connector on the device for input).....
  6. Mathematics!

    Thread Starter Senior Member

    Jul 21, 2008
    Ok , I will look more into the conversion between VGA and RCA component visa-versa.

    But does anybody know why the quality of s-video to rca composite from a PC is not is good as S-video to rca composite from a DVD player or VHS player ,...etc

    Their must be something with the resolution of the screen or something I can set. Because I would think if I come close to matching the resolution of my Graphics card to the TV resolution then I should see the same quality as if I was going from a DVD player to the TV.

    I mean the graphics card and DVD players are both using NTSC.
    The length of the cord was tested on both so it is not the length.
    So the only thing it could be is miss match resolution.

    800X600 pixels is the lowest resolution the graphics card bar goes down to but I think I can go into advanced and make it even smaller resolution... or zoom in or something. Maybe I can use the DPI setting to compensate for not being able to go lower resolution... they allow you in my graphics card to increase or decrease the DPI (dots per inch )

    Wondering if anybody knows what I have to set a computer video card to to get the display crystal clear like the DVD player does. Maybe their is noway but if their isn't I would like somebody to provide me with a reason beyond a doubt
    Must be a standard TV resolution or something for those old CRT TV's

    Last edited: Jan 28, 2010
  7. studiot

    AAC Fanatic!

    Nov 9, 2007
    You have a whole raft of issues to overcome converting computer graphics standard to TV (video) standard.

    Different Frame rates
    Different scan regimes
    Different gamuts
    possibly a different field order

    It's not just a question of pixels.

    This task can only be accomplished well in studio scan convertors costing $$$$$$$$$$.
  8. Mathematics!

    Thread Starter Senior Member

    Jul 21, 2008
    So are you telling me their is noway I can make it crystal clear with just maninpulating the video card settings on my graphics card?

    some of these settings I think I can set but I don't know what the equivalant setting that the DVD players output uses???

    If I am going to have to get $$$$$ additional hardware then forget it.
    I mean the image is still good for video but when I display a text document I cann't read it very easyilly....etc. That is all I mean I can keep changing it back and forth between some ridiculous font but that gets to be a hassel. Still curious on why words look so bad.

    I saw that radioshack was carrying a RF modulator for audio/video signals to modulate them on TV carrier wave either channel 3 or 4... but the output is a coaxial cable output. Would it be possible to directly plug a coaxial antenna into the output port of the rf modulator and an antenna into my old analog crt TV and broadcast video/audio to my TV wirelessly??? Or is it just meant for cable output and not antenna?

    Theoritical this should work but I don't know if it will get enough range all I need is a room away about 30ft or less.... If not I am assuming I can get a power amplifier to plug into the output of the TV/RF modulator to boost the range....

    Thanks for your comments.
  9. rjenkins

    AAC Fanatic!

    Nov 6, 2005
    There is no good reason the S-Video output from a PC video card shouldn't give excellent quality on an analog TV.

    I use S-Video from one of my media center PCs to a JVC analog widescreen TV, with the PC video set up for primary display on the TV out and resolution set to 1024x768.

    As Studiot says, VGA scan rates are completely different to TV scan rates; converters do exist, but they are active devices.

    For the best possible connection, if you have an LCD or Plasma with HDMI input, you can connect directly to this from a PC DVI monitor output - the signals are compatible, all you need is an off-the-shelf DVI to HDMI cable (or DVI plug to HDMI Socket adapter block plus standard HDMI cable).

    With that setup, you can run full HD 1080p video, if your PC VGA card is up to it..
  10. Mathematics!

    Thread Starter Senior Member

    Jul 21, 2008
    I realize the DVI , HDMI thing but this TV is old and only has composite and coaxial.

    I am doing the VGA and component project for a friend of mine that has a TV with only coaxial , component , and composite (but this is broken on his TV)

    His laptop only has VGA (not s-video to bad :( )

    And I don't want to use VGA to coaxial would think this would be harder to do then the component conversion.

    As for the quality of s-video on a PC to rca composite on an old crt tv could you tell me what your graphics card configurations are I have an nvidia card.

    Also Does anybody have any opinon on this
  11. studiot

    AAC Fanatic!

    Nov 9, 2007
    It's more than the scan rate.

    All the 'lines' on a TV screen make up one 'frame'


    The TV system actually only addresses half the lines on the screen each time it passes from top to bottom.

    The other half don't change.

    This is called a field.

    To address the rest of the lines a second field is needed.

    Two fields make one frame.

    Thus the overall 'frame rate' in the US is 29.7 Hz.

    The system is known as interlacing.

    Digital systems are fast enough to address all the screen lines each time and use what is known as progressive scan

    Here is a good demonstration of the difference.
  12. Mathematics!

    Thread Starter Senior Member

    Jul 21, 2008
    Yes I understand how interlacing and progressive work but I don't understand how this is going to hurt the quality of a almost still video screen image words ,...etc

    When I have action like a movie playing thru the computer the quality is just as good as a regular DVD players....

    So I don't get what your getting at for loss of quality issues???

    Somebody perviously said he could get equivalent quality I am just wondering what he set his graphics cards on to obtain this did he put it on the lowest resolution ...?
  13. studiot

    AAC Fanatic!

    Nov 9, 2007
    All conversion is about compromise of effort v benefit. VGA/Video conversion is a pretty onerous task so the range of circuit complexity available will be able to do a more or less effective job. Naturally the more complex (expensive) the circuitry the better the conversion.
    Whilst modern fast pcs can accept much of the conversion burden, there are physical differences in the standards that can only be addressed in hardware and there are some that cannot be fully addressed at all.

    I should mention that, in general, it is impossible to get the same picture on both a TV and computer screen.
    This is because the colour gamuts are different. This means that the pc can callup and display colours that are not available on TV and vice versa. So unless the picture is composed solely of colours common to both systems there will be changes in translation.

    Another issue to be addressed is pixel shape. Digital pixels are rectangular. TV pixels are square so a conversion needs to be effected.

    Much of this has been dealt with in the convergence of technologies but it need to be considered.

    So if you , or your friend, get some good picture conversions (to your eyes) - that's great. You just have to accept you will not get them all. Further the settings that work best for one source material will be inappropriate for another.
  14. rjenkins

    AAC Fanatic!

    Nov 6, 2005
    Pixel shape & scan format is not really relevent to analog TV as the scan is not aligned to pixels. (As a point of note, PC resolutions are generally based on square ratio pixels; eg. 4:3 monitors @ 640 x 480, 800 x 600, 1024 x 768 etc).
    Also, many older analog VGA PC monitors used interlace scan @ 1024x768 & above.

    I used Gigabyte GA-MA69GM-S2H boards in my media center machines.
    These have integral ATI Radeon X1250 graphics with VGA, DVI, HDMI, S-Video and Component video out. (Plus optical s/pdif).

    As mentioned, for the analog JVC TV, I have the video set to 1024x768 and the display size / alignment tweaked using the adjustments in the ATI display control panel.

    The TV is not quite up to displaying that actual resolution, small PC text is a bit fuzzy - but you can see the difference between a DVD and a Blu-Ray disc.

    PC Video cards appear to be able to detect when an analog device is connected and the TV setup options are often disabled if a TV is not detected. Some drivers have a 'force detect' option.

    I've found you get the best results if the TV is set as the primary display and no other monitor is connected when the machine is booted. This seems to allow most flexibility in settings.

    The other option is use dual display mode and the option to clone the displays (rather than separate or expanded desktop). In this mode, most drivers have an option to auto maximise any video, so you can eg. play a video file in a small window on the PC monitor and it plays full screen on the TV.

    I've never needed to do any colour adjustments or other signal 'tweaks', it's purely been matching the display to the TV screen size, which is sometimes quite difficult. (try going back a year or so on the driver version if you have problems).