ADC->DVI worth doing?

Discussion in 'Video Hardware' started by Michael L Kankiewicz, Apr 5, 2008.

  1. Hi all. I have a Quicksilver with the dual port VGA & ADC card. I picked
    up a relatively inexpensive 19" Starlogic LCD that looks very average
    with the VGA connection - actually not too great. It does have a DVI
    input. If I get an ADC->DVI adapter to use the monitor's DVI input from
    the ADC port, will that improve the monitor's view, or would that just be
    a waste of time/money?

    Michael L Kankiewicz, Apr 5, 2008
    1. Advertisements

  2. Michael L Kankiewicz

    David Empson Guest

    It will depend on your LCD screen. If your screen is good enough quality
    then the DVI input will produce a better image than the VGA input. Some
    cheap monitors aren't much better with DVI than with VGA. Do you have
    access to another computer with DVI output which you could use for a
    comparitive test before buying the adapter?

    I have a Dell UltraSharp 24" widescreen LCD monitor, which has DVI and
    VGA inputs (along with lots of others). I'm primarily using it with a
    PowerMac G4 QuickSilver 2002 which has a video card with ADC and VGA

    I normally have the monitor hooked up to the ADC via a DVI adapter (I
    got a cheap Taiwanese one).

    I occasionally plug the monitor into other computers, using either DVI
    or VGA according to convenience.

    There is a signficant drop in video quality if I use the VGA input. DVI
    (or ADC via the DVI adapter) is much better quality. VGA is noticeably
    noisy and somewhat blurry. DVI is much sharper.
    David Empson, Apr 6, 2008
    1. Advertisements

  3. Michael L Kankiewicz

    Dolores Park Guest

    I have a 22inch 16x10 Samsung that's connected to my MacBook via DVI and
    to my PC via VGA. At first I was really unhappy with how bad the VGA
    input looked. Then, with a page of text covering most of the screen,
    I noticed that there were maybe 8 columns of alternating blurriness and
    clarity. When I covered the screen with a bitmap of alternating black
    and white pixel-sized dots, it became really obvious.

    This monitor has some image adjustments that it labels "coarse" and
    "fine". Using the coarse setting, going one way produced more & more
    alternating columns; going the other way reduced it to two columns,
    with the blurriness covering only the right-hand 5th of the screen.
    Adjusting the fine setting "just so" eliminated all of the blurriness.

    For my purposes, VGA now offers the same clarity as DVI. It appears
    that the problem wasn't really with the monitor, per se, just with
    the "auto-adjust" circuitry it uses to sync its electronics with the
    VGA scan rate. If you have similar adjustments and a suitable test
    pattern, you may want to try tweaking it a bit.
    Dolores Park, Apr 6, 2008
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.