Hooray! ... damn.
Aug. 18th, 2007 02:41 pm![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Hooray! After saving up for a while, we finally broke the New Computer Piggy Bank.
quelonzia finally got herself a decent computer -- and got an insanely great deal on it. But I'll let her rave about that in her own journal.
While we were at Fry's, we also went monitor shopping. I have a dual-monitor setup, and I've been staring at these huge old Dell Trinitron 17" CRTs for a couple of years now. I've been wanting to replace them with LCDs, both for desk space and less power consumption.
Lo and behold, Fry's also had some nice-looking 19" widescreen LCDs for all of $149. I replaced both my old tubes for what I thought I'd have to spend on one.
...and this is where "damn" comes in.
My video card has always had a quirk. It recognizes what kind of monitor is in the primary port just fine, and when you check the nVidia control panel or the Windows > Display > Properties tab, the make of the monitor pops right up.
The secondary port, however, doesn't seem to recognize the Plug-N-Play data. No matter what you plug into it, it just says "Generic Analog Monitor".
With the Trinitrons, that was just an odd quirk. It displayed just fine, and gave me a wide array of resolution options.
With the LCDs, though... it still gives me a wide array of resolution options, and, unfortunately, none of them are the 1440 x 900 native resolution of the monitor.
And all the other options look terrible.
The primary monitor is gorgeous: bright, crisp, high contrast. I've yoinked the second LCD, though, and replaced it with one of the old Trinitrons -- which now looks kinda dingy and squinty by comparison.
So... it's time for a new video card, too. My current one is a year and a half old. I don't remember the manufacturer, but it's running the nVidia geForce FX 5500 chipset.
My question is this:
Is this a common problem with multi-monitor video cards, or is it just a quirk of this brand -- or even just this card in particular? If I go and shell out for another card, will I still have the same problem? Or will newer cards include the 1440 x 900 resolution (which is a pretty common standard amongst LCD widescreens, I understand) as an option even for "generic" monitors?
The new monitors accept both VGA and DVI inputs. They're currently using VGA, since that's all my old card provides (a deliberate choice, since the Trinitrons were, of course, VGA). If I get a card with dual digital outputs, will it be more likely to recognize and/or provide the proper resolution?
Thanks in advance.
UPDATE (21:33):
halfelf talked me through all manner of troubleshooting, and, when the answer ultimately turned out to be "time for a new video card", helped me make an informed decision. He also pointed out a good source for inexpensive DVI cables. Thanks!!
![[livejournal.com profile]](https://www.dreamwidth.org/img/external/lj-userinfo.gif)
While we were at Fry's, we also went monitor shopping. I have a dual-monitor setup, and I've been staring at these huge old Dell Trinitron 17" CRTs for a couple of years now. I've been wanting to replace them with LCDs, both for desk space and less power consumption.
Lo and behold, Fry's also had some nice-looking 19" widescreen LCDs for all of $149. I replaced both my old tubes for what I thought I'd have to spend on one.
...and this is where "damn" comes in.
My video card has always had a quirk. It recognizes what kind of monitor is in the primary port just fine, and when you check the nVidia control panel or the Windows > Display > Properties tab, the make of the monitor pops right up.
The secondary port, however, doesn't seem to recognize the Plug-N-Play data. No matter what you plug into it, it just says "Generic Analog Monitor".
With the Trinitrons, that was just an odd quirk. It displayed just fine, and gave me a wide array of resolution options.
With the LCDs, though... it still gives me a wide array of resolution options, and, unfortunately, none of them are the 1440 x 900 native resolution of the monitor.
And all the other options look terrible.
The primary monitor is gorgeous: bright, crisp, high contrast. I've yoinked the second LCD, though, and replaced it with one of the old Trinitrons -- which now looks kinda dingy and squinty by comparison.
So... it's time for a new video card, too. My current one is a year and a half old. I don't remember the manufacturer, but it's running the nVidia geForce FX 5500 chipset.
My question is this:
Is this a common problem with multi-monitor video cards, or is it just a quirk of this brand -- or even just this card in particular? If I go and shell out for another card, will I still have the same problem? Or will newer cards include the 1440 x 900 resolution (which is a pretty common standard amongst LCD widescreens, I understand) as an option even for "generic" monitors?
The new monitors accept both VGA and DVI inputs. They're currently using VGA, since that's all my old card provides (a deliberate choice, since the Trinitrons were, of course, VGA). If I get a card with dual digital outputs, will it be more likely to recognize and/or provide the proper resolution?
Thanks in advance.
UPDATE (21:33):
![[livejournal.com profile]](https://www.dreamwidth.org/img/external/lj-userinfo.gif)
no subject
Date: 2007-08-18 11:17 pm (UTC)no subject
Date: 2007-08-19 01:37 am (UTC)no subject
Date: 2007-08-19 02:35 pm (UTC)no subject
Date: 2007-08-19 06:49 pm (UTC)no subject
Date: 2007-08-19 08:41 pm (UTC)no subject
Date: 2007-08-19 06:10 am (UTC)The good news is that if I get a postdoc, chances are excellent that my next employer will pay for my next laptop. It seems to be quite common; one of the few perks a postdoc GETS with any sort of funding, is their computer equipment paid for. :>
no subject
Date: 2007-08-19 07:01 am (UTC)