I HAVE THE OPPORTUNITY.. WAIt.... THE CHANCE TO PURCHASE EITEHR ONE OF THESE BEUTIFUL CARDS FOR THE SAME PRICE.. EGIHTY DOLARS..... WHICH ONE SHOULD I GO FOR AND WHY? THEY BOTH GONNA COST THE SAMES FOR ME BUT I WANT TO KNOW WHO I SHODLD GET BECAUSE IDONT WANNA GET A RIPPED OFF. I LOVE YOU 4CHANNELERS THANK YOU IONCE AGAIN FOR HELOPING ME WITH MY IMPULSE BUYINGS :)
Name:
Lynxis2006-05-30 0:08
I'd have to say it depends on what all your planning to play. ID Software based games (Quake/Doom) will probably be faster on the GeForce 6600GT. Playing a Half Life 2 based game? Then the Radeon x1600xt would be better.
Also, is the 6600GT 128MB or 256MB? What about x1600xt? If the GT is only 128MB, and the x1600xt is 256, then I would personally be leaning towards the x1600xt.
Name:
Anonymous2006-05-30 0:15
both are 256mb
how do you figure id games work better on the gf?
where did you get this information?
as far as games go, i mostly play eroge and qw. but i want it for ED6:SC.. my current gf4 mx440 64mb really eats shit on this game...
also, ive never had an ati card. i dont really want to switch, but if its going to benefit me more, then why the hell not.
also, i heard these new cards require a molex connector to even work properly. is this true?
Name:
Anonymous2006-05-30 0:30
wow. the faggotry is rampant in this thread.
just remember one major thing: "Radeon Sucks Balls"
get the 6600gt. the performance is roughly the same anyway. the benefits from the nvidia card are "digital vibrance control" and "image sharpening" in hardware (*no* performance loss). you can enable both in the driver-settings. the first one will make your colors more rich and beautiful. they look palish without it, so i wouldn't be satisfied without it anymore. it's not just for games, but the whole "desktop experience". you'll see when you enable it. the image sharpening does what it sounds like, it sharpens the image you see. if you have a crt this is the way to go, because fonts will look *very* sharp and crisp, even on a cheap screen like mine.
oblivion will work on both cards on medium (and some high) settings.
Name:
Anonymous2006-05-30 14:37
Radeon drivers are a piece of shit. The cards ain't that good either, performance and power consumption wise NVidia is ahead right now.
I'd spend more and get a 7 series geforce, the price is around the same as a 6600GT anyway.
As for molex, yes Geforce 6600 GT 128/256mb cards have a molex connector for power, BUT that's the AGP version, PCI-E ones don't have additonal power connector far as I know.
Oh yea that sharpening thing is pretty nice too as #7 said
Name:
Lynxis2006-05-30 23:40
Well for the record, I'm running an XFX 6600GT 128MB card in this PC right now.. I have some pretty serious complaints about this card too. Basically that it doesn't support old games with the newer forceware drivers. I'm running Forceware 84.21 (Latest nVidia reference driver) and I cannot play the original Quake 1. Quake 2 shows graphical errors and Quake 3 WILL NOT use trilinear filtering or anti-aliasing. Even forcing them through the driver panel does not work.
Quake games aren't the only ones that don't work either. Final Fantasy 7/8 PC don't work properly or at all.
You probably don't care about any of the games I've listed but these are the only examples of older games that I have. I'm sure others have issues as well which I'm just not aware of. It's also possible that there is just something wrong with my particular card although that seems unlikely because these issues are remedied by going to an older driver version.
I've asked around various boards about this and the response is to use an older driver version if I'm going to play these games. I'm not exactly a fan of having to switch between drivers each time I feel like playing something different.
My roomate uses an ATi x800 series card and has none of these problems.
The digital vibrance control and image sharpening are nice, easy effects to impliment on nVidia hardware. Trust me, similar effects ARE available for ATi cards, they just aren't as simple to get to or modify. I admit nVidia's control panel is simple yet effective. I've heard a few people complaining about ATi's control panel.
As far as raw performance, you'll find that both cards perform equally overall with an edge going to one or the other depending on the game played.
If you are comfortable with an nVidia card, then the 6600gt might be a good choice for you. If you'd like a change, maybe you should pick up the x1600xt.
We can't escape our biases but I try my best not to be clouded by them. I know lots of people who own cards from either company and I've personally bounced back and forth a decent amount. TNT2->GeForce2->Radeon 8500->Radeon 9600->Geforce 6600
I'm seriously considering my next card to be an ATi, for the record.
Name:
Anonymous2006-05-31 2:09
Both seem to be neck and neck, but the 6600GT has a clear upper hand in games that use an OpenGL renderer and there is the added comfort of PS3 support. Not to mention it's the better of the two giants should you ever fiddle w/ Linux down the line.
My vote's for the 6600GT.
Name:
Anonymous2006-05-31 15:49
>>10
x1xxx series cards added in pixel shaders v3 www.omegadrivers.com for drivers better than the normal ATI ones
If you can find an nvidia 6800GS, get one of those as they beat a radeon x1600XT pretty soundly and is about the same price http://www.driverheaven.net/reviews/X16_GS/index.htm for a comparison
>>15
just don't overdo it with the sharpener. i usually set it to 5, which isn't that high, but gives a nice sharpening effect without fucking the image up. the digital vibrance however looks nice even on higher settings, like the 20 i usually use.
Name:
Anonymous2006-06-01 13:05
>>17
holy shit.. the digital vibrance is nice.. but i set it to medium, which is only about 25% of the slider... i tried it maxed out and i almost went blind.
the sharpener.... i will leave that disabled because it looks like total fucking shit. i have a viewsonic vx2025wm, it already looked fresh out of the box. the digital vibrance helped a little though.
Name:
Anonymous2006-06-01 18:56
>>18
lcds shouldn't need any kind of sharpening. on crts its very nice though.
Name:
Anonymous2006-06-02 8:27
uh.. i just tried to play some games on it...
it crashes everytime the game loads............
lol drivers?
too bad www.omegadrivers.net doesnt tweak nvidia drivers anymore
connect the power to the card if your pc is agp noob
other than that enjoy your reformat
Name:
Anonymous2006-06-03 17:07
ok i figured out the problem... but the solution is not one that i particularly like
i was trying to use the card in my dual opteron box... i was monitoring the temperature of the card in riva tuner, and everytime i ran a 3D game, the temperature steadily went from 59C to about 78-79C before the system locked. apparently the heat from the opterons SEATED DIRECTLY ADJACENT to the card was too much for it to handle. i would like to thank the GENIUSES AT TYAN FOR MAKING THIS HAPPEN.
anyway, i had an old 1.5ghz P4 box sitting around that i havent used in about a year... i threw the card in there, played some games, the shit works FLAWLESSLY. but i dont want to play games on that computer. for one, its 1.5ghz. what the fuck games can you play on that? also, i have 384mb of SDRAM on it. pc133 to be precise. its a slow as shit system.
i am pissed though. i cannot use the motherfucking card in the system i bought it for. what a bunch of bullshit
Name:
Anonymous2006-06-03 17:45
got fans? anything to move some air around the card..
Name:
Anonymous2006-06-03 23:35
you do not understand what i just got done explaining
i have TWO opterons sitting DIRECTLY NEXT TO THE FUCKING AGP SLOT
i would take a picture but im not getting underneath that desk again anytime soon