Because both Nvidia and ATI are neck and neck and ATI drivers suck. Obviously in times when there is a disparity, the cheapest always wins, but right now, Nvidia wins the performance crown flat out and when it comes to bang for buck in the mid range, there is no clear winner.
~Jason
Eh... I think it's more fair to say that ATI's drivers DID suck, because they're pretty solid nowadays (even in Linux - imagine my surprise.) I never have to worry about which drivers are the best for which game because all games run pretty good no matter what version of Catalyst I'm running. I've upgraded to every new release since 10.6 (we're now up to 10.12) and I have experienced no decrease in quality or performance in any of the games I regularly play. I have, however, noticed slight increases in performance on very new games.
The new CCC that is currently in beta is also pretty good. It's at least as good as nVidia's control panel, which may not be saying much
Small sample size here, but the only issues with the AMD drivers I've had was Borderlands locking up with VSynch for a couple versions and the weird glitch they had with PopCap games not drawing to the screen properly a while back. In general, I've just learned to not upgrade my drivers unless it fixes a particular game, offers performance improvements, or new functionality.
I played Borderlands for some time with all settings turned on (including vsync) and never experienced any lockups or crashes. What's your other hardware? Catalyst version? I could test on my own system to see if I get similar results.
There is one thing I really don't like about ATI, and that is the GLSL implementation. GLSL on ATI is apparently much more strict than it is on nVidia, so if GLSL applications aren't coded properly they will crash on startup. It's definitely not a deal breaker and I've only had a problem with one specific application; a GLSL shader library for Minecraft. The problem was eventually fixed so it's irrelevant to me now.
I'm pretty much in agreement with DarkED though. The 5850 and higher cards are more that ample for today's games, they draw less power, and are quieter than the Fermi cards were when they launched (I think I've read the newer Fermis have better power usage and fans).
Yeah, at the time that was the biggest selling point for me besides the low price; I already knew that this card would run everything at max detail, but I was building a new rig on a budget so I wanted a card that wasn't too power-hungry.
To be fair that Nvidia driver cock-up was rare for them and they fixed it quickly and had it pulled.
I use both Nvidia and ATi and ATi does seem to just have fail programmed into it. Especially with black ops where i struggled to push 30FPS at 800x600 on ATi, I had 0 issues running 60FPS 1920x1200 on nvidia with cards that put out roughly the same FPS in other games, UT3 for example.
I guess it's really personal preference but I wouldn't go ATi again.
Well, from what I've heard Black Ops is problematic on a lot of cards from both manufacturers (as well as on the consoles?)
Already did on this system, remember? It didn't work, I was disappointed. Not that it is the card's fault, but I'm not looking to buy ATi stuff again.
Sorry, I must have missed that, or forgot. Why didn't it work?