ATi or NVIDIA

  • Two Factor Authentication is now available on BeyondUnreal Forums. To configure it, visit your Profile and look for the "Two Step Verification" option on the left side. We can send codes via email (may be slower) or you can set up any TOTP Authenticator app on your phone (Authy, Google Authenticator, etc) to deliver codes. It is highly recommended that you configure this to keep your account safe.

Phopojijo

A Loose Screw
Nov 13, 2005
1,458
0
0
37
Canada
The_Head said:
I was comparing the 7800GTX to the X1800XT because they are approximately a similar price.
If you compare the X1800XT to the 7800GTX512 get the X1800XT, as it is much much cheaper.


UT2004 will be fine on an X300, should be able to run on pretty good settings too.
Meh, the X1900XTX and the 7900GTX are +/- 30$ from each other and the 7900GTX blows it away in most every situation.

(Except HDR+AA but frankly I'm still nearly-damn certain ATI is cheating with that and reverting to hardware-accellerated bloom+dynamic gamma... there's no way they'd have enough V-Ram and Bandwidth to truely render out and antialias HDR -- the Xbox360 eDRAM, well that's a tossup -- I don't know, but for the cards we have now... I can't see how its possible.)

I'm mostly just saying don't make brand-wide (or even series-wide) assumptions based on one or two cards. People need to make informed decisions based on what they buy -- hell not even AMD is a smart choice in every situation anymore now that Intel is going gungho on Conroe. (Intel has GOT to be losing money on sales if they release at their target price... ~40% faster than the FX-60 for ~1/2 the price... and 3 clocked-down models for cheaper.)
 
Last edited:

The_Head

JB Mapper
Jul 3, 2004
3,092
0
36
36
UK
www.unrealized-potential.com
Phopojijo said:
Meh, the X1900XTX and the 7900GTX are +/- 30$ from each other and the 7900GTX blows it away in most every situation.

(Except HDR+AA but frankly I'm still nearly-damn certain ATI is cheating with that and reverting to hardware-accellerated bloom+dynamic gamma... there's no way they'd have enough V-Ram and Bandwidth to truely render out and antialias HDR -- the Xbox360 eDRAM, well that's a tossup -- I don't know, but for the cards we have now... I can't see how its possible.)
AH, over in the UK the X1900XTX is far cheaper. around £350 for the cheapest XTX and around £400 for the GTX.

How can they be cheating? the effects look good, as good if not beter than the NVidia cards, they have a far better RAM architecture with the ringbus.
You say "except AA and HDR" who would play games with cards this expensive without AA and HDR.
Only people I can think of are the people that go around playing CSS on low settings because they don't want under 300fps.....
 

Phopojijo

A Loose Screw
Nov 13, 2005
1,458
0
0
37
Canada
The_Head said:
AH, over in the UK the X1900XTX is far cheaper. around £350 for the cheapest XTX and around £400 for the GTX.

How can they be cheating? the effects look good, as good if not beter than the NVidia cards, they have a far better RAM architecture with the ringbus.
You say "except AA and HDR" who would play games with cards this expensive without AA and HDR.
Only people I can think of are the people that go around playing CSS on low settings because they don't want under 300fps.....
Because I still don't think it is AA and HDR... I think they turn off HDR and revert to a 32-bit color Antialiasing method.

But yea -- as for cheating... ATI actually was caught last generation for rendering different images on screenshots than on framerate tests... it was quite humourous to hear ATI's response of "You see, that's just the complexity of the optimizations!" claiming that the only reason why the drivers refused to render the true anisotropic filtering pattern is because "it knows when it needs what quality".
 

The_Head

JB Mapper
Jul 3, 2004
3,092
0
36
36
UK
www.unrealized-potential.com
Phopojijo said:
Because I still don't think it is AA and HDR...
Well it looks like HDR, which is what matters.
In this case its not how you get to the outcome, it is the outcome itself.

Also NVidia have been caught out with cheeky optimisations too, especially when the 6800 series came out. Its also widely accepted by all but NVidia fanboys that the picture quality on the new ATI cards far exceeds NVidia.
I can find you numerous threads of people going from NVidia to ATI if you can't find any / are too lazy to look.
 

Phopojijo

A Loose Screw
Nov 13, 2005
1,458
0
0
37
Canada
The_Head said:
Well it looks like HDR, which is what matters.
In this case its not how you get to the outcome, it is the outcome itself.

Also NVidia have been caught out with cheeky optimisations too, especially when the 6800 series came out. Its also widely accepted by all but NVidia fanboys that the picture quality on the new ATI cards far exceeds NVidia.
I can find you numerous threads of people going from NVidia to ATI if you can't find any / are too lazy to look.
Oh indeed, nVidia's not a saint either.

However in every optimization I've seen of nVidia's, you're allowed to shut them off. That was the key difference between ATI and nVidia's Aniso Brilinear optimization -- nVidia's drivers let you turn it off to bench without it. As a matter of fact, a lot of ATI's higher framerate postings were done by benchmarkers unaware of ATI's optimizations. They turned off nVidia's optimizations but left ATI's on (due to no command to disable).

And yea -- I've seen the screenshots, they're pretty much level to be honest.

All this said, ATI does have a bunch of nice cards out there... research DEEPLY before you shell out over half a grand on silicon -- don't take a few benchmarks for granted. nVidia and ATI are so close there's no way you can give a blanket statement to everyone on which card they should buy.
 

neilthecellist

Renegade.
May 24, 2004
2,306
0
0
San Diego, California
www. .
The_Head said:
I was comparing the 7800GTX to the X1800XT because they are approximately a similar price.
If you compare the X1800XT to the 7800GTX512 get the X1800XT, as it is much much cheaper.


UT2004 will be fine on an X300, should be able to run on pretty good settings too.

... What? ATi cards have always been cheaper than their nVidias. No duh, but so does quality.
 

Sir_Brizz

Administrator
Staff member
Feb 3, 2000
26,020
83
48
err... Hardware Accelerated Bloom + AA would look just as good as HDR in 99.999999% of games available right now. Part of the problem is that there are very few games that actually use HDR to it's full potential, when Bloom can do the exact same type of rendering at lower cost.

I also can't see how HDR+AA can be done without huge framerate hits. In fact, the FPS numbers on benchmarks utilizing both of those don't seem to have been hit nearly as hard as they should have been.
 

omnius

New Member
Sep 29, 2004
14
0
0
Flexx said:
dual core cpu's weaker only in old games that doesn't support them. UE3 will support dual core processing, so Athlon X2 3800+ is enough for upcoming games.
Thanks, I'm going to try roboblitz and see if it works.
 

Phopojijo

A Loose Screw
Nov 13, 2005
1,458
0
0
37
Canada
Sir_Brizz said:
err... Hardware Accelerated Bloom + AA would look just as good as HDR in 99.999999% of games available right now. Part of the problem is that there are very few games that actually use HDR to it's full potential, when Bloom can do the exact same type of rendering at lower cost.

I also can't see how HDR+AA can be done without huge framerate hits. In fact, the FPS numbers on benchmarks utilizing both of those don't seem to have been hit nearly as hard as they should have been.
The thing I'm thinking about is UE3 will use true HDR -- how can ATI cope with it?

Ain't none of the UT2007/BIA3/whatever screenshots are fully AA'd. Even with Gearbox being ATI fans.

5$ claim its going to be a "patch" which just reroutes final frame control to ATI drivers for 24-bit AA.
 
Last edited:

The_Head

JB Mapper
Jul 3, 2004
3,092
0
36
36
UK
www.unrealized-potential.com
neilthecellist said:
What? ATi cards have always been cheaper than their nVidia card counterparts. That goes without saying, but the quality between these counterpart comparisions are questionable.
Thats better :)
I don't think there is any quality gap, I have certainly seen no evidence of NVidia being better quality, and if anything, ATI seem to be better quality at the moment with the Performance and price o the X1*00 series.
 

Sir_Brizz

Administrator
Staff member
Feb 3, 2000
26,020
83
48
Electrolyte said:
When you say true you mean hardware accelerated HDR - the X1### series fully support it.
We're discussing the ramifications of a fully Anti-aliased HDR scene. ATI claims they can do this, but we don't believe them.
The_head said:
Thats better
I don't think there is any quality gap, I have certainly seen no evidence of NVidia being better quality, and if anything, ATI seem to be better quality at the moment with the Performance and price o the X1*00 series.
I haven't seen much of ATI being cheaper this generation. In the US they are generally $50 more for the similar performing card.
 

Phopojijo

A Loose Screw
Nov 13, 2005
1,458
0
0
37
Canada
Electrolyte said:
When you say true you mean hardware accelerated HDR - the X1### series fully support it.
Claims ATI, however they claim plenty of things in the past. Like I said, lets wait for UnrealEngine3 to come out, and see how ATI deals with it. My guesses is ATI will release another mysterious "HDR+AA" patch which'll just (like my gut feeling claims for the Oblivion patch) reroute HDR to the Catalyst drivers which'll force a "close but not quite" alternative and pass it off as 100% authentic.

As for ATI being cheaper -- yea I've seen that, they're generally around 20-30$ cheaper than a similar nVidia card... the thing is that similar nVidia card may get its ass kicked by the ATI counterpart, or may be the one doing the ass kicking (see: X1900XTX vs GeForce7900 GTX-512 -- nVidia wins).
 
Last edited:

Homeslice

A mapper not in the zone
Right now I have a Pentium 4 (somewhere at about under 2 GHz), 768 MB of PC2700 memory, and an ATi Radeon 9550 SE card. That SO wouldn't be enough to run the game. I have a Pentium 4 3.6 GHz processor that I hope to install soon (with a new motherboard and 2 GB (eventually hoping to make it 4 GB) of PC3200 memory), but I'm looking to spend no more than maybe $200 on a graphics card ATM. It looks like the most I can afford on the nVIDIA side is a 7600 GT. What about Radeon? What would be best to use in this deal, and will it run the game ok?