geForce6800 or RadeonX800

  • Two Factor Authentication is now available on BeyondUnreal Forums. To configure it, visit your Profile and look for the "Two Step Verification" option on the left side. We can send codes via email (may be slower) or you can set up any TOTP Authenticator app on your phone (Authy, Google Authenticator, etc) to deliver codes. It is highly recommended that you configure this to keep your account safe.

6800 or X800?

  • geForce 6800

    Votes: 16 36.4%
  • Radeon X800 XT

    Votes: 28 63.6%

  • Total voters
    44

Thanatos45

Frag-tastic
Mar 19, 2004
342
0
0
38
Dutchieland
Since the both the GF 6800 Ultra and Radeon X800 XT will blow away every other card that's currently on the market, I'd just go with the one that will be the cheapest.
 

PainAmplifier

Evil by Example
Radeon X800 XT would be my choice. A cooler card that takes up less space is worth more to me.

The GF6800 shows that Nvidia just might recover from the 5x series debacle...now they need to just make sure they avoid the benchmark wacky-weed, to recover some of their lost trust also. ATI learned their lesson after the Quake/Quack thing...now can Nvidia learn this lesson also? (FarCry/FartCry isn't damning...but where there's smoke...they better be making beef jerky and not sniffing the fumes.)
 

JaFO

bugs are features too ...
Nov 5, 2000
8,408
0
0
Ati ... then you don't need to buy a new powersupply. ;)
 

Sir_Brizz

Administrator
Staff member
Feb 3, 2000
26,020
83
48
I always vote for nVidia, because I bought this ATI and I don't like it at all.
 

m0nk

New Member
May 3, 2004
25
0
0
The x800 is a benchmarking card. The 6800 is a feature/enthusiast card. Both rock hardcore.


ATI takes the anti aliasing crown. Nvidia takes the OpenGL crown (Doom3 anyone :))
Since Nvidia opted to include Shader Model 3.0, they had to increase their die size to accomadate which hit performance by a bit. ATI will have to suffer the same fate next generation.

Also to dispell the myth, the 6800 does not require a 480w power supply. I have seen many benches that have run it on a 350w. Also, Nvidia released a statement saying that the only reason they suggested a 480w is that to do some serious OCing, you need that kind of power.

Also keep in mind that the benches you see now are with ATI's final driver and Nvidia's beta driver. I guess there is a problem with the way the Nvidia's drivers handle zbuffer that uneccesarily slows it down in dx9 games.

Lets also hope that ATI doesn't follow with what they did with a recent batch of cards. Evidently, there have been quite a few problems OCing them :)

The spinmeister has spun 8)

Since I am an avid OCer, 6800 all the way.
 
Last edited:

JaFO

bugs are features too ...
Nov 5, 2000
8,408
0
0
You might not need a new powersupply, but they do require two leads from separate outlets in the powersupply to keep it stable.

Of course there's also the fact that nVidia uses one hell of a cooling-solution (that could make the card hover if it wasn't so heavy ;)), while Ati still manages to keep it within sane limits.

Seriously this is insane :
1081965449.jpg


compared to

1083420873.jpg


OC'ing is for penny-pinching loosers and their wannabe geek friends.
Of course you're going to run into problems if you go past the official limits.
Overclocking is not safe for the average user no matter how many people manage to do it without problems.

You could blame Ati for not having as big an error-margin in that department ... or you could blame nVidia for underclocking their products.
 

TomWithTheWeather

Die Paper Robots!
May 8, 2001
2,898
0
0
43
Dallas TX
tomwiththeweather.blogspot.com
I've been a nVidia guy on my home pc, but I've used nothing but ATI at work. Both have their strengths and weaknesses. nVidia has dual monitor hardware acceleration while ATI only supports one moniter acceleration (not sure about the new cards though). I can't really tell a difference in image quality, seeing as how I play mostly fast-paced fps games. ATI usually seems to be able to squeeze out a few more frames per second, though not enough to really matter.

That card with the massive cooler is probably just the reference model. Usually 3rd party companies like Albatron or MSI use smaller, better coolers.
 

m0nk

New Member
May 3, 2004
25
0
0
JaFO said:
You might not need a new powersupply, but they do require two leads from separate outlets in the powersupply to keep it stable.

Oh wow, so hard to hook up two 12v molex to a card...

JaFO said:
Of course there's also the fact that nVidia uses one hell of a cooling-solution (that could make the card hover if it wasn't so heavy ;)), while Ati still manages to keep it within sane limits.

I agree, but the card is built for the enthusiast. The overclocker.

JaFO said:
OC'ing is for penny-pinching loosers and their wannabe geek friends.
Of course you're going to run into problems if you go past the official limits.
Overclocking is not safe for the average user no matter how many people manage to do it without problems.

So wrong...not everyone is rich. Some people want to buy a graphics card and stretch it for 2 years. That won't happen nowadays unless you OC.

OCing is for people who love to tweak. Who love to figure something out. Who love to stretch their hardware to the limit.

Seriously, why do you have to reply like a big giant penis?

The reason I say that the 6800 is an enthusiast card is because it was meant to be overclocked. To reiterate what I said:

m0nk said:
The x800 is a benchmarking card. The 6800 is a feature/enthusiast card. Both rock hardcore.

So go and buy your ATI. Both cards ROCK...leave the overclocking to those that want to do it.
 
Last edited:

TomWithTheWeather

Die Paper Robots!
May 8, 2001
2,898
0
0
43
Dallas TX
tomwiththeweather.blogspot.com
Overclocking usually doesn't get you enough extra fps for it to make a noticable difference, much less last you two years, unless you go crazy and dry-ice/liquid nitrogen cool it or something. Something like basic coolbits overclocking is mainly a mental thing. It's rarely worth the extra heat and wear&tear on your card. :hmm:

"OMG!!!1 I get 150 in UT2004 with teh Gefarce 6800!!" Why would you need to go higher? You won't notice a difference other than the little number in the corner of the screen is slightly higher. :rolleyes:
 

m0nk

New Member
May 3, 2004
25
0
0
TomWithTheWeather said:
Overclocking usually doesn't get you enough extra fps for it to make a noticable difference, much less last you two years, unless you go crazy and dry-ice/liquid nitrogen cool it or something. Something like basic coolbits overclocking is mainly a mental thing. It's rarely worth the extra heat and wear&tear on your card. :hmm:

"OMG!!!1 I get 150 in UT2004 with teh Gefarce 6800!!" Why would you need to go higher? You won't notice a difference other than the little number in the corner of the screen is slightly higher. :rolleyes:

It depends, you are looking at it from one side. Maybe not for UT2004 but in Farcry, for instance, can make a 15fps difference. If not more. And that's tons considering my already aging 5900 is only pushing 50fps in the good areas...In the bad it hits 20fps. But since I have it OCed...35 ;)

UT2004 is a more processor intensive game than anything. And I OC that as well.

Question: If OCing made absolutely no difference, how come the new OCed Alienware system coming out promises a 70% increase in performance over another computer with the same components built with factory settings? Of course you pay with your soul if you buy it...technically, that system will outlast the other by 70% longer.

Even if it doesn't hold up to 70%, thats still kicking the pants off the competition...

And yes, with OCing, I bet you can hold out for 2 years. Not this time though. DX9 is bringing a revolution with it...
 
Last edited:

I_Like_Fire

Smokin Bad Ass
May 4, 2004
349
0
0
52
Why not do it yourself I mean it doesn't take all that much as long as you know how to read and research it and have common sense. I have my processor over clocked to 3.6 from 2.8 except I use water cooling so heat is not really an issue.
 
Last edited: