ATi or NVIDIA

  • Two Factor Authentication is now available on BeyondUnreal Forums. To configure it, visit your Profile and look for the "Two Step Verification" option on the left side. We can send codes via email (may be slower) or you can set up any TOTP Authenticator app on your phone (Authy, Google Authenticator, etc) to deliver codes. It is highly recommended that you configure this to keep your account safe.

zakapior

New Member
Apr 22, 2006
12
0
0
neilthecellist said:
Erm.... Why aren't you shooting for 512mb on your vRam?
256 mb vram is enough. I think that like in fear if we have 128mb vram we aren't able to set maxiumum textures detail only medium and performance is a bit worse.
For me nowadays egine that runs fear and condemnded criminals is the best on pc and unreal engine 3.0 should be a little better and have simlar pc hardware requirements( maybe more cpu power for more advanced physics). Oblivion sucks looks like 3dmark 2003 nature test and optimization is ****ed up
So 5-6000 3dmarks 2005 i think unreal engine should have good performance and there is no reason to panic. All this quad sli must have is just marketing bull****
:lol:
obviously if you want to play 1600x1200 fsaa 4x anizo 16x you must have something better
for most players 1024x768 or 1280x1024(lcd native) is enough
 
Last edited:

The_Head

JB Mapper
Jul 3, 2004
3,092
0
36
36
UK
www.unrealized-potential.com
Quad SLI for max settings leads me to believe that the game can scale really high and look awesome.
Single graphics card and a decent system will be fine for people who play at 1024x768.
I myself will be aiming to play 1280x1024 on decent - high settings (probs with an X1900). But not being to play it on high isn't the end of the world to me.
 

zakapior

New Member
Apr 22, 2006
12
0
0
The_Head said:
Quad SLI for max settings leads me to believe that the game can scale really high and look awesome.
Single graphics card and a decent system will be fine for people who play at 1024x768.
I myself will be aiming to play 1280x1024 on decent - high settings (probs with an X1900). But not being to play it on high isn't the end of the world to me.
sli orcrossfire this is enought to have awesome looking game

quad for me it is like sience fiction
for this money i would buy ps3 or xbox360
quad price is just sick:lol:
epic only used quad in presentation because they have unoptimized game and they cooperate with nvidia .For nvidia it is the best advertisement
and who need 100-200fps :lol:
 

neilthecellist

Renegade.
May 24, 2004
2,306
0
0
San Diego, California
www. .
zakapior said:
256 mb vram is enough. I think that like in fear if we have 128mb vram we aren't able to set maxiumum textures detail only medium and performance is a bit worse.
For me nowadays egine that runs fear and condemnded criminals is the best on pc and unreal engine 3.0 should be a little better and have simlar pc hardware requirements( maybe more cpu power for more advanced physics). Oblivion sucks looks like 3dmark 2003 nature test and optimization is ****ed up
So 5-6000 3dmarks 2005 i think unreal engine should have good performance and there is no reason to panic. All this quad sli must have is just marketing bull****
:lol:
obviously if you want to play 1600x1200 fsaa 4x anizo 16x you must have something better
for most players 1024x768 or 1280x1024(lcd native) is enough

Hm. You're making sense, although, if the forums allow, I must ask you this question.

In the far future, you'll have to upgrade your computer hardware, right? By then, we'll be on 19" monitors. Hell, my friends are all purchasing 21" ones right now and I'm stuck on 19". A bigger monitor, obviously, as you would know, gives you more screen look, especially with higher resolutions and stuff. So my question is, wouldn't you want a video card with 512 vRAM right now so you don't have to suffer resolution crap later when you get a bigger monitor?

And if you're not getting a bigger monitor any time soon... That's just weird.
 

zakapior

New Member
Apr 22, 2006
12
0
0
neilthecellist said:
Hm. You're making sense, although, if the forums allow, I must ask you this question.

In the far future, you'll have to upgrade your computer hardware, right? By then, we'll be on 19" monitors. Hell, my friends are all purchasing 21" ones right now and I'm stuck on 19". A bigger monitor, obviously, as you would know, gives you more screen look, especially with higher resolutions and stuff. So my question is, wouldn't you want a video card with 512 vRAM right now so you don't have to suffer resolution crap later when you get a bigger monitor?

And if you're not getting a bigger monitor any time soon... That's just weird.
i have 18'' lcd it is like 19'' crt i dont need more

Buying dx9 card is just a waste of money

r600 with unified architecture will be good option
nvidia g80 seems to be joke half dx10 without unified architecture

If you want higher resolution like 1600x1200 single 1900xtx will be good choice
 

Nordoch

New Member
Dec 17, 2004
30
0
0
well thank U all for your inputs.... ive decided to for the x1800xt, n i guess the drivers should be Omega drivers right..???
 

dragonfliet

I write stuffs
Apr 24, 2006
3,754
31
48
41
Kang the Mad said:
Could you run UT2004 on a ATI x300?
I'm sure you could if you don't mind the lack of spiffy effects and the annoying sound of your poor graphics card crying.
 

Phopojijo

A Loose Screw
Nov 13, 2005
1,458
0
0
37
Canada
The_Head said:
YA RLY!
*cough*
Nice load of interactive benchmakrs there, this one is particularly impressive, although I suppose its running on high settings, which is good for ATI (If you want High AA and AF choose an X1800XT > 7800GTX)
check out the low quality bench on FEAR, seems the X1800XT wins again.
OOo just found one that the 7800GTX wins on, if you play Black and White with low res and no AA or AF (aka fugly) the 7800GTX pulls a 6fps lead.
Even in Doom that is an OpenGL game the 7800GTX barely pulls a lead at low res, put it in high res and the X1800XT pulls ahead.
Think your point is proven wrong until you have some further proof.



wtf is that suposed to mean? how on earth is that relavent?
You're comparing a 256MB card to a 512MB card...?

Duh? Caching to RAM and floating across the PCI-e bus generally lags crap. Go figure? Especially with the PCI-e bus already flooded with the CPU vertex-deformation of meshes flooding back and forth for each frame of animation and each tick of physics.

Try the 7900GTX single 512 vs the X1900XT single 512 where the FPS (even in HalfLife2) are pretty much equal, even with the 7900GTX beating the X1900XTX in high-res + AA/AF.

Especially with ATI repetitively cheating in their benchmarks.

Sure, nVidia's drivers have the second optimization aswell -- but they have the option to turn it off AND they declaire its even there BEFORE benchmarkers discover it and email them about it more than twice.
 
Last edited:

The_Head

JB Mapper
Jul 3, 2004
3,092
0
36
36
UK
www.unrealized-potential.com
Phopojijo said:
You're comparing a 256MB card to a 512MB card...?

---snip---
I was comparing the 7800GTX to the X1800XT because they are approximately a similar price.
If you compare the X1800XT to the 7800GTX512 get the X1800XT, as it is much much cheaper.


UT2004 will be fine on an X300, should be able to run on pretty good settings too.
 

Juguard

The King Is Dead, Punk Rock Lives!
Nov 30, 1999
570
0
16
Tustin CA, USA
Visit site
I would just get a card when UT2007 is out. Don't worry about the brand too much, since the battles are so close. I don't think ATI or Nvidia will have a huge lead, one over the other, when UT2007 is out.

For me, I love Nvidia, been gaming with them since TNT2 Ultra. Never problems, no matter what system I used them in. ATI drivers just don't do it for me, had problems with them in the past.
 

omnius

New Member
Sep 29, 2004
14
0
0
I'm sure my GPU will be fine (radeon x1600) but I'm more worried about my CPU (AMD Athlon™ 64 X2 Dual-Core 3800+) is it ok for 2007?
 

zakapior

New Member
Apr 22, 2006
12
0
0
omnius said:
I'm sure my GPU will be fine (radeon x1600) but I'm more worried about my CPU (AMD Athlon™ 64 X2 Dual-Core 3800+) is it ok for 2007?
:lol: :lol: :lol:
AMD Athlon™ 64 X2 Dual-Core 3800+ was too weak cpu even for 2005.
Stupid question so here is the answer.
:lol: :lol: :lol: :lol: :lol:
 

The_Head

JB Mapper
Jul 3, 2004
3,092
0
36
36
UK
www.unrealized-potential.com
zakapior said:
:lol: :lol: :lol:
AMD Athlon™ 64 X2 Dual-Core 3800+ was too weak cpu even for 2005.
Stupid question so here is the answer.
:lol: :lol: :lol: :lol: :lol:
What sort of retarded post is that.
the X2-3800+ is a great chip, and I guarantee it will perform better than most single core processers that are around when UT2007 comes out (presuming it is multithreaded as has been promised)
I'd like to see where your idea of it being a weak chip for 2005 comes from.
Give it a bit of overclocking and it can match a 4600 with little difficulty.
 

Saverous

New Member
Mar 29, 2006
40
0
0
dual core cpu's weaker only in old games that doesn't support them. UE3 will support dual core processing, so Athlon X2 3800+ is enough for upcoming games.
 
Last edited:

dragonfliet

I write stuffs
Apr 24, 2006
3,754
31
48
41
Juguard said:
I would just get a card when UT2007 is out. Don't worry about the brand too much, since the battles are so close. I don't think ATI or Nvidia will have a huge lead, one over the other, when UT2007 is out.

Actually, I would wait for the new ATI card (forget the codename) that will support DX10 with Unified Shader Architecture(USA), or If nvidia wakes up in time to fit the G80 with USA, then it should be a close choice between the two; but waiting an extra month or 3(if it's delayed for the vista launch) will be well worth it, as you'll have a decently futureproof GFX card.
~Jason