256 mb vram is enough. I think that like in fear if we have 128mb vram we aren't able to set maxiumum textures detail only medium and performance is a bit worse.neilthecellist said:Erm.... Why aren't you shooting for 512mb on your vRam?
sli orcrossfire this is enought to have awesome looking gameThe_Head said:Quad SLI for max settings leads me to believe that the game can scale really high and look awesome.
Single graphics card and a decent system will be fine for people who play at 1024x768.
I myself will be aiming to play 1280x1024 on decent - high settings (probs with an X1900). But not being to play it on high isn't the end of the world to me.
zakapior said:256 mb vram is enough. I think that like in fear if we have 128mb vram we aren't able to set maxiumum textures detail only medium and performance is a bit worse.
For me nowadays egine that runs fear and condemnded criminals is the best on pc and unreal engine 3.0 should be a little better and have simlar pc hardware requirements( maybe more cpu power for more advanced physics). Oblivion sucks looks like 3dmark 2003 nature test and optimization is ****ed up
So 5-6000 3dmarks 2005 i think unreal engine should have good performance and there is no reason to panic. All this quad sli must have is just marketing bull****
obviously if you want to play 1600x1200 fsaa 4x anizo 16x you must have something better
for most players 1024x768 or 1280x1024(lcd native) is enough
i have 18'' lcd it is like 19'' crt i dont need moreneilthecellist said:Hm. You're making sense, although, if the forums allow, I must ask you this question.
In the far future, you'll have to upgrade your computer hardware, right? By then, we'll be on 19" monitors. Hell, my friends are all purchasing 21" ones right now and I'm stuck on 19". A bigger monitor, obviously, as you would know, gives you more screen look, especially with higher resolutions and stuff. So my question is, wouldn't you want a video card with 512 vRAM right now so you don't have to suffer resolution crap later when you get a bigger monitor?
And if you're not getting a bigger monitor any time soon... That's just weird.
yep omega onlyNordoch said:well thank U all for your inputs.... ive decided to for the x1800xt, n i guess the drivers should be Omega drivers right..???
I'm sure you could if you don't mind the lack of spiffy effects and the annoying sound of your poor graphics card crying.Kang the Mad said:Could you run UT2004 on a ATI x300?
You're comparing a 256MB card to a 512MB card...?The_Head said:YA RLY!
*cough*
Nice load of interactive benchmakrs there, this one is particularly impressive, although I suppose its running on high settings, which is good for ATI (If you want High AA and AF choose an X1800XT > 7800GTX)
check out the low quality bench on FEAR, seems the X1800XT wins again.
OOo just found one that the 7800GTX wins on, if you play Black and White with low res and no AA or AF (aka fugly) the 7800GTX pulls a 6fps lead.
Even in Doom that is an OpenGL game the 7800GTX barely pulls a lead at low res, put it in high res and the X1800XT pulls ahead.
Think your point is proven wrong until you have some further proof.
wtf is that suposed to mean? how on earth is that relavent?
I was comparing the 7800GTX to the X1800XT because they are approximately a similar price.Phopojijo said:You're comparing a 256MB card to a 512MB card...?
---snip---
omnius said:I'm sure my GPU will be fine (radeon x1600) but I'm more worried about my CPU (AMD Athlon™ 64 X2 Dual-Core 3800+) is it ok for 2007?
What sort of retarded post is that.zakapior said:
AMD Athlon™ 64 X2 Dual-Core 3800+ was too weak cpu even for 2005.
Stupid question so here is the answer.
Juguard said:I would just get a card when UT2007 is out. Don't worry about the brand too much, since the battles are so close. I don't think ATI or Nvidia will have a huge lead, one over the other, when UT2007 is out.