1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.
  2. Two Factor Authentication is now available on BeyondUnreal Forums. To configure it, visit your Profile and look for the "Two Step Verification" option on the left side. We can send codes via email (may be slower) or you can set up any TOTP Authenticator app on your phone (Authy, Google Authenticator, etc) to deliver codes. It is highly recommended that you configure this to keep your account safe.

ATi or NVIDIA

Discussion in 'Unreal Tournament 3' started by Nordoch, Apr 19, 2006.

  1. neilthecellist

    neilthecellist Renegade.

    Joined:
    May 24, 2004
    Messages:
    2,306
    Likes Received:
    0
    Erm.... Why aren't you shooting for 512mb on your vRam?
     
  2. zakapior

    zakapior New Member

    Joined:
    Apr 22, 2006
    Messages:
    12
    Likes Received:
    0
    256 mb vram is enough. I think that like in fear if we have 128mb vram we aren't able to set maxiumum textures detail only medium and performance is a bit worse.
    For me nowadays egine that runs fear and condemnded criminals is the best on pc and unreal engine 3.0 should be a little better and have simlar pc hardware requirements( maybe more cpu power for more advanced physics). Oblivion sucks looks like 3dmark 2003 nature test and optimization is ****ed up
    So 5-6000 3dmarks 2005 i think unreal engine should have good performance and there is no reason to panic. All this quad sli must have is just marketing bull****
    :lol:
    obviously if you want to play 1600x1200 fsaa 4x anizo 16x you must have something better
    for most players 1024x768 or 1280x1024(lcd native) is enough
     
    Last edited: Apr 23, 2006
  3. The_Head

    The_Head JB Mapper

    Joined:
    Jul 3, 2004
    Messages:
    3,092
    Likes Received:
    0
    Quad SLI for max settings leads me to believe that the game can scale really high and look awesome.
    Single graphics card and a decent system will be fine for people who play at 1024x768.
    I myself will be aiming to play 1280x1024 on decent - high settings (probs with an X1900). But not being to play it on high isn't the end of the world to me.
     
  4. zakapior

    zakapior New Member

    Joined:
    Apr 22, 2006
    Messages:
    12
    Likes Received:
    0
    sli orcrossfire this is enought to have awesome looking game

    quad for me it is like sience fiction
    for this money i would buy ps3 or xbox360
    quad price is just sick:lol:
    epic only used quad in presentation because they have unoptimized game and they cooperate with nvidia .For nvidia it is the best advertisement
    and who need 100-200fps :lol:
     
  5. neilthecellist

    neilthecellist Renegade.

    Joined:
    May 24, 2004
    Messages:
    2,306
    Likes Received:
    0
    Hm. You're making sense, although, if the forums allow, I must ask you this question.

    In the far future, you'll have to upgrade your computer hardware, right? By then, we'll be on 19" monitors. Hell, my friends are all purchasing 21" ones right now and I'm stuck on 19". A bigger monitor, obviously, as you would know, gives you more screen look, especially with higher resolutions and stuff. So my question is, wouldn't you want a video card with 512 vRAM right now so you don't have to suffer resolution crap later when you get a bigger monitor?

    And if you're not getting a bigger monitor any time soon... That's just weird.
     
  6. zakapior

    zakapior New Member

    Joined:
    Apr 22, 2006
    Messages:
    12
    Likes Received:
    0
    i have 18'' lcd it is like 19'' crt i dont need more

    Buying dx9 card is just a waste of money

    r600 with unified architecture will be good option
    nvidia g80 seems to be joke half dx10 without unified architecture

    If you want higher resolution like 1600x1200 single 1900xtx will be good choice
     
  7. Nordoch

    Nordoch New Member

    Joined:
    Dec 17, 2004
    Messages:
    30
    Likes Received:
    0
    well thank U all for your inputs.... ive decided to for the x1800xt, n i guess the drivers should be Omega drivers right..???
     
  8. zakapior

    zakapior New Member

    Joined:
    Apr 22, 2006
    Messages:
    12
    Likes Received:
    0
    yep omega only:lol:
     
  9. Monster Kill

    Monster Kill New Member

    Joined:
    May 16, 2005
    Messages:
    138
    Likes Received:
    0
    Could you run UT2004 on a ATI x300?
     
  10. dragonfliet

    dragonfliet I write stuffs

    Joined:
    Apr 24, 2006
    Messages:
    3,754
    Likes Received:
    31
    I'm sure you could if you don't mind the lack of spiffy effects and the annoying sound of your poor graphics card crying.
     
  11. Phopojijo

    Phopojijo A Loose Screw

    Joined:
    Nov 13, 2005
    Messages:
    1,458
    Likes Received:
    0
    You're comparing a 256MB card to a 512MB card...?

    Duh? Caching to RAM and floating across the PCI-e bus generally lags crap. Go figure? Especially with the PCI-e bus already flooded with the CPU vertex-deformation of meshes flooding back and forth for each frame of animation and each tick of physics.

    Try the 7900GTX single 512 vs the X1900XT single 512 where the FPS (even in HalfLife2) are pretty much equal, even with the 7900GTX beating the X1900XTX in high-res + AA/AF.

    Especially with ATI repetitively cheating in their benchmarks.

    Sure, nVidia's drivers have the second optimization aswell -- but they have the option to turn it off AND they declaire its even there BEFORE benchmarkers discover it and email them about it more than twice.
     
    Last edited: Apr 25, 2006
  12. neilthecellist

    neilthecellist Renegade.

    Joined:
    May 24, 2004
    Messages:
    2,306
    Likes Received:
    0
    Cheating. I've got to investigate this ASAP. I'll post updates if I find anything, guys.
     
  13. The_Head

    The_Head JB Mapper

    Joined:
    Jul 3, 2004
    Messages:
    3,092
    Likes Received:
    0
    I was comparing the 7800GTX to the X1800XT because they are approximately a similar price.
    If you compare the X1800XT to the 7800GTX512 get the X1800XT, as it is much much cheaper.


    UT2004 will be fine on an X300, should be able to run on pretty good settings too.
     
  14. Juguard

    Juguard The King Is Dead, Punk Rock Lives!

    Joined:
    Nov 30, 1999
    Messages:
    570
    Likes Received:
    0
    I would just get a card when UT2007 is out. Don't worry about the brand too much, since the battles are so close. I don't think ATI or Nvidia will have a huge lead, one over the other, when UT2007 is out.

    For me, I love Nvidia, been gaming with them since TNT2 Ultra. Never problems, no matter what system I used them in. ATI drivers just don't do it for me, had problems with them in the past.
     
  15. omnius

    omnius New Member

    Joined:
    Sep 29, 2004
    Messages:
    14
    Likes Received:
    0
    I'm sure my GPU will be fine (radeon x1600) but I'm more worried about my CPU (AMD Athlon™ 64 X2 Dual-Core 3800+) is it ok for 2007?
     
  16. Saverous

    Saverous New Member

    Joined:
    Mar 29, 2006
    Messages:
    40
    Likes Received:
    0
    I think you are wrong, your CPU is ok but your GPU is need for upgrade
     
  17. zakapior

    zakapior New Member

    Joined:
    Apr 22, 2006
    Messages:
    12
    Likes Received:
    0
    :lol: :lol: :lol:
    AMD Athlon™ 64 X2 Dual-Core 3800+ was too weak cpu even for 2005.
    Stupid question so here is the answer.
    :lol: :lol: :lol: :lol: :lol:
     
  18. The_Head

    The_Head JB Mapper

    Joined:
    Jul 3, 2004
    Messages:
    3,092
    Likes Received:
    0
    What sort of retarded post is that.
    the X2-3800+ is a great chip, and I guarantee it will perform better than most single core processers that are around when UT2007 comes out (presuming it is multithreaded as has been promised)
    I'd like to see where your idea of it being a weak chip for 2005 comes from.
    Give it a bit of overclocking and it can match a 4600 with little difficulty.
     
  19. Saverous

    Saverous New Member

    Joined:
    Mar 29, 2006
    Messages:
    40
    Likes Received:
    0
    dual core cpu's weaker only in old games that doesn't support them. UE3 will support dual core processing, so Athlon X2 3800+ is enough for upcoming games.
     
    Last edited: Apr 26, 2006
  20. dragonfliet

    dragonfliet I write stuffs

    Joined:
    Apr 24, 2006
    Messages:
    3,754
    Likes Received:
    31
    Actually, I would wait for the new ATI card (forget the codename) that will support DX10 with Unified Shader Architecture(USA), or If nvidia wakes up in time to fit the G80 with USA, then it should be a close choice between the two; but waiting an extra month or 3(if it's delayed for the vista launch) will be well worth it, as you'll have a decently futureproof GFX card.
    ~Jason
     

Share This Page