News UnrealEd Files Mods FragBU Liandri Archives
BeyondUnreal Forums

Go Back   BeyondUnreal Forums > BeyondUnreal > Games > Unreal Tournament Series > Original Unreal Tournament

Reply
 
Thread Tools Display Modes
Old 20th May 2000, 03:02 PM   #1
godz
Registered User
 
Join Date: Feb. 29th, 2000
Posts: 81
Question

Please help quickly... I've heard that the GeForce2 is better than the voodoo5... I need input. I'm looking for the best overall video card and I've already pre-ordered the voodoo5 but I still have a few days to cancel... please help
godz is offline   Reply With Quote
Old 20th May 2000, 03:06 PM   #2
Rooster
Local Legend
 
Rooster's Avatar
 
Join Date: Jan. 4th, 2000
Location: Fort Mill, SC
Posts: 5,287
Post

Depends on what you're doing. UT only? Go with the Voodoo. Otherwise, check out all the site out there, make your own decision. You can't hit any web page now-days without someone doing a comparison.


|C|Rooster|PuF
Defenseman for |C|ondemned
Rooster is offline   Reply With Quote
Old 20th May 2000, 03:09 PM   #3
godz
Registered User
 
Join Date: Feb. 29th, 2000
Posts: 81
Post

oh yeah, what's the best geforce2 card, I see elsa, and asus, and gullimont
godz is offline   Reply With Quote
Old 20th May 2000, 03:15 PM   #4
Switch[UF]
Registered User
 
Join Date: Jan. 27th, 2000
Posts: 188
Post

I recently saw a review that showed that the two cards were about equal in speed. This was not in a glide based environment. I think it might have been Quake3 but I'm not sure. So I think it doesn't really matter, it's more a matter of taste than speed. Some games support the T&L in the Geforce cards which is quite nice. But voodoo supports glide (natively) which is also. So pick the one you like.



Switch[UF] of the clan Unfear

Actually I had figured out something really great to write here. But then I forgot it.
Switch[UF] is offline   Reply With Quote
Old 20th May 2000, 03:29 PM   #5
Kukuman
HUT HUT HUT HUT HUT HUT HUT HUT HUT HUT
 
Kukuman's Avatar
 
Join Date: Mar. 27th, 2000
Location: Bothell, WA
Posts: 661
Talking

I've had it with 3dfx's attitude. They won't go with T&L because "it's won't be very widely supported"...HA! Q3A supports it, and I'm assuming all games based on the Q3A engine will support it too. I'm still holding back from buying until I see some REAL benchmarks(ie a slower system playing UT). Why doesn't anybody do real benchmarks now...oh yeah, that's why my friends and I started realsystems.ws...aaaah!



Kukuman|PuF

"Where si my magic hatt??! I AM VARY DIPSSLEASD SIR!!! PH34R M3!!!"

Which is approximate this website, and who are you from the types?

It is website for blood 2 clan, RENAGADE!!!1 is we the amazing experts at the fight with blood 2 on-line and can a blow of foot give to much donkey.


Kukuman is offline   Reply With Quote
Old 20th May 2000, 03:39 PM   #6
Blistering_Pants
T2 Junkie
 
Blistering_Pants's Avatar
 
Join Date: May. 14th, 2000
Location: VA, USA
Posts: 580
Cool

Well, I have a Geforce so I am bias but I have read a lot of articles on both. If you go with a G2 (my choice) Go for Elsa's Gladiaca. Okay, not the best name, (imagine bragging to you friends "Yeah I got the Galdiac and it rips!") but from what I've read the it's the best card. It will run you around $350 and thats with out TVout and all that jazz. Here are some bench marks.


Pentium III 450MHz, no T&L: 2,514 3D Marks

Pentium III 450MHz, with T&L: 3,753 3D Marks

Athlon 700MHz, no T&L: 3,603 3D Marks

Athlon 700MHz, with T&L: 5,224 3D Marks

All in all it comes down to personal choice. I used to like Voodoo until I owned a Geforce now I'm hooked. It is just a better chip.

Hoped this helped.

Blistering_Pants is offline   Reply With Quote
Old 20th May 2000, 06:09 PM   #7
Clayeth
Classic
 
Clayeth's Avatar
 
Join Date: Apr. 10th, 2000
Location: Kentucky
Posts: 5,602
Post

GeForce 2. It should be a little faster than the Voodoo 5's especially in games using T&L. Voodoo5's will be faster in Glide games built specificaly for glide (ex. UT) but most games would have much better OpenGL support than UT, which would help the GeForce a lot.
Clayeth is offline   Reply With Quote
Old 20th May 2000, 08:51 PM   #8
Trebux
Registered User
 
Join Date: Apr. 30th, 2000
Posts: 91
Post

Future 3DFX's products won't be as good as it used to be.

The company lost about 30Million dollars in a quarter of a year due to crappy Voodoo3s and loosing to Nvidia in performance.

Thats a LOT of money.

3DFX is probably struggling right now to get back with voodoo4 and 5 but I am not so confident. The company 3DFX is unstable....

The reviews I seen comparing GeForce 2 with Voodoo5 5000, GeForce 2 is definitely better. Unless you are getting a Voodoo5 6000, which I couldn't find any reviews but it just has two more 3dfx's graphic chips than 5000.

I think you should cancel the order, and wait for Geforce 2 GTS with 64MB of Ram which might/prolly come out some time soon. Or you can just buy the Geforce 2 GTS right now.

UT will prolly run faster with Voodoo5 but come on.....UT is not the only game out there you know....and who knows? the support might turn away from Glide and move to something newer and different?


I want to see YOU snipe a moving target with a 56K connection!

Consider yourself Un-Godlike if you can do it.
Trebux is offline   Reply With Quote
Old 20th May 2000, 09:08 PM   #9
Clayeth
Classic
 
Clayeth's Avatar
 
Join Date: Apr. 10th, 2000
Location: Kentucky
Posts: 5,602
Post

On paper, the 32mb GeForce2 would have the ability to out preform the V5 6000. The V5 6000 has a fill rate of 1.32-1.47 Gtexels/sec. Where the GeForce2 has a 1.6 Gtexels/sec. fill rate. The thing that is supposed to make the 6000 so great, is the multiple processors, which creates higher fill rates. But, the GeForce 2 GTS is so much more powerful, a single chip can create a higher fill rate. I'm still not convinced that the 32mb Geforce 2 will out-preform the 6000, but the 64mb version of the GTS will still cost less than the 6000. And don't forget, that the GeForce 2 can support up to 128mb of ram!!! I'm not even sure any more that 3Dfx will have the card that competes with Nvidia in the high end graphics cards. But only time will tell.
Clayeth is offline   Reply With Quote
Old 20th May 2000, 10:20 PM   #10
WW Aldrick
Registered User
 
Join Date: May. 18th, 2000
Posts: 14
Post

UT 2 will only use D3D... so in that case maybe the geforce is better...
WW Aldrick is offline   Reply With Quote
Old 20th May 2000, 10:46 PM   #11
RW
Registered User
 
Join Date: Nov. 24th, 1999
Location: Sparks, Nevada USA
Posts: 234
Post

Well I posted this over at the GT forum so I guess I will also post it here.

Here are a few of the reasons I will be getting the V5-5500 this week.

1: V5-5500 supports "glide" rendering GF2 doesn't. There are still a lot of games out there using glide.
2: Hardware FSAA with less performance hit than the GF2. I will be able to run most games at 1024x768x32 with 2x FSAA, and some games at 800 x 600 with 4x FSAA.
3: T&L is a fine thing but not even an issue today and even when it is an issue we will all be running 1GHz or higher CPU's and may never really need or use it anyway IMHO.
4: The V5-5500 is at least $50 less than the GF2 with 64MB vs 32MB.
5: My experience with nividia cards is search the net all week for the latest leaked or beta drivers in hopes that it will fix the cards problems. My experience with 3dfx cards is load the drivers that came in the box and maybe check the 3dfx site for new drivers every two months.


RW is offline   Reply With Quote
Old 21st May 2000, 02:07 AM   #12
Lilsting12
Registered User
 
Join Date: Apr. 15th, 2000
Posts: 5
Talking

I would defenitely go with The Geforce 2 GTS. I have the Geforce 256 now and it werks great. I tried it in all my computers and it werks all fine and stuff. But when i got to the voodoos they gave me headaches. I dont know maybe its my specs but if i were u i would go with the GEforce 2 GTS. It only have 1 chip while the voodoo 5 6000 has 2. 1 chip costs less heat and stops freezes. 2 might heat up ur computer and freeze most of the games =) go with the GeForce 2 GTS ........=)


Dont mess with me
Lilsting12 is offline   Reply With Quote
Old 21st May 2000, 02:18 AM   #13
Rooster
Local Legend
 
Rooster's Avatar
 
Join Date: Jan. 4th, 2000
Location: Fort Mill, SC
Posts: 5,287
Post

Lilsting.. heh.. two heats cause more? Only if they're being pushed twice as hard as one.

Heat does not combine like that. i.e, 40C (chip1) + 40C (chip2) does not = 80C. Where as the same power through an identical chip would probably equate to 60C.

2 chips is always better than one for heat dissapation. I'm not saying which chipset is better, but your arguments FOR the GeForce2 are totally bunk. Sorry man.


|C|Rooster|PuF
Defenseman for |C|ondemned
Rooster is offline   Reply With Quote
Old 21st May 2000, 02:30 AM   #14
Kukuman
HUT HUT HUT HUT HUT HUT HUT HUT HUT HUT
 
Kukuman's Avatar
 
Join Date: Mar. 27th, 2000
Location: Bothell, WA
Posts: 661
Post

Unless 3dfx gets their act together and releases decent OpenGL wrappers(that's what they are, not full implementations), the Geforce2 will beat 3dfx handily in Q3A.

I have no idea how well a Geforce2 will perform in UT, since I have seen no benchmarks involving UT.

RW, do you work for 3dfx? I could have sworn those exact words were said by a 3dfx employee.
http://www.sharkyextreme.com/hardwar...ce2_gts_guide/
Now tell me which one is better.



Kukuman|PuF
Kukuman is offline   Reply With Quote
Old 21st May 2000, 02:56 AM   #15
Kokensu
Fire in Ma Belly!
 
Kokensu's Avatar
 
Join Date: Jan. 4th, 2000
Location: Shut yo mouth!
Posts: 2,912
Post

I haven't read everyone's reply to this post, so I may be repeating someone(sorry), but I've heard that Epic is no longer supporting glide? And I've also heard that 3dFx is planning on dropping that API as well in the not-to-distant future. Are these just rumors or is there some truth to them?


You like that?
Kokensu is offline   Reply With Quote
Old 21st May 2000, 06:54 AM   #16
Wingznut PEZ
Registered User
 
Join Date: Nov. 30th, 1999
Location: Portland, OR, USA
Posts: 293
Post

Have you not seen the comparisons over at HardOCP? The V5-5500 is within 4 fps of the GeForce2, on Q3 at 1024 resolution.

The V5-6000 will kick @ss all over the GeForce2 (and any other card, for the next few months). However, ~$600 is too rich for my blood.

Kukuman, Q3 only supports hardware Transform. They still use their own Lighting engine. And anyway, Q3 runs just as fast on a V5-5500.

Clayeth, "on paper" is exactly right. Memory bandwidth restrictions don't allow the GTS to get (realistically) over 1 gigatexels. This is why the V5-5500 does so well, because it doesn't have the same bottleneck (because of the dual processors).

FSAA looks so sweet. Not that I plan to use it with FPS games. But other games like NFS5, flight sims, etc... will be awesome with FSAA.

I came "this close" to getting a GTS a couple weeks ago. But with the latest reviews, I really don't see the point of going with the GTS. Sure, T&L will probably take off someday. But that'll be at least one more upgrade away. Whereas with FSAA, I can use it right now, on all my games.


Wingznut [PEZ]
ICQ #29598363
Wingznut PEZ is offline   Reply With Quote
Old 21st May 2000, 07:57 AM   #17
Switch[UF]
Registered User
 
Join Date: Jan. 27th, 2000
Posts: 188
Post

That's the review I was reffering to. The one Wingznut mentioned. So they're about the same speed. And that's with the gf2 using T&L (well obviously only the T part). I also read over at HardOCP that the FSAA on the GF2 is actually just for show, you can't use it because it slows down to a crawl. Now it might seem like I'm pro-3dfx, but I'm not in the market for a 3d-card so I'm not really pro anything. Except pro-check-everything-out-before-you-buy which is quite important, at least for me. I just wanted to say that they are about similiar in speed, so it's a matter of taste. And money.

I wonder how big this thread will be and when it will come to personal insults. It's bound to, sooner or later.

PS.
Me drools at the thought of the 6000, two times as fast as the 5500 (well, almost). That's quite speedy. Someone made a test and Q3 was _really_ playable att 2000x1600 or something like that. Not that anyone would play at that res I think, but you could add some more detail instead.
DS.



Switch[UF] of the clan Unfear

Actually I had figured out something really great to write here. But then I forgot it.
Switch[UF] is offline   Reply With Quote
Old 21st May 2000, 10:59 AM   #18
Devistator
Registered User
 
Join Date: May. 19th, 2000
Location: PA
Posts: 3
Exclamation

http://www.tomshardware.com/graphic/...519/index.html

My favorite review site. I would go with GeForce2 if you got the cash. Here is a comparison of several GeForce2 cards.
Devistator is offline   Reply With Quote
Old 21st May 2000, 12:00 PM   #19
DemBones
Registered User
 
Join Date: Feb. 16th, 2000
Location: Clarksville, MD, USA
Posts: 3
Post

The initial comparisons between the GF2 and the V5 with FSAA showed that in lower resolutions, the GF2 crunched the V5 thoroughly. However, once you start going to high resolutions, the V5 takes the lead do to the larger capacity for a frame buffer than a 32 MB GF2, and the GF2's poor memory bandwidth.

Without FSAA enabled, the GeForce has performed remarkably well compared to the V5. In some benchmarks, its lead over the V5 has been quite significant. That, coupled with the feature load-out of most of the new cards makes the GF2 very attractive to consumers and OEMs alike. Many of its features are designed for tomorrow's games, while still giving powerful performance for today's.

As for memory bandwidth issues, this is not necessarily nVidia's fault. They decided to go with DDR SDRAM because of its speed benefits versus SDR SDRAM. However high-speed DDR SDRAM yields have been rather limited. The GF2's memory bandwidth is not so much limited by its design as it is by the lack of sufficient high-frequency memory chips. A daring company could take a big risk and buy all the high-speed DDR SDRAM chips they can find and clock them higher on their boards. However, that move would have to rely on two things: 1) That they could manage to advertise this fact to the consumer and 2) That they can produce boards in sufficient quantity to meet demand. The risks involved are not ones that companies take lightly, and not many will be willing to even consider this option.

Also, regarding the V5's 64 MB of memory... that's not entirely accurate. The V5 5500 has only 32 MB per chip. Multiply that by 2 chips and you have 64 MB, right? Not really. You see, those banks of memory have to duplicate texture memory for each chip. So, let's say a game needs 16 MB for texture data. With 16 MB of memory for each chip being used for texture info, that leaves 32 MB for the frame buffer (since it is split across the chips). A 32 MB graphics card would only have 16 MB left for the frame buffer, while a TRUE 64 MB card would have a full 48 MB left.

As for the price: yes, you can get a 64 MB V5 for less than a 32 MB GF2, but with the GF2 you're paying for more features, including the DDR SDRAM, which is still quite expensive. Prices will plummet as soon as DDR SDRAM production ramps up to full speed.

Personally, I'm buying a 64 MB GF2 as soon as I see one I like. The two best brands I've seen so far are the Elsa and the ASUS cards. The Hercules card shows promise with its heat sinked memory modules, but initial tests show its not very overclockable.

Of course, I have some money to burn due to a rather unfortunate car accident before my birthday. Thanks Nationwide!! ;-)

Realisitcally, I'd say wait. Prices almost always drop in the summer. Only fools (like me) rush in and buy a graphics card as soon as it debuts. You need to give the manufacturers a couple months to be competitve. Then prices should be a bit more reasonable.


DemBones @ www.riva3d.com
DemBones is offline   Reply With Quote
Old 21st May 2000, 12:49 PM   #20
Rooster
Local Legend
 
Rooster's Avatar
 
Join Date: Jan. 4th, 2000
Location: Fort Mill, SC
Posts: 5,287
Post

Just to be fair: DemBones - I've seen a review of a "gold card" and beta drivers and the V5 FSAA was MUCH better than the GF2GTS. The Beta Card/Beta Drivers did perform like crap in previous reviews. As well, I couldn't give a rat's *** about T&L because Q3 is practically the only game that supports it and I don't play and never intend on playing anything on the Q engine. I just don't play FPS's really - UT is it.

That being said... I would definitely wait until about 2 months after the cards come out and some real reviews not previews have been done. Overall - the GF2GTS looks like the more "promising" card - but not that much better than a GF DDR. It's only about 15-20% faster than a GFDDR - and like you really need to go from 60fps to 70fps?!? Gimme a break - not for an overpriced $320. No way in hell. I paid $118 for my V3 3000 and am STILL happy with it. Probably much happier than any bloke that just recently plopped down $300 on a GeForce DDR and gets the same performance I do.

When they come to a reasonable price of like $200, then we can talk - but even the DDR's are not there yet and it's been almost a year. nVidia definitely has to work on their pricing.


|C|Rooster|PuF
Defenseman for |C|ondemned
Rooster is offline   Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 02:30 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.

Copyright ©1998 - 2012, BeyondUnreal, Inc.
Privacy Policy | Terms of Use
Bandwidth provided by AtomicGamer