GeForce2 vs Voodoo5

  • Two Factor Authentication is now available on BeyondUnreal Forums. To configure it, visit your Profile and look for the "Two Step Verification" option on the left side. We can send codes via email (may be slower) or you can set up any TOTP Authenticator app on your phone (Authy, Google Authenticator, etc) to deliver codes. It is highly recommended that you configure this to keep your account safe.

godz

New Member
Feb 29, 2000
81
0
0
IN CONSTRUCTION
Please help quickly... I've heard that the GeForce2 is better than the voodoo5... I need input. I'm looking for the best overall video card and I've already pre-ordered the voodoo5 but I still have a few days to cancel... please help
 

Rooster

Local Legend
Jan 4, 2000
5,287
0
0
Fort Mill, SC
www.legionoflions.com
Depends on what you're doing. UT only? Go with the Voodoo. Otherwise, check out all the site out there, make your own decision. You can't hit any web page now-days without someone doing a comparison.

bullet2.gif

|C|Rooster|PuF
Defenseman for |C|ondemned
 

Switch[UF]

New Member
Jan 27, 2000
188
0
0
www.unfear.com
I recently saw a review that showed that the two cards were about equal in speed. This was not in a glide based environment. I think it might have been Quake3 but I'm not sure. So I think it doesn't really matter, it's more a matter of taste than speed. Some games support the T&L in the Geforce cards which is quite nice. But voodoo supports glide (natively) which is also. So pick the one you like. :)


bullet2.gif

Switch[UF] of the clan Unfear

Actually I had figured out something really great to write here. But then I forgot it.
 

Kukuman

HUT HUT HUT HUT HUT HUT HUT HUT HUT HUT
Mar 27, 2000
661
0
0
38
Bothell, WA
I've had it with 3dfx's attitude. They won't go with T&L because "it's won't be very widely supported"...HA! Q3A supports it, and I'm assuming all games based on the Q3A engine will support it too. I'm still holding back from buying until I see some REAL benchmarks(ie a slower system playing UT). Why doesn't anybody do real benchmarks now...oh yeah, that's why my friends and I started realsystems.ws...aaaah!


bullet2.gif

Kukuman|PuF

"Where si my magic hatt??! I AM VARY DIPSSLEASD SIR!!! PH34R M3!!!"

Which is approximate this website, and who are you from the types?

It is website for blood 2 clan, RENAGADE!!!1 is we the amazing experts at the fight with blood 2 on-line and can a blow of foot give to much donkey.
 

Blistering_Pants

T2 Junkie
May 14, 2000
580
0
0
45
VA, USA
Well, I have a Geforce so I am bias but I have read a lot of articles on both. If you go with a G2 (my choice) Go for Elsa's Gladiaca. Okay, not the best name, (imagine bragging to you friends "Yeah I got the Galdiac and it rips!") but from what I've read the it's the best card. It will run you around $350 and thats with out TVout and all that jazz. Here are some bench marks.


Pentium III 450MHz, no T&L: 2,514 3D Marks

Pentium III 450MHz, with T&L: 3,753 3D Marks

Athlon 700MHz, no T&L: 3,603 3D Marks

Athlon 700MHz, with T&L: 5,224 3D Marks

All in all it comes down to personal choice. I used to like Voodoo until I owned a Geforce now I'm hooked. It is just a better chip.

Hoped this helped.
 

Clayeth

Classic
Apr 10, 2000
5,602
0
0
41
Kentucky
GeForce 2. It should be a little faster than the Voodoo 5's especially in games using T&L. Voodoo5's will be faster in Glide games built specificaly for glide (ex. UT) but most games would have much better OpenGL support than UT, which would help the GeForce a lot.
 

Trebux

New Member
Apr 30, 2000
91
0
0
Visit site
Future 3DFX's products won't be as good as it used to be.

The company lost about 30Million dollars in a quarter of a year due to crappy Voodoo3s and loosing to Nvidia in performance.

Thats a LOT of money.

3DFX is probably struggling right now to get back with voodoo4 and 5 but I am not so confident. The company 3DFX is unstable....

The reviews I seen comparing GeForce 2 with Voodoo5 5000, GeForce 2 is definitely better. Unless you are getting a Voodoo5 6000, which I couldn't find any reviews but it just has two more 3dfx's graphic chips than 5000.

I think you should cancel the order, and wait for Geforce 2 GTS with 64MB of Ram which might/prolly come out some time soon. Or you can just buy the Geforce 2 GTS right now.

UT will prolly run faster with Voodoo5 but come on.....UT is not the only game out there you know....and who knows? the support might turn away from Glide and move to something newer and different?

bullet2.gif

I want to see YOU snipe a moving target with a 56K connection!

Consider yourself Un-Godlike if you can do it.
 

Clayeth

Classic
Apr 10, 2000
5,602
0
0
41
Kentucky
On paper, the 32mb GeForce2 would have the ability to out preform the V5 6000. The V5 6000 has a fill rate of 1.32-1.47 Gtexels/sec. Where the GeForce2 has a 1.6 Gtexels/sec. fill rate. The thing that is supposed to make the 6000 so great, is the multiple processors, which creates higher fill rates. But, the GeForce 2 GTS is so much more powerful, a single chip can create a higher fill rate. I'm still not convinced that the 32mb Geforce 2 will out-preform the 6000, but the 64mb version of the GTS will still cost less than the 6000. And don't forget, that the GeForce 2 can support up to 128mb of ram!!! I'm not even sure any more that 3Dfx will have the card that competes with Nvidia in the high end graphics cards. But only time will tell.
 

RW

New Member
Nov 24, 1999
234
0
0
Sparks, Nevada USA
Visit site
Well I posted this over at the GT forum so I guess I will also post it here.

Here are a few of the reasons I will be getting the V5-5500 this week.

1: V5-5500 supports "glide" rendering GF2 doesn't. There are still a lot of games out there using glide.
2: Hardware FSAA with less performance hit than the GF2. I will be able to run most games at 1024x768x32 with 2x FSAA, and some games at 800 x 600 with 4x FSAA.
3: T&L is a fine thing but not even an issue today and even when it is an issue we will all be running 1GHz or higher CPU's and may never really need or use it anyway IMHO.
4: The V5-5500 is at least $50 less than the GF2 with 64MB vs 32MB.
5: My experience with nividia cards is search the net all week for the latest leaked or beta drivers in hopes that it will fix the cards problems. My experience with 3dfx cards is load the drivers that came in the box and maybe check the 3dfx site for new drivers every two months. :)
 

Lilsting12

New Member
Apr 15, 2000
6
0
0
Visit site
I would defenitely go with The Geforce 2 GTS. I have the Geforce 256 now and it werks great. I tried it in all my computers and it werks all fine and stuff. But when i got to the voodoos they gave me headaches. I dont know maybe its my specs but if i were u i would go with the GEforce 2 GTS. It only have 1 chip while the voodoo 5 6000 has 2. 1 chip costs less heat and stops freezes. 2 might heat up ur computer and freeze most of the games =) go with the GeForce 2 GTS ........=)

bullet2.gif

Dont mess with me
 

Rooster

Local Legend
Jan 4, 2000
5,287
0
0
Fort Mill, SC
www.legionoflions.com
Lilsting.. heh.. two heats cause more? Only if they're being pushed twice as hard as one.

Heat does not combine like that. i.e, 40°C (chip1) + 40°C (chip2) does not = 80°C. Where as the same power through an identical chip would probably equate to 60°C.

2 chips is always better than one for heat dissapation. I'm not saying which chipset is better, but your arguments FOR the GeForce2 are totally bunk. :) Sorry man.

bullet2.gif

|C|Rooster|PuF
Defenseman for |C|ondemned
 

Kukuman

HUT HUT HUT HUT HUT HUT HUT HUT HUT HUT
Mar 27, 2000
661
0
0
38
Bothell, WA
Unless 3dfx gets their act together and releases decent OpenGL wrappers(that's what they are, not full implementations), the Geforce2 will beat 3dfx handily in Q3A.

I have no idea how well a Geforce2 will perform in UT, since I have seen no benchmarks involving UT.

RW, do you work for 3dfx? I could have sworn those exact words were said by a 3dfx employee.
http://www.sharkyextreme.com/hardware/articles/nvidia_geforce2_gts_guide/
Now tell me which one is better.


bullet2.gif

Kukuman|PuF
 

Kokensu

Fire in Ma Belly!
Jan 4, 2000
2,912
0
0
Shut yo mouth!
Visit site
I haven't read everyone's reply to this post, so I may be repeating someone(sorry), but I've heard that Epic is no longer supporting glide? And I've also heard that 3dFx is planning on dropping that API as well in the not-to-distant future. Are these just rumors or is there some truth to them?

bullet2.gif

You like that?
 

Wingznut PEZ

New Member
Nov 30, 1999
293
0
0
Portland, OR, USA
Visit site
Have you not seen the comparisons over at HardOCP? The V5-5500 is within 4 fps of the GeForce2, on Q3 at 1024 resolution.

The V5-6000 will kick @ss all over the GeForce2 (and any other card, for the next few months). However, ~$600 is too rich for my blood.

Kukuman, Q3 only supports hardware Transform. They still use their own Lighting engine. And anyway, Q3 runs just as fast on a V5-5500.

Clayeth, "on paper" is exactly right. Memory bandwidth restrictions don't allow the GTS to get (realistically) over 1 gigatexels. This is why the V5-5500 does so well, because it doesn't have the same bottleneck (because of the dual processors).

FSAA looks so sweet. Not that I plan to use it with FPS games. But other games like NFS5, flight sims, etc... will be awesome with FSAA.

I came "this close" to getting a GTS a couple weeks ago. But with the latest reviews, I really don't see the point of going with the GTS. Sure, T&L will probably take off someday. But that'll be at least one more upgrade away. Whereas with FSAA, I can use it right now, on all my games.

bullet2.gif

Wingznut [PEZ]
ICQ #29598363
 

Switch[UF]

New Member
Jan 27, 2000
188
0
0
www.unfear.com
That's the review I was reffering to. The one Wingznut mentioned. So they're about the same speed. And that's with the gf2 using T&L (well obviously only the T part). I also read over at HardOCP that the FSAA on the GF2 is actually just for show, you can't use it because it slows down to a crawl. Now it might seem like I'm pro-3dfx, but I'm not in the market for a 3d-card so I'm not really pro anything. Except pro-check-everything-out-before-you-buy which is quite important, at least for me. I just wanted to say that they are about similiar in speed, so it's a matter of taste. And money.

I wonder how big this thread will be and when it will come to personal insults. It's bound to, sooner or later. :)

PS.
Me drools at the thought of the 6000, two times as fast as the 5500 (well, almost). That's quite speedy. Someone made a test and Q3 was _really_ playable att 2000x1600 or something like that. Not that anyone would play at that res I think, but you could add some more detail instead.
DS.


bullet2.gif

Switch[UF] of the clan Unfear

Actually I had figured out something really great to write here. But then I forgot it.
 

DemBones

New Member
Feb 16, 2000
3
0
0
Clarksville, MD, USA
Visit site
The initial comparisons between the GF2 and the V5 with FSAA showed that in lower resolutions, the GF2 crunched the V5 thoroughly. However, once you start going to high resolutions, the V5 takes the lead do to the larger capacity for a frame buffer than a 32 MB GF2, and the GF2's poor memory bandwidth.

Without FSAA enabled, the GeForce has performed remarkably well compared to the V5. In some benchmarks, its lead over the V5 has been quite significant. That, coupled with the feature load-out of most of the new cards makes the GF2 very attractive to consumers and OEMs alike. Many of its features are designed for tomorrow's games, while still giving powerful performance for today's.

As for memory bandwidth issues, this is not necessarily nVidia's fault. They decided to go with DDR SDRAM because of its speed benefits versus SDR SDRAM. However high-speed DDR SDRAM yields have been rather limited. The GF2's memory bandwidth is not so much limited by its design as it is by the lack of sufficient high-frequency memory chips. A daring company could take a big risk and buy all the high-speed DDR SDRAM chips they can find and clock them higher on their boards. However, that move would have to rely on two things: 1) That they could manage to advertise this fact to the consumer and 2) That they can produce boards in sufficient quantity to meet demand. The risks involved are not ones that companies take lightly, and not many will be willing to even consider this option.

Also, regarding the V5's 64 MB of memory... that's not entirely accurate. The V5 5500 has only 32 MB per chip. Multiply that by 2 chips and you have 64 MB, right? Not really. You see, those banks of memory have to duplicate texture memory for each chip. So, let's say a game needs 16 MB for texture data. With 16 MB of memory for each chip being used for texture info, that leaves 32 MB for the frame buffer (since it is split across the chips). A 32 MB graphics card would only have 16 MB left for the frame buffer, while a TRUE 64 MB card would have a full 48 MB left.

As for the price: yes, you can get a 64 MB V5 for less than a 32 MB GF2, but with the GF2 you're paying for more features, including the DDR SDRAM, which is still quite expensive. Prices will plummet as soon as DDR SDRAM production ramps up to full speed.

Personally, I'm buying a 64 MB GF2 as soon as I see one I like. The two best brands I've seen so far are the Elsa and the ASUS cards. The Hercules card shows promise with its heat sinked memory modules, but initial tests show its not very overclockable.

Of course, I have some money to burn due to a rather unfortunate car accident before my birthday. Thanks Nationwide!! ;-)

Realisitcally, I'd say wait. Prices almost always drop in the summer. Only fools (like me) rush in and buy a graphics card as soon as it debuts. You need to give the manufacturers a couple months to be competitve. Then prices should be a bit more reasonable.


DemBones @ www.riva3d.com
 

Rooster

Local Legend
Jan 4, 2000
5,287
0
0
Fort Mill, SC
www.legionoflions.com
Just to be fair: DemBones - I've seen a review of a "gold card" and beta drivers and the V5 FSAA was MUCH better than the GF2GTS. The Beta Card/Beta Drivers did perform like crap in previous reviews. As well, I couldn't give a rat's *** about T&L because Q3 is practically the only game that supports it and I don't play and never intend on playing anything on the Q engine. I just don't play FPS's really - UT is it.

That being said... I would definitely wait until about 2 months after the cards come out and some real reviews not previews have been done. Overall - the GF2GTS looks like the more "promising" card - but not that much better than a GF DDR. It's only about 15-20% faster than a GFDDR - and like you really need to go from 60fps to 70fps?!? Gimme a break - not for an overpriced $320. No way in hell. I paid $118 for my V3 3000 and am STILL happy with it. Probably much happier than any bloke that just recently plopped down $300 on a GeForce DDR and gets the same performance I do. :)

When they come to a reasonable price of like $200, then we can talk - but even the DDR's are not there yet and it's been almost a year. nVidia definitely has to work on their pricing.

bullet2.gif

|C|Rooster|PuF
Defenseman for |C|ondemned