PDA

View Full Version : Aside from Envy...


Neddaf
30th Mar 2005, 02:36 PM
How is the engine looking in terms of performance on todays computers?

I have no idea when its supposed to be, but does anyone have any idea of what the "minimum" specs will be? Even an estimate would be helpful.

Zygar
30th Mar 2005, 02:47 PM
It won't run on any card without support for SM3.0. (Shader Model 3.0)
So basically the only cards today that could run it will be the Geforce 6x00 series. Now, you can forget playing it on a 6200, just not going to happen. So that leaves the 6600 series and the 6800 series. A 6600 apparently gets around 20fps on a standard resolution with medium settings, so if you want to play it with a decent framerate, the 6800 series are pretty much your only option.
By the way, UEngine3 supports 64bit, multithreading, and SLi. So a computer with 4gb ram, a dual core 64bit processor, and a couple of Geforces will run it VERY nicely.

Bazzi
30th Mar 2005, 03:02 PM
It WILL run on SM2 cards.

Epic runs a business, remember?

Zygar
30th Mar 2005, 03:09 PM
I think I read somewhere that it wasn't going to have support for SM2.0. Most cards that don't have SM3.0 will be too slow to run UE3 anyway, with the exception of the Radeon X800 series.
EDIT: Looking around, I may have my facts mixed up. I just read that you must have a DX9 graphics card or the game simply won't run. I probably mixed this up with some other information I picked up somewhere. Bah.

-AEnubis-
30th Mar 2005, 03:11 PM
Well maybe you should go back to that "somewhere" and refernce your sources ;)

It is early yet though, and graphical optimizations for smaller systems usually come last.

Bazzi
30th Mar 2005, 03:11 PM
Marketing anyone?

ATI's current high end cards don't support SM3, and those will likely be used in 2006 still.

Afterall, SM3 is more hype than anything else.... it doesn't turn an ugly game into a fancy one.

Zygar
30th Mar 2005, 03:16 PM
By the way, SM3.0 isn't just hype. Check out this thread over at anandtech for some good resources on SM3.0.
http://forums.anandtech.com/messageview.aspx?catid=31&threadid=1547736&enterthread=y

Bazzi
30th Mar 2005, 03:30 PM
You can compare SM2 vs 3 with AGP vs PCIe

Now there isn't much difference but there will eventually be some because not of more performance now but because of better technology/scalability.

But since they want to sell games to as many people as possible they have to support older generations of PCs.
And the main problem is that Ati cards with SM3 will be released in june/July so realistically available in Fall/Winter. Those people who bought an X800/X850 right now want to play UE3 games though. And since UT3 will still look almost as good with SM2 as with SM3 why forcing people to trash their newly bought $500 card?

Sir_Brizz
30th Mar 2005, 03:35 PM
Right now, with the engine running at full details on a Dual Core 64-bit Opteron system with 3 gb RAM and 2 SLI GF6800GTs they were clearly getting 25-35 consistently in the 1up videos.

That means for crap the next gen video cards are going to be the only ones running this sucker at full power.

They would be stupid, however, to not support the 9600Pro series forward, which is where much of the playerbase is likely to lie in 2006.

RaptoR
30th Mar 2005, 03:45 PM
Current builds of UE3 probably require SM 2 or 3, however you can expect this to change before Envy is released. Just as in UE2 games like 2k3/2k4, Epic will probably use fallback textures to allow the game to run on older cards with less advanced shader support.

T2A`
30th Mar 2005, 05:54 PM
I'm waiting as long as possible before I do any upgrading. My computer runs UT2004/HL2 fine and Far Cry/Doom3 decently, and I don't buy many games, so I don't have much to worry about much. Once UT3 comes along, I'm getting everything anew. Time to start saving up! It's gonna be so suite having a dual-core processor and two video cards... And I'm gonna be so poor! \o/ Has there been any talk of dual dual-core processors? :eek:

I wonder how many gigs of textures the game is gonna ship with... There's, what, 3+ in UT2004? Since UE3 is gonna be based so much on normal mapping and such, I'm predicting 10+ gigs of textures. :D

Bang Yr Head
30th Mar 2005, 06:24 PM
I'm waiting as long as possible before I do any upgrading. My computer runs UT2004/HL2 fine and Far Cry/Doom3 decently, and I don't buy many games, so I don't have much to worry about much. Once UT3 comes along, I'm getting everything anew. Time to start saving up! It's gonna be so suite having a dual-core processor and two video cards... And I'm gonna be so poor! \o/ Has there been any talk of dual dual-core processors? :eek:

Yeah I agree, there is really no reason at all for me to upgrade right now. I can wait :)
I'll probably upgrade a month before the game ships. Man this is gonna be sweet :D

Dark Pulse
30th Mar 2005, 06:54 PM
[...]Once UT3 comes along, I'm getting everything anew.[...]

UT4.

UT1 = Unreal Tournament (AKA UT99)
UT2 = UT2003
UT3 = UT2004
UT4 = Envy

T2A`
30th Mar 2005, 06:58 PM
UT3.

UT1 = UT
UT2 = UT2003
UT2.5 = UT2004
UT3 = Envy

Dark Pulse
30th Mar 2005, 07:07 PM
Who ****ing cares? :p

Dakin
30th Mar 2005, 07:31 PM
you started it.

Dark Pulse
30th Mar 2005, 08:02 PM
you started it.
Shh.

Nereid
30th Mar 2005, 11:09 PM
Unreal Tournament = UT1
UT2003 = forget
UT2004 = UT2
Envy = UT3

Shh.And this is where he gets his 1600 posts from.....

Dakin
30th Mar 2005, 11:49 PM
And this is where he gets his 1600 posts from.....

Exactly.

I honestly can't wait for the new UT, and the need to upgrade my computer. I always enjoy upgrading, always eager to buy the top of the line stuff every year or so.

Dark Pulse
31st Mar 2005, 01:54 AM
What? I think it's 4, he thinks it's 3. Who cares? It probably won't be called either.

Do try not to rub me the wrong way, Dakin.

T2A`
31st Mar 2005, 02:49 AM
Watch out, Dakin! He's teh uber-1773 staff! Ph34r! Plague! Pestilence! He can smite you with his spear of big-headedness!

:rolleyes:

Dakin
31st Mar 2005, 03:19 AM
Uhhh so scared? :P


Anyrate, this being slated to release forever in the future hopefully some badass hardware comes out that I can spend too much money on. I supose I should budget $3.5k for the next big computer upgrade I do.

B
31st Mar 2005, 07:43 AM
Do try not to rub me the wrong way, Dakin.

Or what? Your weewee starts growing? :con:

Bot_40
31st Mar 2005, 01:34 PM
I think sweeny said a while back the engine is designed just to render a shadar effect as fast as possible based on the shadar version of the graphics card or something.
So if you have a ps2.0 card then the effect will still look the same, but it will just take multiple passes to render so it will be slower.

Neddaf
31st Mar 2005, 03:41 PM
Thanks for the info guys, most of it sounded like technical jargon to me, but thats ok. It just sounds like I won't be getting this one.

Tournament0
31st Mar 2005, 03:49 PM
Don't worry everyone! NVIDIA and/or ATI will probably make a new graphics card and it will probably run this game nicely.

SanitysEdge
31st Mar 2005, 04:13 PM
^Master of the obvious!
R520 will be teh pwange, for those who know what im talking about.
I think with my next card I will return to ATI, NVidia has me slightly dissapointed.

Sir_Brizz
31st Mar 2005, 04:49 PM
That's funny, because I'm returning to nVidia because ATI has me more than slightly disappointed.

-AEnubis-
31st Mar 2005, 05:53 PM
Yeah, it feels weird... I actually stuck with nvidia even though ATI runs uengine better, due to prioritization for api, but I figure, there are so many other games that use gl, that, if I did wanna try Doom3, or the next Quake that comes out :)

I've bought cards for this franchise before, yet had no hesitation with that decision.

Dark Pulse
31st Mar 2005, 05:55 PM
Or what? Your weewee starts growing? :con:
Yeah. Yeah it does. :)

Bazzi
31st Mar 2005, 06:03 PM
Seems that ati/nvidia users are hopping currently because either of them disappoints them ;)

B
1st Apr 2005, 08:08 AM
Yeah. Yeah it does. :)
Oh man it all connects....you know we're back at the log. yeh!

Tournament0
3rd Apr 2005, 07:07 AM
NVIDIA, the way it's meant to be played. :tup:
Do I need to say anything else? ;)

Dark Pulse
5th Apr 2005, 07:04 PM
NVIDIA, the way it's meant to be played. :tup:
Do I need to say anything else? ;)
Yeah, you need to wonder how you're going to run a game that's heavily based on DirectX on nVidia cards, as they're better suited for OpenGL last I looked.

-AEnubis-
6th Apr 2005, 01:30 AM
Direct3d.

Yeah, nvidia cards are definately cross api in 2k4.

edhe
6th Apr 2005, 04:41 AM
The difference would be a half dozen frames anyway - imho go for the better priced option that's cooler.

L0cky
6th Apr 2005, 08:14 AM
The min will be dx9, not sm3. There isn't a card that can't do dx9 and would be able to run it above a slideshow anyway. If someone has a rad9600 or a gf5200 then at least they'll get the game up and running and want to upgrade.

btw:
Unreal Tournament = UT1
Unreal Tournament 2003 = UT2
Unreal Tournament 2004 = UT2 Expansion Pack
Unreal Tournament: Envy = UT3

If you think UT2004 a sequel then you probably think GTA: Vice City is GTA4, Doom3: Resurrection of Evil is Doom4, and Painkiller: Battle out of Hell is Painkiller 2.

edhe
6th Apr 2005, 09:03 AM
shouldn't 2k3 be a UT2-beta then? :)

Bazzi
6th Apr 2005, 10:40 AM
Ack.

Renegade Retard
6th Apr 2005, 10:49 AM
That's funny, because I'm returning to nVidia because ATI has me more than slightly disappointed.


You know you miss nVidia, don't you? I have yet to try an ATI card.

Do try not to rub me the wrong way, Dakin, or a genie will fly out of my arse!

Corrected. ;)

Sir_Brizz
6th Apr 2005, 11:58 AM
Yeah, you need to wonder how you're going to run a game that's heavily based on DirectX on nVidia cards, as they're better suited for OpenGL last I looked.
Um....no.

Xipher
6th Apr 2005, 01:43 PM
Seeing as they are going to have Direct3D AND OpenGL renders, I don't see what the fuss is about any how. Mainly, Direct3D will be the primary render in Windows, while other OS's use OpenGL. This is no different then UT200X. Now, the OpenGL render was also avalible in Windows, although I don't think it was commonly used, since it normally didn't preform as wll, and also had missing features (Render to Texture, I think is what it was called, thats used for the Hellbender License plate)

Timonator
6th Apr 2005, 02:26 PM
it IS called render to texture, but i really dont know why the hell they didnt do it? RenderToTexture _IS_ supported by OpenGL, but not by the game :(
basically you can do almost everything with OGL you can do with D3D and vice versa, so OGL has features D3D doesnt have andother way around

personally i like OGL more, because the NVidia support is better, there is linux support (really good linux support!) and it doesnt have Microsoft all over it :P
(+ its easyer to code so you can spend more time on the gameplay/engine code than on the graphics code!)

CyMek
6th Apr 2005, 02:36 PM
The difference would be a half dozen frames anyway - imho go for the better priced option that's cooler.

Oh, so you mean go with ATI? :P

I don't see the point in debating the video cards. To each his own. I'm a hardcore ATI guy, and short of major unseen disasters, I will be sticking with them until the 8th circle freezes over. Then I'll take that as a sign and go back to nVidia.

Not to knock them as a company though - I love my nForece 4 mobo. :D

Timonator
6th Apr 2005, 02:53 PM
for me ATI is teh suxx0r, but thats just due to ****ing bad linux drivers and the worst OpenGL support i have ever seen :)

-AEnubis-
6th Apr 2005, 03:08 PM
Even though both API's are availible in a game, doesn't mean it won't matter. Developers usually only focus on one API as the main one, and hence said discrepencies. I tried running GL on my 6600, and D3D is still more stable, with minimal quality increase. Running D3D though, means I can't jack my settings through the roof (in general).

I miss Glide.

Xipher
6th Apr 2005, 11:00 PM
Even though both API's are availible in a game, doesn't mean it won't matter. Developers usually only focus on one API as the main one, and hence said discrepencies. I tried running GL on my 6600, and D3D is still more stable, with minimal quality increase. Running D3D though, means I can't jack my settings through the roof (in general).

I miss Glide.
Yes, but when this main one isn't avalible across all platforms (Mac, and Linux get the OpenGL Render, as Direct3D is a Microsoft thing) Its good to give a choice. Although I will agree, the OpenGL isn't nearly as good in Windows as Direct3D, Its basicly the only option on other PC OSs.

-AEnubis-
7th Apr 2005, 02:08 AM
Actually, I prefer the way games optimized for GL look, as opposed to games optimized for D3D, it's just that this game looks better in D3D because of that optimization.

edhe
7th Apr 2005, 05:05 AM
OGL is only included for the Linux users to be able to play.

In that case, go for nvidia who have a stronger OGL architecture.

ATi have been knocking out the best framerates for 2k4 at the lower cost..

Look at benchmarks and prices and choose *yourself* people.

Doikor
7th Apr 2005, 05:42 AM
Well its not only that. To make UT3 really portable to all platforms they need OGL (as i remember playstation doesnt have directx). And the ATI drivers suck big time in linux. (You should be happy to get the firegl to work)

Timonator
7th Apr 2005, 09:45 AM
yeah and DONT TRY TO UPDATE THEM!!11oneone
they dont tell you that for nothing :D
(updating the linux drivers from ATI may reault in either abig explosion, a big fireball, a big cloud of smoke or your mom getting shaved :) )