Question about CPU and graphical settings

  • Two Factor Authentication is now available on BeyondUnreal Forums. To configure it, visit your Profile and look for the "Two Step Verification" option on the left side. We can send codes via email (may be slower) or you can set up any TOTP Authenticator app on your phone (Authy, Google Authenticator, etc) to deliver codes. It is highly recommended that you configure this to keep your account safe.

Capt.Toilet

Good news everyone!
Feb 16, 2004
5,826
3
38
42
Ottawa, KS
Ok so here is the story.

Old pc
intel quad core Q6600 @2.4ghz
2 gigs pc6400 ocz
8800gt 512meg
600w PSU
Winxp

Computer flaked out last week and in the meantime will be using this

AMD phenom 9150e quad@1.8ghz (will change when I get paid)
7gigs pc6400
Same graphics card
300w PSU(will change this out when funds come in)
780g MSI motherboard chipset
Win Vista 64bit(will get 7 hopefully soon)

My question is does cpu speed effect what games will find as an acceptable optimal setting. I will use Crysis as an example here.

On the old rig, Crysis set everything to high quality as optimal settings and it yielded good results. On this current rig, it defaults everything to medium. Now I know this processor is crap and it is probably a bottleneck, but I figured when a game chose what it felt was best it was looking at the g-card for these settings.

Thanks in advance.
 

Darkdrium

20% Cooler
Jun 6, 2008
3,239
0
36
Montreal
Yes, of course it will affect how the games automatically adjust the settings.
I found for the few games I have that do this that they compare your hardware to the system requirements they have programmed into them.
If your hardware is apparently too low (Here CPU Speed) it will lower the settings even though the game in itself might be more GPU bound and the CPU would not be a bottleneck at all for that particular game.
Though 1.8Ghz is pretty low by today's standards. ;) At least for Intel, don't know if AMD still has this "rated" vs "effective" speed thing.
 

NRG

Master Console Hater
Dec 31, 2005
1,727
0
36
35
Generally games aren't very comprehensive in CPU speed detection. Normally it will just look at the frequency and the higher the better. For example, games often assume if you have a 3.0GHz+ processor then you spent a lot of money on a fast processor. This logic would probably be a lot more valid if overclocking didn't exist.

No real big deal though. Only becomes a problem if the game checks for minimal requirements and it prevents the game from starting/installing if you fail.
 

Capt.Toilet

Good news everyone!
Feb 16, 2004
5,826
3
38
42
Ottawa, KS
Yes, of course it will affect how the games automatically adjust the settings.
I found for the few games I have that do this that they compare your hardware to the system requirements they have programmed into them.
If your hardware is apparently too low (Here CPU Speed) it will lower the settings even though the game in itself might be more GPU bound and the CPU would not be a bottleneck at all for that particular game.
Though 1.8Ghz is pretty low by today's standards. ;) At least for Intel, don't know if AMD still has this "rated" vs "effective" speed thing.

Yeah I don't think they rate their cpu's that way anymore. I am noticing a fair bit amount of fps drops in games that I knew ran perfectly on my other machine. It is a shame too because I used to have an AMD 3200+ @1.6ghz that felt like it was running at over 2ghz.

Anywho thanks for the reply :)

EDIT: And you too NRG :)
 

-n7-

Member
May 12, 2006
411
0
16
Edmonton, AB
...don't know if AMD still has this "rated" vs "effective" speed thing.

Clock for clock, AMD used to offer better performance, hence their rating system.

But ever since Intel came out with Core 2 Duo (or Core Duo on notebooks), it's been the opposite for pretty much all comparable Intel to AMD CPUs.
I.e., Intel is faster, clock for clock.