dual core bottlenecking 560ti?

  • Two Factor Authentication is now available on BeyondUnreal Forums. To configure it, visit your Profile and look for the "Two Step Verification" option on the left side. We can send codes via email (may be slower) or you can set up any TOTP Authenticator app on your phone (Authy, Google Authenticator, etc) to deliver codes. It is highly recommended that you configure this to keep your account safe.

Capt.Toilet

Good news everyone!
Feb 16, 2004
5,826
3
38
42
Ottawa, KS
Well, it is odd. I don't know about dual core processors. I have a q6600 and a radeon 5750 and I can play all of the games you've mentioned at 1600x1200 with some AA and mostly max settings and the game is smooth (I don't really check the framerates, but I don't notice hitching or frame lag).

Ok how is this for weird. Precision shows Metro 2033 always around 95% gpu usage and I am usually 20's-30's most of the time at 1920*1080 Very high settings. This is normal according to the benchmarks. I just played a bit of borderlands and the gpu usage never peaks above 60%. The only way for me to achieve 60fps is to disable dynamic shadows, which is very weird since this card should handle them just fine. All other settings don't increase nor decrease frames, only shadows have an impact. JD mentioned that UE3 is mostly cpu bound so that is telling me my cpu is what is holding back Borderlands.

This might explain why I am seeing better performance in games like Metro and Crysis 2 compared to when I had my 8800gt. They apparently utilize the gpu unlike Borderlands.
 

Sir_Brizz

Administrator
Staff member
Feb 3, 2000
26,020
84
48
I would say UE3 is unbelievably CPU-bound compared to some other modern games. If shadows are the only thing making a difference, though, that seems weird.
 
H

hotsoniaborden5

Guest
Hi Everyone
I'm new to This forum
it is great to join this Forum, hope i'm welcome in ;)
grey.png
 

Capt.Toilet

Good news everyone!
Feb 16, 2004
5,826
3
38
42
Ottawa, KS
I would say UE3 is unbelievably CPU-bound compared to some other modern games. If shadows are the only thing making a difference, though, that seems weird.

I love how this bot used that avatar for more emphasis. :gonk:

If you guys could do me a favor. Run borderlands with your monitoring tool of choice, ex Kantham and I use Precision, and let me know if gpu usage for that is the same as mine.
 

Kantham

Fool.
Sep 17, 2004
18,034
2
38
I don't have Borderlands installed right now and on bandwidth cap. I should get Crysis 2 soon.

I would assume you have L4D2 installed, but that game isn't a GPU brainer anyway.

I can tell you this: I run APB on maxed, GPU usage mostly juggle between 40 and 50.
 
Last edited:

Capt.Toilet

Good news everyone!
Feb 16, 2004
5,826
3
38
42
Ottawa, KS
I don't have Borderlands installed right now and on bandwidth cap. I should get Crysis 2 soon.

I would assume you have L4D2 installed, but that game isn't a GPU brainer anyway.

I can tell you this: I run APB on maxed, GPU usage mostly juggle between 40 and 50.

I have played L4D2 since getting the card and I can run insane amounts of AA and be fine unlike before, but I also know Source is very CPU heavy so it does go somewhat south when a horde begins or when a pipe bomb goes off.

I would be interested in seeing Crysis 2 from your point of view though. I usually get around 90% gpu usage in that game but when my frames drop so does my usage.

Metro2033 is the lone exception to that rule as my gpu usage does not drop when my frames do. I am assuming because the game is much more gpu bound.

Just found this too http://forums.nvidia.com/index.php?showtopic=202579

Sounds like the issue I am having, though I haven't tried Oblivion yet. It also ties in with the Toms article I posted earlier. Perhaps my dual core just isn't enough for this card to power a cpu hungry game? I will need to check my borderlands cpu usage to see if in fact it is as cpu hungry as I think it is.
 

Capt.Toilet

Good news everyone!
Feb 16, 2004
5,826
3
38
42
Ottawa, KS
I think NVIDIA have been putting a lot of work into optimizing their Forceware drivers for multi-core processors now that 4+ cores is commonplace. So yes, you could say graphics cards these days (or at least their drivers), are more 'CPU hungry'.

If you look at the screenshot I posted in the UT2004 forum, you can see that multiple threads are being used, despite UT2004 only using a single threaded engine. What do you think is using those other cores?

As for your GPU temperature an fan speed:

You should not have to be manually setting high RPM for decent temps. Unfortunately I didn't have my 560Ti for long enough to get well acquainted with it's fan control behaviour, but I wouldn't expect it to be running at 84° during normal gaming use, unless you had poor airflow or high ambient temps. 84° is a normal load temp for heavy loading benchmarks (say, Furmark ) *providing* there is plenty of fan speed leeway.

What you should do is return the fan to automatic control and run Furmark. Observe the GPU temp and fan speed stats. What should happen is the temp should rise to 85°, where it should stay (give or take a few degrees). You should see the fan speed increase to keep it at the 85° reference temp, and then stabilize somewhere well before 100%.

If the temp exceeds 85° with no fan increase from whatever the idle speed is, you have a problem. If the fan speed increases to 100% and the temp continues to climb, you have another problem. I would expect that the fan speed should become stable at around 70% with that card, maybe even less.

Alright I will try that when I get home tomorrow afternoon. As far the temps are concerned, I recall Metro and Crysis 2 topping off around 85c under normal game load with fan speed set to auto. This may be because my card is rather close to my PSU. It isn't cramped by any means, but I imagine some of the heat is being pushed back onto the card because of it.

Also let me as you this, if both cores are being maxed at 100% what does that usually signify?
 

Capt.Toilet

Good news everyone!
Feb 16, 2004
5,826
3
38
42
Ottawa, KS
Chronic bottleneck. :rockon:

Ok thats what I figured. That occured with Crysis 2 of all things. I had set it to windowed mode since alt tabbing doesn't give me a true cpu reading.

On the flip side I did the same thing with Metro and my cores were roughly 50% on both. Seems Crysis2 is more cpu bound if I am looking at this correctly.
 

KaL976

*nubcake*
Nov 28, 2003
2,515
5
38
Cardiff | UK
Visit site
...if both cores are being maxed at 100% what does that usually signify?

You really needed to ask that? :eek:

I'm not reading the whole thread to find out but I'm curious what resolution your tv/monitor is? [& I'm a assuming DVI or HDMI connex]

I'm still running my C2D @ 3GHz, 2GB RAM & a GTS 250 512MB @ 1680x1050 & it's having to work a lot harder than @ 1280x1024 but I still don't have a game that maxes the GPU, CPU or both even though at 'Extra' settings in MW2, [for example] it's a slideshow when there's lots of fog/dust & 30-60FPS the rest of the time.

[H]OCP have a recent article about CPU bottlenecking too, the conclusions page is here http://enthusiast.hardocp.com/article/2009/05/19/real_world_gameplay_cpu_scaling/11
 

Capt.Toilet

Good news everyone!
Feb 16, 2004
5,826
3
38
42
Ottawa, KS
You really needed to ask that? :eek:

I'm not reading the whole thread to find out but I'm curious what resolution your tv/monitor is? [& I'm a assuming DVI or HDMI connex]

I'm still running my C2D @ 3GHz, 2GB RAM & a GTS 250 512MB @ 1680x1050 & it's having to work a lot harder than @ 1280x1024 but I still don't have a game that maxes the GPU, CPU or both even though at 'Extra' settings in MW2, [for example] it's a slideshow when there's lots of fog/dust & 30-60FPS the rest of the time.

[H]OCP have a recent article about CPU bottlenecking too, the conclusions page is here http://enthusiast.hardocp.com/article/2009/05/19/real_world_gameplay_cpu_scaling/11

Yes I did need to ask that. I may know how to put a computer together, but my knowledge of bottlenecks and such is limited. I am not a jack of all trades you know :)

What's wrong with just leaving task manager open on the performance tab? the CPU graph continues to update in the background...

Yeah I hear what you mean, I was just being anxious. Plus when I got the fps dips I would alt tab while it was lagging to see if my cpu was affected or not.

Either way I should be home tomorrow afternoon to check that fumark program. I am rather anxious to what that will produce.
 

Capt.Toilet

Good news everyone!
Feb 16, 2004
5,826
3
38
42
Ottawa, KS
Just don't leave furmark unattended, because if there really is something wrong with your card's fan control, you'll probably fry the GPU. If it gets near 100°C, stop the benchmark.

Ok ran the benchmark with everything I could check at 1080p, no AA. Ran it for 10 minutes and it topped at 89c. Fan speed went up to around 60% from what I saw.

Also took a screenshot of this gpu shark utility. The low mhz readings must be because the card is in adaptive mode. The current performance state should be accurate since I just finished a preset benchmark on that run.

Anything you can spot that may seem out of place?
 

Attachments

  • gpu shark capture.JPG
    gpu shark capture.JPG
    59.7 KB · Views: 9
Last edited:

NRG

Master Console Hater
Dec 31, 2005
1,727
0
36
35
90c at 60% fan speed sounds normal to me. Video cards wont hesitate to hit 100% fan speed if the temperatures start reaching high temperatures. It's probably set to maintain relatively 90c at around 100% load. If you video card has never had to hit over 95% fan speed, it has never reached temperatures the manufacture deemed dangerous.

If you still don't like it, just set the fan speed manually. If you want, you can even modify the card's BIOS with NVFlash and change the values of the fan speed thresholds. The second option is obviously a lot more risky.
 

Capt.Toilet

Good news everyone!
Feb 16, 2004
5,826
3
38
42
Ottawa, KS
90c at 60% fan speed sounds normal to me. Video cards wont hesitate to hit 100% fan speed if the temperatures start reaching high temperatures. It's probably set to maintain relatively 90c at around 100% load. If you video card has never had to hit over 95% fan speed, it has never reached temperatures the manufacture deemed dangerous.

If you still don't like it, just set the fan speed manually. If you want, you can even modify the card's BIOS with NVFlash and change the values of the fan speed thresholds. The second option is obviously a lot more risky.

Nah I think it is fine. I achieved roughly 70c when running my fan speed at 90% through a custom profile. Surprisingly it idles around 35c, which isn't bad for how close it is. The size of the case and how far my g-card is from the PSU is most likely a factor in the higher temps.

Thought my furmark temps were a little suspicious and on further investigation it turns out NVIDIA throttles it's cards when it detects furmark (supposedly to prevent damage). Using the heaven benchmark my card sits at 85C.

So thought I should let you know, furmark is now useless.

Thanks for the heads up. I will try this when I get home tonight. Overall I am pleased with the performance of my card, though Borderlands and both Crysis games seem a tad troubling, but I think my processor is as fault with this.

Thanks again guys.