Epic's Mike Capps: We Love PC, But...

  • Two Factor Authentication is now available on BeyondUnreal Forums. To configure it, visit your Profile and look for the "Two Step Verification" option on the left side. We can send codes via email (may be slower) or you can set up any TOTP Authenticator app on your phone (Authy, Google Authenticator, etc) to deliver codes. It is highly recommended that you configure this to keep your account safe.

Sir_Brizz

Administrator
Staff member
Feb 3, 2000
26,020
83
48
I don't think it matters where you focus development as long as you give the PC the appropriate amount of polish. Look at Borderlands. It's a qualified console game, however it works perfectly on the PC anyway. Why? Because they spent enough time making the PC version what it should be instead of just copying and pasting the code (I know that is underexaggerating the amount of work involved).

It's also a good example that, if your game is good enough, having a few issues in the UI doesn't really matter (as long as it is mostly functional).
 

WHIPperSNAPper

New Member
Mar 22, 2003
444
0
0
Visit site
Bad developers are killing PC games.

Shocking online functionality is killing PC games.

DRM is killing PC games.

It's a simple triumvirate of failure, and we're the only ones who are going to suffer. There's a reason I long since made the choice to stop supporting Epic products.

I agree.

Ironically, if any games would be immune to piracy, it would be games intended for online multiplayer--like Unreal Tournament. The company could just keep track of the specific CD keys issued an only allow those keys to play online. Stardocks Impulse system seems to have kept Sins of a Solar Empire pirates offline, or at least limited to inferior Hamachi.
 

WHIPperSNAPper

New Member
Mar 22, 2003
444
0
0
Visit site
S2 is great, but I know them better for their lesser known titles, Savage and Savage 2 (You should check out savage 2 if you get around to it)

Savage 2 is a great game--and it is FREE! (S2 makes its money by selling a small improvement to the game for $10, but it's optional.) I have been having a huge amount of fun with it. It's a melee-style FPS type of game that plays like Onslaught. It also has RPG and RTS elements to it. One player can be a team commander (the RTS element) and decide where things get built along with buffing players. It has an RPG element in that for each map, players can level up to level 15, increasing abilities such as strength and health. There are also different character classes you select on each spawn. Check it out.

http://www.Savage2.com
 
Last edited:

Severin

New Member
Feb 8, 2008
199
0
0
Yeah, I don't understand why somebody would want the death of GPU's.
what would we be stuck with? Onboard?

Then we have to either replace the mother board or rig our own crap.

EFF THAT~!

To replace them with multi-core CPU's or CPU like graphics cards.
Thd majority of CPU's tend to cost less than highend GPU's so upgrades to play the latest games may work out cheaper if this comes to pass.

As for the "why would they want to" part. Because you don't get limited by the fact that your hardware is designed to do things in a set way. With a more generalised hardware design its down to the developers how their game renders to the screen giving far more scope for original looking effects etc.
 

ambershee

Nimbusfish Rawks
Apr 18, 2006
4,519
7
38
37
Nomad
sheelabs.gamemod.net
To replace them with multi-core CPU's or CPU like graphics cards.
Thd majority of CPU's tend to cost less than highend GPU's so upgrades to play the latest games may work out cheaper if this comes to pass.

As for the "why would they want to" part. Because you don't get limited by the fact that your hardware is designed to do things in a set way. With a more generalised hardware design its down to the developers how their game renders to the screen giving far more scope for original looking effects etc.

This is the argument from a marketing perspective, however it is in reality completely backwards, think about it.

1) If you start faffing about with how your CPU works in order to allow it to handle the kinds of rapid mathematical processing a GPU is designed to, it is going to become considerably more expensive as a result.

2) Most people who upgrade a machine to play games will upgrade their GPU one or two times over the lifetime of their machine. Upgrading a CPU is less common because it often comes with a more significant generational gap that will also require upgrades to the motherboard and isn't as flexible as an expansion card with a suitable interface. The result of needing to upgrade to keep up with games? Not more memory and a new graphics cards - you now need effectively a whole new machine.

3) When you take a piece of hardware designed to do a very specific task and try to generalise it to perform more generic tasks, it becomes **** at doing that specific task. Graphics cards exist because of a need to do very, very specific tasks and a generalised piece of hardware does not suit these ever evolving needs, nor can it easily keep up with them. Any ****ty Intel motherboard with integrated graphics is a fantastic example of this.
 

Maniacbob

New Member
Jul 7, 2009
7
0
1
I don't think it matters where you focus development as long as you give the PC the appropriate amount of polish. Look at Borderlands. It's a qualified console game, however it works perfectly on the PC anyway. Why? Because they spent enough time making the PC version what it should be instead of just copying and pasting the code (I know that is underexaggerating the amount of work involved).

It's also a good example that, if your game is good enough, having a few issues in the UI doesn't really matter (as long as it is mostly functional).

Agreed. Most computer games now slap the same old UI that they made for console onto the PC version and just assume it will work and it's annoying and borderline offensive. That was the problem I faced when I played Fallout 3. Terminals and computers are annoying to navigate through same with the Pipboy. Ironically Fallout 3 was voted the 2nd best PC game of all time but whatever. The funny part is that with a few slight changes would make a lot of them exponentially better.
 

elmuerte

Master of Science
Jan 25, 2000
1,936
0
36
42
the Netherlands
elmuerte.com
I call bull****. What games do have a proper UI? Or for that matter, what software does have a proper UI? People will always find something in the game/software to complain about. People are just looking for excuses to bitch about things.
 

Sir_Brizz

Administrator
Staff member
Feb 3, 2000
26,020
83
48
It really just depends on how functional and user friendly the UI is. It's not the primary factor, but it is a very noticeable contributing factor.
 

Grobut

Комиссар Гробут
Oct 27, 2004
1,822
0
0
Soviet Denmark
I call bull****. What games do have a proper UI? Or for that matter, what software does have a proper UI? People will always find something in the game/software to complain about. People are just looking for excuses to bitch about things.

Feeling a little butthurt over something there? :lol:

The thing about UI's is that it's a utillity, which makes it unremarkable, it's something we take for granted till the day it stops working, or we get one that is horribly flawed, much like a.. screwdriver, i don't really remeber all the ones i've owned or used that worked perfectly fine, i do however remember the two cheap ones that broke whilst i was using them for a normal mundane task, and that one with the badly shaped handle that i coulden't get a good grip on nomatter what i tried.

The same is true of UI's, the best an UI can be is forgettable, because that means it just worked and nothing about it really bothered you, it's the bad ones that really stick out in your mind, it's the bad ones that people will mention and argue about.

And there's really no shortage of games out there where the UI never got brought up, this is because it just worked, an UI doesen't have to be ground breaking, it doesen't have to be artistic or pretty, it just has to work and flow with a minimum of annoyance, then it's a perfectly forgettable utillity and forgettable is good, it's only when you cock one up we take notice!
 

Bi()ha2arD

Toxic!
Jun 29, 2009
2,808
0
0
Germany
phobos.qml.net
I call bull****. What games do have a proper UI? Or for that matter, what software does have a proper UI? People will always find something in the game/software to complain about. People are just looking for excuses to bitch about things.

It's probably highly subjective, but a good thought through UI will make workflow faster. And games easier to use.
 

Severin

New Member
Feb 8, 2008
199
0
0
@ambershee:

On the whole I agree with you. I was just trying to clarify the case for Larrabee like cards.

Though point 1 does not really hold water. CPU's are general purpose devices, you are not changing them as such to be graphics chips (though you could have dedicated cores for that purpose) its jut continuing the trend that has already started, more cores per cpu. The price of a CPU at a particular (relative) peformance level has not really changed that much in many years.


[edit] point 3. While I don't disagree. It is the case that GPU's are becoming more generalised with each new generation and Intel integrated graphics (tho crap) is dedicated hardware.

A real benefit from something like larrabee or a yet more generalised GPU is that when your not playing games it can do general computing tasks acting as a second 'cluster' of cpus. Great for heavy duty number crunching tasks.

I do see that the lines between GPU/CPU are starting to blur and that (imo) GPU's won't be around forever. Though wether that is because you will have an NVIDIA based pc or a pc with an Intel massively parallel CPU doing all the graphics and over what timescale this will happen, who can tell.
 
Last edited:

Severin

New Member
Feb 8, 2008
199
0
0
...

The thing about UI's is that it's a utillity, which makes it unremarkable, it's something we take for granted till the day it stops working, or we get one that is horribly flawed, much like a.. screwdriver, i don't really remeber all the ones i've owned or used that worked perfectly fine, i do however remember the two cheap ones that broke whilst i was using them for a normal mundane task, and that one with the badly shaped handle that i coulden't get a good grip on nomatter what i tried.

The same is true of UI's, the best an UI can be is forgettable, because that means it just worked and nothing about it really bothered you, it's the bad ones that really stick out in your mind, it's the bad ones that people will mention and argue about.

And there's really no shortage of games out there where the UI never got brought up, this is because it just worked, an UI doesen't have to be ground breaking, it doesen't have to be artistic or pretty, it just has to work and flow with a minimum of annoyance, then it's a perfectly forgettable utillity and forgettable is good, it's only when you cock one up we take notice!


Spot on.
I would add that a good UI also gives a resonable amount of access to the softwares feature set. Allowing the user to easily configure and use the software in the manner they prefer. Subjective I know but if a large number of people don't find that it meets those criteria its a bad U.I.
 
Last edited:

Sir_Brizz

Administrator
Staff member
Feb 3, 2000
26,020
83
48
Graphics will never move on to the CPU in any usable form for lots of reasons, but the primary one is heat. CPUs already have a somewhat difficult time dissipating the heat they generate. Imagine sticking a GPU on there as well that will easily quadruple the heat output (on a related note, GPUs tend to run about 40dC hotter than CPUs do). Anything that is not an Intel GMA is not going to be able to survive with one another.
 

Severin

New Member
Feb 8, 2008
199
0
0
Its not a case of adding a gpu to the cpu. Its one replacing the other as they both move towards each other in terms of design.

Or to put it another way The GPU's Nvidia and the like make may at some point be called CPU's and rival manufactures CPU's will be easily capable of coping with massively parallel tasks.
 
Last edited:

Sir_Brizz

Administrator
Staff member
Feb 3, 2000
26,020
83
48
Its not a case of adding a gpu to the cpu. Its one replacing the other as they both move towards each other in terms of design.

Or to put it another way The GPU's Nvidia and the like make may at some point be called CPU's and rival manufactures CPU's will be easily capable of coping with massively parallel tasks.
They already are, but it doesn't matter. If GPUs were going to take over for CPUs or vice versa, that shift would have already taken place last generation (when a seemingly physical barrier was reached on processor speeds).

To be honest, other than AMD/ATI, it's not in CPU manufacturer's best interest to push GPU makers out, nor vice versa. Intel keeps pushing their stupid Intel GMA, but you can be sure that it will continue to be the red headed step child of GPUs for many years to come.
 

elmuerte

Master of Science
Jan 25, 2000
1,936
0
36
42
the Netherlands
elmuerte.com
It's probably highly subjective, but a good thought through UI will make workflow faster. And games easier to use.

Only if your goal for the UI is to make a "fast workflow".
But what if the goal is to make the UI part of the game world. Fallout 3 is a great example of this. The UI directly reflects the game world. It doesn't have a fast workflow. But not a single computer (or other interface) in the Fallout 3 world has a fast workflow.

UIs shouldn't be forgettable per se. Specially in RPG or strategy games the UI plays a very important role. For a lot of other genres the UI doesn't serve a role beyond starting the game and changing some settings.
 

Severin

New Member
Feb 8, 2008
199
0
0
@Sir_Brizz
If GPUs were going to take over for CPUs or vice versa, that shift would have already taken place last generation (when a seemingly physical barrier was reached on processor speeds).

Clock speeds have not gone up much over the last few years but processing power has gone up significantly and will continue to do so.

A 3.2Ghz P4 is not even in the same ballpark as an I7 at the same clock speed.
 

Sir_Brizz

Administrator
Staff member
Feb 3, 2000
26,020
83
48
Who said it was?

My point was that building multiple cores into the same unit should have been the time when GPUs were built in. However, there are tons of logistical problems that would have top be overcome. The business models for CPUs and GPUs ruin what could be a good combination. Would they have EVERY model of CPU combined with EVERY model of GPU? so a single CPU model would have 20 different models for all the GPUs it could be combined with? This is just a tiny example of the problems with doing what AMD is trying to do and what you are suggesting. The most probable outcome of this is that GPUs built into CPUs will be extremely neutered versions of the bulkier (and more expensive) addon cards.