Tim Sweeney: "End of the GPU by 2020"

  • Two Factor Authentication is now available on BeyondUnreal Forums. To configure it, visit your Profile and look for the "Two Step Verification" option on the left side. We can send codes via email (may be slower) or you can set up any TOTP Authenticator app on your phone (Authy, Google Authenticator, etc) to deliver codes. It is highly recommended that you configure this to keep your account safe.

Dark Pulse

Dolla, Dolla. Holla, Holla.
Sep 12, 2004
6,186
0
0
38
Buffalo, NY, USA
darkpulse.project2612.org
Tim Sweeney's been a man who's made a lot of interesting predictions on just how gaming - and PC hardware are going to evolve. His latest prediction: We're all going back to software rendering.

Epic Games founder and 3D engine architect Tim Sweeney has presented what he calls "The end of the GPU roadmap", where he essentially says that GPU as we know them are too limited, and predicts that by 2020 developers will switch to a more flexible massively parallel programming model where all fixed functionality (texture filtering, anti-aliasing, rasterization) have been replaced with a software implementation, backed by massive computing power. There's no denying that such a perspective is exciting for software engineers, and T. Sweeney commands a lot of respect: the Unreal engine was one of the best software 3D engine before we all jumped onto the GPU boat and the Unreal Engine3 is used in about 150 games today.
 

elmuerte

Master of Science
Jan 25, 2000
1,936
0
36
42
the Netherlands
elmuerte.com
Current games already make extensive use of software in their rendering. Remember, pixel and vertex shaders are actually small software programs executed by the graphics card.
 

Soggy_Popcorn

THE Irish Ninja
Feb 3, 2008
564
0
0
Maybe, but not that soon. There's a reason GPUs are used so much: they're the only viable method of the massively parallel computing he talks about. Just try doing that on a consumer level CPU. Nope, maybe 2050 ;)
 

JaFO

bugs are features too ...
Nov 5, 2000
8,408
0
0
Sounds good in theory, because most of these hardware-limitations never made any sense.
In a way this mimics the way computers have evolved (from static/fixed processing units that could barely add two digits to the complex cpu's we've got today).

Let's just hope that this 'fully programmable parallel processing unit' gets a language that is standardised between manufacturers like the current cpu's are (and preferably a bit more).

There's just one question remaining : will this superpowerful ppu ever replace the cpu as the main processing chip ?
And if it does so ... won't that have proven Intel's idea that the cpu should handle everything ?
 

Phopojijo

A Loose Screw
Nov 13, 2005
1,458
0
0
37
Canada
I'd personally say we'd still have two chips (not necessarily separate) -- one General Purpose, one heavily parallel.
 

Kantham

Fool.
Sep 17, 2004
18,034
2
38
Funnily enough I had made a thread a few days ago about people's prediction on 2020. :p
What amaze me is that Tim is talking PC. Epic has only proved to be a console company nowadays.
 

SleepyHe4d

fap fap fap
Jan 20, 2008
4,152
0
0
There's just one question remaining : will this superpowerful ppu ever replace the cpu as the main processing chip ?
And if it does so ... won't that have proven Intel's idea that the cpu should handle everything ?

What? You don't understand what he's saying. :lol:

Edit: Instead of making a dick comment and not explaining I'll add that that is exactly what Tim is talking about, the cpu doing everything. Not a separate processor.

Also, Tim has mentioned this before... so it's not really new. I don't remember where though, anyone remember or have a link? I think it was a UT3 interview or it was around that time with the UT3 hype.
 
Last edited:

Continuum

Lobotomistician
Jul 24, 2005
1,305
0
0
43
Boise
By 2020 you'll be playing PC games that are installed on a Google/Amazon VM and you'll only need 256mb RAM with a 512mHz processor.
 

elmuerte

Master of Science
Jan 25, 2000
1,936
0
36
42
the Netherlands
elmuerte.com
What? You don't understand what he's saying. :lol:

Edit: Instead of making a dick comment and not explaining I'll add that that is exactly what Tim is talking about, the cpu doing everything. Not a separate processor.

No he's not. One of the things Tim is saying in his presentation is that the GPU was we currently know it (communication through APIs like Glide/OpenGL/DirectX) will go. Instead, you will get a more general purpose processor that on which you execute code. This is already happening right now on GeForce 8x an later through CUDA, and similar on AMD/Ati. Combine this with the "normal" CPU and you get the platform which Tim is talking about.
The "CPU" platform Tim is talking about is much like the Cell architecture (not as currently present in the PS3, a lot of changes are needed, but the overall idea is the same). Instead of a GPU thing you have a bunch of stream processors (which most modern GPUs contain and use for the shaders) which perform "pure functional" routines (like physics, collision detection, scene traversal, path finding, ...) and execute graphics shaders. And you have a bunch of cores which perform game state code, and other mode structural complex objects. And a "controller" which does the old and slow sequential stuff like I/O.
This is different from the current system because there is no (rendering) pipeline. The software is in full control on what and when certain things are executed by the various processors.
 
Last edited:

Zur

surrealistic mad cow
Jul 8, 2002
11,708
8
38
48
So to summarize things, there's a planned shift from a specialized circuit that handles graphics only to something like a general purpose signal processor which can handle other tasks like physics simulation ?
 
Last edited:

SleepyHe4d

fap fap fap
Jan 20, 2008
4,152
0
0
No he's not. One of the things Tim is saying in his presentation is that the GPU was we currently know it (communication through APIs like Glide/OpenGL/DirectX) will go. Instead, you will get a more general purpose processor that on which you execute code.

Well I didn't see the presentation, but if this is true then they need to make the article more clear. I didn't read anything about a new type of processor, just "software implementation, backed by massive computing power." So I just assumed he was talking about CPUs rendering everything with software.

I was also going by what I read on Tim talking about this subject before. Don't really remember the details though.

As for the rest, I'm completely lost, I don't understand the inner workings of GPUs. :lol:
 
Last edited:
It's still a HELLUVA long time but why do I feel like I wasted $500 on a GPU?

Or am I just misunderstanding the fact that GPU's will eventually be considered part of a computer systems' general-purpose processing and not just in graphics/physics calculations? (Meaning, they won't be obselete - instead, integrated.)

I feel like the title of "End of the GPU by 2020" brews a misconception. Does it mean, GPUs become useless or GPUs simply turn into something more than a "video card"? (Seeing the cGPU architecture nVidia wants to introduce in the GeForce 300's)
 

Zur

surrealistic mad cow
Jul 8, 2002
11,708
8
38
48
As for the rest, I'm completely lost, I don't understand the inner workings of GPUs. :lol:

The strength of graphics processors and DSPs is that they can process a lot of information in parallel (think of a factory with many production lines) whereas a GPU serves it's purpose as a jack of all trades.

Or am I just misunderstanding the fact that GPU's will eventually be considered part of a computer systems' general-purpose processing and not just in graphics/physics calculations? (Meaning, they won't be obselete - instead, integrated.

As I understand things, they will gradually be transformed into something that can serve a larger purpose. Graphics would have already been integrated in the main chips. The advantage would be that information could transit very rapidly (a few tens of GBs per second) but that would make CPUs more dense and more expensive to produce.

Since the 0.18 micron barrier was broken in terms of fabrication process it's become apparent that silicon is slowly reaching it's limits with data being corrupted by quantum effects. Since transforming electronic signals into optical ones is still messy, the only way to go forward at the moment is to design efficient multicore or multiprocessor systems.

At the moment, the fact that information has to transit through the main board makes things much slower even if a dedicated bus like PCI-E is used. However, things are handled such that all the raw information is digested in one go and all that is needed when handling graphics is to regurgitate the processed information to the screen. There are a few other types of tasks that could be handled in the same way.
 
Last edited:

TurdDrive

sam k
Oct 31, 2008
3,445
2
38
WALES
i thinks when Games and such reach a point where they are 100% life like then computers will slowly start to creep up on the GPU until they overtake it thus making it redundant.