Tim Sweeney: "End of the GPU by 2020"

  • Two Factor Authentication is now available on BeyondUnreal Forums. To configure it, visit your Profile and look for the "Two Step Verification" option on the left side. We can send codes via email (may be slower) or you can set up any TOTP Authenticator app on your phone (Authy, Google Authenticator, etc) to deliver codes. It is highly recommended that you configure this to keep your account safe.

elmuerte

Master of Science
Jan 25, 2000
1,936
0
36
42
the Netherlands
elmuerte.com
Well I didn't see the presentation, but if this is true then they need to make the article more clear. I didn't read anything about a new type of processor, just "software implementation, backed by massive computing power." So I just assumed he was talking about CPUs rendering everything with software.

I didn't see the presentation either, I just read the slides that were linked from the article.

It's still a HELLUVA long time but why do I feel like I wasted $500 on a GPU?

I feel like that because you spend $500 on a graphics card (not a GPU). Buying a $250 graphics card is more than enough.

Or am I just misunderstanding the fact that GPU's will eventually be considered part of a computer systems' general-purpose processing and not just in graphics/physics calculations? (Meaning, they won't be obselete - instead, integrated.)

I feel like the title of "End of the GPU by 2020" brews a misconception. Does it mean, GPUs become useless or GPUs simply turn into something more than a "video card"? (Seeing the cGPU architecture nVidia wants to introduce in the GeForce 300's)

Graphics cards won't cease to exist. But the specialized 3D rendering GPU will cease. You still need something that produces a VGA/DVI/HDMI/.. output so that stuff will be shown on your monitor. But a lot of the rendering, which is currently more or less ofloaded to the GPU through an API and rendering pipeline, will be performed by the "system". Where "system" is the combination of what we currently call CPU and parts of the current GPU (the parts that are used in GPGPU (see wikipedia)). How this system will eventuelly look like is unknown. It might be a slot you insert on your motherboard like you did in the past with Pentium 2, 3 and first generation Athlons. Or it might be multiple processors you place on your motherboard, like you did even more in the past with the separate floating point processor.
 

Leak

New Member
Sep 1, 2008
105
0
0
Linz, Austria
No he's not. One of the things Tim is saying in his presentation is that the GPU was we currently know it (communication through APIs like Glide/OpenGL/DirectX) will go. Instead, you will get a more general purpose processor that on which you execute code. This is already happening right now on GeForce 8x an later through CUDA, and similar on AMD/Ati. Combine this with the "normal" CPU and you get the platform which Tim is talking about.
So instead of plugging a CPU into the mainboard and adding a graphics card you'll be plugging a GPU into the mainboard and add a CPU card? :D

(All this "GPU getting more general purpose" makes me wonder when they'll remove the video connector(s) (and the frame buffer(s)) from the graphics card and move it onto the mainboard - with the last calculation pass of the "chip formerly known as GPU" just being a DMA transfer of the rendered, errr, calculated frame to the 2D frame buffer of the retro-2D graphics hardware on the mobo...)

EDIT: Augh. I really should have read the thread to it's end... *slaps himself with a trout*
 

elmuerte

Master of Science
Jan 25, 2000
1,936
0
36
42
the Netherlands
elmuerte.com
No... not a GPU, but a stream processor cluster (or something). It's not a graphics chip because it doesn't have any connectors, and it doesn't know anything about framebuffers.
The software will eventually just push the rendered image to the framebuffer of the graphics card/chip. So... it's just a dumb framebuffer just like graphics cards were 15 years ago.
 
Last edited:

Zur

surrealistic mad cow
Jul 8, 2002
11,708
8
38
48
In 2020, scientists will realize they did a big mistake by reducing the methane produced by cows.
 

MonsOlympus

Active Member
May 27, 2004
2,225
0
36
42
Dont software and hardware go hand in hand? Oh well, Im sure in the age of quantum computing we wont need anything but light and atoms to render a picture, oh wait... its called life hehe

Honestly though with all the trickery in games/engines that goes on behind the scenes these days. Well I wouldnt mind more thought put into accuracy instead of pushing some kind of invisible purty bar developers put on themselves.

Im predicting in 2020 games and engines will still be just as buggy as they are now, if not moore so! All I know is the strain on artists to make use of the engines is higher than its ever been, sure theres plenty out there working on gameplay, to make use of all this render power we'll have in 2020 the tools and pipelines is where the most time will be saved.

If you can scan someone in 3d for cheap in a matter of minutes, get them in and animated in an engine in acouple of hours thats worth more then any renderer on the planet!!
 
Last edited:

MonsOlympus

Active Member
May 27, 2004
2,225
0
36
42
I know a couple of companies that actually already do that for things like heads and faces :lol:

Scratch one bogey! Yeah perhaps but theres alot to be said for creativity and art. Photography spawned things like the futurists, we'll see what this motion-capture does for 4D art shall we. Hmmz... not alot so far *goto slide, duck, cover generic anim mk400

Well atleast it isnt another MP40 mesh!

Its funny though, jumpin to interfaces for a sec and not the code type. I find it strange how some people prefer the methods that require more effort to achieve something like a wiimote or touch screen. A flick of a scroll mouse does seem easier to me, call me crazy though!

Seen an ad on TV today for voice activated CD player in a car, they were like "change track", computer responds "which track would you like to.. blah... change to", lady is about to respond and a dog barks "4"the computer replies. Anyways while all that is goin on Im 30seconds into track 4 cause I clicked acouple of times, mind yo I probably rear ended a car in the process but kinda not the point :p

I woulda been more impressed if you knew a company doin the first thing I mentioned, seriously... Teleporters, mass drivers... and whens Tim gonna covert a tesla roadster into a flyin delorian with 1.21 gigawatts of power. At this rate we'll need that much just for the quadtrodblogsli

Shared ram for GPU's is where its at if you ask me, 1gb per board is a waste. Hell even some of the new nvidia ones dont even do it with the new design do they? its what 864 or somethin per. CPU's/multicores share ram and its makin a mess of the mobo as it is with all those sata and ram slots. EEEExATX, dont forget the 1 tonne bit of heatsink to go over the top.

Heres one for ya though, what happens when the processes in the CPU manufacturing get so small the current doesnt have room to pass through? Will we have no choice but to go bigger with more power draw?

Theres always another sound barrier to break, speed of light here we come. Then again, Id hate to be the test pilot :lol:
 
Last edited:

dinwitty

DeRegistered User
Nov 10, 1999
860
1
18
midwest,USA
www.qtm.net
The GPU is a kind of parallel processor. He predicts your just moving everything down to the CPU. Problem is when you can improve the CPU, you can improve the GPU at the same time due to technical advances, so I still don't see the GPU dissappearing. You may see a change how its implemented tho. Future Notebooks could run todays high end stuff with ease.