Well I didn't see the presentation, but if this is true then they need to make the article more clear. I didn't read anything about a new type of processor, just "software implementation, backed by massive computing power." So I just assumed he was talking about CPUs rendering everything with software.
I didn't see the presentation either, I just read the slides that were linked from the article.
It's still a HELLUVA long time but why do I feel like I wasted $500 on a GPU?
I feel like that because you spend $500 on a graphics card (not a GPU). Buying a $250 graphics card is more than enough.
Or am I just misunderstanding the fact that GPU's will eventually be considered part of a computer systems' general-purpose processing and not just in graphics/physics calculations? (Meaning, they won't be obselete - instead, integrated.)
I feel like the title of "End of the GPU by 2020" brews a misconception. Does it mean, GPUs become useless or GPUs simply turn into something more than a "video card"? (Seeing the cGPU architecture nVidia wants to introduce in the GeForce 300's)
Graphics cards won't cease to exist. But the specialized 3D rendering GPU will cease. You still need something that produces a VGA/DVI/HDMI/.. output so that stuff will be shown on your monitor. But a lot of the rendering, which is currently more or less ofloaded to the GPU through an API and rendering pipeline, will be performed by the "system". Where "system" is the combination of what we currently call CPU and parts of the current GPU (the parts that are used in GPGPU (see wikipedia)). How this system will eventuelly look like is unknown. It might be a slot you insert on your motherboard like you did in the past with Pentium 2, 3 and first generation Athlons. Or it might be multiple processors you place on your motherboard, like you did even more in the past with the separate floating point processor.