1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.
  2. Two Factor Authentication is now available on BeyondUnreal Forums. To configure it, visit your Profile and look for the "Two Step Verification" option on the left side. We can send codes via email (may be slower) or you can set up any TOTP Authenticator app on your phone (Authy, Google Authenticator, etc) to deliver codes. It is highly recommended that you configure this to keep your account safe.

Tim Sweeney: "End of the GPU by 2020"

Discussion in 'News & Articles' started by Dark Pulse, Aug 12, 2009.

  1. elmuerte

    elmuerte Master of Science

    Joined:
    Jan 25, 2000
    Messages:
    1,936
    Likes Received:
    0
    I didn't see the presentation either, I just read the slides that were linked from the article.

    I feel like that because you spend $500 on a graphics card (not a GPU). Buying a $250 graphics card is more than enough.

    Graphics cards won't cease to exist. But the specialized 3D rendering GPU will cease. You still need something that produces a VGA/DVI/HDMI/.. output so that stuff will be shown on your monitor. But a lot of the rendering, which is currently more or less ofloaded to the GPU through an API and rendering pipeline, will be performed by the "system". Where "system" is the combination of what we currently call CPU and parts of the current GPU (the parts that are used in GPGPU (see wikipedia)). How this system will eventuelly look like is unknown. It might be a slot you insert on your motherboard like you did in the past with Pentium 2, 3 and first generation Athlons. Or it might be multiple processors you place on your motherboard, like you did even more in the past with the separate floating point processor.
     
  2. Leak

    Leak New Member

    Joined:
    Sep 1, 2008
    Messages:
    105
    Likes Received:
    0
    So instead of plugging a CPU into the mainboard and adding a graphics card you'll be plugging a GPU into the mainboard and add a CPU card? :D

    (All this "GPU getting more general purpose" makes me wonder when they'll remove the video connector(s) (and the frame buffer(s)) from the graphics card and move it onto the mainboard - with the last calculation pass of the "chip formerly known as GPU" just being a DMA transfer of the rendered, errr, calculated frame to the 2D frame buffer of the retro-2D graphics hardware on the mobo...)

    EDIT: Augh. I really should have read the thread to it's end... *slaps himself with a trout*
     
  3. elmuerte

    elmuerte Master of Science

    Joined:
    Jan 25, 2000
    Messages:
    1,936
    Likes Received:
    0
    No... not a GPU, but a stream processor cluster (or something). It's not a graphics chip because it doesn't have any connectors, and it doesn't know anything about framebuffers.
    The software will eventually just push the rendered image to the framebuffer of the graphics card/chip. So... it's just a dumb framebuffer just like graphics cards were 15 years ago.
     
    Last edited: Aug 13, 2009
  4. Bersy

    Bersy New Member

    Joined:
    Apr 7, 2008
    Messages:
    910
    Likes Received:
    0
    He made this prediction like a year ago.
     
  5. BITE_ME

    BITE_ME Bye-Bye

    Joined:
    Jun 9, 2004
    Messages:
    3,566
    Likes Received:
    0
    In 2020 we will run out of oil.

    How are we going to power these monster CPU's
     
  6. Interbellum

    Interbellum I used to be a man

    Joined:
    May 17, 2008
    Messages:
    719
    Likes Received:
    0
    With people (and a form of fusion).
     
  7. elmuerte

    elmuerte Master of Science

    Joined:
    Jan 25, 2000
    Messages:
    1,936
    Likes Received:
    0
    Power will be generated from the heath of internet flamewars.
     
  8. Zur

    Zur surrealistic mad cow

    Joined:
    Jul 8, 2002
    Messages:
    11,717
    Likes Received:
    4
    In 2020, scientists will realize they did a big mistake by reducing the methane produced by cows.
     
  9. Plumb_Drumb

    Plumb_Drumb yumb

    Joined:
    Mar 19, 2002
    Messages:
    8,623
    Likes Received:
    0
    cool.

    " praise hell SATAN"
    ________
    Samciganik
     
    Last edited: Aug 28, 2011
  10. ambershee

    ambershee Nimbusfish Rawks

    Joined:
    Apr 18, 2006
    Messages:
    4,521
    Likes Received:
    7
    I'm sure AMD made this claim about five or so years ago. Either way, I'd much prefer upgrading my GPU every so often, rather than effectively my entire machine :|
     
  11. MonsOlympus

    MonsOlympus Active Member

    Joined:
    May 27, 2004
    Messages:
    2,224
    Likes Received:
    0
    Dont software and hardware go hand in hand? Oh well, Im sure in the age of quantum computing we wont need anything but light and atoms to render a picture, oh wait... its called life hehe

    Honestly though with all the trickery in games/engines that goes on behind the scenes these days. Well I wouldnt mind more thought put into accuracy instead of pushing some kind of invisible purty bar developers put on themselves.

    Im predicting in 2020 games and engines will still be just as buggy as they are now, if not moore so! All I know is the strain on artists to make use of the engines is higher than its ever been, sure theres plenty out there working on gameplay, to make use of all this render power we'll have in 2020 the tools and pipelines is where the most time will be saved.

    If you can scan someone in 3d for cheap in a matter of minutes, get them in and animated in an engine in acouple of hours thats worth more then any renderer on the planet!!
     
    Last edited: Aug 23, 2009
  12. ambershee

    ambershee Nimbusfish Rawks

    Joined:
    Apr 18, 2006
    Messages:
    4,521
    Likes Received:
    7
    I know a couple of companies that actually already do that for things like heads and faces :lol:
     
  13. tomcat ha

    tomcat ha Well-Known Member

    Joined:
    Feb 2, 2002
    Messages:
    2,750
    Likes Received:
    52
    here he goes again, all this leads too is more features not getting implemented like in the past.
     
  14. KeithZG

    KeithZG will forever be nostalgic

    Joined:
    Oct 14, 2003
    Messages:
    118
    Likes Received:
    0
    Coincidentally, 2020 is also the year that the Linux and OSX ports of UT3 will come out ;)
     
  15. MonsOlympus

    MonsOlympus Active Member

    Joined:
    May 27, 2004
    Messages:
    2,224
    Likes Received:
    0
    Scratch one bogey! Yeah perhaps but theres alot to be said for creativity and art. Photography spawned things like the futurists, we'll see what this motion-capture does for 4D art shall we. Hmmz... not alot so far *goto slide, duck, cover generic anim mk400

    Well atleast it isnt another MP40 mesh!

    Its funny though, jumpin to interfaces for a sec and not the code type. I find it strange how some people prefer the methods that require more effort to achieve something like a wiimote or touch screen. A flick of a scroll mouse does seem easier to me, call me crazy though!

    Seen an ad on TV today for voice activated CD player in a car, they were like "change track", computer responds "which track would you like to.. blah... change to", lady is about to respond and a dog barks "4"the computer replies. Anyways while all that is goin on Im 30seconds into track 4 cause I clicked acouple of times, mind yo I probably rear ended a car in the process but kinda not the point :p

    I woulda been more impressed if you knew a company doin the first thing I mentioned, seriously... Teleporters, mass drivers... and whens Tim gonna covert a tesla roadster into a flyin delorian with 1.21 gigawatts of power. At this rate we'll need that much just for the quadtrodblogsli

    Shared ram for GPU's is where its at if you ask me, 1gb per board is a waste. Hell even some of the new nvidia ones dont even do it with the new design do they? its what 864 or somethin per. CPU's/multicores share ram and its makin a mess of the mobo as it is with all those sata and ram slots. EEEExATX, dont forget the 1 tonne bit of heatsink to go over the top.

    Heres one for ya though, what happens when the processes in the CPU manufacturing get so small the current doesnt have room to pass through? Will we have no choice but to go bigger with more power draw?

    Theres always another sound barrier to break, speed of light here we come. Then again, Id hate to be the test pilot :lol:
     
    Last edited: Aug 29, 2009
  16. dinwitty

    dinwitty DeRegistered User

    Joined:
    Nov 10, 1999
    Messages:
    851
    Likes Received:
    0
    The GPU is a kind of parallel processor. He predicts your just moving everything down to the CPU. Problem is when you can improve the CPU, you can improve the GPU at the same time due to technical advances, so I still don't see the GPU dissappearing. You may see a change how its implemented tho. Future Notebooks could run todays high end stuff with ease.
     

Share This Page