Tim Sweeney Excited About Intel's Larrabee

  • Two Factor Authentication is now available on BeyondUnreal Forums. To configure it, visit your Profile and look for the "Two Step Verification" option on the left side. We can send codes via email (may be slower) or you can set up any TOTP Authenticator app on your phone (Authy, Google Authenticator, etc) to deliver codes. It is highly recommended that you configure this to keep your account safe.

hal

Dictator
Staff member
Nov 24, 1998
21,409
19
38
55
------->
www.beyondunreal.com
Gamespot has a really short Q&A with Epic Games' Tim Sweeney about Intel's Larrabee Processor. I think Tim likes it.

The most exciting opportunity for Larrabee is the possibility of Intel moving it down to the mid-range and low-end over time. If Larrabee eventually displaces Intel Integrated Graphics, that would bring compelling graphics to the masses. Intel could become a real force for good in the graphics market, which -- to be blunt -- hasn't been the case in recent years.
 

GG-Xtreme

You are a pirate!
Mar 12, 2008
332
0
0
I could care less about a 'Larrabee' GPU, but it would be nice if Intel stopped making their Integrated Graphics **** the mainstream.
 

Anuban

Your reward is that you are still alive
Apr 4, 2005
1,094
0
0
What is so funny about this is that Carmack actually does NOT like Larrabee ... and we all know that Carmack is saying that their new engine is setting a new bar and also he said that Doom 4 will be beyond anything anyone has ever seen. Man I love it ... these two dudes Sweeney and Carmack are arguably the most brilliant game engine developers around and to see them going head to head once id Tech 5 comes out is going to be something. Crytek is going to get pushed back and so is Valve ... its all about Epic vs id .. just like the good old days. Although to be fair there are some pretty amazing engines being developed for games these days ... the Evolution engine used for Dark Sector is pretty amazing and Too Human also has a good solid engine. But in my mind I really think the battle for the future of graphics is going to come down to Sweeney vs Carmack ... it just seems to be destined.
 

MonsOlympus

Active Member
May 27, 2004
2,225
0
36
43
Well I think Intel is in a pretty good position to take on the graphics market myself, since they already do chipsets which is where Nvidia have been heading over recent years with their SLi support. Intel actually makes boards which support both SLi and Crossfire but a majority of their chipsets support only crossfire and the third parties tend to stick with that formula which could be hurting intel some.

Ofcoarse you can still run Intel CPU's in Nvidia boards but you can also run AMD so the market is really in a bit of a spot, for some reason Nvidia thinks they can be competitive in the mobo market when in my experience the features, powerusage etc are only starting to be worth the price now. See Ati really have something going since both AMD and Intel seem to be supporting them almost fully for whatever reason, perhaps its because of Nvidia's move into the mobo market.

Intel also has the bonus of being in the Apple Mac market so they could move to take position in that for more bluechip like offerings. If Intel does make a move for the console market IBM could be in for alittle competition, either way it'll be interesting to see who goes what for the next wave of consoles. For PC we kinda know what to expect, the main problem is the incompatibilities though for SLi etc to really take off its got to become a standard so people have more choice :cool:
 

MonsOlympus

Active Member
May 27, 2004
2,225
0
36
43
Yeah but Epic usually dont have good things to say when it comes to Intel and graphics :p
 
Z

ZeelessOne

Guest
Sweeney has been talking about reintegrating the CPU and GPU for many years. I'm sure he's wetting himself over getting out of the boundries of shaders and opening up to programming his graphics in standrad C without much fixed-functionality. But he only says he's interested so that mainstream computers won't suck.

I'd think Carmack would have the same desires, though. He's one of the founding fathers of gaming graphics as well. Why wouldn't he like a fully-programmable GPU?
 

Sir_Brizz

Administrator
Staff member
Feb 3, 2000
26,021
86
48
It could work out well, If its graphical performance is up to the same standards as NVIDIA/ATI and it is used to replace Intel GMAs.
What this guy said.

Who here really thinks it will be better than the current integrated graphics line? I'm not betting on it, but let's hope Tim knows something we don't.
 

gregori

BUF Refugee
May 5, 2005
1,411
0
0
38
Baile Atha Cliath, Eireann
They haven't said how well it will stack up against dedicated graphic cards, but no one knows. Except maybe Tim :D

The good thing about this as opposed to the Integrated Graphics is that it may allow people to upgrade since all one chip/board that can be slotted in. In the other case, people practically needed to buy a whole new computer.

The guys who designed the GMA's aren't behind the production of this, The Lab that created Pentium 4's is, so perhaps it will turn out better.

Here is a better article about it: http://arstechnica.com/news.ars/pos...biggest-leap-ahead-since-the-pentium-pro.html
 
Last edited:

SPIDEYUT2K7

I see a blue screen
Feb 22, 2008
162
0
0
47
City of the Polution
its all about Epic vs id .. just like the good old days. But in my mind I really think the battle for the future of graphics is going to come down to Sweeney vs Carmack ... it just seems to be destined.

Agree with you ma8! we want to see this like the good old days! Quake vs Unreal ! who is the best game? or Who got the best graphic engine and everything ? :)

Also, like everyone said before, we hope that in the future whatever it is, Intel make a good graphic option for the basic concept of the pc´s in the future market, not only for office and home tasks, the croud claim better performance for less $$$ but maybe im just dreaming about that :(
 

[SAS]Solid Snake

New Member
Jun 7, 2002
2,633
0
0
41
New Zealand
www.digitalconfectioners.com
Well first of all, let's put it clear that Carmack isn't anti Larrabee, much like he isn't anti DirectX. He's [Carmack] just saying that what Intel is touting at this point is a little pointless until they've got some proven data on it. Right now, Intel doesn't really have anything to show with the Larrabee chipset. It's like all of those nVidia/ATi killer GPU's that never quite make it to being the killer GPU [Matrox, Bitboys]. During production they tout about the features and how it'll be the best performing GPU out there. The card comes out, does less than stellar across the board and winds up being forgotten. So before we all claim that Larrabee is going to be the most awesome GPU out there, let's actually test the thing first.

Sweeney has been talking about reintegrating the CPU and GPU for many years. I'm sure he's wetting himself over getting out of the boundries of shaders and opening up to programming his graphics in standrad C without much fixed-functionality. But he only says he's interested so that mainstream computers won't suck.

I'd think Carmack would have the same desires, though. He's one of the founding fathers of gaming graphics as well. Why wouldn't he like a fully-programmable GPU?
Well, the things is, is that techniques we have now already offer fully programmable GPU's. You can modify the texel output however you like, thus programmable GPU's are here. The point of a GPU is to accelerate things via brute force [silicon] that are meaningless work to a 3D engine. For example, vertex projection transformation (the act of displacing a vertex so it appears correctly on your 2D screen) is so basic to any 3D engine that it might as well be hardware accelerated, that is, why lose FPS calculating something that is going to be done a billion times? The GPU is a refined piece of silicon that's designed for programmers to simply upload 2D/3D data too and then it does the boring work for you, so that you can start working on the interesting stuff.

What the Larrabee is trying offer is total control over the whole rendering process, which oddly enough is just software rendering. So, with that in mine, if the Larrabee is really going to be a killer GPU, it'll need to emulate a 'normal' GPU faster than a GPU's silicon. It could work perhaps. The other real application that it has, are engines that aren't designed around the normal 3D techniques. Thus engines that use ray tracing or volumetric displacement may have an opportunity to shine here. Problem is, a lot of your favorite games may see a decrease in performance.

So, even judging by what Sweeney has actually said, he's not pro / con Larrabee, he's just saying that theres an opportunity here, much like Carmack. However I suppose Carmack is a little more hesitant because there isn't actually any proof of concept just yet. I suppose Carmack carries slightly negative connotations about the Larrabee compared to Sweeney.
 
Last edited:

MonsOlympus

Active Member
May 27, 2004
2,225
0
36
43
One bonus I kinda see for Intels approach is it could speed up things between the CPU, system RAM and GPU. Currently Nvidia and Ati only offer GPU's so they dont have a great deal to do with the other end of things but that might change for Ati in the future since they are part of AMD now. Nvidia on the other hand relies on chipsets to provide features their cards can use, though some features dont seem to have a chipset requirement.

For eg, if a game is heavier on GPU it could bottleneck where the Larrabee might prevent that bottleneck by load balancing the load balancing can be adjusted by Intel to suit certain games. Well thats what Im getting from the article gregori posted but maybe Im wrong. The same could be said for a CPU heavy game also, if its heavy on both ofcoarse you'll have to make some detail adjustments of sorts, its an interesting approach if I got it right. It could mean more power for your powerusage or even more of your system used for gaming instead of having bits sitting around doing nothing, I guess its a move to yet another more unified architecture to allow more silicon to multitask.

Hey Im no engineer though, sounded good to me in the article either way :lol:
 

Sir_Brizz

Administrator
Staff member
Feb 3, 2000
26,021
86
48
Signs of failure:
"As [blogger and CPU architect] Peter Glaskowsky said, the 'large' Larrabee in 2010 will have roughly the same performance as a 2006 GPU from Nvidia or ATI."
 

Sir_Brizz

Administrator
Staff member
Feb 3, 2000
26,021
86
48
Today's Intel graphics are about on par with 2001 graphics cards for the most part. Moving that up to 2006 is only going to fix the problem for so long (maybe a year).