Well first of all, let's put it clear that Carmack isn't anti Larrabee, much like he isn't anti DirectX. He's [Carmack] just saying that what Intel is touting at this point is a little pointless until they've got some proven data on it. Right now, Intel doesn't really have anything to show with the Larrabee chipset. It's like all of those nVidia/ATi killer GPU's that never quite make it to being the killer GPU [Matrox, Bitboys]. During production they tout about the features and how it'll be the best performing GPU out there. The card comes out, does less than stellar across the board and winds up being forgotten. So before we all claim that Larrabee is going to be the most awesome GPU out there, let's actually test the thing first.
Sweeney has been talking about reintegrating the CPU and GPU for many years. I'm sure he's wetting himself over getting out of the boundries of shaders and opening up to programming his graphics in standrad C without much fixed-functionality. But he only says he's interested so that mainstream computers won't suck.
I'd think Carmack would have the same desires, though. He's one of the founding fathers of gaming graphics as well. Why wouldn't he like a fully-programmable GPU?
Well, the things is, is that techniques we have now already offer fully programmable GPU's. You can modify the texel output however you like, thus programmable GPU's are here. The point of a GPU is to accelerate things via brute force [silicon] that are meaningless work to a 3D engine. For example, vertex projection transformation (the act of displacing a vertex so it appears correctly on your 2D screen) is so basic to any 3D engine that it might as well be hardware accelerated, that is, why lose FPS calculating something that is going to be done a billion times? The GPU is a refined piece of silicon that's designed for programmers to simply upload 2D/3D data too and then it does the boring work for you, so that you can start working on the interesting stuff.
What the Larrabee is trying offer is total control over the whole rendering process, which oddly enough is just software rendering. So, with that in mine, if the Larrabee is really going to be a killer GPU, it'll need to emulate a 'normal' GPU faster than a GPU's silicon. It could work perhaps. The other real application that it has, are engines that aren't designed around the normal 3D techniques. Thus engines that use ray tracing or volumetric displacement may have an opportunity to shine here. Problem is, a lot of your favorite games may see a decrease in performance.
So, even judging by what Sweeney has actually said, he's not pro / con Larrabee, he's just saying that theres an opportunity here, much like Carmack. However I suppose Carmack is a little more hesitant because there isn't actually any proof of concept just yet. I suppose Carmack carries slightly negative connotations about the Larrabee compared to Sweeney.