Yeah, sucks how "specific setups" covers such a wide number of systems in this case, huh?
Actually I didn't encounter anyone who had that problem until you came around...
Is it possible that you found people with similar issues as you *because* you had the issue and looked for people with your problem?
Kinda a self-selected survey.
You enjoy trolling anyone who doesn't see the same things as you or doesn't experience the same issues as you don't you?
In closing -- play Battlefield 2... you'll enjoy UnrealEngine 1/2/3's stability.
But yes -- I later found out the issue doesn't apply to Vista, only XP. So it might be more widespread than I experienced.
As for Crysis, it would probably benefit from having PhysX support as well, considering how much physics simulation is going on there. The devs opted for not supporting a card that has an install base of way less than 1% though, and optimized CPU physics instead. Sensible decision in my opinion, maybe with a broader install base though Crysis will be patched to get PhysX support as well.
As for the "only series 8" thing, it's because only the newest series has the newest features. In this particular case, it's a special chipset feature (CUDA, apparently, whatever that means) that can easily duplicate what the PhysX chip does, seemingly, particularly now that they have access to both all the patents and all the know-how from Ageia.
No, you won't have access to the newest features with a less-than-newest card, ever. In fact, I'm mightily impressed by Nvidia porting PhysX to series 8, I'd have expected that only series 9 and onwards would benefit from the deal they made with Ageia.
CUDA is nVidia's programming language for GPUs.
Videocards used to be fixed-function cards with dedicated functions -- this chip did texture assignment -- this chip did N-dot-L linear algebra lighting calculations... eventually people wanted the GPU to do more and more -- some materials in UnrealEngine3 alone ending up to be a few hundred shader instructions for a single pixel.
nVidia decided "Hey -- if we're going to generalize the GPU... why don't we make a C-esque programming language to communicate with GPUs for anything we want?"
And there came CUDA... sure it's mostly used for shader ops... but you can use it for Physics if you wanted to -- why not? It's generalized.
I've even seen a videocard run a program to turn it -- basically -- into a soundcard. Fourier analysis and waveform blending, all while outputting the final waveform to the screen in a cool little oscilloscope thing.
Crytek used their own method for physics (I believe based off Havok 3 or 4 -- but I dunno for sure)... it was a design tradeoff they did. More control yet more development time. You don't *need* middleware to make every videogame... Insomniac made a purpose to NOT license middleware and Resistance and Ratchet and Clank Future Tools of Destruction weren't failures by any stretch of the imagination. That being said -- look at all of UE3's satisfied customers?
It's all in what you need, what you can accomplish, and what you want to spend time doing.