Some graphics card's drivers are designed and optimized for higher resolutions. Take the Radeon, for example. Playing at 16 bit @ 1024x7 results in lower FPS than playing at 32 bit. That's because the drivers shut down certain features and optimizations at the lower bit rate.
Same goes for most other GC's. Their drivers start using certain optimization features at higher resolutions. Once you go below the default resolution, those features are either turned off or work at half speed.
So if you run UT at 1024x7 and receive 60 fps, then move to 640x480, it is quite possible to maintain the same FPS (or lower), without noticing an increase. They don't expect or understand why someone would play most games at such low rez, so they don't spend the time to optimize the drivers for lower settings.
Some people don't realize they can play at a higher rez and probably receive better FPS. I know I didn't. I was getting poor FPS at 800x @16 bit. Until I started to read the reviews and realized the optimal setting was 1024x7 @32.
Test your GC's, not every GC has optimized drivers and most companies don't say anything about it (they feel it's a form of cheating during benchmark tests