I find that many people tend to underrate the importance of good, solid image quality preferring to go for speed instead. This is sensible to a degree. After all, what's the good of a photo realistic experience, if it is jerky and unrealistic in it's movement? Correct! There is no point to that. However, most new-ish cards are far from slow and would not jerk around unless you ran in an incredibly high resolution. If a game does not run at 30 fps with a Voodoo2 or higher card, then it is the program that is at fault, not the card. Once you reach 30fps, speed is no longer an issue really. Yes, 60fps does look subliminaly better, but higher colour depths and rich textures are infinitely preferable to such ludicrously unnoticeable speed increases. Just as an example of what is possible in terms of 30fps graphics on current systems, look at this picture.
This shot is from 3DMark 99 which is a 3D benchmarking program, designed to use your PC as fully as possible and return a benchmark result. Look at the detail present in this animated scene. The flare, filtering, shading, transparency, translucency and texturing. Impressive isn't it? Surprisingly this wasn't rendered on a silicon graphics workstation. It was rendered on a standard PC with 3D accelerator. A Voodoo2 can perform this feat quite adaquately. Currently of course, the power is not present to render this scene AND perform all the physics and AI calculations as well, but, that only requires a little more CPU power. It can't be long before our games will start to look like this and therefore what you want is as clear and colourful an image as possible.
At the moment, Matrox have the image quality crown with their G400 card. It supports 32-bit colour, hardware bump-mapping, AGP 4X and VCQ (Vibrant Colour Quality). Going down a peg, the G200 chip has the best image quality of the 2nd generation 3D cards. NVidia's TNT2 does not have quite as big a feature set, but it comes a close second to the G400 on image quality. The TNT would probably come second to the G200 also. S3 would be next up with their Savage and Savage4 which have similar image quality to the NVidia cards. The best thing about these S3 cards is that they alone support texture compression, which is a feature of Direct X6 and higher. This makes them more attractive in terms of the image, because obviously, they can store more and larger textures than other cards. Intel and ATI's 3D offerings have good image quality too. 3dfx unfortunately have not kept up with recent technology innovations and have stuck with tried and tested 16-bit rendering. Once upon a time this was revolutionary, but, like I said, once you've seen 32-bit colour, 16-bit never looks the same. The Voodoo2 looks decidedly dull and to be honest, so does the Voodoo3. The Voodoo3, in it's favour, does have a 22-bit colour output convertor which apparently improves it a little, but not much and there is no hardware support for bump-mapping, texture-compression, anisotropic filtering or anything else, so 3dfx definitely lack punch in the image quality stakes at the moment. There is hope yet!
Now that that's established, the question remains, which one?