Anti-aliasing is very pretty, but most of the time, the extra prettiness isn't really worth dropping your FPS by 25-50%.
And uh, whoever says that it's useless, even at high native resolutions, doesn't know what they're talking about.
I never said that it was useless. It certainly has a place during
pre-rendering. And it's especially useful in rendering for modeling programs.
That said, for games, I've never seen any AA worth the drop.
------------------
Anti-aliasing by two levels (in my experience) is more resource intensive than going from 1024x768 to 1280x960, and damned near more resource intensive than going from there to 1600x1200. The result? Your game looks better and runs faster by merely increasing the resolution than if you had blurred the Hell out of everything. Some games (and video cards) are better at handling AA than others, but I've never seen an improvement that was worth the drain. And in some cases I've seen AA that is
jarring when enabled. Quake Wars: Enemy Territory handles AA in that, if you use it, it blurs the entire screen rather than just the edges.
If the game's engine and assets were set up properly (hah), then no object should have noticeable 'jaggies' at the highest (currently) possible resolution. If they do (see: Oblivion), then
someone screwed up. Raising stock Dues Ex to 1600x1200 gives me no reason to turn AA on, at all.
------------------
Anti Aliasing is an neccesity when using any LCD monitor. Heck. WINDOWS USES IT ON THE TEXT!
* BloodEagle is the proud owner of a CRT monitor.
And using anti-aliasing on text (usually
pre-rendered (if you really want to use the word rendered, here)) is far different than using it on, say, a space-shuttle model.
[EDITED TO CORRECT A SINGLE, RATHER EMBARRASSING SPELLING ERROR]