Couple o' questions -- first, what is th' official terminology t' describe wot wavy ripple flicker effect near large textures?
Like with beams looking like comicbook thunderbolts with kinda jagged edges flickering in and out of existence?
Second, are there any other steps wot I can take or thin's I can add t' th' command line t' smooth th' performance out?
Not that I know of... except -img2dds, which essentially converts all textures to a DDS format, which lowers the system and GPU memory requirements and and also ensures rather smooth mip mapping... but the thing is, as far as I know DDS compression is rather unfriendly at effects with smooth gradients in them, so it might cause some pixelation. Or not. Anyway, you might want to play around with it - if it causes any observable pixelation/compression artefacts but doesn't affect framerates in any meaningful way, you might as well chalk it out of your command line if it is there. And enable mip mapping in driver control panel.
Third, I see wot me video card's temperature is at aroun' 67C durin' th' game -- is this 'ere too hot fer a geforce 7600 GT? While doin' this 'ere post (ie: almost idle) it is at 58C. Is wot too hot?
It's not too hot, the throttledown temps are way higher than that... GPU's in general can take that much heat easily. However, if you really want to improve it, fit your GPU with a new cooler. I've got an XFX GeForce 7600 XXX-edition [which actually is factory overclocked a lot higher than it's
supposed to be according to the manufacturer's web page - default 650MHz for core versus supposed 590MHz, quite a difference I dare say, and 2x800 MHz for the memory...]. Ever since I fitted it with a Zalman VF-700Cu heatsink/fan, it
rarely goes past 50 degrees Celcius. Most of the time it's idling at about 43-45 deg C when the ambient room temperature is about 20 degrees Celcius.
I hope wot I'm at an appropriate temperature with th' core on account o' Freespace has ne'er looked this 'ere good on me system an' it seems t' be holdin' up OK so far.
It seems all right to me for a retail cooler, mine used to idle at +50 as well and load at about 60 degrees.
And my older MSI GeForce 6600 was even hotter with it's default cooler... never tried fitting that with a new cooler and it works perfectly fine still, I'm just not using it.
By the way, did you know that NVidia's Quadro drivers work rather fine on GeForce cards (which normally use ForceWare drivers)? On my particular PC, the Direct3D performance on some games seems to be a lot better with Quadro drivers, on others there's no observable effect. The bad thing is, Quadro drivers make things look really crappy at 8SxAA - it makes everything kinda fuzzy.
