Hard Light Productions Forums
General FreeSpace => FreeSpace Discussion => Topic started by: fsphiladelphia on March 27, 2007, 06:46:09 pm
-
Hi -- I've been reading the board for a little while, first time post. Hopefully someone will beam me. After playing the Hotu version for a while, I found the HLP forums and took the plunge into FsOpen. To the people responsible for all of this, I'd like to say a sincere thanks. This game is gorgeous, fun to play, and just very complete. Your work is definitely appreciated.
After a couple of weeks with it, I loved the game so much I had to break my modest grad student budget and pick up a video card. I got a 7300gs and returned it after 2 days, exchanging it today for a 7600gt. So far, I can say the new card rocks.
What I want to know (and I've searched through the forum as well as other websites, but can't tell definitively) is: what graphical settings make the biggest impact on gameplay? For background info, my system is as such: Athlon 64 3200+, 512MB ram. While experimenting with the 7300 and now with the 7600gt, I've been playing the Derelict mission 'Rites of Passage' over and over, comparing FPS.
Right now I'm getting a constant 60 FPS whether I'm looking at space or at cap ships, when a battle starts it drops to about 34. When facing the enormous space station (which I should have the name memorized by now (Altair Station?)) in the middle of a battle, it dopped to under 10 for a few moments, i think something was either warping in or there was a shockwave.
I've got the AF up to 16, and the AA is at 8xS. In the nVidia forceworks settings I have the texture filtering set to Quality, transparency filtering is on supersampling. I'm running the game at 1280x1024.
This is my command line:
C:\Games\Freespace2\fs2_open_3_6_9.exe -mod derelict,mediavps -spec -glow -env -jpgtga -mipmap -2d_poof -noscalevid -missile_lighting -cache_bitmaps -orbradar -rearm_timer -3dwarp -warp_flash -snd_preload -alpha_env -fps -spec_exp 11 -spec_point 0.8 -spec_static 0.8 -spec_tube 0.8 -ambient_factor 65 -ogl_spec 80
Basically, I'm looking to have the best possible graphical experience, and I'd love to do so without dropping below 25 FPS in the most intense moments. I'd also like to keep it in 1280 x 1024 because I've got a 19" monitor. I'm sure this is possible, but what graphics settings most negatively impact the performance? I am guessing it's the AA being set to 8xS.
Thanks for any input, and, glad to be here.
-
The biggest culprits to your framerate, without removing FSO visual experience, are AA and AF, along with your texture settings. Basically the things you set in your video software, not FS. You should be able to max out the graphical options in FSO itself. Just tinker with the video card's settings - they actually don't make an enormous difference to visual quality in the higher ranges (versus the high or moderate ranges), but they do use more resources.
-
Thanks for the info. I'm going to keep tinkering with the nvidia control panel to see where it goes. I do have the settings maxed in FSO.
I am emboldened by the fact that the framerates are pretty high already, including constant 60 if in a smaller dogfight. It might be higher than 60 if I unpeg the vsynch, so maybe I'll do that just to take a look and see where it really is.
-
I turned off vertical synch to see the 'true' fps, and lowered the AA to 4x. Here were the results:
120 fps most of the time, including smaller dogfights.
~85-90 fps while staring at luxor station prior to the battle.
a drastic range of ~38 - 90 fps while staring at luxor station and some capships during a firefight, dropping to a low of around 13 fps while some fenris cruisers jump in (3), but only a pause in the otherwise 38-90 fps for a second or two while they jumped in.
finally, it stayed at or above 30 fps during a huge fight between luxor station with 6 or 7 other cap ships on the screen all firing beams, but like the above scenario, it would momentarily freeze whenever there was a shockwave or something warped in.
Are the above results basically typical? I'm somewhat satisfied and will probably tinker to improve it. Also, I've read in other threads, but don't quite grasp the concept about moving the shockwave files from one .vp to another, or to another directory? Is this something that would improve performance, and how would one go about that?
-
I have much similar system - same CPU, but I have 1024 MB RAM and my GeForce 7600 GT is (supposedly) from XFX's XXX range of products, and I'm using it clocked to 720/1750 MHz for core/memory (versus default clocks of 650/1600 MHz versus the stock 7600 GT clocks of 560/1400 MHz...).
My settings for FS2_Open are:
1280x1024@32bit resolution
cmdline: fs2_open_3_6_9.exe -spec -glow -env -jpgtga -mipmap -nomotiondebris -2d_poof -missile_lighting -img2dds -no_vsync -cache_bitmaps -dualscanlines -targetinfo -orbradar -rearm_timer -ship_choice_3d -3dwarp -warp_flash -snd_preload -alpha_env -voicer -fps -ambient_factor 0 -no_emissive_light -fov 0.55 -spec_exp 16.7 -spec_point 0.6 -spec_static 0.9 -spec_tube 1 -mod mediavp368zeta
8xS AA
16xAF set from the Launcher
Trilinear mip mapping set from Launcher
All settings maxed in NVidia Control Panel; visual quality set to High Quality etc. etc.
Vertical synchronization disabled from the launcher, but enabled on the NVidia Control Panel; that makes the game run faster while keeping the framerates from causing page tearing.
Running with MediaVP 3.6.8 zetas with adveffects enabled.
For most of the time I'm running happily at over 60 FPS. The least FPS I've seen has been in Axem's Just Another Day's Disco Inferno mission, where there is a gargantual particle spewing explosion cluster, it dropped the FPS to <19 with the screen full of explosions...
As has been already stated, the biggest impact to performance is AA and AF levels from drivers, and at least with my card, specifically the AA setting. Changing the setting from 8xS AA to 4xAA improves the frame rates greatly without sacrificing much visual quality. AF setting can be kept maxed out rather safely.
Most likely the biggest thing affecting your system's performance is the rather low amount of system RAM. Many textures are rather big in MediaVP's, and that can cause slowdowns especially on explosions.
-
Thanks for the info there. It's great to get feedback from someone with a similar system and almost the same video card. I'll try disabling vsynch in launcher and setting it to 'force on' in the nvidia control panel. Without more ram, I'm going to keep the AA at 4x, though 8xS did look nice.
Also, thanks for the heads up re: the ram. I know ram is usually one of the usual suspects, and it's probably time for an upgrade there, as well. I am getting great framerates, but when shockwaves and warp-ins happen, there can be a lag.
Still, I can't say enough about the 7600 GT -- for $100 after rebate at Micro Center, I'm not sure there is a better value out there.
Are the 3.6.8 zeta vp's better/more efficient as far as performance than the ones that would have come straight from the SCP installer? Do they look a lot different? Also (revealing real noob status here) how do I know if I have adveffects enabled? The game looks stunning and the visuals I'm getting are on par with the screenshots I've seen in the forums, so I imagine I have them on, but I don't know.
-
Hey fsphiladelphia,
I've been through the process you're going through now, looking for best performance/quality. The biggest changes I could make to help frame rates without changing overall quality are detailed in this post:
http://www.hard-light.net/forums/index.php/topic,44757.msg913653.html#msg913653
Let me know if you try any of these and what you think!
Huggybaby
-
MediaVP 3.6.8 zetas are the latest (and probably last) incarnation of pre-3.6.8 release VP's, which aren't going to exist. The MediaVP cycle will jump directly to 3.6.9 MediaVP's hopefully in close future.
As far as I know, the SCP installer does install these latest MediaVP's, but there are numerous things that can make performance better without affecting visual quality. One of the most important things for PC's with low memory is to download and install DDS Beamglows, which replace the ANI beamglows in the MediaVP's and reduce the memory use while retaining practically the same quality.
Another way to reduce slowdowns is to extract the medium resolution shockwaves from mv_effects.vp (they should be image files named shockwave-<something>) and locating them into mediavps\data\maps\ directory; from there they should replace the ultra-high resolution shockwaves that reside in mv_adveffects.vp.
Those ultra-high res shockwave textures are one of the things that cause the most FPS drops when using adveffects, as opposed to only using normal effects.
Unfortunately, I can't post more accurate instructions at the moment. :blah:
-
Turn off missile lighting. There's no way in hell you'll be able to see it in the course of normal playing, and it causes a reasonably large slowdown, especially when someone fires off a fish.
-
Hi fsphiladelphia. I have the exact same system specs as you do, except i have 1.5gb of ram. System ram is not a culprit in speed with the fso even with mv_adveffects. The culprit is yes your video card settings. It's like what was said already, those do not make much of a difference to need to have up all of the way, it's best to just go into the 7600gt settings and tweak it for performance right from there. After that, FSAA is very resource intensive, just go the good way and pretty much turn off fsaa. AF is a good thing to turn on, it makes enough of an extreme difference in visual quality if you have it as low as 2x or 4x this will sacrifice some performance, but you'll probably find it negligible. Another thing you should turn on is triple buffering, this will improve performance. Another speed sacrificing thing that i recommend you turn on is trilinear filtering, because bilinear filtering looks like ****, yet again trilinear is negligible in performance drop.
Another thing is to take control of your refresh rate. The 60fps and nothing higher sounds due to the fact that you have vsync on in the features tab of fs2. Either turn off vsync (vsync matches the framerate of your game to the refresh rate of your monitor) and experience variable but higher framerates, or do what i do. For fs2 i play in 1024x768 at 85hertz with vsync on. This helps ensure that my framerate stays constant and steady, not going above 85fps, and doing a pretty damn good job not going below 85fps. The way i tweaked the framerate using vsync and my refresh rate provides an extremely smooth and stable gameplay with virtually no tearing of the 3d environment if i so much as turn my fighter (vsync helps alleviate tearing that's one reason many use it).
You have virtually the same set up that i have. I'd really recommend getting at least a gig of system memory though or else if you're playing games like halflife2, fear, or even fs2scp for that matter you may experience hops and jumps in gameplay due to the system utilizing virtual memory a little further as compared to my machine which doesn't need to access it so much in games because of my 1.5gb's of memory (my system doesn't hop and skip anymore in any games since i upgraded from 512mb system memory).
Last thing to say is that, there should be no reason why you can't run with max settings in fs2scp. I have all of the mediavps including mv_adveffects with all of the high end features checked in the launcher, along with my 7600gt control panel tweaked for performance (except for my activation of trilinear filtering), and i get absolutely flawless gameplay. and plz trying getting an extra 512 stick of memory, or just plain old add a stick of 1gig to pair with your existing memory.
Missile lighting? i notice that all of the time, especially helps when avoiding missiles or just plain old looking for enemy missiles (not really a performance dropper either, also i notice it because i look for it or i wouldn't have it checked in the launcher). Ultra high-res shockwaves? I'm not sure if those were replaced with either the medium or just the high end shockwaves or are going to be replaced (whatever shockwaves in adveffects they don't ruin my performance either).
You have a good gaming machine, learn to tweak it's settings properly.
-
Thanks for all of the input guys. It's a big help. FSO is great, but optimizing it seems to be system-specific and there are lots of little inside tricks and without this board I'd never be able to try them.
When I get home today I'm going to try the medium res beams and shockwaves. I did notice last night that I'm by far having more issues with shockwaves than with beams. The mission I'm benchmarking with features something like 10+ cap ships firing beams at one another at one point, and it doesn't go into the 20s (fps), but if there is only a neligible difference in quality, I'll try the new beams. I'll also continue messing with the nvidia control panel. I really would like to keep the AA at 4x if possible and the AF at 16.
Question RE: AF -- if I have it set to 16 in the launcher, can I turn it off in the nvidia control panel -- ie: is the system running that effect twice now, once through FSO itself and once through the nvidia control panel? If so, is it better to have it on in one versus the other, or should I have it set to 16 in both places?
As for the shockwaves, I don't mind dropping them down, since they are only onscreen a second and really are one of the main culprits.
Is it correct o assume in the 3.6.9 mediavps most of this stuff will be better optimized so these workarounds will no longer be necessary?
Another qustion -- my computer uses DDR Ram, PC3200. At this point I am considering heading to microcenter after school to pick up at least 512MB more. I know that DDR ram is forwards-compatible (if that's the right concept) as far as speeds -- but does it make a huge impact if the second DIMM is at PC4000 or higher as opposed to PC3200?
Thanks again for the answers, this is a big help. I haven't really been interested in a game and getting it to work so well in probably 10 years, and this forum is great..
-
Question RE: AF -- if I have it set to 16 in the launcher, can I turn it off in the nvidia control panel -- ie: is the system running that effect twice now, once through FSO itself and once through the nvidia control panel? If so, is it better to have it on in one versus the other, or should I have it set to 16 in both places?
I think that it is indeed better to set those features to "software controlled" mode in NVidia's drivers. That includes anisotropic filtering and MIP mapping, as both of those can be set to the same levels that the drivers support (16x for AF and trilinear filtering for MIP mapping).
Of course you can try out if there's any difference in FPS/quality if you first set these features to software controlled in drivers, the ndisable them in launcher but enable them in drivers, or if you run them both.
Another qustion -- my computer uses DDR Ram, PC3200. At this point I am considering heading to microcenter after school to pick up at least 512MB more. I know that DDR ram is forwards-compatible (if that's the right concept) as far as speeds -- but does it make a huge impact if the second DIMM is at PC4000 or higher as opposed to PC3200?
Find out if you have two 256 MB modules or one 512 MB module. If you have one 512MB memory module, buy a similar one, or as similar as you can get, and you're set to use the memory to it's full extent. But if you have two 256 MB modules, then it gets more complicated.
In case you don't know - dual channel is a feature that enables the computer to double the memory bandwidth without increasing the memory or front side bus clock frequencies. But the paired modules is a requisite. In order to enable dual channel ability, memory modules must be installed in pairs so that the specs for size and speed match and preferably even same brand. So, adding a single 512 MB module would not enable you to use dual channel mode on at least that module - dunno if it would affect the other two modules as well.
You can of course stick a third, 512 MB module into the socket, and it will work... kinda. The PC will detect the new memory isntalled, but it will only operate in single channel mode. How it affects the two older modules I don't know.
So... if you have two 256 MB modules, and you want the PC to keep using the dual channel feature for the new memory as well... then you can either buy two new 256 MB modules to get your memory up to 1 GB, or you can buy two 512 MB modules to get it to 1.5 GB.
-
Of course that would be dependant on how many memory banks he has spare to put the memory in! :)
Oh and fsphiladelphia .................................
:welcome:
You have now been beamed :D
-
The biggest performance killer is the shockwave. Switching to a smaller version of that will help a lot. AF should be set to 16x globally at all times, unless you find the 7 series shimmering too annoying. The performance hit from AF is something like 2% on any card from the last four years.
You can of course stick a third, 512 MB module into the socket, and it will work... kinda. The PC will detect the new memory isntalled, but it will only operate in single channel mode. How it affects the two older modules I don't know.
This only works on the old Socket A processors, where you do get full dual channel. Anything newer will only boot at all with 1, 2 or 4 sticks.
-
This thread is quite interesting. fsphiladelphia, keep asking questions, I'll be reading the answers... :D
-
Jr2 - there's definitely lots of info here. I'd love to ask more questions but I don't know if I'll have any until later this afternoon after I am back home experimenting.
I am going to have to go home and see if I have 2 memory modules or one before I commit to buying anything. That's a bummer because the store I go to is near my school, 25 minutes from my apartment. So I'll have to go home, check it all out, and go back.
I know I have four memory slots to work with. I seem to recall seeing two modules when I installed the video card yesterday, but I also seem to recall the crucial system scanner telling me I only had 1 512MB module installed. I bought the computer about 15 months ago, and I know it came with 512. It's possible that I took the memory from my previous setup and installed it as well and it isn't being recognized, but I doubt I would have just chucked any old kind of memory in there. I'll easily be able to figure this out in a couple of hours when I get home by removing one or the other and running the scanner/checking system properties. Crossing my fingers to find a single 512MB module in there.
The one thing I really like about getting into a game which can be somewhat intensive is that it forces you to keep your machine up to date. It's been a while since I've taken such an active interest in my setup.
:welcome:
You have now been beamed :D
Thanks!
-
@fsphil: Sometimes, I've seen computers accept two different types of memory, and just ignore the incorrect one (the one that's lower, eg bank 1 instead of 0). (I've also seen this happen with an incorrectly seated memory chip..)
Also, if you remember your make/model, go to the manufacturer's website, punch in the model info in the support section, and see what they gave you, it could possibly be under 'technical documents' or similar. Or google your make model specs, eg Dell Dimension XPS 5400 specs... just make sure you're looking at factory specs, not someone's customized one.
-
7600GT should be more than able to handle it, I guess, but you could try without the -env flag -- environment mapping tends to be one of the heavier options.
-
7600GT should be more than able to handle it, I guess, but you could try without the -env flag -- environment mapping tends to be one of the heavier options.
and it looks fugly on some ships.
-
Also, if you remember your make/model, go to the manufacturer's website, punch in the model info in the support section, and see what they gave you, it could possibly be under 'technical documents' or similar. Or google your make model specs, eg Dell Dimension XPS 5400 specs... just make sure you're looking at factory specs, not someone's customized one.
Already a step ahead of you :) I just went to crucial.com and put in my computer -- compaq presario 1625nx, apparently it comes with 256 standard. I distinctly remember there being 2 modules inside, however, because I unlatched one of them yesterday while installing the video card.
The second is definitely something I added myself, most likely from my past computer. If I had to guess, I would say that right now my memory is probably not running in dual channel. I think one of the modules might be PC2100 and one is PC3200.
I could easily go get a 1GB stick or two 512GB sticks. However, if I get the two 512 sticks, I'll be up to 1.5 GB and possibly 2 of the 4 sticks would be running dual channel. On the other hand, if I get the single 1GB stick, I may not be able to boot up with 3 sticks in there and I'd have to get rid of one of the 256's, which would leave me at 1.25GB, also not running in dual channel because it would not be matching sized modules.
I'd like to minimize travel time, if possible, and avoid driving around Philadelphia at rush hour when school is right next to the store. I am guesing at this point, the 2 matched 512 sticks is probably the best/safest bet? Best case scenario, the 2 existing sticks are already in dual channel and the new ones are as well, and I have 1.5 gb at dual channel. Worst case scenario, I am at 1.5 GB without being at dual channel.
Question about that, however -- would it be a noticeable upgrade to add 1gb of ram if I do not receive the benefit of dual channel?
At least I can rule out buying a single 512MB module, because in that case, I can't boot up with 3 modules, and I'd have to lose one of the 256's and I'd definitely not be in dual channel.
...and I suppose while I'm here, may as well add another option -- getting 2 additional 256mb modules?
Hm, the more I write the more I realize I should go home and see if the existing ones are in dual channel or not. That will probably make it a lot easier. Final question (for this post) -- would it be preferable to run 1GB in full dual channel or 1.5 GB not in dual channel? (buying a matched GB of 512's, or even just a solo 1GB module and losing the 256's altogether might be a final option).
Haha...what a can of worms I've opened up. :lol:
7600GT should be more than able to handle it, I guess, but you could try without the -env flag -- environment mapping tends to be one of the heavier options.
It's definitely something I could experiment with in a couple of hours.
-
Just get 2x 1GB 3200 sticks and be done with it. :lol:
-
http://www.kingston.com/branded/default.asp (http://www.kingston.com/branded/default.asp)
http://www.memorystock.com/desktop-memory.htm (http://www.memorystock.com/desktop-memory.htm)
Check those out, they seem to offer quite accurate information about desktops and their memory upgrades.
->CP5670: Thanks for correcting me, that information was new to me... SO, I wouldn't be able to buy just a single 1 GB module in addition to my two 512MB modules, as I have a S939 MOBO/processor? I would have to buy two new modules if I would like to upgrade my PC's memory (presuming I would keep the old memory modules in the PC as well as the new ones).
...well, that sucks. :ick: I guess I'll have to get those two 1GB sticks as well... :drevil:
-
Just get 2x 1GB 3200 sticks and be done with it. :lol:
I'd love to...but that upgrade may not be in the card$ :lol:
-
Try shopping online?
EDIT: Careful you get it from a good source, though. (Newegg, TigerDirect, BestBuy, some others...)
Or you could eBay for a NIB (new in box) one, as long as the seller has a high rating.
-
Try shopping online?
EDIT: Careful you get it from a good source, though. (Newegg, TigerDirect, BestBuy, some others...)
Or you could eBay for a NIB (new in box) one, as long as the seller has a high rating.
I think I may go that route. I hate waiting for, and paying for shipping, however, the prices at tigerdirect.com are literally almost 50% off of what Micro Center charges, and Mircro Center is pretty competitive as far as in-person retailers go. The 7600gt was only $100 after rebate, which was the same as newegg/tigerdirect, but on RAM, it seems like MC is getting killed. Likely I'll go home, find out exactly what I have, make a decision about what I need to get, find it online, and jot the price down. If the store here has it within $10 or so of what it is online, I'll just get it in person, but the savings on ram, online, seems too much to pass up.
-
...
And the jack@ss of the day award goes to: myself.
You guys probably won't believe this, but I did have 1GB of ram all along (which, incidentally, I *thought* I did prior to yesterday). Turns out I dislodged one of the two modules yesterday when installing the 7600GT. It's two 512 PC3200's, so presumably I'm running in dual channel. Looks like RAM won't be an issue after all.
I feel a little silly, but at least I won't be spending any money. Now back to fiddling with my graphics settings.
-
...
And the jack@ss of the day award goes to: myself.
You guys probably won't believe this, but I did have 1GB of ram all along (which, incidentally, I *thought* I did prior to yesterday). Turns out I dislodged one of the two modules yesterday when installing the 7600GT. It's two 512 PC3200's, so presumably I'm running in dual channel. Looks like RAM won't be an issue after all.
I feel a little silly, but at least I won't be spending any money. Now back to fiddling with my graphics settings.
:lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol:
-
Emboldened by the 1 gig of ram, and realizing I had therefore not yet actually seen what the 7600GT was truly capable of, I set the graphics settings as follows (fully expecting more of the same):
FSO: Everything on, adveffects on, AF 16, Vsynch off
Nvidia: AF = software controlled, Vsynch On, AA 8xS, Texture filtering: quality
Any situation without capships (normal flying, dogfights): 120fps
Staring at the fleet + luxor station: ~85-95fps
Dogfighting while fleet and/or luxor station are on screen: ~45-90fps, never less than 45 though
Not moving, watching capships fire beams at one another: ~30-50fps
Dogfighting in the middle of beams going off, capships moving, shockwaves exploding: ~25-50fps
Near shockwave, or if something warped in, possible small pause; in the middle of one huge shockwave, it dropped for a moment, to 13fps.
First, the game remained at all times fluid and playable. Second, with the settings that high, it looked absolutely amazing. Third, I am floored by these results. Without much tinkering, my rig is really plowing through a lot of activity and holding up relatively well. With tinkering, I think it's conceivable I could get it to my goal of never really dropping below 25, and not hiccuping.
Perhaps the best part of the whole thing was that when a shuttle or smaller cap ship, like a food container or frigate got blown up in the 2000+ km range, the shockwaves did not cause an interruption at all whatsoever to the action.
Now let's see what improvements I can make.
-
Huggybaby -- I took your suggestions and installed the more sane glowmaps for the beams as well as the mid-range shockwaves.
The game ran flawlessly -- no pauses, no hiccups, and even in the most intense situations, tons of stuff, capships, whatever, it never dropped below maybe 35.
:D :D :D :D :D
Thanks to everyone here for their suggestions!
-
Hi fsphiladelphia. I have the exact same system specs as you do, except i have 1.5gb of ram. System ram is not a culprit in speed with the fso even with mv_adveffects.
:p Look at his results, 512 vs 1024. :p
-
Well for me i have 1.5gb of ram, and it's all dual channeled. Of course dual channeling your ram only affects system memory and not what's on your video card(this is like a given but i wasn't sure if fsphiladelphia knew or not...probably did know).
Jr2 get your numbers right, that's actually 512 vs 1536 actually. Any good job for him already having a gig and him making it work again :)
While we're on the subject, the 7600gt is a ridiculously good midrange card. The 7600gt outperforms the radeon x1800 and comes on par with the x1900. Nvidia has seriously outdone themselves with the geforce 7 series, otherwise i wouldn't have bought one.
Apparently none of you know the advantages of utilizing vsync properly. Vsync will lock your framerate to that of the refresh rate, if you have a monitor that does over 60hertz then you'll have more of a fun time. But, if you something less like a strobe light such as 85hertz, you can sustain a framerate that won't try to go over or under 85fps. The benefit of this is that it helps eliminate tearing, maintains the framerate, all adding up to an extremely smooth visual experience. Another benefit of utilizing vsync is that while it keeps your framerate from jumping so much, it also frees up the processor for doing other stuff such as visuals (providing a good balance), as opposed to trying to maintain as high a framerate the game can do plus advanced visuals and all the other geometry, shading, lighting, and other **** gpus do. Quite frankly it provides the best game experience when utilized properly as opposed to maintaining the maximum framerate at all times. Trying to maintain the maximum possible framerate at all times does get counterintuitive after a while.
-
Apparently none of you know the advantages of utilizing vsync properly.
:p Besides me. 1024x1080 (I think), 85 Hz, V-Sync on. 8)
-
Ok cool, so i'm not the only one. 85 hertz is a good selection, because it's not a strobe light like 60 hertz, and also it's not too high of a refresh rate to make your card maintain a framerate at. I'm sure if you had 120hertz your card would have a hard time maintaining that all the time as opposed to 85 hertz(also for me in 1024x768 85hertz is the highest refresh rate my monitor will do in that resolution).
-
The 7600gt outperforms the radeon x1800 and comes on par with the x1900. Nvidia has seriously outdone themselves with the geforce 7 series, otherwise i wouldn't have bought one.
Uh, no, those cards are not comparable at all. The X1900s are more than twice as fast.
The 7 series was overall not nearly as good as the 6 and 8 series. It was only an evolutionary improvement over the 6 series in terms of speed, and they also introduced the AF shimmering. You could basically get rid of it by switching to high quality mode, but the performance hit from that was anywhere between 5 and 30% depending on the game. The 6 and 8 series cards never had/have this problem.
-
Actually, I forgot... I run my desktop at 1024x1080, FS I think I cut down to 1024x768, because of distortion in x1080 (aspect ratio different).
-
This may seem like a silly question to most but I'm a bit out of date when it comes to the latest graphics cards, (need to start reading the computer mags again). But what's everyones opinion on the ATI Radeon 9250 (256mb) as a card. My brother-in-law has just got one second hand from his brother and wants me to fit it for him, I just want to know if it would be worth it?
-
Well, all i know is that the x1600 and the x1800 suck. So far the best midrange card out of the 7 series is the 7600gt, and it's even better to buy it compared to x1600 especially.
Who gives a **** about the AF shimmering, i don't really see it my games, then again i don't bring it over 2x, as i usually don't use it at anything higher just my preference i guess.
I took your word for it and found one of the benchmarks i referenced.
http://enthusiast.hardocp.com/article.html?art=MTAzOCwxLCxoZW50aHVzaWFzdA== (http://enthusiast.hardocp.com/article.html?art=MTAzOCwxLCxoZW50aHVzaWFzdA==)
It's the card i own versus an x1800 gto. It's about of equal performance to an x1800 gto. I remembered my information incorrectly (i was checking benchmarks seeing if it was a good idea for me to get an x1600 which the x1600 pretty much sucked). I mistakenly remembered 19 instead of 18. But when looking for a midrange card i recommend a 7600gt, great performance, it's a good buy and will satisfy.
At least the geforce 7 series performs much better than the 6 series and i believe it was the 7600gt's and higher have hd-dvd and blu-ray hardware acceleration that remove as much as 40% of the work away from the processor when watching hd off the two mediums (highly recommended to have dual core when playing hd dvd or blu ray).
-
Apparently none of you know the advantages of utilizing vsync properly.
Well... I've confined to 60 FPS on hardware, that's as much as my monitor can do but since it's LCD it doesn't have the "strobe" effect that the CRT's have.
Anyhow, I've noticed that I usually get better results when I disable VSynck in the launcher, but enable driver level VSync from NVidia's CP. For some reason, the frame rate drops are larger if VSync is enabled on Launcher... dunno why. Perhaps because the code has to do that much more on one pass or somesuch. :rolleyes:
-
Apparently none of you know the advantages of utilizing vsync properly.
Anyhow, I've noticed that I usually get better results when I disable VSynck in the launcher, but enable driver level VSync from NVidia's CP. For some reason, the frame rate drops are larger if VSync is enabled on Launcher... dunno why. Perhaps because the code has to do that much more on one pass or somesuch. :rolleyes:
I ran with this Vsync suggestion as well. I've got it off in launcher and on in the NVidia control panel. On the desktop I have it set to the same as the game, 1280x1024, but for some reason the highest refresh rate I can get is 75hz. I will experiment tonight to see if setting vsynch on in the launcher yields a constant 75 fps. It would be nice because I can't much tell the difference between a 75 fps situation and a 120 fps situation, but it would definitely improve those moments where it drops to 35 or so. However, I like the results I've had so far and it is off as of now, in launcher.
While I agree in theory that the constant framerate would create a more consistent experience and possibly free up resources which would otherwise be working to up the framerate when it drops, I'm not convinced that turning it on is going to help me in situations where it drops to 35 fps, or 50 fps -- I am sort of of the mind that in those situations, that's just the best the rig can do with what FSO is throwing at it.
Nonetheless, system RAM has some kind of impact. The difference was dramatic when I reinstalled (!) that disconnected module.
-
Who gives a **** about the AF shimmering, i don't really see it my games, then again i don't bring it over 2x, as i usually don't use it at anything higher just my preference i guess.
You need a better monitor in that case. :p The default quality AF looks like garbage compared to cards from the other generations of either company's lines. There is a reason why many of the later reviews of the 7 series models used HQ mode after the issue became more well known. A 7600GT is still a great buy and easily the best choice in the sub-$100 price range, but let's be realistic about its capabilities.
Also, 2x AF hardly does anything. You may as well have it at 16x (the performance hit from 16x is tiny on default quality mode) or leave it off altogether if you want to minimize the shimmering and other artifacts.
The HD acceleration may be useful, but you have to pay extra for Nvidia's decoder software for it do anything (ATI's AVIVO thing has the same problem). Apparently that has something to do with royalty fees.
Vsync and TB are definitely a must in a game with smooth turning motions like FS2. It defaults to be on, actually. However, note that it actually reduces the overall framerates slightly (even when the framerate is below the refresh rate) and TB also causes a slight stuttering effect in some situations. You'll get the highest framerates with it turned off, but motion also won't look smooth.
-
But what's everyones opinion on the ATI Radeon 9250 (256mb) as a card.
Ok for desktop use.
Ok for older games on 1024*768, no AA, no AF.
-
Hmm, I just checked my settings...
I have a BFG 6200OC 256, 1GB system memory (soon 2GB), 3GHz P4
Could someone suggest settings for these? Also, when AF and AA are set to app-controlled, what does FSO set them at?
Anisotropic filtering.......................=....app-controlled
Anisotropic mip filter optimization=.....off
Anisotropic sample optimization...=....on
Antialiasing settings....................=.....app-controlled
Conformant texture clamp...........=.....use hardware
Extension limit.............................=.....off
Force mipmaps............................=.....none
Hardware acceleration................=....multiple display performance mode
Negative LOD bias......................=....allow
Texture filtering..........................=....Performance
Transparency antialiasing..........=....off
Triliniear optimization.................=....on
Triple buffering...........................=....off
V-Sync........................................=....force on
-
You're AF is good as most of the time you control AF from the game.
Anistropic Mip filter optimization you want on - it says optimization why don't you want it on?
Anistropic sample optimization - it's on yay
Antialaliasing app-controlled (looking good so far)
Conformant texture clamp - i've got mine set to off, idk what it's for really, it doesn't seem as precedent and important as the other stuff though
Extension limit - pretty much don't touch that, it makes fs2 unplayable, until your smart enough to not mess with this setting
Force mip maps - you should set that to trilinear if you want extra visual quality
Hardware acceleration - i've got one monitor so it's set to single display mode if you have one monitor as well then you should change it to single
Negative LOD bias - mines set to clamp (don't know what this is...i left it default)
Texture filtering performance yes
Transparency antialiasing off yes
Trilinear optimization on yes
Triple buffering - you should have that to on, triple buffering is faster than double buffering...this is a speed increasing setting
Vsync - i've got mine to force on, good to have on for those games that don't let you turn it on in their settings, i really didn't notice any downside to having this setting on at all.
And also those with lcd monitors, you have fun with your ghosting and less sharp picture...lol, actually lcd monitors have gotten way better, i've seen the widescreen one my dad bought and actually the ghosting on that lcd monitor was low i barely couldn't notice it, and it was a had a sharp picture. Those who have crt's well don't use 60hertz.
-
Thx... BTW, those settings have an explanation at the bottom... IDK what half of it means, maybe it would help you though.
-
I already do know what all that **** means except for these ones...
Conformant texture clamp is to help prevent 3d artifacting, maybe i'll try this when i overclock my card by a serious amount or something, doesn't really seem to be a necessity
Extension limit is for older opengl apps that have long extension strings and can't deal with unlimited extension strings, like what the settings says, leave it disable if you're not using old opengl ****
Negative LOD appears to be an extra "nicety" when using antialiasing, might as well leave it disabled as well it doesn't really serve much of a purpose.
Oh yes absolutely knowing what these meant that i will drastically change my settings...no, just going to leave them the way they were since after further review there good the way i left them.
-
Oh yes absolutely knowing what these meant that i will drastically change my settings...no, just going to leave them the way they were since after further review there good the way i left them.
Ah, so you do have grey matter between the ears. (Which was why I asked.) I was simply asking if it would help, (since you said you didn't know what they mean) that's all. No need to get all snippy. :ick:
-
All of the various "optimizations" should definitely be disabled. They all reduce image quality for negligible speed improvements. The only one you may want to keep is quality mode, since HQ mode has a sizeable performance hit.
Negative LOD bias should be set to clamp. That significantly reduces the AF shimmering effect. Conformant texture clamp and Extension limit are for old game compatibility and can be enabled in a game's profile if needed, but should not be turned on globally.
-
http://www.tweakguides.com/NVFORCE_6.html (http://www.tweakguides.com/NVFORCE_6.html)
That gives more detailed explanations of all the settings. However, it seems everyone here knows most of this stuff (and thanks to the wonder of google, probably already has seen this webpage). Nonetheless, that may be of some use for some.
Just for fun, I think I'm going to see what happens with HQ mode. ;)
(i'll probably be back in here in a few minutes saying how bad the framerate hit was).
EDIT:
Certainly, it looked nice, but I did notice a performance hit by going from 'quality' to 'high quality'. At one point, the framerate dropped to 25, and at a couple of other times, it was in the mid 30's. That's not horrible, but when it starts jumping from 120 down to 25, the experience comes off slightly ... uneven, I guess, for lack of a better way of putting it. I'll note that this didn't slow me down, per se. I mean, when I started with no video card, I was probably averaging around 25 fps, and I was still able to play through the entire retail campaign, but I'm not sure whether it's worth it to stay at the 'high quality' setting.
I guess my question is, what exactly is the difference between 'quality' and 'high quality' in the actual game? While I had the vague sense that it looked better, I couldn't really tell exactly what I was looking for.
-
On the 7 series cards, it mainly affects the trilinear and AF optimizations. AF looks far better with it turned on, although the framerate drops you're describing seem rather extreme. It tends to have more of a performance impact in OpenGL games than D3D ones, but I've never seen it fall by more than 30%. You may want to leave it on quality mode if it's that bad.
-
On the 7 series cards, it mainly affects the trilinear and AF optimizations. AF looks far better with it turned on, although the framerate drops you're describing seem rather extreme. It tends to have more of a performance impact in OpenGL games than D3D ones, but I've never seen it fall by more than 30%. You may want to leave it on quality mode if it's that bad.
I agree with you that it is pretty extreme, especially when my lowest framerates in 'quality' mode are around 45 fps even in the most intense moments. So, because the change was not ridiculously worthwhile, and the framerate hit was pretty big, I think I'm toning it down.
Really I just wanted to see what would happen. While it was playable, I think the experience is better with it one notch lower. Which is a shame -- I am _this_ close to getting away with literally every setting in nvidia and fso set to their maximums :drevil: Oh well. :lol:
-
Do you have two PCI-e ports? If you do, you could eventually go to SLI, then you'd be back here laughing your butt off at your ridiculous quality settings... Oh well. I have a BFG 6200, there's no way I could do anything close to what you've got.
...But I do have 2GB of RAM! :p
-
I can read that bit there about the RAM ;)
Unfortunately, I only have one PCI-E slot. It's not possible to go SLI with a PCI-E card and and AGP card, is it? If so, what sort of benefits am I looking at?
Wait -- I live on student loans. This is about as good as it gets for me :)
-
Unfortunately, I only have one PCI-E slot. It's not possible to go SLI with a PCI-E card and and AGP card, is it? If so, what sort of benefits am I looking at?
No; the benefits are basically two GPUs doing your work, and your Video RAM would of course be doubled.
EDIT: Of course you could always return your Graphics card and get the BFG version (http://www.bfgtech.com/7600GT_256_PCIX.html).