Author Topic: The difference in ATI and Nvidia.  (Read 13838 times)

0 Members and 1 Guest are viewing this topic.

Re: The difference in ATI and Nvidia.
Acceptable by me. I've just been sick of playing the most recent games like MW2 and R.U.S.E. with sub-optimal performance without having to sacrifice the high quality settings. I'm hoping I can put this card to the test with the newest games and get some pretty solid performance.
Fun while it lasted.

Then bitter.

 

Offline Klaustrophobia

  • 210
  • the REAL Nuke of HLP
    • North Carolina Tigers
Re: The difference in ATI and Nvidia.
i run a 4850, AMD 5600+ (OC at 3.2ghz), 2 gb of ram and i run crysis from around 22-30 fps (extremely playable for crysis) with most settings on high (i run XP and am locked out of very high).  some of the less noticable settings i reduce to medium for more performance.  it looks very nice and the only thing that bugs me is the occasional loading hitch.  warhead gets a little glitchy with textures sometimes but i'm 99% sure that is warhead's fault. 

newer hardware would get you better results, but honestly i don't see anything other than very top of the line cracking crysis all the way open yet.  but you don't really need it to.  for reasons unknown, it plays butter smooth even at 22-24 fps.  30 fps is enough to make me feel like i'm watching stop-motion in some other games.
I like to stare at the sun.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The difference in ATI and Nvidia.
I've had the same experience in Crysis - it runs weirdly smoothly even at a counted 20-25 FPS. I think it's the excellent motion blur.

 

Offline Klaustrophobia

  • 210
  • the REAL Nuke of HLP
    • North Carolina Tigers
Re: The difference in ATI and Nvidia.
crysis does everything different.  it really is revolutionary.  i wish it DIDN'T do antialiasing all weird though, it doesn't really work all that well whatever they did.  and i can see a white halo around my hand/weapon.
I like to stare at the sun.

 

Offline CP5670

  • Dr. Evil
  • Global Moderator
  • 212
Re: The difference in ATI and Nvidia.
Quote
but you don't really need it to.  for reasons unknown, it plays butter smooth even at 22-24 fps.

I hear a lot of people say this, but it looks just as bad as any other game to me at such framerates. I turn down the resolution until I can at least get 40fps consistently.

There is still no single card that can handle Crysis truly well. They all bog down somewhere in the game, especially during the snow levels later on. Crysis is a real exception among games though, and most of the titles we get today are just console ports that don't look nearly as good. For the last two years, speed increases in video cards have been pretty modest, but it's just as well since there are hardly any games to justify a much faster card, with game graphics having more or less peaked in 2007.

 
Re: The difference in ATI and Nvidia.
There is still no single card that can handle Crysis truly well. They all bog down somewhere in the game, especially during the snow levels later on. Crysis is a real exception among games though, and most of the titles we get today are just console ports that don't look nearly as good. For the last two years, speed increases in video cards have been pretty modest, but it's just as well since there are hardly any games to justify a much faster card, with game graphics having more or less peaked in 2007.
I agree with you on that one.
I think I'll wait till christmas or later this year before upgrading. the PC game market is pretty ****ty right now, with only a few games such as Crysis worth playing smoothly IMO. I don't expect it will improve, either, with everyone rushing to play consoles. I think PC will die within the next decade when it comes to hardcore gaming.

Back on Crysis though, the weird thing about that game is that, with half the settings on medium and less-intensive settings such as textures and geometry set on high, I can get a few minutes of completely solid 60 FPS gameplay. Then completely randomly it crawls down to ~25 FPS for about 10 minutes, before suddenly shooting back up to 60 FPS again. I think this has something to do with either my cards 256MB memory or my 2GB of RAM, but I still think it's rather bizarre. I have that issue to a lesser extent in Mass Effect 2, but other than that I've never encountered this on another game.
Fun while it lasted.

Then bitter.

 

Offline Dark RevenantX

  • 29
  • anonymity —> animosity
Re: The difference in ATI and Nvidia.
HD 5850 is a very excellent card.  Look into it.

HD 5870 is about even with nVidia's best card, but at a lower price!


There's a reason that ATi isn't really going to do their usual price drop this season: they don't have to.

 
Re: The difference in ATI and Nvidia.
Still, it's worth it to wait a few months I think, as GPU's always fall in price fifty bucks by the end of the year.
Fun while it lasted.

Then bitter.

 

Offline CP5670

  • Dr. Evil
  • Global Moderator
  • 212
Re: The difference in ATI and Nvidia.
The 5850 and 5870 have actually increased in price over time. The people who got them when they were released lucked out. The 5850 launched at $260, but went up to $300 after a month and has remained at that level ever since.

Quote
Back on Crysis though, the weird thing about that game is that, with half the settings on medium and less-intensive settings such as textures and geometry set on high, I can get a few minutes of completely solid 60 FPS gameplay. Then completely randomly it crawls down to ~25 FPS for about 10 minutes, before suddenly shooting back up to 60 FPS again. I think this has something to do with either my cards 256MB memory or my 2GB of RAM, but I still think it's rather bizarre. I have that issue to a lesser extent in Mass Effect 2, but other than that I've never encountered this on another game.

I don't think low memory would cause a consistently low framerate like that. Make sure your drivers are up to date. This actually reminds me of a bug in the Nvidia drivers at one point, where the power saving features weren't working right and the card would randomly drop to 2D speeds in the middle of a game.

 
Re: The difference in ATI and Nvidia.
I don't think low memory would cause a consistently low framerate like that. Make sure your drivers are up to date. This actually reminds me of a bug in the Nvidia drivers at one point, where the power saving features weren't working right and the card would randomly drop to 2D speeds in the middle of a game.
They're definitely up to date. I've had this problem since I got my card.
Fun while it lasted.

Then bitter.

 

Offline Ghostavo

  • 210
  • Let it be glue!
    • Skype
    • Steam
    • Twitter
Re: The difference in ATI and Nvidia.
Could it be that somehow the memory of the graphic card gets full and then proceeds through a few minutes of thrashing every now and then?
"Closing the Box" - a campaign in the making :nervous:

Shrike is a dirty dirty admin, he's the destroyer of souls... oh god, let it be glue...

 

Offline The E

  • He's Ebeneezer Goode
  • 213
  • Nothing personal, just tech support.
    • Steam
    • Twitter
Re: The difference in ATI and Nvidia.
No. If the card was swapping textures from main to video RAM, the framerate probably would be at a constant low (Unless these games do some tricky streaming).
Based on the description, I'd rather guess at some heat throttling going on, but it would be just a totally uninformed guess.
If I'm just aching this can't go on
I came from chasing dreams to feel alone
There must be changes, miss to feel strong
I really need lifе to touch me
--Evergrey, Where August Mourns

 

Offline pecenipicek

  • Roast Chicken
  • 211
  • Powered by copious amounts of coffee and nicotine
    • Skype
    • Steam
    • Twitter
    • PeceniPicek's own deviantart page
Re: The difference in ATI and Nvidia.
instead of deconstructing every ati fanboy post here, i'm just gonna go with saying that nvidia is best and i'll be staying with it for the foreseeable future.


then again, i may be biased because i have been on nvidia for the last 7-8 years i think. best value for me at the moments when i "bought" it. i rarely buy new since its sometimes almost 2 times cheaper to buy a used card than a new one here in croatia.

XFX GTX260 XXX (216sp, 896MB ram) is more than enough power for 99% of todays games. Unoptimised crap like Crysis and similar bull**** notwithstanding. And there's the sad fact that the ancient 9800GX2 trashes most of todays newer gpu's. The difference between modern dual cards and the gx2 is the fact that its actually two chips on a single board, instead of two cards glued to a heatsink and linked with a sli/crossfire cable internally.


Ahem, so yes. ATi may have the price as a good point, performance is nice, and with nvidia's Fermi Fail... well. i dunno. I wont be going with ati simply because i had very bad experiences with it. And anyone saying that the Catalyst Control Center is better than nvidia's control panel needs their brain checked.



so much from nvidia fanboy.








(also, i actually like what nvidia is doing with the whole CUDA deal, and some of the raytracing engines using it are looking promising, so thats one more reason for me to stick with nvidia. probably the biggest reason, as i dont play much games today. the most demanding i've played was metro 2033 when it came out and mass effect 2 isnt really that demanding)
Skype: vrganjko
Ho, ho, ho, to the bottle I go
to heal my heart and drown my woe!
Rain may fall and wind may blow,
and many miles be still to go,
but under a tall tree I will lie!

The Apocalypse Project needs YOU! - recruiting info thread.

 

Offline Nuke

  • Ka-Boom!
  • 212
  • Mutants Worship Me
Re: The difference in ATI and Nvidia.
these days i mostly just play starcraft. so throwing money away on video cards is pointless. id rather buy booze, marijuana, hookers, ect.
I can no longer sit back and allow communist infiltration, communist indoctrination, communist subversion, and the international communist conspiracy to sap and impurify all of our precious bodily fluids.

Nuke's Scripting SVN

 

Offline BloodEagle

  • 210
  • Bleeding Paradox!
    • Steam
Re: The difference in ATI and Nvidia.
these days i mostly just play starcraft. so throwing money away on video cards is pointless. id rather buy booze, marijuana, hookers, ect.

You're actually just renting the first and third items.  :p

  

Offline NGTM-1R

  • I reject your reality and substitute my own
  • 213
  • Syndral Active. 0410.
Re: The difference in ATI and Nvidia.
All three, really. If you argue that consumables are rented.
"Load sabot. Target Zaku, direct front!"

A Feddie Story

 

Offline Admiral LSD

  • 27
  • Shorter of breath and one day closer to death
    • http://adphq.dyndns.org
Re: The difference in ATI and Nvidia.
instead of deconstructing every ati fanboy post here, i'm just gonna go with saying that nvidia is best and i'll be staying with it for the foreseeable future.


then again, i may be biased because i have been on nvidia for the last 7-8 years i think. best value for me at the moments when i "bought" it. i rarely buy new since its sometimes almost 2 times cheaper to buy a used card than a new one here in croatia.

XFX GTX260 XXX (216sp, 896MB ram) is more than enough power for 99% of todays games. Unoptimised crap like Crysis and similar bull**** notwithstanding. And there's the sad fact that the ancient 9800GX2 trashes most of todays newer gpu's. The difference between modern dual cards and the gx2 is the fact that its actually two chips on a single board, instead of two cards glued to a heatsink and linked with a sli/crossfire cable internally.


Ahem, so yes. ATi may have the price as a good point, performance is nice, and with nvidia's Fermi Fail... well. i dunno. I wont be going with ati simply because i had very bad experiences with it. And anyone saying that the Catalyst Control Center is better than nvidia's control panel needs their brain checked.



so much from nvidia fanboy.








(also, i actually like what nvidia is doing with the whole CUDA deal, and some of the raytracing engines using it are looking promising, so thats one more reason for me to stick with nvidia. probably the biggest reason, as i dont play much games today. the most demanding i've played was metro 2033 when it came out and mass effect 2 isnt really that demanding)

:lol:

This post is all kinds of comedy gold.
00:19  * Snail cockslaps BotenAnna
00:19 -!- Snail was kicked from #hard-light by BotenAnna [Don't touch me there! RAPE!!!]

15:36 <@Stealth_T1g4h> MASSIVE PENIS IN YOUR ASS Linux

I normally enjoy your pornographic website... - Stealth
Get Internet Explorer!

 

Offline Klaustrophobia

  • 210
  • the REAL Nuke of HLP
    • North Carolina Tigers
Re: The difference in ATI and Nvidia.
or really sad, depending on your viewpoint.
I like to stare at the sun.

 

Offline Admiral LSD

  • 27
  • Shorter of breath and one day closer to death
    • http://adphq.dyndns.org
Re: The difference in ATI and Nvidia.
That's part of what makes it so funny though.
00:19  * Snail cockslaps BotenAnna
00:19 -!- Snail was kicked from #hard-light by BotenAnna [Don't touch me there! RAPE!!!]

15:36 <@Stealth_T1g4h> MASSIVE PENIS IN YOUR ASS Linux

I normally enjoy your pornographic website... - Stealth
Get Internet Explorer!

 

Offline asyikarea51

  • 210
  • -__-||
Re: The difference in ATI and Nvidia.
The only reason I stay with the green camp is because it doesn't screw up in most cases (apart from the failed batch which was actually my previous card) and I haven't found much to fault with their drivers (be it Windows or Ubuntu) so far... although the last driver I used before it went kaput was 16x.xx and at the time the newest was 18x or 19x. Don't break what works, so to speak.

If I'd throw some good points about ATI... Picture quality maybe, tiny bit easier on the power bill but that's subjective. And of course the current prices still being cheaper (kinda). But guess I'll eventually find out sooner or later...

As for Crysis (not Warhead, Warhead wasn't so bad)... I think the best stress test is actually the last part on the carrier when the huge alien thing jumps on and you have to kill it off the ship (No, not the mothership nuke part, rather the not-so-huge thing before that). On my previous card, whoop-dee-doo, 4-10fps on 1024x768 (or was it 1280x1024, don't remember) with a specific mix of low and medium settings...

Since then when I saw most of the review sites simply doing the benchmark or short gameplay at some other level, I lost faith somewhat...