Hard Light Productions Forums

Off-Topic Discussion => General Discussion => Topic started by: haloboy100 on June 25, 2010, 10:06:23 pm

Title: The difference in ATI and Nvidia.
Post by: haloboy100 on June 25, 2010, 10:06:23 pm
What's the competition like with these guys right now? I've heard ATI just released a new card a long while back that everybody was talking about, but I forgot what most of the technical stuff meant when I looked at the specs.

Seeing as my current setup is:

Nvidia 8800 GTS (256 MB)
2.66 Intel Duo Core CPU (E6750)
2GB RAM
Windows 7 64 bit.

and I still can't run Mass Effect 2 with more than a few minutes of more than 50 FPS performance, I'm considering upgrading the GPU the upcoming Christmas. I've also been looking forward to the day where I can run all of the most gorgeous games like Crysis at ultra quality with negligible performance loss (like my rich-ass friends and their alienware computers :doubt:).
I would get it sooner, with my own money, but I'm saving my ~200$ cash for Rock Band 3 and Halo Reach Limited edition.
Yeah, I'm too lazy for a job, and obviously this isn't the highest priority for me, which is why I'm not asking for particular device on an actual card upgrade.

My first card was this thing called the ATI All-In-Wonder radeon, which was some kind of hybrid video capture card and some supposedly enthusiast-level GPU. It worked so far back then, but at the time It couldn't run much more than Halo and FS2. It couldn't even do UT2004 above 30 fps.
Next, after switching to my old nvidia 6700 and then to my current 8800 GTS, I never thought twice about going back to ATI; it seems their cards always have compatibility and durability issues (judging from my friend's experiences). Though it seems nowadays that Nvidia is moving away from the gaming market and more towards the media industry in general, making cards for video effects processing and movie production and all that stuff. I chose nvidia because their marketing is easier to understand (the names are multiples of 100, unlike whatever ATI's setup is. Frankly, I just get alienated by card names like HD 5750) but now I'm thinking about going back to ATI, mostly because all my friends use them now, they're shiny black and red, and they're a Canadian-based company.

See, those are pretty stupid reasons to switch, which is why I'm here. I'm wondering what the difference in the two companies' cards are. Obviously I doubt those differences will be more than driver software and compatibility, but I'm hoping to find more compelling reasons. So, can you guys help me draw a comparison between the two? It's been months since I looked up the GPU competition.
Title: Re: The difference in ATI and Nvidia.
Post by: Admiral LSD on June 25, 2010, 10:59:38 pm
Long story short: nVidia dropped the ball after the 8-series GeForces allowing ATi to move in, first with the 4000-series and now with the 5000-series. I mention it in a little more detail in my post in Battutas thread. (http://www.hard-light.net/forums/index.php?topic=70066.msg1384385#msg1384385)

It's funny you find nVidia's naming less confusing than ATi's, because it's actually the opposite. With ATi you have clear indicators to where a given GPU sits in relation to all the others within the same series. You can tell at a glance that the 5770 is better than the 5750, but inferior to the 5850. nVidia are infinitely more confusing because not only do they badge the same GPU under multiple different names (9800GT for example, is virtually identical to the 8800GT and the GTS250 is identical to the 9800GTX+), but they also release vastly different products under the same name (192 and 216 shader GPUs both marketed under the name GTX 260). They're slowly getting the message, but they're still a long way from ATi's XYZ0 Series/Family/Model structure.
Title: Re: The difference in ATI and Nvidia.
Post by: haloboy100 on June 25, 2010, 11:08:47 pm
Yeah, nvidia's naming scheme was confusing at first, and I agree it's rather unnecessary. I thought ATI had something similar, but it seems the numbers are all that's needed to understand relative performance.

What about bug/compatibility issues with games? I always see tech problems crop up on video game forums regarding some kind if texture or shading related problem with ATI cards.

But what I'm more interested in is the software support; Nvidia has it's control panel, which I find alright but I heard ATI has cool features like Eyefinity that would certainly help with my dual-monitor setup (though, that's only an example). Are ATI's software utilities very useful?
Title: Re: The difference in ATI and Nvidia.
Post by: Klaustrophobia on June 25, 2010, 11:10:40 pm
for a while NVIDIA was a good ways ahead in terms of raw horsepower.  they got the most 3DMarks for a long time and consequently became the "best" cards according to reviewers and benchmarkers, who a great deal of the gaming mainstream blindly follow.  they then proceeded to sit on this name brand while ATI actually started cranking out fundamentally improved designs rather than just run the clocks higher.  at a cheaper price (usually) none the less.  

edit:  i haven't used NVIDIA drivers since the days of FS1 and my old MX440, but from what i remember they weren't exactly friendly.  i've never had a problem with ATI's catalyst, and from what i can see they are continually improving.  i don't use eyefinity or any of that extra stuff though. 

(well i actually did have one problem with catalyst, which was when they stopped AGP support for my 9800pro in whatever the latest update was, but that was sorted out in a day via support ticket)
Title: Re: The difference in ATI and Nvidia.
Post by: BloodEagle on June 25, 2010, 11:51:33 pm
I'd like to note here that 3DMark is (or was, at least) complete bull****, for actual benchmarking.
Title: Re: The difference in ATI and Nvidia.
Post by: haloboy100 on June 25, 2010, 11:54:30 pm
I don't even know what 3DMark is. :P
Title: Re: The difference in ATI and Nvidia.
Post by: Admiral LSD on June 26, 2010, 12:06:53 am
I've been running ATi cards for something like 6 years now (They've always had the better value option when I've been in the market for a video card) and the only issue I've ever really had with the Catalysts was the AGP mess a couple of years ago and even that was forgiveable because AGP was long obsolete at that point. A large chunk of the negative attitudes toward ATi's drivers stems from before they instituted the Catalyst program, when their drivers were quite terrible. Even though that's changed, people's attitudes haven't. I'd be almost willing to bet that, if you went back through both companies drivers, you'd find they'd be fairly closely matched when it came to number of issues. At one point MS were saying that 25% of Vista crashes were the result of the nVidia drivers, a statistic AMD/ATi were more than happy to take and run with.

As for the control panel, I have to say I much prefer ATi's (despite it being a .NET app) over nVidia's. It may be light on options, but at least they're arranged in a somewhat logical and easy to follow manner. Trying to make head or tail of the nTune-styled nVidia one was a nightmare.

Yeah, nvidia's naming scheme was confusing at first, and I agree it's rather unnecessary. I thought ATI had something similar, but it seems the numbers are all that's needed to understand relative performance.

They used to, prior to the 3000-series they had a whole mess of suffixes: Pro, XT, SE, GT, GTO, XTX and probably others I can't remember. nVidia and ATi also played a game where one would give a higher end card a certain suffix only to have the other attach the same suffix to a lower end card to confuse people even further. The series/family/model numbering ATi started using in the 3000-series is far, far simpler. nVidia are slowly going in a similar direction with their numbering, but it's still not as simple as ATi's system.
Title: Re: The difference in ATI and Nvidia.
Post by: Klaustrophobia on June 26, 2010, 12:49:00 am
i don't remember ATI ever having a GT or GTO.  they did get a bit confusing ala NVIDIA briefly, but in the olden days it used to be "pro" for the more or less stanard high end (like the --70 now) and "XT" for the overclocked (--90).  i think the XTX was the doing of a manufacturer, not ATI.

anywho, drivers are hardly worth debating one way or the other, because 90% of the bashings handed out to either side are from fanboys of the other.  neither are bad enough to be deal breakers, that's for damn sure. 
Title: Re: The difference in ATI and Nvidia.
Post by: General Battuta on June 26, 2010, 01:04:15 am
You an get an HD 4870 for a very reasonable price (between $100 and $200) and be able to play any game released to this date at a very reasonable framerate.

There is no reason to buy NVIDIA right now.
Title: Re: The difference in ATI and Nvidia.
Post by: haloboy100 on June 26, 2010, 01:09:11 am
So, I'm looking at ATI cards right now at my local retailer. Thinking I might save Rock Band 3 for the christmas gift and get the video card first.
Thoughts on this HD 5670 (http://www.memoryexpress.com/Products/PID-MX27158%28ME%29.aspx)?
Title: Re: The difference in ATI and Nvidia.
Post by: Admiral LSD on June 26, 2010, 01:16:26 am
There were GT and GTO versions of the X800, the latter being one of the chips that ATi like to throw out every now and again (and nVidia traditionally lack the balls to) that performs well above its price bracket (others being the 9500 Pro, the X1950 Pro and the 4670). The GT was a lower spec card than nVidia's GTs, which were the range toppers at the time (they didn't add GTX until the 7-series iirc), and was meant to take the shine off nVidia's offerings. Around the same time, nVidia started using the "XT" suffix, which was the top of ATi's range at the time, on lower spec cards to do the same thing.

anywho, drivers are hardly worth debating one way or the other, because 90% of the bashings handed out to either side are from fanboys of the other.  neither are bad enough to be deal breakers, that's for damn sure.

That's what I was getting at. There was a time when ATi's drivers were terrible, but they've long since cleaned up their act with the Catalyst program. Most of the people still doing the bashing either never owned an ATi product at all, never owned one since the shift to Catalyst 6-7 years ago or if they've owned one in that time, either go out of their way to look for problems or get their panties in a bunch the moment they even think they see one (it may not be serious and it may not even be real, but when they're looking for an excuse to bash the drivers, they'll take what they can get).

You an get an HD 4870 5770 or 5850 for a very reasonable price (between $100 and $200) and be able to play any game released to this date at a very reasonable framerate.

There is no reason to buy NVIDIA right now.

There, fixed :P

So, I'm looking at ATI cards right now at my local retailer. Thinking I might save Rock Band 3 for the christmas gift and get the video card first.
Thoughts on this HD 5670 (http://www.memoryexpress.com/Products/PID-MX27158%28ME%29.aspx)?

I prefer Sapphire for ATi cards, but it really doesn't matter so long as the price is right. Since everyone mostly just copies the reference designs laid out by ATi or nVidia, there's next to no differentiation in performance between (stock clocked, overclocked cards are another story, you need to factor that in when shopping around) cards all sharing a common GPU. The 5670 isn't a bad choice if you game at less than 1680x1050 but if you can afford it, consider stepping up to the 5750 or 5770 which offer better performance at higher resolutions and stand a better chance of staying useful if/when DX11 becomes more common place.
Title: Re: The difference in ATI and Nvidia.
Post by: Bobboau on June 26, 2010, 01:33:53 am
I got a recent 'good' ATI card, like all other ATI cards I've owned it's had, what I am assuming is, driver issues, I can't get video to play in full screen reliably and games randomly grey screen, were the GPU puts it's self into a reset loop and I have to turn the computer off to get it working again.
Title: Re: The difference in ATI and Nvidia.
Post by: CP5670 on June 26, 2010, 01:38:25 am
As others have said, ATI has had the best cards at almost every price point ever since the 5000 series was released. Mass Effect 2 is not that demanding though, and almost any decent card from the last two years can handle that game fine.

I think both companies' driver control panels are useless. The third party utilities out there put them to shame. However, I prefer the nVidia utilities (nHancer and Rivatuner) over the ATI ones (ATI Tray Tools), and they tend to be more frequently updated too. As for driver bugs, both drivers have issues in a handful of older games, but generally work well otherwise.

Quote
I'd like to note here that 3DMark is (or was, at least) complete bull****, for actual benchmarking.

It's even more so today than it once was. It doesn't even look that good anymore.

Quote
I got a recent 'good' ATI card, like all other ATI cards I've owned it's had, what I am assuming is, driver issues, I can't get video to play in full screen reliably and games randomly grey screen, were the GPU puts it's self into a reset loop and I have to turn the computer off to get it working again.

That sounds more like a hardware problem with the card itself. You might have gotten a bad one.
Title: Re: The difference in ATI and Nvidia.
Post by: haloboy100 on June 26, 2010, 01:42:23 am
I prefer Sapphire for ATi cards, but it really doesn't matter so long as the price is right. Since everyone mostly just copies the reference designs laid out by ATi or nVidia, there's next to no differentiation in performance between (stock clocked, overclocked cards are another story, you need to factor that in when shopping around) cards all sharing a common GPU. The 5670 isn't a bad choice if you game at less than 1680x1050 but if you can afford it, consider stepping up to the 5750 or 5770 which offer better performance at higher resolutions and stand a better chance of staying useful if/when DX11 becomes more common place.
The highest my television can support is 720p, so resolution won't go higher than that (anything higher than 720p with 2x AA is more than necessary, IMHO). I just need a card that can run me Crysis on ultra with negligible performance loss - or is that too much to ask from 100-200$ cards these days, still?
You mentioned Sapphire, that brand is avaliable for this card at the same price, but it's sold out all over the city.

This 5750 (http://www.memoryexpress.com/Products/PID-MX26121%28ME%29.aspx) is available for the extra 50 bucks. Think that's good? Has a cool looking fan on it. ;7

Mass Effect 2 is not that demanding though, and almost any decent card from the last two years can handle that game fine.
My current 8800 GTS can, but only when I first start it up. After a few minutes of gameplay, the FPS drops to the high twenties; low twenties for intensive parts like talking with the illusive man. I notice after minimizing the game for 10 minutes (for lunch or something) performance goes back to ~55 FPS before dropping down again over time. I thought it was a heating issue, but after testing it with my fan and various other ways I found out it's not. I think this card is just getting old from constant abuse.
This is of course with all unnecessary processes turned off, including windows 7 gadgets, which i find to be a big slowdown during games.
Title: Re: The difference in ATI and Nvidia.
Post by: General Battuta on June 26, 2010, 01:48:32 am
Unless I'm badly mistaken, won't the 57 series somewhat underperform the 48 series?  :nervous:
Title: Re: The difference in ATI and Nvidia.
Post by: Admiral LSD on June 26, 2010, 01:57:40 am
Unless I'm badly mistaken, won't the 57 series somewhat underperform the 48 series?  :nervous:

As I said in the other thread, the 5770 performs somewhere in between the 4870 and 4890. It's not worth it if you had either of those cards (or even a 4850) before, but a decent upgrade from a 46-- or 47-- card. The other advantage of the 5000-series is the 40nm process results in much lower power consumption and heat output than the 55nm 4000-series cards. For the kind of price range you were suggesting, it's not really worth going for the last-gen stuff.

I got a recent 'good' ATI card, like all other ATI cards I've owned it's had, what I am assuming is, driver issues, I can't get video to play in full screen reliably and games randomly grey screen, were the GPU puts it's self into a reset loop and I have to turn the computer off to get it working again.

I can't say I've seen anything like that with any of the ATi cards I've owned over the years, at least under Windows. I've had various issues in Linux (some of which are Linux' fault, others ATi's. It's never only been just ATi at fault), but even that's getting better now and you don't by a high-end card with the intention of using it under Linux anyway.
Title: Re: The difference in ATI and Nvidia.
Post by: haloboy100 on June 26, 2010, 02:08:15 am
As I said in the other thread, the 5770 performs somewhere in between the 4870 and 4890. It's not worth it if you had either of those cards (or even a 4850) before, but a decent upgrade from a 46-- or 47-- card. The other advantage of the 5000-series is the 40nm process results in much lower power consumption and heat output than the 55nm 4000-series cards. For the kind of price range you were suggesting, it's not really worth going for the last-gen stuff.
The older series actually out-performs the new? Though, I think that happens a lot in the video card market.
But I should stick with buying 56/57 cards, still?
Title: Re: The difference in ATI and Nvidia.
Post by: Admiral LSD on June 26, 2010, 02:35:02 am
The older series actually out-performs the new? Though, I think that happens a lot in the video card market.
But I should stick with buying 56/57 cards, still?

This highlights the one problem in ATi's model numbering scheme: It works best when you're looking at cards within the same series or comparing a given card in one series with its predecessor in the previous series. Trying to map out where each card is in relation to others in other series muddies the waters a whole lot (not that it's ever been easy to begin with...). For example, the 5870 will be faster than the 5850, the 5850 will be faster than the 5770 and the 5770 will be faster than the 5670, and each of those will be faster than their respective predecessors in the 4000-series (4870, 4850, 4770 and 4670), but because of the overall speed bump of the 5000-series over the 4000-series, you find the 5770, a mid range part in this generation, performing around the same as last-generations high-end.

If you're looking to replace a 4800-series Radeon, then the 5770 is more a sidegrade than an upgrade, you'd need to look at the 5800-series before you would start seeing decent performance improvements. If, however, you're on something slower than that like an 8 or 9-series GeForce, 4700-series Radeon or lower, then the 5770 would net you a decent performance improvement. A 4800-series Radeon would too, but the savings in power consumption and heat output the 40nm process brings to the table make the 5770 are better choice.
Title: Re: The difference in ATI and Nvidia.
Post by: haloboy100 on June 26, 2010, 02:43:51 am
Yeah, I suspected something like that.
What I'm worried about most is exactly what performance I'll get; is expecting a 100-150 dollar card these days to run games such as Crysis and Bioshock 2 smoothly on ultra settings (with my 720p resolution) too much?
Title: Re: The difference in ATI and Nvidia.
Post by: General Battuta on June 26, 2010, 02:46:39 am
Yeah, I suspected something like that.
What I'm worried about most is exactly what performance I'll get; is expecting a 100-150 dollar card these days to run games such as Crysis and Bioshock 2 smoothly on ultra settings (with my 720p resolution) too much?

A 4870 running on a rig with a slightly better processor than yours can run Crysis on maximum settings at a playable though not excellent framerate.

Presumably a 5770 would do so as well or better.
Title: Re: The difference in ATI and Nvidia.
Post by: haloboy100 on June 26, 2010, 02:48:03 am
Acceptable by me. I've just been sick of playing the most recent games like MW2 and R.U.S.E. with sub-optimal performance without having to sacrifice the high quality settings. I'm hoping I can put this card to the test with the newest games and get some pretty solid performance.
Title: Re: The difference in ATI and Nvidia.
Post by: Klaustrophobia on June 26, 2010, 03:42:33 am
i run a 4850, AMD 5600+ (OC at 3.2ghz), 2 gb of ram and i run crysis from around 22-30 fps (extremely playable for crysis) with most settings on high (i run XP and am locked out of very high).  some of the less noticable settings i reduce to medium for more performance.  it looks very nice and the only thing that bugs me is the occasional loading hitch.  warhead gets a little glitchy with textures sometimes but i'm 99% sure that is warhead's fault. 

newer hardware would get you better results, but honestly i don't see anything other than very top of the line cracking crysis all the way open yet.  but you don't really need it to.  for reasons unknown, it plays butter smooth even at 22-24 fps.  30 fps is enough to make me feel like i'm watching stop-motion in some other games.
Title: Re: The difference in ATI and Nvidia.
Post by: General Battuta on June 26, 2010, 03:46:42 am
I've had the same experience in Crysis - it runs weirdly smoothly even at a counted 20-25 FPS. I think it's the excellent motion blur.
Title: Re: The difference in ATI and Nvidia.
Post by: Klaustrophobia on June 26, 2010, 03:51:22 am
crysis does everything different.  it really is revolutionary.  i wish it DIDN'T do antialiasing all weird though, it doesn't really work all that well whatever they did.  and i can see a white halo around my hand/weapon.
Title: Re: The difference in ATI and Nvidia.
Post by: CP5670 on June 26, 2010, 11:31:39 am
Quote
but you don't really need it to.  for reasons unknown, it plays butter smooth even at 22-24 fps.

I hear a lot of people say this, but it looks just as bad as any other game to me at such framerates. I turn down the resolution until I can at least get 40fps consistently.

There is still no single card that can handle Crysis truly well. They all bog down somewhere in the game, especially during the snow levels later on. Crysis is a real exception among games though, and most of the titles we get today are just console ports that don't look nearly as good. For the last two years, speed increases in video cards have been pretty modest, but it's just as well since there are hardly any games to justify a much faster card, with game graphics having more or less peaked in 2007.
Title: Re: The difference in ATI and Nvidia.
Post by: haloboy100 on June 26, 2010, 12:50:26 pm
There is still no single card that can handle Crysis truly well. They all bog down somewhere in the game, especially during the snow levels later on. Crysis is a real exception among games though, and most of the titles we get today are just console ports that don't look nearly as good. For the last two years, speed increases in video cards have been pretty modest, but it's just as well since there are hardly any games to justify a much faster card, with game graphics having more or less peaked in 2007.
I agree with you on that one.
I think I'll wait till christmas or later this year before upgrading. the PC game market is pretty ****ty right now, with only a few games such as Crysis worth playing smoothly IMO. I don't expect it will improve, either, with everyone rushing to play consoles. I think PC will die within the next decade when it comes to hardcore gaming.

Back on Crysis though, the weird thing about that game is that, with half the settings on medium and less-intensive settings such as textures and geometry set on high, I can get a few minutes of completely solid 60 FPS gameplay. Then completely randomly it crawls down to ~25 FPS for about 10 minutes, before suddenly shooting back up to 60 FPS again. I think this has something to do with either my cards 256MB memory or my 2GB of RAM, but I still think it's rather bizarre. I have that issue to a lesser extent in Mass Effect 2, but other than that I've never encountered this on another game.
Title: Re: The difference in ATI and Nvidia.
Post by: Dark RevenantX on June 26, 2010, 12:56:25 pm
HD 5850 is a very excellent card.  Look into it.

HD 5870 is about even with nVidia's best card, but at a lower price!


There's a reason that ATi isn't really going to do their usual price drop this season: they don't have to.
Title: Re: The difference in ATI and Nvidia.
Post by: haloboy100 on June 26, 2010, 01:01:23 pm
Still, it's worth it to wait a few months I think, as GPU's always fall in price fifty bucks by the end of the year.
Title: Re: The difference in ATI and Nvidia.
Post by: CP5670 on June 26, 2010, 01:28:41 pm
The 5850 and 5870 have actually increased in price over time. The people who got them when they were released lucked out. The 5850 launched at $260, but went up to $300 after a month and has remained at that level ever since.

Quote
Back on Crysis though, the weird thing about that game is that, with half the settings on medium and less-intensive settings such as textures and geometry set on high, I can get a few minutes of completely solid 60 FPS gameplay. Then completely randomly it crawls down to ~25 FPS for about 10 minutes, before suddenly shooting back up to 60 FPS again. I think this has something to do with either my cards 256MB memory or my 2GB of RAM, but I still think it's rather bizarre. I have that issue to a lesser extent in Mass Effect 2, but other than that I've never encountered this on another game.

I don't think low memory would cause a consistently low framerate like that. Make sure your drivers are up to date. This actually reminds me of a bug in the Nvidia drivers at one point, where the power saving features weren't working right and the card would randomly drop to 2D speeds in the middle of a game.
Title: Re: The difference in ATI and Nvidia.
Post by: haloboy100 on June 26, 2010, 03:00:21 pm
I don't think low memory would cause a consistently low framerate like that. Make sure your drivers are up to date. This actually reminds me of a bug in the Nvidia drivers at one point, where the power saving features weren't working right and the card would randomly drop to 2D speeds in the middle of a game.
They're definitely up to date. I've had this problem since I got my card.
Title: Re: The difference in ATI and Nvidia.
Post by: Ghostavo on June 26, 2010, 04:52:36 pm
Could it be that somehow the memory of the graphic card gets full and then proceeds through a few minutes of thrashing every now and then?
Title: Re: The difference in ATI and Nvidia.
Post by: The E on June 26, 2010, 04:55:45 pm
No. If the card was swapping textures from main to video RAM, the framerate probably would be at a constant low (Unless these games do some tricky streaming).
Based on the description, I'd rather guess at some heat throttling going on, but it would be just a totally uninformed guess.
Title: Re: The difference in ATI and Nvidia.
Post by: pecenipicek on June 26, 2010, 06:00:40 pm
instead of deconstructing every ati fanboy post here, i'm just gonna go with saying that nvidia is best and i'll be staying with it for the foreseeable future.


then again, i may be biased because i have been on nvidia for the last 7-8 years i think. best value for me at the moments when i "bought" it. i rarely buy new since its sometimes almost 2 times cheaper to buy a used card than a new one here in croatia.

XFX GTX260 XXX (216sp, 896MB ram) is more than enough power for 99% of todays games. Unoptimised crap like Crysis and similar bull**** notwithstanding. And there's the sad fact that the ancient 9800GX2 trashes most of todays newer gpu's. The difference between modern dual cards and the gx2 is the fact that its actually two chips on a single board, instead of two cards glued to a heatsink and linked with a sli/crossfire cable internally.


Ahem, so yes. ATi may have the price as a good point, performance is nice, and with nvidia's Fermi Fail... well. i dunno. I wont be going with ati simply because i had very bad experiences with it. And anyone saying that the Catalyst Control Center is better than nvidia's control panel needs their brain checked.



so much from nvidia fanboy.








(also, i actually like what nvidia is doing with the whole CUDA deal, and some of the raytracing engines using it are looking promising, so thats one more reason for me to stick with nvidia. probably the biggest reason, as i dont play much games today. the most demanding i've played was metro 2033 when it came out and mass effect 2 isnt really that demanding)
Title: Re: The difference in ATI and Nvidia.
Post by: Nuke on June 26, 2010, 07:45:09 pm
these days i mostly just play starcraft. so throwing money away on video cards is pointless. id rather buy booze, marijuana, hookers, ect.
Title: Re: The difference in ATI and Nvidia.
Post by: BloodEagle on June 26, 2010, 09:29:49 pm
these days i mostly just play starcraft. so throwing money away on video cards is pointless. id rather buy booze, marijuana, hookers, ect.

You're actually just renting the first and third items.  :p
Title: Re: The difference in ATI and Nvidia.
Post by: NGTM-1R on June 26, 2010, 09:33:35 pm
All three, really. If you argue that consumables are rented.
Title: Re: The difference in ATI and Nvidia.
Post by: Admiral LSD on June 26, 2010, 11:29:27 pm
instead of deconstructing every ati fanboy post here, i'm just gonna go with saying that nvidia is best and i'll be staying with it for the foreseeable future.


then again, i may be biased because i have been on nvidia for the last 7-8 years i think. best value for me at the moments when i "bought" it. i rarely buy new since its sometimes almost 2 times cheaper to buy a used card than a new one here in croatia.

XFX GTX260 XXX (216sp, 896MB ram) is more than enough power for 99% of todays games. Unoptimised crap like Crysis and similar bull**** notwithstanding. And there's the sad fact that the ancient 9800GX2 trashes most of todays newer gpu's. The difference between modern dual cards and the gx2 is the fact that its actually two chips on a single board, instead of two cards glued to a heatsink and linked with a sli/crossfire cable internally.


Ahem, so yes. ATi may have the price as a good point, performance is nice, and with nvidia's Fermi Fail... well. i dunno. I wont be going with ati simply because i had very bad experiences with it. And anyone saying that the Catalyst Control Center is better than nvidia's control panel needs their brain checked.



so much from nvidia fanboy.








(also, i actually like what nvidia is doing with the whole CUDA deal, and some of the raytracing engines using it are looking promising, so thats one more reason for me to stick with nvidia. probably the biggest reason, as i dont play much games today. the most demanding i've played was metro 2033 when it came out and mass effect 2 isnt really that demanding)

:lol:

This post is all kinds of comedy gold.
Title: Re: The difference in ATI and Nvidia.
Post by: Klaustrophobia on June 27, 2010, 12:03:19 am
or really sad, depending on your viewpoint.
Title: Re: The difference in ATI and Nvidia.
Post by: Admiral LSD on June 27, 2010, 12:10:38 am
That's part of what makes it so funny though.
Title: Re: The difference in ATI and Nvidia.
Post by: asyikarea51 on June 27, 2010, 12:17:45 am
The only reason I stay with the green camp is because it doesn't screw up in most cases (apart from the failed batch which was actually my previous card) and I haven't found much to fault with their drivers (be it Windows or Ubuntu) so far... although the last driver I used before it went kaput was 16x.xx and at the time the newest was 18x or 19x. Don't break what works, so to speak.

If I'd throw some good points about ATI... Picture quality maybe, tiny bit easier on the power bill but that's subjective. And of course the current prices still being cheaper (kinda). But guess I'll eventually find out sooner or later...

As for Crysis (not Warhead, Warhead wasn't so bad)... I think the best stress test is actually the last part on the carrier when the huge alien thing jumps on and you have to kill it off the ship (No, not the mothership nuke part, rather the not-so-huge thing before that). On my previous card, whoop-dee-doo, 4-10fps on 1024x768 (or was it 1280x1024, don't remember) with a specific mix of low and medium settings...

Since then when I saw most of the review sites simply doing the benchmark or short gameplay at some other level, I lost faith somewhat...
Title: Re: The difference in ATI and Nvidia.
Post by: Grizzly on June 27, 2010, 01:15:09 am
Quote
and I still can't run Mass Effect 2 with more than a few minutes of more than 50 FPS performance,

Upgrading your card will not change that at all. Mass Effect 2 'tunes' itself to run between 22 fps - 62 fps to prevent stuttering due to FPS spikes (it smoothens performance). It's a feature, not a hardware problem.

Looking at your system, I am quite sure you will be able to run the game fine on ultra high. There is no difference between 30fps and 50 fps anyway (not visable), so why bother?
Title: Re: The difference in ATI and Nvidia.
Post by: Admiral LSD on June 27, 2010, 01:19:52 am
Surely though having a faster system would influence how it tunes itself, no?
Title: Re: The difference in ATI and Nvidia.
Post by: Grizzly on June 27, 2010, 01:21:59 am
Surely though having a faster system would influence how it tunes itself, no?

No. There's just an 'cap' on the FPS rate.
Title: Re: The difference in ATI and Nvidia.
Post by: General Battuta on June 27, 2010, 01:22:40 am
Doesn't sound like the issue Hades is having. Even with this tuning his FPS should stay consistently high. If it drops off rapidly in this game as well as others something else is going wrong.
Title: Re: The difference in ATI and Nvidia.
Post by: Grizzly on June 27, 2010, 01:24:48 am
Doesn't sound like the issue Hades is having. Even with this tuning his FPS should stay consistently high. If it drops off rapidly in this game as well as others something else is going wrong.

It depends. I would not worry if my FPS rate drops below 50, as there is is no notable difference...

But yeah, if it starts to stutter... That might not be graphics card related, as his card should be capable of the job.
But if it goes to 30 fps or something, don't worry. ME2 just tries to use the minimum amount of system resources.
Title: Re: The difference in ATI and Nvidia.
Post by: haloboy100 on June 27, 2010, 02:10:04 am
30 FPS is playable, I agree. Though I really have no idea what they expect the performance you're supposed to have when you're system meets the "Recommended" requirements.
I find the difference of 30-50 FPS very drastic. 30 is easily noticeable and annoying but playable, 40 is less noticeable and ignorable, 50 is relatively unnoticeable when you're not paying attention, and anything above that is irrelevant.
20 I find very annoying. anything lower than that I find completely breaks immersion.
Though of course, those are my subjective opinions. Call me a nit-picker, it's just what I am.

I have a large feeling my problem is something to do with system resources. If I minimize the game for more than a minute or two I can usually get my FPS right back up to solid 60s without any performance decrease. Then after a while it starts to crawl down to the 20s again, even if I stay in the same area, without any loading screens.
This could be a cooling problem, but seeing as how my card isn't overclocked and my case isn't modified in anyway (though sometimes I play with a large fan running next to it because I'm paranoid), I don't see why my card's heat would be a problem. Using some utilities (and from almost burning my finger myself) I find that my card runs at around 90 degrees Celsius, even when idling. Though I've come to expect that as normality from GPU's of my 8800s time.

This is just my naive speculation, though. I don't expect to be correct in anyway. Though like I said, this problem is even more dramatic in Crysis, where at moments my game goes to 15/20 FPS for a while and then suddenly spikes up to 60 again. However in Crysis the 60 spikes are rather spontaneous, and again with absolutely no relation to what is happening on screen.
Title: Re: The difference in ATI and Nvidia.
Post by: CP5670 on June 27, 2010, 02:44:09 am
If you're getting 90C on idle, you definitely have a problem there. Some cards do get that hot on load (although it's rare for the 8800s) but an idle temperature that high suggests a bad heatsink contact or some other hardware issue.

Quote
Upgrading your card will not change that at all. Mass Effect 2 'tunes' itself to run between 22 fps - 62 fps to prevent stuttering due to FPS spikes (it smoothens performance). It's a feature, not a hardware problem.

This cap can be disabled. It's in almost every UE3 game, and the first thing I do in those games is to go into the ini files and turn it off. The fixed 62fps looks really bad with vsync on most refresh rates above 60hz.
Title: Re: The difference in ATI and Nvidia.
Post by: haloboy100 on June 27, 2010, 02:09:51 pm
If you're getting 90C on idle, you definitely have a problem there. Some cards do get that hot on load (although it's rare for the 8800s) but an idle temperature that high suggests a bad heatsink contact or some other hardware issue.
Well, around 85C, but that's just details.
My bigass fan does nothing to cool my card, so I guess my cooling problem is unsolvable without spending money, which I'd rather save for a new card altogether.
This cap wouldn't fix my problem if it was off, would it? Keep in mind that if I run at a lower resolution (1024x768 on my other monitor) I get absolutely solid ~60 FPS throughout the whole game, suggesting to me that it's a hardware issue.
Title: Re: The difference in ATI and Nvidia.
Post by: Jeff Vader on June 27, 2010, 02:17:37 pm
My laptop is at 85-90 C during basic desktop usage. Except immediately after booting, when it might be as low as 65 C. Sometimes, for no apparent reason, it ventures into ~95 C, stays there for an hour or so and then shuts down because of overheating.

Fujitsu <3
Title: Re: The difference in ATI and Nvidia.
Post by: General Battuta on June 27, 2010, 03:29:20 pm
My computer runs at 35.  :nervous:
Title: Re: The difference in ATI and Nvidia.
Post by: S-99 on June 27, 2010, 10:30:02 pm
My system runs at 40C. Idk about everyone else, but there's a clear difference between Farenheight and celcius. Unless haloboy and jeff vader just don't give a **** about their computers being very close to the temperature that boils water.
Title: Re: The difference in ATI and Nvidia.
Post by: Jeff Vader on June 27, 2010, 10:53:12 pm
Idk about everyone else, but there's a clear difference between Farenheight and celcius. Unless haloboy and jeff vader just don't give a **** about their computers being very close to the temperature that boils water.
Oh damn, I actually meant Kelvins. :rolleyes:

Yes, I know the difference between C and F. Yes, I care about it. Yes, those are the readings I'm getting. At least according to Rainmeter, which gets its intel from Speedfan. And the damn thing certainly feels like it's that hot.
Title: Re: The difference in ATI and Nvidia.
Post by: haloboy100 on June 28, 2010, 12:13:55 am
My GPU card feels the same as an oven to touch. My CPU runs at around the mid-high seventies, which I checked is standard temperature for mine. I used to have huge overheating problems with that, too, but that was solved when I finally bought a new CPU fan with actual working screws.
Title: Re: The difference in ATI and Nvidia.
Post by: Klaustrophobia on June 28, 2010, 01:52:43 am
the latest generations of cards (at least ATI, not sure about NVIDIA but i would assume so) have been built to take high heat.  the ATI 4k series reference design idles above 80c.  manufacturers really only attached the giant heatsinks for consumer comfort for those who have been conditioned that those temps are way too high.  and more overclocking room i guess.
Title: Re: The difference in ATI and Nvidia.
Post by: CP5670 on June 28, 2010, 02:07:33 am
They load around that level. I haven't seen them idling anywhere near that high. Maybe it could happen with a high room temperature and with very bad case cooling. This (http://www.anandtech.com/show/2977/nvidia-s-geforce-gtx-480-and-gtx-470-6-months-late-was-it-worth-the-wait-/19) shows what idle temperatures typically look like. My 280 idles at 50C and loads in the mid-80s.

Laptops are a different matter, and I can imagine it happening there given how poorly most laptop cooling setups are designed.
Title: Re: The difference in ATI and Nvidia.
Post by: haloboy100 on June 28, 2010, 03:26:26 am
Hell, I remember back when I played with the right side of my computer case off, I could lay my hand on it and feel the motherboard bearing being quite warm. When my fan is sitting next to it, the CPU typically cools by 10-15 degrees, and the bearing feels cool to the touch, at least compared to the rest of the room. sure doesn't budge my GPU temperature one bit, though.
Title: Re: The difference in ATI and Nvidia.
Post by: Hades on June 28, 2010, 03:44:03 am
Doesn't sound like the issue Hades is having. Even with this tuning his FPS should stay consistently high. If it drops off rapidly in this game as well as others something else is going wrong.
You mean Haloboy100, right? :p
Title: Re: The difference in ATI and Nvidia.
Post by: S-99 on June 28, 2010, 04:59:37 am
You really see the difference between nvidia and ati driver sets when you use linux. The ati driver going opensource for linux has been surprisingly good. The nvidia proprietary driver on the other hand is another matter since it's proprietary, but it's also of good quality. What really sucked in the beginning was the ati proprietary driver for linux. Then the opensource driver sucked when it first began. But, it's all good, you have ati hardware, no need to install a driver anymore when you install linux. The linux intel driver is also awesome.
Title: Re: The difference in ATI and Nvidia.
Post by: asyikarea51 on June 28, 2010, 06:11:22 am
The ATI proprietary drivers on my laptop on Linux for some reason result in my computer not shutting down properly; I have to force shutdown and that's no good to me for my case. Though the driver loaded does not match the exact card model, only the closest equivalent.

The fix I have for this, just switch between cards and ignore driver installation - the Intel chip lags horribly in heavy work, can't play games and it screws things up horribly when going into projector mode, video streaming sites take a total crap and irritating screen switching with the resolution also taking a crap among other things (I've done project presentations on it, it sucks to take the flak from the Windows/Mac masses, but that's the life of it), but in normal use it's better.

So far so good, hmm I guess I'm more neutral these days but still a bit partial to the green camp (even though I'm neither fond of the idea of video game market cornering with nV specific programming nor their tendency for shameless price tags)...

-------------------------------------------------------------------------------------

XD at the hades-haloboy confusion. another one of those strange things to me... =\
Title: Re: The difference in ATI and Nvidia.
Post by: haloboy100 on June 28, 2010, 05:24:49 pm
I think it's because I'm only a little bit older than Hades and our usernames both start with an H...
Title: Re: The difference in ATI and Nvidia.
Post by: S-99 on June 28, 2010, 08:41:22 pm
The ATI proprietary drivers on my laptop on Linux for some reason result in my computer not shutting down properly; I have to force shutdown and that's no good to me for my case. Though the driver loaded does not match the exact card model, only the closest equivalent.
Give the opensource ati driver a try. Fglrx has sucked for a while, even to the point of not being able to do compositing without modifying a bunch of system files.
the Intel chip lags horribly in heavy work, can't play games and it screws things up horribly when going into projector mode, video streaming sites take a total crap and irritating screen switching with the resolution also taking a crap among other things (I've done project presentations on it, it sucks to take the flak from the Windows/Mac masses, but that's the life of it), but in normal use it's better.
I think it's just the fact that it's an intel chip period. They are made for pretty basic things when it comes with a motherboard. The intel chip in my netbook was really tossed in there strictly for meager desktopping and for watching dvd's. It's just the fact that it's pretty ok at playing games like openarena at medium settings was pretty cool for what it can also get away with (netbook is good for older games).

What makes me happy about a driver going opensource is that it gets included in the linux kernel. When it gets included in the linux kernel, that's another device that fits the description of "works right out of the box" with linux. So, you can imagine how much i love the ati and intel opensource drivers because after i'm done installing linux, all my hardware really is working without needing to grab extra drivers. Opensource ati drivers are cool even further, i believe it works on a one driver works for all ati cards basis (no planned obselescence, they're going to go ahead and have compatibility for older ati cards too). I know for the new nouveau nvidia opensource driver that's how it is (i still use the nvidia proprietary driver still because nouveau isn't what i'd consider usable right now).

I've been pretty happy with nvidia for a while. I use to use ati after nvidia bought 3dfx. Ati wasn't much to balk at during the day until the radeon 9xxx series came out (those were good). And then ati pretty much dropped the ball with radeon x300 x400 x800 etc(the next gen radeon cards after the 9xxx series). The geforce 6 and 7 series were spectacular for performance and price back in the day. Also, because nvidia was way better for running linux with for a long time. Before the ati opensource driver, you had the nvidia and ati proprietary drivers. The nvidia proprietary driver was and still is that much better than the ati proprietary driver.

To recap, the ati driver going opensource is awesome, this means that ati is now an actual solution for 3d acceleration under linux whereas previously it was mediocre and now ati people can hold something pretty major over nvidia people (opensource driver being it). For nvidia, there's the proprietary driver which is still a really good and made nvidia a long time the preferred choice for compatibility and 3d acceleration under linux (still a good solution it is, and it works great, but i want the nouveau driver to mature so i don't need to rely on the proprietary driver anymore).

It's good to know ati picked up the ball again in two areas instead of just one (gaining better performance than nvidia and opensource driver).
Title: Re: The difference in ATI and Nvidia.
Post by: Ghostavo on June 28, 2010, 09:08:03 pm
And then ati pretty much dropped the ball with radeon x300 x400 x800 etc(the next gen radeon cards after the 9xxx series). The geforce 6 and 7 series were spectacular for performance and price back in the day.

I have to disagree with you on that.

The first cards in ATI's x series (as you referred, the x800, x700, etc...) were top notch and went head to head with the GeForce 6 series, even having better performance/price in some cases. The only downside to them (and this can only be said in hindsight) was the lack of shader 3.0 support which would be used extensively a few years later.

Where ATI really dropped the ball was with the x1000 series and NVidia started having jumps in performance with the GeForce 7 series.
Title: Re: The difference in ATI and Nvidia.
Post by: Klaustrophobia on June 28, 2010, 10:16:19 pm
i was playing America's Army when the x800 came out.  everyone flipped their **** over it.  at that time it was "Nvidia who"?  and then they struck even harder in the price/performance category with the XL version, which was essentially an underclocked XT at almost half the price.
Title: Re: The difference in ATI and Nvidia.
Post by: Grizzly on June 29, 2010, 03:08:17 am
You really see the difference between nvidia and ati driver sets when you use linux. The ati driver going opensource for linux has been surprisingly good. The nvidia proprietary driver on the other hand is another matter since it's proprietary, but it's also of good quality. What really sucked in the beginning was the ati proprietary driver for linux. Then the opensource driver sucked when it first began. But, it's all good, you have ati hardware, no need to install a driver anymore when you install linux. The linux intel driver is also awesome.

My intell 855 does not work all that well under Ubuntu Lucid Lynx...
Title: Re: The difference in ATI and Nvidia.
Post by: S-99 on June 29, 2010, 05:47:02 am
Try not using ubuntu, that might fix your problem (ubuntu sucks). Lucid lynx uses a hybrid version of x.org. It's a hybrid of the old and the new version (the old version got patched all to hell and works questionably when instead ubuntu devs could have gone with simply the old version or the new version...now they get to support this abomination for years...and yes x.org may not be your problem).

After that ubuntu sucks, fedora sucks, opensuse is a wierd smattery of slackware plus rpms, mandriva sucks. Really all i can recommend is starting from scratch with debian stable or arch (maybe gentoo also), or use pclinuxos or mepis (pclinuxos and mepis are not start from scratch linux types).

How hard would it really be to download a mepis livecd and test your video card with it for ****s and giggles? I'd be interested to know of the details (i don't have too much of a low down on intel graphics aside from what gets soldered to a motherboard). For now, the intel driver for my meager simple task intel pos in the netbook is performing and functioning as expected.
Title: Re: The difference in ATI and Nvidia.
Post by: asyikarea51 on June 29, 2010, 08:41:31 am
I don't even feel safe moving to 10.04 without screwing something up somewhere ala 7.xx and I don't have backup media to spare right now, urgk... and everything is already set up pretty nicely...

I tried one of the earlier pclinuxos versions once (not on the laptop) but didn't really get into it for some reason despite sound working straight out of the box. Maybe sooner or later when I move everything and do a reformat... and the whole ALSA-OSS issue is almost like kicking a dead horse in some ways...
Title: Re: The difference in ATI and Nvidia.
Post by: Admiral LSD on June 29, 2010, 10:49:50 am
Try not using ubuntu, that might fix your problem (ubuntu sucks). Lucid lynx uses a hybrid version of x.org. It's a hybrid of the old and the new version (the old version got patched all to hell and works questionably when instead ubuntu devs could have gone with simply the old version or the new version...now they get to support this abomination for years...and yes x.org may not be your problem).

The reason they did it that way is so they could implement certain features from X.org 1.8 (such as 3D/KMS support for the 4000-series Radeons, for example) without breaking compatibility with binary drivers that only work with 1.7. It would be such a big problem if the X and kernel guys didn't make something of a sport out of breaking things every other week...

As for Intel support, they've changed the way things work meaning only IGPs newer than the 915 work right now. That's probably asyikarea51's problem, not his choice in distro.
Title: Re: The difference in ATI and Nvidia.
Post by: S-99 on June 29, 2010, 09:14:49 pm
I normally blame ubuntu for one big thing, instability.
I don't even feel safe moving to 10.04 without screwing something up somewhere ala 7.xx and I don't have backup media to spare right now, urgk... and everything is already set up pretty nicely...

I tried one of the earlier pclinuxos versions once (not on the laptop) but didn't really get into it for some reason despite sound working straight out of the box. Maybe sooner or later when I move everything and do a reformat... and the whole ALSA-OSS issue is almost like kicking a dead horse in some ways...
As much as i don't feel safe moving my mom to 10.04 myself. If you didn't quite like pclinuxos, you could try mepis instead, of course when you get some burnable media. But why use burnable media? Use . It lets you extract the contents of an livecd iso onto a fat32 formatted mobile storage of any kind (in this case usually thumb drives, but i usually use an sd card) in a very easy fashion, so you can bootup with said mobile storage.

As for the alsa oss issue. Look in the repositories for a package called "alsa-oss". This package is a compatibility layer that lets oss applications work through alsa. Invoking the wrapper is super easy for any applications that only use oss. But for the most part, oss is dead and has been for years, everyone has been and is still using alsa. So this should be easy, use alsa, and for any program that uses oss and has no functionality for switching to alsa, you use the alsa-oss wrapper. (http://unetbootin.sourceforge.net/)unetbootin[/url)
Title: Re: The difference in ATI and Nvidia.
Post by: asyikarea51 on June 29, 2010, 09:46:51 pm
Not so much on OSS programs through ALSA but rather OSS itself. I actually prefer using OSS over ALSA... kinda... (as in the current v4, but then again I'm not recording) but like plenty of things out there, they all have pros and cons...

My integrated GPU is a GMA45 if I remember correctly. And the other reason I'm hesitant to go up to 10.04 is because of the Studio package...

But plenty of that is going off-topic and I'm elsewhere right now so that's it from me for now XD :nervous:
Title: Re: The difference in ATI and Nvidia.
Post by: Admiral LSD on June 29, 2010, 09:53:46 pm
Use unetbootin (http://unetbootin.sourceforge.net/). It lets you extract the contents of an livecd iso onto a fat32 formatted mobile storage of any kind (in this case usually thumb drives, but i usually use an sd card) in a very easy fashion, so you can bootup with said mobile storage.

unetbootin kind of sucks, but if you're already running a *buntu then the Startup Disk creator that comes with that isn't too bad. I've been using it to put various ISOs on to SD cards for booting my Eee PC and it works fairly well.
Title: Re: The difference in ATI and Nvidia.
Post by: pecenipicek on June 30, 2010, 12:50:13 am
Use unetbootin (http://unetbootin.sourceforge.net/). It lets you extract the contents of an livecd iso onto a fat32 formatted mobile storage of any kind (in this case usually thumb drives, but i usually use an sd card) in a very easy fashion, so you can bootup with said mobile storage.

unetbootin kind of sucks, but if you're already running a *buntu then the Startup Disk creator that comes with that isn't too bad. I've been using it to put various ISOs on to SD cards for booting my Eee PC and it works fairly well.
which is actually the same damn thing if i remember correctly. (unetbootin/*buntu startup disk creator that is)
Title: Re: The difference in ATI and Nvidia.
Post by: Admiral LSD on June 30, 2010, 02:03:27 am
That may have been the case at one point, and may still be in the back end, but the latest versions of the Startup Disk Creator in *buntu have a far better UI than unetbootin.
Title: Re: The difference in ATI and Nvidia.
Post by: Mikes on July 01, 2010, 01:34:28 am
I got a recent 'good' ATI card, like all other ATI cards I've owned it's had, what I am assuming is, driver issues, I can't get video to play in full screen reliably and games randomly grey screen, were the GPU puts it's self into a reset loop and I have to turn the computer off to get it working again.

Gonna call bull****. I've had a X1800, HD3850 and now HD4870x2 and every card worked flawlessly.

Not every driver release worked perfect, but it was never an issue finding a driver that worked just great that you could stick with (forums help a lot).

And Nvidia ain't any different there either. I've used a 8800GTX between the X1800 and HD3850 and had drivers that caused grey/yellow triangles to appear until i changed them and my girlfriend with her Geforce 285 fondly remembers the DX10 shadows bug in Lotro that made everything look like crap until the drivers finally got updated.

If you just go to the webpage and install whatever driver is the newest right now.... then driver support is definitely an issue with Nvidia AND Ati (LOL).
If you take a minute to look around and ask people which drivers actually work, then both company's cards usually run without issues.

Obviously, the newest card(s) in either lineup may have spotty driver support in the first 1-2 months (while there is only 1 or 2 drivers that may or may not work well :p), but after that it's pretty much always been smooth sailing.


What should be kept in mind however is that Nvidia isn't "just" Nvidia and "Ati" isn't just "Ati"... you gotta pick a decent card manufacturer too.
What i said above comes from experience with Sapphire/EVGA/Asus cards....  if you always buy the cheapest offer from whatever crap company that got their hands on Nvidia or ATI chipsets then yeah, .... your experience may vary  :lol:
Title: Re: The difference in ATI and Nvidia.
Post by: Admiral LSD on July 01, 2010, 03:08:28 am
What should be kept in mind however is that Nvidia isn't "just" Nvidia and "Ati" isn't just "Ati"... you gotta pick a decent card manufacturer too.
What i said above comes from experience with Sapphire/EVGA/Asus cards....  if you always buy the cheapest offer from whatever crap company that got their hands on Nvidia or ATI chipsets then yeah, .... your experience may vary  :lol:

The thing here though is that all of them copy the reference designs set down by nVidia and AMD/ATi, they just add their own branding (which usually entails little more than replacing the AMD/ATi/nVidia stickerage on the cooler with their own). What you get as a result of this is next to no differentiation between cards all sharing a common (stock clocked) GPU. In other words, it doesn't really matter who's card you get since they all work out mostly the same at the end of the day. Several companies sell overclocked versions, sometimes with a beefier cooler than the reference design, but they're all still based on the reference PCB layout. Very few companies (Sapphire being one. My preference for Sapphire aside, it's why I bought their dual slot 4850. It was the only 4850 card on the market that was 8" long and would fit neatly in here. (http://www.hexellent.com/files/26/2q1fcbk.jpg)) actually go to the trouble of designing their own custom PCB layouts.
Title: Re: The difference in ATI and Nvidia.
Post by: QuantumDelta on July 01, 2010, 04:30:18 am
That's definitely not true, there have been a few articles about which manufacturers to go with - more important with ATi/AMD (general rule of thumb for them, go Sapphire).
Can't remember nV mattering as much.
Title: Re: The difference in ATI and Nvidia.
Post by: Admiral LSD on July 01, 2010, 04:57:48 am
I'll admit you get what you pay for to an extent, but as long as you avoid the overly cheap brands as well as the overly expensive brands then everything in the middle is more or less the same.

edit: let me put this another way: If someone asks me what GPU they should get then, at this point in time I'll recommend something from ATi and when they then ask me which of the over 9000 brands of card featuring said ATi GPU they should buy, then I'll recommend Sapphire. If they then turn around and say they have a preference for Gigabyte, MSI, PowerColor, XFX, HIS or whoever, I won't object assuming the price isn't that much different from the Sapphire. If on the other hand they said they were looking at a Biostar or ECS built card then I would insist they choose another brand.
Title: Re: The difference in ATI and Nvidia.
Post by: pecenipicek on July 01, 2010, 07:38:57 am
dont be dissin ECS, they're cards are more solidly built than most low price cards.


i've got a 2 year old nvidia 9600GT which was ****ed with so much its a wonder it still works. also, it got a full cup of coffee spilled on it, and only gracefully shut down.



on the other end, i thought that ECS pulled out of retail market and went pure OEM or something?
Title: Re: The difference in ATI and Nvidia.
Post by: Admiral LSD on July 01, 2010, 09:06:59 pm
I keep hearing about how ECS have cleaned up their act in recent times, but I'm still not inclined to trust them. In part because they operate in the cheap and nasty end of the price spectrum, but mostly I suspect because I'm old enough to be familiar with the kind of **** they used to pull when they were PC Chips: http://www.redhill.net.au/b/b-bad.html
Title: Re: The difference in ATI and Nvidia.
Post by: CP5670 on July 02, 2010, 01:24:57 am
I would say XFX is the best brand for ATI due to their support and warranty. Sapphire and HIS had crappy warranty setups a few years ago, although they are said to have improved since then.

Note that most of these card companies don't actually make the cards themselves. The cards all come from the same one or two manufacturers. The various brands mainly differ in the choice of cooler, the clock speeds, and the length and quality of the warranty.
Title: Re: The difference in ATI and Nvidia.
Post by: QuantumDelta on July 02, 2010, 01:32:20 am
The reason Sapphire stand out from the rest is because they are basically 'inhouse'.
Sapphire used to produce what was the official ATi cards. ie; they're not really third party.

and official ATi cards are no longer made. (I don't remember why, think Sapphire wanted too much money from ATi)
Either way it means they tend to produce slightly higher quality boards, Even now.
Title: Re: The difference in ATI and Nvidia.
Post by: CP5670 on July 02, 2010, 02:12:44 am
If they do make the cards, then chances are all the other companies buy from them, so the quality would be identical. :p I think it's more likely that a few of the big motherboard companies do this. ASUS and Foxconn have been mentioned for both Nvidia and ATI.

There were several reports of Sapphire charging $15 for warranty service and requiring you to ship to a Hong Kong address a few years ago, but they're supposed to be better now.
Title: Re: The difference in ATI and Nvidia.
Post by: pecenipicek on July 02, 2010, 07:36:44 am
I keep hearing about how ECS have cleaned up their act in recent times, but I'm still not inclined to trust them. In part because they operate in the cheap and nasty end of the price spectrum, but mostly I suspect because I'm old enough to be familiar with the kind of **** they used to pull when they were PC Chips: http://www.redhill.net.au/b/b-bad.html
didnt know that ECS was PCChips... eh.

back when i bought that GPU, i also bought one of their mobo's, one of the first am2+ ones, when the first phenoms came out, and while it wasnt a very serious overclocker or anything, it was damn stable. When i switched over to Asrock P43DE and Intel E7500 i wept... Now that crap was unstable as all ****. In the end, sold it to a friend and got a Gigabyte MA78G-DS3H (rev 1.0, meaning faulty VRM's on CPU phases, meaning unstable voltages as you increase them in bios, meaning, you thought you raised the voltage by 0.125V while it actually raised it by 0.75V or the like, fixed in rev 2 of the same mobo) and AMD 7750 BE @ 2.7 GHz...

i'm still weeping for that ECS mobo and the old Phenom... performance wasnt anything special, but **** me it was rock-solid stable.








so yeah. still nvidia fanboy for GPU's, and Intel can go **** itself, as far as CPU's and MBO's go.



[edit] This mobo was the one i had from ECS (http://www.overclockersclub.com/reviews/ecs_a770m_a_review/)
Title: Re: The difference in ATI and Nvidia.
Post by: Wanderer on July 02, 2010, 08:12:32 am
Might be that my experiences are rather limited to FS Open circles but ATI has done more than enough to leave a very persistent foul stench behind it. Like ATI with their 'clever' x1k series & VTF.. (or - possibly - the recent HD 5870 probs...)
Title: Re: The difference in ATI and Nvidia.
Post by: pecenipicek on July 02, 2010, 08:36:47 am
ATI never was that concerned or reliable with their OpenGL quality (excluding the FireGL series and such, but those arent for gaming, now are they?)



Nvidia was always a much better bet in regards to opengl , so yeah
Title: Re: The difference in ATI and Nvidia.
Post by: Mikes on July 02, 2010, 09:43:06 am
I'm actually not quite sure about this...

... but i doubt that all card manufacturers would use the same brand of capacitors for their card... and if they don't, well that would explain the huge differences that you see in reliability and longevity right there :p

Otherwise, even with a reference design, you could still easily have huge differences with heat compound, fan manufacturers and most importantly manufacturing PROCESS .

.... someone actually has to put these cards together. And it doesn't matter if it's actually done by a human or fully/mostly automated by a machine...  the quality of the manufacturing process will have a huge impact on the reliability of the final product.


The difference between ATI and Nvidia, frankly, is less important to me these days than the difference between card manufacturers.

I've had plenty of both Nvidia AND ATI cards that worked well over the years. (And as far as problems go i had my share with either company as well... the impression that stuck was, that one better be careful with new Nvidia products when their market position is too strong at that moment (the mobile GPU desaster with mass failing of inferior components comes to mind, as well as the rather infuriating memory/stutter bug in several cards of the 8800 gts series) and on the other hand....  that ATI is prone to push too hard for performance when pressed (which - for example - resulted in the rather grotesque heat/power/noise monster cards in the X2x000 series ... )

Lately tho, we ve seen very nice and comparable offerings from both companies with Nvidias Fermi and ATIs 4x000 and 5x000 series.


... in any case...  Nvidia or Ati isn't really the issue in my eyes; In the past both companys have had really nice and really crappy cards (Geforce 5x000 er series that's you for example lol)

however, more importantly, I've never had a heatsink part just fall off an EVGA or Sapphire card inside the case while playing a game... (I'm looking at you PNY LOL - and that was 3 weeks after purchase mind you.).
Title: Re: The difference in ATI and Nvidia.
Post by: Hades on July 02, 2010, 01:41:59 pm
 that ATI is prone to push too hard for performance when pressed (which - for example - resulted in the rather grotesque heat/power/noise monster cards in the X2x000 series ...
Nvidia does this too, y'know (GTX 260, GTX 280, etc).
Title: Re: The difference in ATI and Nvidia.
Post by: CP5670 on July 02, 2010, 03:07:40 pm
Quote
however, more importantly, I've never had a heatsink part just fall off an EVGA or Sapphire card inside the case while playing a game... (I'm looking at you PNY LOL - and that was 3 weeks after purchase mind you.).

I think you just got unlucky. All of the professional Quadro cards sold at retail are PNY models, so their quality has to be at least decent. I highly doubt that there are any widespread reliability differences between cards (at least models with reference speeds and cooling), as they are all made by some third party company.

Quote
Nvidia does this too, y'know (GTX 260, GTX 280, etc).

Those cards are not that hot or noisy at all. The problem with them was the prices they were released at. However, Microsoft happened to have the original, huge cashback deal going on when they were released (it only lasted a day or two), which many people got in on. I got a 280 for half price back then, which I'm still using.

Better examples of this are the 2900XT and now the GTX 480 and 465. The X1900 line was very loud too, but it did at least have superior performance for its time.
Title: Re: The difference in ATI and Nvidia.
Post by: pecenipicek on July 02, 2010, 04:52:15 pm
that ATI is prone to push too hard for performance when pressed (which - for example - resulted in the rather grotesque heat/power/noise monster cards in the X2x000 series ...
Nvidia does this too, y'know (GTX 260, GTX 280, etc).
with a very simple solution. peel off the coolers and replace the TIM. (on the chip only, dont under any circumstance remove the memory TIM pads.)


the downside is they use a lot of paste. like a whole pack of arctic cooling MX-3. the problem is the spacers between the card and the heatsink, resulting in approximately 2-3 milimeter distance between the chip and the heatsink, which has to be filled with thermal paste.


then again, its not like those cards are for "standard" users mostly. the expectany these days with high end cards is if you got enough money to buy them, you most likely have either the money to get enough airflow for them or enough money to get someone to do it for you.
Title: Re: The difference in ATI and Nvidia.
Post by: Hades on July 02, 2010, 06:09:13 pm
Usually I'd expect the cards to use less power and generate less heat if I'm buying an expensive one. :\
Title: Re: The difference in ATI and Nvidia.
Post by: pecenipicek on July 02, 2010, 09:27:34 pm
its actually quite the opposite, and it has been so for quite a while now.
Title: Re: The difference in ATI and Nvidia.
Post by: asyikarea51 on July 03, 2010, 02:04:09 am
Is the ATI 5xxx series stable in FSO or are there any issues with it (minor, major, etc)?

Well that's another one for the green camp, sadly...
Title: Re: The difference in ATI and Nvidia.
Post by: Mongoose on July 03, 2010, 02:12:17 am
The fact that there are now kilowatt power supplies is sort of a scary thing.
Title: Re: The difference in ATI and Nvidia.
Post by: pecenipicek on July 03, 2010, 02:25:43 am
eeeh, its hard to get a rig that actually uses that much power on full-on power. we're talking triple/quadruple card setups, 6+ HDD's and some pretty extreme CPU's there (newest hex cores come to mind).


my rig, which isnt exactly a powersaver, drains on full load under 300W, so yeah. 90% of the time its around 120-ish W, and i definitely aint packing any powersaving components.
Title: Re: The difference in ATI and Nvidia.
Post by: QuantumDelta on July 03, 2010, 04:49:01 am
How do you know what wattage you're using?
Title: Re: The difference in ATI and Nvidia.
Post by: pecenipicek on July 03, 2010, 07:26:03 am
wattmeter pluged in the outlet from which the PC, monitor and speakers get their power?


connection is as follows outlet -> wattmeter -> extension cord -> PC, monitor, speakers.



and note, this utterly ignores the power consumption of individual components, i only get how much the whole thing guzzles, ignoring any efficiency thingamajiggies and such. the wall outlet is pretty much the best spot to get accurate measurements on the consumption.
Title: Re: The difference in ATI and Nvidia.
Post by: QuantumDelta on July 03, 2010, 08:17:15 am
*looks at cable sprawl under desk*
.....Yea that wouldn't work here, but I'd guestimate around 400-450 wattage on my box :<
Title: Re: The difference in ATI and Nvidia.
Post by: pecenipicek on July 04, 2010, 06:36:54 am
and? you tell me you have multiple extension cords coming in from different outlets?

just get an ordinary wattmeter of that type, find the outlet where its all plugged in and plug it in before the extension cord.