Hard Light Productions Forums
Off-Topic Discussion => General Discussion => Topic started by: Liberator on April 22, 2004, 11:23:42 pm
-
I don't want this to turn into a ATI vs. NVidia free-for-all, so you ATI folks stay away.
I'm going to have a little extra money in a couple of months(schools gonna pay me the rest of the grant they held from me for letting my GPA drop) and I am curious, who makes a good GeForce FX card? I know to stay away from anything below 5700 and bear in mind I only have a 4x AGP slot.
-
Seriously, while I'm not an ATi fanatic, I'm advising you _not_ to get a FX series Geforce.
Wait a bit and get a 6800 (XT for low mainstream, Normal for Mainstream, Ultra for highend) if you really want a Geforce.
-
"who makes a good GeForce FX card?"
nobody
-
Originally posted by Liberator
I don't want this to turn into a ATI vs. NVidia free-for-all, so you ATI folks stay away.
I'm going to have a little extra money in a couple of months(schools gonna pay me the rest of the grant they held from me for letting my GPA drop) and I am curious, who makes a good GeForce FX card? I know to stay away from anything below 5700 and bear in mind I only have a 4x AGP slot.
The only decent FX series card, Lib, is the 5900 based board. I've had good luck with Asus and Elsa cards.
-
Originally posted by ChronoReverse
Wait a bit and get a 6800 (XT for low mainstream, Normal for Mainstream, Ultra for highend) if you really want a Geforce.
Only if you have an extra PCI cooling slot; oh and not to mention, a 480 watt power supply.
-
Wrong, tests have shown the 6800U draws at most only 20 more watts than the 9800XT and about 7 more watts than the 5950U. Using a high quality 200W power supply, the 6800U has been run through a non-stop loop of 3dmark overnight without any overheating problems.
The only time you'll have problems is if you have a crappy powersupply, in which case even a 480W one will be outperformed by a 350 watt one.
Moreover, only the Ultra is a two-slot solution. The XT and normal are both single slots.
Finally, who uses the PCI slot next to the AGP slot? It's been standard practice for years not to use that slot anyways.
I still recommend waiting if you want a 5900 board. The prices should drop a bit more when the 6800 comes out. Beware that the 5900XT is actually the low-end card and is the successor to the 5700U.
-
You know, I've got a problem with dual slot GAME cards (IE Radeons and GeForces) but not professional cards like a Wildcat.
However, Chronoreverse is right: I've not put anything in the first pci slot in... um... wow. Since my TNT2, which was my first video card to have a fan on the chipset. That's been a long damn time. *heh*
Of course, I can't remember the last time I had more than three PCI cards in my machine at once. I've always got at least two (usually three) PCI slots open.
-
I was a hardcore nvidia user, proud of it, almost was swayed to ATI's side till I saw the reviews on a Geforce 6800 Ultra. My GOD that card blew my socks off when I saw the difference. I believe there's hope for nvidia after all! I honestly would wait for the 6800's to come down in price and/or wait for the next revision to come out. Simply to see what kinds of bugs they might have in them and such. ATI's still got the visual quality but Nvidia's comin back from what I've seen.
-
I just don't like the Drivers support, rather a lack thereof, that ATI is famous for. With an Nvidia, regard of actual performace, it almost always works in fashion it's supposed to.
Bear in mind I've owned 2 3D cards in my life that weren't salvaged. A 3Dfx Voodoo 3 2000 PCI w/16mb of RAM and a VisionTek GeForce 4 MX 420 PCI w/64mb of RAM. ANYTHING is going to be an improvement.
-
I've had my 9500 Pro for over a year now and I've yet to have a serious problem with the Catalyst drivers. In fact, they've been so good I'd go as far as saying I had more trouble with nVidia's drivers than I've ever had with the Cats.
-
the only problem i had with cats was when i failed to update them from the time i bought them to about the time i bought halo. I was running 2.4 cat drivers forever and that was the main source for my problems :P
and on my mobo, i only have 3 pci slots, and the top one is to close to my radeon 9000 to put anything other than a slim little ide controller card in there.
-
My current GeForce was built by ASUS too, I can recommend them.
-
And mine was by MSi, who're very good - the cards held out fine and only died on my (frame rate wise) a handful of times:
The Far Cry demo.
The Doom 3 alpha.
The Half-Life 2 alpha.
Battlefield: Vietnam with everything on full including sound settings.
As you can see - that's a fairly good indication that a GeForce 4 4600 Ti isn't going to be good enough for this years crop of games, but it'll cope just fine with everything before that (within reason, you need a system good enough to back it).
I would seriously recommend looking at ATI cards though, apparently the support is a lot better and, when the prices drop, I'm going to pick up a top-of-the-range 9800.
-
I just spent the whole week (literally) reading up on Tom's Hardware (http://www.tomshardware.com/) all the relevant nVidia vs. ATi articles in the Graphics (http://www.tomshardware.com/graphics/) section. Aside from the spanking-new nVidia GeForce 6800 Ultra, which can't even be purchased yet, ATi cards are pretty much the equivalent of nVidia cards without quality enhancements (FSAA and/or anisoptropic filtering), but easliy take the lead with said quality enhancements turned on - with one exception: OpenGL. Most OpenGL games run a slight bit faster with nVidia.
That said, those quality enhancements make so much of a difference to games that I'm buying my next card with the "requirement" that it be able to use FSAA and anisoptropic filtering at the higher resolutions (1280x1024 is ideal for my monitor) without frame rates dipping below "very playable".
I'm getting an Hightech Radeon 9800 Pro ICEQ card from NewEgg.com for $250. A steal.
-
You may have just argued me into going ATi for my next chipset, Sandwich. :D
-
Nix: wait until ATI releases their next card
i highly advise against nVidia at all times due to their thermal problems
once again someone said there is a radeon taht uses two slots- NAME IT last time someone said that hey ate crow because they realized they were not thinking of an ATI board at all
-
Both ATI and nVidia are fairly closely matched in terms of performance, hardware ability, technical ability, and driver quality. The notion that ATI drivers are of any less quality than the nVidia ones are utter bollocks because I have both GeForces and Radeons around here and I run into the same shenanagins on both sets of Deonators (or Forceware or whatever you call it now :D) and Catalysts. Find the drivers that work best and stay there.
Presently my powerhouse is a Radeon 9700Pro. Its fast, its pretty, and it looks really good in games. No problems, no issues, nothing to suggest that I made a bad purchase.
That said, see what card is best for you. The FX 5700 and 5900 are the best of the bunch, the FX 5200 is a budget card and the FX5600 and FX5800 are to be avoided. The Radeon 9800Pro, or XT are top of the bunch right now, the 9600XT is an excellent buy, and so on.
Be aware that budget versions of all cards in the ATI brand are known as SE (so 9600SE for instance) and the budget cards in the nVidia brand are XT (in an obvious attempt to sow confusion since the ATI top level cards are labled XT).
Read the reviews and see where it lands you.
Certainly the ATI cards, as Sandwich mentions, loose less FPS for enabling FSAA or AF....while without they are virtually identical to the nVidia cards. The GeForce 6 brand is supposed to redress that problem as well....
-
Originally posted by Kazan
Nix: wait until ATI releases their next card
i highly advise against nVidia at all times due to their thermal problems
once again someone said there is a radeon taht uses two slots- NAME IT last time someone said that hey ate crow because they realized they were not thinking of an ATI board at all
Kazan, stop jumping on anyone who says anything remotely like it's against ATi in any way.
You know, I've got a problem with dual slot GAME cards (IE Radeons and GeForces) but not professional cards like a Wildcat.
This is saying that game cards like Geforces and Radeons _shouldn't have to use two slots unlike a professional card. It doesn't actually imply that the Radeon uses two slots.
Moreover, even if I owned a Radeon (I haven't upgraded), I'd still be using a two-slot solution since I like silent cooling. If I do upgrade though, I'll probably use a two slot active cooling to vent air out of my case though.
-
I guess the verdict rules that ATI is better or something?
-
Originally posted by .::Tin Can::.
I guess the verdict rules that ATI is better or something?
Find a relatively indepth review of any series of video cards at any of the notables that suggests that one company has a major advantage over the other...
The fact is that neither of them do. But there are different issues to each and strengths and weaknesses....
I don't see any verdict that ATI is better...just that their FSAA performance is better (this is a verifiable and quantified performance result - not subjective in any way) and that their speed performance is neck and neck with nVidia.
-
Of the current generation (GFFX and R9x00) the cards run about the same speed when AA and AF is turned off.
The moment you turn them on, the GFFX lags far behind.
Moreover, if you play PS2.0 games, the GFFX series reverts to PS1.1 since they don't run PS2.0 fast enough (true even for the 5950).
I'd say it's safe to say that ATi is better for this generation...
The next generation is still up for grabs though. I have a feeling it'll be closer this round.
-
It really does boil down to personal preference though - Kazan does have a point that GF cards run hot. Not saying Radeons are cool - but certainly they have more effective cooling...
...anyway, like I said it's personal taste. If you don't plan on playing tomorrows games - but you insist on performance for todays then ATI is problably the way forward for you. Certainly their mid-range cards also seem a better bet, but some of the older GF cards (like my GF4!) can still hold their own with current software... though that's beginning to change.
At any rate, you pay your money and take your choice - you can't really go wrong with one of the £300 cards - they're fast. Start compromising with £100 cards and you'll find you have to lower some detail settings... but then you saved yourself £200. See where I'm going with this? :)
Certainly though if you wait for the next generation of cards to come out, you should see the current lines of cards come down quite significantly in price.... making them much more affordable but not as powerful as the next line... that said the next line are only really needed for the likes of Far Cry, Doom 3, HL2 and so forth... tomorrows games basically.
-
Oh Kazan should love this. From the Tom's Hardware review of the GeForce 6800 (http://www20.tomshardware.com/graphic/20040414/geforce_6800-11.html)--
While NVIDIA is loudly and proudly advertising the new shader model, ATi is attempting to downplay it. A wonderful and quite humorous example of this tactic is a developer presentation titled "Save the Nanosecond" by ATIs Richard Huddy which was accidentally leaked on the internet. The trouble was, the presentation still sported some personal notes which made it clear that the presenter was trying to convince developers to stay away from Flow Control in PS3.0, as it incurs a very tangible performance hit. Quote:
"Steer people away from flow control in ps3.0 because we expect it to hurt badly. [Also it's the main extra feature on NV40 vs R420 so let's discourage people from using it until R5xx shows up with decent performance...]"
Snarf.
-
Snarf?
-
Originally posted by IceFire
the 9600XT is an excellent buy
I'm still waiting for ATi to release the new card, so I can watch this baby's price drop even more. It's currently US$164 for a R9600XT with 256Mb. That's great, and all, but I'd hate to spend all that, and watch it plummet even more.
When is ATi's newest card coming out, BTW? Any estimates?
-
Originally posted by ZylonBane
Snarf.
Was that a Thundercats reference?
-
Originally posted by Raa Tor'h
I'm still waiting for ATi to release the new card, so I can watch this baby's price drop even more. It's currently US$164 for a R9600XT with 256Mb. That's great, and all, but I'd hate to spend all that, and watch it plummet even more.
When is ATi's newest card coming out, BTW? Any estimates?
The press release is the 26th and hopefully cards will be out by May (around the same time the 6800 is supposed to be available too).
This is for the X800Pro though. The XT will come later supposedly.
-
Originally posted by 01010
Was that a Thundercats reference?
It's an onomatopoeia.
And Thundercats was stupid.
-
Thundercats was stupid... yes well there's a conclusive and well put forth arguement ;)
-
As I said previously, I have never had the pleasure of a highend card. I play at 1024x768 on most things. I usually turn AS filtering all the way up because I've never seen a big performance hit, but I leave AA off because I can't tell a difference for the massive performance hit it requires.
-
Originally posted by Liberator
As I said previously, I have never had the pleasure of a highend card. I play at 1024x768 on most things. I usually turn AS filtering all the way up because I've never seen a big performance hit, but I leave AA off because I can't tell a difference for the massive performance hit it requires.
I have what amounts to last years high end card and I'm essentially playing last years high end games and I still run at 1024x768. Two reasons: Raven Shields text doesn't size upto higher resolutions so while my FPS are still mid 80's at high res I can't read the text. That and Forgotten Battles (like any flight sim) is a graphica/CPU hog by nature so 1024x768 plus 2xAA and 4xAS does the trick :D (and it looks GREAT).
-
I didn't know whether to make a new post or edit my last one.
Anyway,
The following screens were taken in Neverwinter Nights with all the supported grapical settings turned all the way up except Creature Shadow detail which is one setting from max. The only effect this has is giving NPCs blob shadows instead of detailed shadows. Also Shiny Water is turned off because my card doesn't support pixel shaders.
Please forgive any JPG artifacts.
The first one is without Anti-Aliasing:
(http://swooh.com/peon/liberator/NWNex1.jpg)
The second one has Nice 4X Sampling(or something like that, it's maxed out)
(http://swooh.com/peon/liberator/NWNex2.jpg)
Both are at 800x600 with 2x Anisotropic Filtering.
I can't tell a difference between the two, but if someon with better eyes wants to point out the differences I'd be happy to look at them.
-
Erm, AA isn't turned on in the second screenshot either. I don't see the pattern at the edges that 4xAA should yield.
Which card do you have and which driver set?
-
GF 4MX 420 PCI
drivers=53.04
BTW, I just used the slider in NWN, I may have to turn it on manually since I have set to off.
Okay, image is changed, and I can see the difference. I guess I've never actually had AA on.*grins sheepishly*
It's still useless to me though, the framerate in the first one is ~9 and the framerate in the second is ~4.
-
(http://www.andersonrossi.hpg.ig.com.br/thundercats2.jpg)
-
DON'T SPAM MY THREADS!!!!!!!!!!!!!!!11111111111111111111111
-
Realistically, have fun with your choice of chipsets, but pick a good name. Asus, Abit, Sapphire, or MSI.
-
Originally posted by Liberator
GF 4MX 420 PCI
drivers=53.04
BTW, I just used the slider in NWN, I may have to turn it on manually since I have set to off.
Okay, image is changed, and I can see the difference. I guess I've never actually had AA on.*grins sheepishly*
It's still useless to me though, the framerate in the first one is ~9 and the framerate in the second is ~4.
It's definitely not worth the framerate hit, but look at something else you apparently turned on in the second shot: Anisoptropic filtering (I hate typing that out). It basically makes farther-away textures as sharp as close ones. To see the difference, compare the stones the BG dude in red is standing on in both pics - see the blurriness in the first one?
Originally posted by Kazan
once again someone said there is a radeon taht uses two slots- NAME IT last time someone said that hey ate crow because they realized they were not thinking of an ATI board at all
I didn't say it, but I'm getting one - a Radeon 9800 Pro from HIS / Hightech (http://www.newegg.com/app/ViewProductDesc.asp?description=14-161-109&depa=0). Image attached.
Most Radeons - the vast majority - don't have such a cooling system.
-
Doesn't beat my GF4 ti4200 with two slot passive cooling =P
-
Originally posted by Liberator
DON'T SPAM MY THREADS!!!!!!!!!!!!!!!11111111111111111111111
Ask yourself, what would jesus do?:cool:
-
Originally posted by Sandwich
It's definitely not worth the framerate hit, but look at something else you apparently turned on in the second shot: Anisoptropic filtering (I hate typing that out). It basically makes farther-away textures as sharp as close ones. To see the difference, compare the stones the BG dude in red is standing on in both pics - see the blurriness in the first one?
Sandy, AS filtering is on in the first one also. I always leave it on.
Add to that it's a slightly different angle and he's in a different position.
-
That's a he, Lib?
My, you lead an interesting life, don't you. You might be alright after all. :D
-
Sandwhich and I were both talking about the Red Guy in the background, not the giant sword weilding Elf babe in spiffy, CEP styled chain the forground.
-
Oh, in that case, I take it back. :D You're not alright at all. ;)
-
Originally posted by IceFire
...FX5600 and FX5800 are to be avoided
Why is this? I have a FX5600 and it runs fine for me. I have never had any problems...
-
Because, at the same price point, an ATi card is really far better performance wise, ubermetroid. Also, these cards perform horribly for the generation of video card in which they were released. They're definately decent cards, but they're not on par with the hardware they compete with.
-
Sandwich: i do not consider any boards not manufactured by ATI to be ATI boards
only "Built By ATI" are ATI boards, everything else is simply "Powered By" and is made by whatever third party manufacturer - the reference design is the official design.
NO REFERENCE DESIGN from ATI requires two slots - unlike the reference designs from a certain different manufacturer
-
Originally posted by Liberator
Sandy, AS filtering is on in the first one also. I always leave it on.
Add to that it's a slightly different angle and he's in a different position.
Ahh, ok. It must be the angle between the camera and the ground. That has an affect on whether the AS filter kicks in. Anyway, that kind of blurriness/sharpness difference is what you get - practically for free (performance-wise) - with ATi chips. It places a much heavier load on nVidia chips.
Originally posted by Kazan
Sandwich: i do not consider any boards not manufactured by ATI to be ATI boards
only "Built By ATI" are ATI boards, everything else is simply "Powered By" and is made by whatever third party manufacturer - the reference design is the official design.
NO REFERENCE DESIGN from ATI requires two slots - unlike the reference designs from a certain different manufacturer
Right, I realize all that, and agree with it. What's yer point? :p I'm simply getting a "Powered by ATi" card that doesn't require such heavy-duty cooling, but it certainly will benefit from it. Besides, the first PCI slot apparently shares resources with the AGP slot and should never be used anyway - it'll just slow graphics performance down.
-
Nice waffle, Kaz.
If it has an ATi GPU on it, its an ATi card, just like any card with an Nvidia GPU is an Nvidia card. Don't split hairs when you're wrong.
-
No, actually he's right on this... the two categories of cards are even packaged and labeled differently. The 100% ATi boards are labeled "Built by ATi", the 3rd-party boards using ATi chipsets are limited to "Powered by ATi".
-
I'll not argue the labelling. He's patently correct on the labelling.
However, to suddenly, at this late stage, to apply a narrow definition to the term 'ATi card' in order to remain right when someone has shown proof that would invalidate any looser definition, is, I'm afraid, bollocks.
-
Considering his whole reason for defining an "ATi card" was to state that none of them require two-slot cooling solutions, his definition was perfectly correct. Who better than ATi to determine what cooling their VPU's need? Officially, ATi VPU's don't need 2-slot cooling. Just like how any campaign we may make, while still FS/FS2 "powered", is not officially how the Freespace universe unfolds. :p
-
Originally posted by Liberator
I don't want this to turn into a ATI vs. NVidia free-for-all, so you ATI folks stay away.
:rolleyes:
Well, there's always the next time :D
-
Yes, and the Reference design is the correct one to use for comparing, nvidia doesn't have a reference design... but they do have minimum cooling requirements (which are MUCH higher than ATI's)
-
why should some company be judged based on what another company does?
ATI didn't build that card, they shouldn't be blamed for it.
-
Originally posted by ZylonBane
And Thundercats was stupid.
You're stupid :p
*kicks ZB in the shins and runs away*
-
Hmm... somehow I was expecting... more...
-
I burned out my half a brain cell watching porn earlier.
-
Happens to the best of us
-
I "consider" bacon to be a vegetable.
-
Originally posted by mikhael
Because, at the same price point, an ATi card is really far better performance wise, ubermetroid. Also, these cards perform horribly for the generation of video card in which they were released. They're definately decent cards, but they're not on par with the hardware they compete with.
REALLY???? :eek:
Opps! :mad2:
I only got the FX5600 because I got the Nvidia MB too...
I will have to shop around more the next time I buy.
-
Now see, there's something I would never do: Nvidia chipset mother boards. That's kind of like buying an Intel graphics card. Dude, that's just wrong.
-
Nevermind it's actually one of the best chipsets for an AMD based system.
-
No, I don't think so, Lib. They're pretty much garbage in a serious server or high-availability environment. They have distinct and deep issues with some peripheral hardware that are sensitive to time, etc.
They're decent, but so were the Intel Starfire (IIR the name Correctly) video chipsets. They're not what you should use in serious hardware.
-
Originally posted by mikhael
No, I don't think so, Lib. They're pretty much garbage in a serious server or high-availability environment. They have distinct and deep issues with some peripheral hardware that are sensitive to time, etc.
Yeah cause we all use servers to surf, word process and play Halo :rolleyes:
-
Originally posted by Kazan
nvidia doesn't have a reference design...
Complete and utter crap. What do you think all these reviews of the GeForce 6800 (and practically every review of pre-release nVidia hardware) are being based on? That's right, reference cards provided by nVidia built to nVidia's reference design.
-
Originally posted by mikhael
Now see, there's something I would never do: Nvidia chipset mother boards. That's kind of like buying an Intel graphics card. Dude, that's just wrong.
Yea, well everything looked nice when I was putting building the PC on www.ibuypower.com . And I am not complaining on how fast it runs... :D
-
My board is top of the line 32-bit from Asus. What's it got? nVidia Nforce 2 Ultra chipset. Bad? I doubt it.
Heeh, my bottleneck is the processor and the ram. :doubt:
-
Originally posted by Admiral LSD
Complete and utter crap. What do you think all these reviews of the GeForce 6800 (and practically every review of pre-release nVidia hardware) are being based on? That's right, reference cards provided by nVidia built to nVidia's reference design.
Most retail cards you find are built to nVidia's reference design, since most manufacturers are very, very, lazy. Especially once you get out of first tier.
-
[totally brand-independent] I generally steer away from the integrated motherboards, if only for the driver issues they cause. Half my friends who have computer problems I fix have their problems with screwy drivers. The other half have their problems with Windows. :p [/totally brand-independent]
-
Originally posted by mikhael
No, I don't think so, Lib. They're pretty much garbage in a serious server or high-availability environment. They have distinct and deep issues with some peripheral hardware that are sensitive to time, etc.
They're decent, but so were the Intel Starfire (IIR the name Correctly) video chipsets. They're not what you should use in serious hardware.
So... which brand of chipsets should we go for then? They've fixed the ide issues the nf2 had ages ago and I don't see what other major problems there are.
In any case, all of the problems that the nForce2 would have as an server chipset are completely moot since the nForce2 is for the AthlonXP's which are desktop chips.
And guess what, the nf2 has the best desktop performance.
-
So which brand do you actually like. mikhael? VIA, Ali, SiS? Everyone is integrating now.
And on NF2, they do have the best performance, even though that lead is very, very, small, but they're very picky about RAM and the like. My old KT333 cared much less about timings and let me push my RAM farther.
-
Everyone is not integrated--unless you focus on consumer boards. They're pretty much all integrated, unfortunately.
As for the 'Yeah cause we all use servers to surf, word process and play Halo', I have to say that, as an argument, is bollocks. Some people watch movies on TVs and some watch them on digital projectors. Why? because we like better gear. You can use whatever gear you want to use. I stated my preference to avoid gear that I think, in my professional judgement, is wrong for the tasks to which I will put it.
-
So... I'll re-iterate, which chipset do you use then?
It's also interesting to note the server hardware isn't faster or better for the common day tasks since they designed to be reliable first.
-
Reliable. That's a good thing. :)
It depends on the machine, Chrono. I think--and without opening cases I can't tell you for sure--that all my Intel hardware is on BX chipsets (its old stuff) and I know my AMD stuff is on stock AMD chipsets (though which specific ones I cannot recall). I've found that for my AMD stuff, stock AMD chipsets have been the most solid (less BSODs/lockups/crashes/kernel panics/etc than other 'sexier' chipsets in large heterogenous hosting/lab/server-room environments).
Pretty soon, all I'll even have in the house is the BX chipset server, since the desktops are all going away to be replaced by IBM laptops and there, I hate to say it, I'm not terribly concerned with the hardware details. They run FreeBSD stably and that's all I care about for the terminals they'll be.
[edit]
Appears I was wrong. In one machine, its a KT333. That was supposed to be gone long ago. The other two are 760MPX chipsets.
There's two other machines I can't check because they're physically in another city.
[/edit]
-
So, you have dualies. Of course you wouldn't choose nforce2 then, it's not even for multiple processors.
Stability may be good, but a nforce2 or kt400 system with plain unbuffered ram is more than stable enough for almost any kind of work besides the heavy duty database, 24 hour, number crunching, mission critical stuff.
In any case, it certainly doesn't warrant your comment about never buying a nvidia chipset.
_You_ might never buy one because of what you need to do, but it's not good advice for everyone else since they don't have the same requirements as you.
And frankly, a properly setup nforce2 or kt400 system can handle number crunching just fine as long as you have decent parts to go with it and use conservative settings. You'll want the uber-reliable ECC stuff for lab conditions though I admit (and of course nf2 isn't going to be able to use that).
-
Originally posted by mikhael
As for the 'Yeah cause we all use servers to surf, word process and play Halo', I have to say that, as an argument, is bollocks. Some people watch movies on TVs and some watch them on digital projectors. Why? because we like better gear. You can use whatever gear you want to use. I stated my preference to avoid gear that I think, in my professional judgement, is wrong for the tasks to which I will put it.
If you want to use a projector instead of a TV that's fine but your earlier comment amounted to little more than saying buying a TV instead of a projector was just wrong and that anyone who bought a TV was an idiot. That's what I took issue to.
Of course there is better hardware than the NForce2 boards on the market but for the majority of people paying the extra for a non-integrated solution just isn't worth it.
-
I'm a consumer. I have a 'consumer board'. to be honest, I couldn't give a flying **** about the technology anymore - so long as it works, and I can get paid using it something else, I couldn;t care if it's the best or worst.
There's more important things to argue about than this stuff......like footie.
-
Chrono, I actually have NO dual processor machines, just uniprocessor machines with dual processor chipsets. We used them at Cisco for some of the *nix boxes. I was very well pleased with their stability (especially under certain flavors of *nix).
Originally posted by karajorma
If you want to use a projector instead of a TV that's fine but your earlier comment amounted to little more than saying buying a TV instead of a projector was just wrong and that anyone who bought a TV was an idiot. That's what I took issue to.
Of course there is better hardware than the NForce2 boards on the market but for the majority of people paying the extra for a non-integrated solution just isn't worth it.
I'm a professional. It would be rather stupid of me to ignore my professional experience and judgement when buying gear from my house.
At no point did I say someone was stupid for using Nvidia chipsets. Read what is written, not what you think was written.
-
We keep forgetting that you're not really a "Gaming Geek" like the rest of us mik, sorry.
-
No. In general, I'm not a 'gaming geek'. I gave that up around the time I left high school.
-
Let's disemble a bit
From ubermetroid
I only got the FX5600 because I got the Nvidia MB too...
and the response
Now see, there's something I would never do: Nvidia chipset mother boards. That's kind of like buying an Intel graphics card.
Up to this point you were doing just fine.
Then you added this
Dude, that's just wrong.
And hence implicated that the nvidia chipset mobo is wrong where in fact it's just not suitable for high-reliability purposes.
In any case, it was an interesting and valid comment in the wrong place and implicating the wrong thing.
-
Originally posted by mikhael
I'm a professional. It would be rather stupid of me to ignore my professional experience and judgement when buying gear from my house.
At no point did I say someone was stupid for using Nvidia chipsets. Read what is written, not what you think was written.
Originally posted by mikhael
Now see, there's something I would never do: Nvidia chipset mother boards. That's kind of like buying an Intel graphics card. Dude, that's just wrong.
What more do I need to say? If you personally don't like NForce boards that's your choice. You have the right to choose whatever gear you want for your own use but to say it's just wrong to buy one because everyone should buy a more expensive board is just plain wrong.
An NForce board will give you plenty of bang for your buck. Most people don't do anything that will hit the limits of an NForce board so seriously what is the problem with your average home user buying an NForce board if all they are going to do is play games and surf with it?
-
I did not say everyone should buy a more expensive board. I very carefully DID NOT ESPOUSE any other board. I expressed my opinion about a specific chipset (a motherboard chipset made by a video chipset manufacturer), by comparing it to another chipset (and video chipset made by a motherboard chipset manufacturer).
I'm sorry if I offended your delicate sensibilities. I stand by my statements, and will neither retract, nor modify them. If you have a problem with that position, I invite you to ignore me.