Hard Light Productions Forums
Off-Topic Discussion => General Discussion => Topic started by: Gloriano on October 06, 2003, 10:45:58 am
-
A thread at Gearbox Software's forums claims that nVIDIA has been banned from future DirectX developments:
Nvidia's NV38 has been dubbed as a substandard card by team dx. This means that DX will not include NV in it's developement range for DirectX 10. Team DX made the decision "as a favor to the graphics industry". Team DX claims that NV violated their partnership agreement by changing the DX9 code with their latest set of drivers as caught by Xbit labs recently. This violates the licensing agreement and conpromises DX's quality in order to make it seem as if ATi and NV cards alike display the same image quality. This can only be fixed by reinstalling dx9b. So by all means, Do Not Download Detonator 50 Drivers!!!
//www.teamxbox.com
here (http://www.gearboxsoftware.com/forums/showflat.php?Cat=&Board=technical&Number=25270&page=&view=&sb=&o=&fpart=1&vc=1) orginal thread link added
-
ruh-roh[/scooby-doo]
-
Ouch... the original thread's here (http://www.gearboxsoftware.com/forums/showflat.php?Cat=&Board=technical&Number=25270&page=&view=&sb=&o=&fpart=1&vc=1), btw.
-
Well if they have been cheating on their benchmarks and generally buggering up DX just to make their stuff look better then they deserve a penalty.
I wonder if MS will remember that they did virtually the same thing to Sun with Java though?
-
This is the sort of thing that can make me switch to ATI.
-
I might have been using an ATI now if they hadn't been sold out when I went to buy the card ;)
-
I heard from a friend that NVIDIA actually quited producing Graphic Crads because they lied that their latest model was DX9 compatabile. This was proved thanks to HL2 benchmark which showed that NVIDIA-s latest card wasn't DX9 compatabile unlike ATI-s cards. So bye bye NVIDIA and hello to ATI's new technologies. I knew that it wasn't a mistake to buy an Ati card. NVIDIa really acted like a b***** by signing contracts with game companies so that only their video cards would show best performance. I mean, how g** is that? They deserve this and probably even worse. :yes:
-
do they do anything other than g/cards ?
-
They make pretty nice chipsets for motherboards. Those of you with an NForce 2 board will know what I mean :) They do a fair few other things too IIRC.
-
They do the nforce2 sound chip for motherboards, IIRC.
Although i'm not sure I'd trust a set of forums as being entirely trustful as a news source.... i mean, pretty much everyone you find online has their own agenda, so I'd be inclined to wait for an offcial confirmation.
Of cours,e if it was true, it could kill nVidia. and piss me off, cos I chose their cards because the games under the 'Play the Game Prgoramme' are suppossed to work much better on nVidia than ATI cards, which was a large reason for me choosing the FX5600 over the Radeon 9600.
-
I personally prefer nVidia cards because of their price/power relation. 16 yo kiddos like me without 500 bucks must aim at low-end cards - and that's where FX5200 series come in.
Have you any idea how much is it possible to overclock that chip? And it costs about 80-85 bucks!
-
[q]Have you any idea how much is it possible to overclock that chip?[/q]
No, and I have the 5600 so I'd like to know ;)
-
Originally posted by vyper
[q]Have you any idea how much is it possible to overclock that chip?[/q]
No, and I have the 5600 so I'd like to know ;)
Well, the default settings (for FX5200 Ultra) are 325MHz for GPU and 650 MHz for memory. I've managed to get the CPU frequency to 400 which is a bit better that Radeon 9800Pro and memory to 750 which also exceeds the 9800Pro chip. I'm pretty sure that I could get those two even higher but I don't wanna put too much pressure on it.
So, the only thing it lacks is that magical "horsepower" which comes from the design I suppose.
-
Heh, you guys should read the thread. They basically call him a liar, and allegedly he has a history of nVIDIA bashing.
So stop running around like beheaded chickens.
-
Originally posted by Ulundel
I personally prefer nVidia cards because of their price/power relation. 16 yo kiddos like me without 500 bucks must aim at low-end cards - and that's where FX5200 series come in.
Have you any idea how much is it possible to overclock that chip? And it costs about 80-85 bucks!
No! That's where my Ti4200 falls! Beats the 5200 hands down, and all it lacks is DX9 support :)
And if MS was actually dumping nVidia (which it won't because nVidia still has a sizable portion of the market share), it would be a bad thing. The greater the number of competitors, the better it is for the industry.
-
It's a pity the openGL ARB didn't take the same stance - that way games etc. developed on the openGL API would actually run on cards developed by the people that DESIGN openGL.
Whats' the point in screwing with a standard to commercially jump the gun? it's a standard so you don't get 5 billion screwy little extensions that don't work on half the hardware out there.
nVidia had this coming for a long time, i hope they learn to play fair or someone does a 3dfx on them.
-
This the start of a long road to Nvidias falls? ala 3DFX? ;)
-
What happened to 3DFX, really? All I know is that one day, nVidia bought up 3DFX and poof, no more Voodoo cards.
-
Basically, yes. 3DFX was loosing money, nVidia bought it, and had the employees design GeForce cards.
-
it's little wonder that the next XBOX that will be built, will have a ATI graphics chipset..:nod:
nVIDIA has had a major falling out with microsoft it seems...
-
No, because nVIDIA whill highjack the Vast Right-Wing Conspiracy's orbital mind control lasers.
-
Originally posted by Shrike
No, because nVIDIA whill highjack the Vast Right-Wing Conspiracy's orbital mind control lasers.
nah... they'd have to beat microsoft to it first :p
-
Originally posted by Turnsky
nah... they'd have to beat microsoft to it first :p
That's what you think.
-
Originally posted by Shrike
That's what you think.
:drevil:
-
Hm... Aside from the last bit, this is all interesting...
I was trying to decide between a GF FX 5200 and a Radeon 9200, now there's nothing to worry about. :D
-
For the last time. That's an entirely unconfirmed report from someone who is reportedly is full of BS. If you use that as your criteria for choosing a card, well.....
-
Well? Weren't you all saying that the Radeon was better than the 5200 anyway?
They're the same price, if anyone wants to enlighten me, I'm listening.
-
I use a Ti4200, myself. I'm not getting another gfx card for a couple years.
-
my 9000 hasn't let me down yet... i'm thinking of getting another ATI card.. since i know that they're /reasonably/ reliable... just go with what you know.. :)
-
I also use a TI4200 card - trouble free and keeps up with everything I want to play. Shrike has a very good point: one suspect source is not researching the products.
H
-
Originally posted by Hellbender
I also use a TI4200 card - trouble free and keeps up with everything I want to play. Shrike has a very good point: one suspect source is not researching the products.
H
better the devil you know, than the devil you don't
meaning... if you know a ATI card to work great for you, and know them pretty well, etc, get an ATI card... if you know a nVIDIA card to work for you, use that..Or, if you wanna switch over from ATI to Nvidia, or vice-versa.. it's up to you..
it all boils down to one thing... Choice
it's your money that you're spending... choose what's best for your needs and price-range.... :nod:
and not fall to the evils of rumor and heresay... i just said about the next XBOX having a ATI chipset because i /read/ that in a magazine, and has been confirmed officially...
-
/me wants a new vid card to replace his GeForce2 MX for under $100
-
Originally posted by GalacticEmperor
/me wants a new vid card to replace his GeForce2 MX for under $100
Try looking at a Radeon 9000 (or the pro variant) they offer "bang for your buck" in my opinion
-
Monopoly any one???
-
Originally posted by mikhael
This is the sort of thing that can make me switch to ATI.
**** i been done did that lol, not much compaints so far though!
-
I hope that is false. If it is true and Nvidia goes under because of it, ATI will become a monopoly in the gaming video cards industry and that could be devastating. Competition between companies is important and something you don’t want to lose.
-
/*goes of to invest in ATI stock*/
-
Originally posted by Shrike
For the last time. That's an entirely unconfirmed report from someone who is reportedly is full of BS. If you use that as your criteria for choosing a card, well.....
:nod: i agree there is no official so it's only ruomor
-
[Q]Originally posted by Shrike
Heh, you guys should read the thread. They basically call him a liar, and allegedly he has a history of nVIDIA bashing.[/q]
He speaks truth. ;) Check out the thread - it's long, but worth it if you want to get to the bottom of this.
Originally posted by Shrike
So stop running around like beheaded chickens.
Try telling that to a beheaded chicken. ;)
-
Originally posted by Krackers87
Monopoly any one???
OK, but I wanna be the boot. ;)
:nervous:
-
Originally posted by pyro-manic
OK, but I wanna be the boot. ;)
:nervous:
[Grumble] I never get to be the boot... [/Grumble]
-
boot? ill see your boot and raise you a battleship :D
-
Originally posted by Jonathan_S47
I hope that is false. If it is true and Nvidia goes under because of it, ATI will become a monopoly in the gaming video cards industry and that could be devastating. Competition between companies is important and something you don’t want to lose.
Actually, VIA is in the process of releasing a decent video card (Deltachrome) which should compete well in the mid-range market, and XGI has started producing the Volari video cards, which should compete on all levels (With the V5, V8, V5 Duo, and V8 Duo). Which will make it 3 large video card presences with good cards in all price brackets (nVidia, ATI, and XGI) and 4 companies for mid-range cards. This will pad the market in the event of one of the two giants collapsing.
-
I'll keep the MX440 for now to be honest, I haven't come across many problems with it at all. If it becomes a problem I'll replace it :D
As for Nvidia doing dodgy things as far as providing information is concerned..... isn't Direct X a MICROSOFT subsidiary??
Flipside :D
-
If Microsoft threw stones, it'd be covered in glass......
-
Originally posted by Flipside
I'll keep the MX440 for now to be honest, I haven't come across many problems with it at all. If it becomes a problem I'll replace it :D
You'd be suprised. I was quite content with my MX440 until I found a Ti42 for 60 squids and made the swap. The difference was blatantly noticeable, 'specially where fancier stuff like paritcles and smoke were concerned. Course having paid 60 squids for it, it died and took my CPU with it. But that gave me an excuse to get a 4800... now that's *****in' fast :nod:
-
I wish that the Ti4200 was a bit faster with particle rendering, it's the only problem I really have with it (Other than the fact I went for the 64MB version.) Mine is quite nice though. Mine overclocks quite well (up to 308/614) and has BGA ram, an unusual thing to find in a Ti4200.
-
Originally posted by diamondgeezer
You'd be suprised. I was quite content with my MX440 until I found a Ti42 for 60 squids and made the swap. The difference was blatantly noticeable, 'specially where fancier stuff like paritcles and smoke were concerned. Course having paid 60 squids for it, it died and took my CPU with it. But that gave me an excuse to get a 4800... now that's *****in' fast :nod:
ROFL - Well, yes, my Bro's Radeon thingy (can't remember the number) wees all over mine when there are lots of bitmaps and particles going on, ATI certainly have the edge there ;)
Still, playing HW2 earlier and it (the GeForce) was getting jerky on 800 x 600!! That's very very nearly a decent excuse....... :D
Flipside :D
-
What GeForce do you have, Flipside? If it's a GF4 MX, then you can't expect good framerates out of it since it's really a GF2 MX with HT&L. And if it's a GF4 Ti, then your computer is screwed up.
-
Originally posted by pyro-manic
OK, but I wanna be the boot. ;)
:nervous:
:lol:
That elicited a laugh out of me. Well done. :D
-
Originally posted by karajorma
They make pretty nice chipsets for motherboards. Those of you with an NForce 2 board will know what I mean :) They do a fair few other things too IIRC.
nForce2 is a decent chipset (best the Athlon has ever seen) but the drivers, particularly in the IDE area, need a lot of work.
-
Originally posted by Grey Wolf 2009
What GeForce do you have, Flipside? If it's a GF4 MX, then you can't expect good framerates out of it since it's really a GF2 MX with HT&L. And if it's a GF4 Ti, then your computer is screwed up.
Yep, it's an MX, normally it behaves perfectly well, but I must admit, reading my way through these posts is slowly convincing me that an upgrade would certainly be in order soon ;)
Flipside :D
-
For the last time. That's an entirely unconfirmed report from someone who is reportedly is full of BS. If you use that as your criteria for choosing a card, well.....
nVidia was caught red handed cheating in at least one benchmark big time.
nForce2 is a decent chipset (best the Athlon has ever seen)
It's most certainly better than the nForce 1 was.
-
Originally posted by Kosh
nVidia was caught red handed cheating in at least one benchmark big time.
And ATI hasn't done it either? Everyone cooks their books. Get a card that works for you, don't listen to some halfassed babble you hear on the net.
-
ATI and Nvidia where both caught cheating on 3Dmark03.
[Link] (http://www.xbitlabs.com/news/video/display/20030523152718.html)
-
Can't even get a graphics card I can trust, these days. :sigh:
-
Originally posted by Setekh
Can't even get a graphics card I can trust, these days. :sigh:
Bah.. wait for an Independant test ;)
-
I think the tests at toms hardware guide are independant enough.
-
Tom's is crap. Read their A64 review to see what I mean. They're the only hardware site to review it and not give it a good score. They also used better timings and better RAM on the P4 than on the A64 3200+, where there was no reason to do that whatsoever. A very biased review.
-
Voodoo!
-
S3 Virge!
(be glad if you don;t know what it is... ;) )
-
Urgh....cough....cough....
I did...