Hard Light Productions Forums
Off-Topic Discussion => General Discussion => Topic started by: Dark_4ce on November 18, 2002, 03:04:58 pm
-
(http://www.nvidia.com/docs/IO/3648/SUPP/large03.jpg)
NOW THATS WHAT I'M TALKING ABOUT!
The GeforceFX!
The card I'm gonna upgrade to when it comes out in a couple of months!:D
Click here and drool! (http://www.nvidia.com/view.asp?PAGE=geforcefx) :eek:
-
I wonder how much time they spent tweaking that. :eek2:
Here's another link of interest (http://www.3dgpu.com/previews/geforcefx_3.php). Scroll down a bit. :)
EDIT:
A similiar (basically the same) @ nvidia.com (http://www.nvidia.com/view.asp?PAGE=geforcefx_games).
-
Wow.
But what's the price tag?
-
Originally posted by Thunder
Wow.
But what's the price tag?
Doesn't matter, so long as it pushes a Ti4400 down into a reasonable price range. ;)
-
Following standard nVidia policy, opening price will be anywhere between 300 and 500 bucks. Better off to wait, IMHO.
-
Amen, I've learn't my lesson. Besides - they're bound to have a second more balanced edition out in 6 months time.
-
Originally posted by Dark_4ce
(http://www.nvidia.com/docs/IO/3648/SUPP/large03.jpg)
This certenly won't make it to the Hot women thread.
-
It's about time nVidia released their new card. ATI's Radeon 9700 has been out for awhile now and trounces the GF4. I guess they weren't joking when they said nVidia took on alot of 3dfx when it went under. :rolleyes:
-
my humbled gpu
is id implementing any dx9 code into doom3?
-
Originally posted by mikhael
Doesn't matter, so long as it pushes a Ti4400 down into a reasonable price range...
I hear that!
BTW, my shady underworld contacts suggest that a card with 128mb is actually worse than an old-skool 64mb job, on the basis that the time it takes to search through the larger swap file negates and actually outweighs the benefits of lots of RAM. Discuss in not more than 200 words, and without using the letter E.
-
Originally posted by diamondgeezer
I hear that!
BTW, my shady underworld contacts suggest that a card with 128mb is actually worse than an old-skool 64mb job, on the basis that the time it takes to search through the larger swap file negates and actually outweighs the benefits of lots of RAM. Discuss in not more than 200 words, and without using the letter E.
d*p*nds on your bus lat*ncy for tranf*r, caching and y*r whol* m*mory organisation. In th*ory, a 64-MB card could b* fast*r, but probably mainly with piss-poorly design*d m*mory (128MB) organisations.
But, um.... I think wh*n you run out of on card Vid*o RAM you hav* to start transf*rring stuff in / out from th* physical storag* (CD/HD) - and that's a p*rformanc* kill*r.
-
Hmm, that's just pictures. I want to see benchmarks and more.
Originally posted by IncendiaryLemon
my humbled gpu
is id implementing any dx9 code into doom3?
It's OpenGL game... so no DX9. I don't know if it'll use advanced pixel/vertex shaders though.
-
The D3 Alpha config file contained an option for using the nv25 and (separatly) the nv30 instruction sets. I'm assuming it'll have some kind of implementation and wouldn't be suprised if nVidia had been working with ID to ensure use of some of the FXs new features...
-
and the ****er still get's it's ass handed to it by the R300
-
:nervous:
Proof?
-
Originally posted by Kazan
and the ****er still get's it's ass handed to it by the R300
(http://secondaryfusion.net/~psylent/sa_pics/FLAMEON.jpg)
-
the GeforeFX suffers from memory bandwidth problems [having no where near the bandwidth to support the features it has at any kind of performance] and it suffers from nVidia's unending heat problems just look at the cooling system! it's pathetic!
the moment the GeforceFX hit's the stores the next R300 revision will be out there - the current R300 is already head of it in memory bandwidth, it produces less heat
these are just the evident difference
-
Have you got some links to tests carried out with the card? I tend not to trust people who take a look at a cards specs and say "she'll never manage such-and-such"...
-
the only specs out on the card are it's theorectical maximums - and nVidia has a long sad history of having "theoretical maximums" in about the 120% to 130% of real world performance range
-
oh.. tom's hardware has specs
-
Originally posted by Kazan
the GeforeFX suffers from memory bandwidth problems [having no where near the bandwidth to support the features it has at any kind of performance] and it suffers from nVidia's unending heat problems just look at the cooling system! it's pathetic!
the moment the GeforceFX hit's the stores the next R300 revision will be out there - the current R300 is already head of it in memory bandwidth, it produces less heat
these are just the evident difference
(http://secondaryfusion.net/~psylent/sa_pics/fabulous.jpg)
How bout waiting until the two are properly compared side by side until passing judgement huh?
-
who is this binary asshat with the images
-
Kazan, cut the bad language - you don't have to prove anything here anymore.
-
Originally posted by Kazan
who is this binary asshat with the images
Have we learnt a new dirty word? Touch upon a nasty raw nerve?
How about you read what I put under the image. About not passing judgement on the card until it has been properly tested.
-
Its alright 01, I like the piccies. Post some more, why not...
-
Originally posted by diamondgeezer
Its alright 01, I like the piccies. Post some more, why not...
I just found it amusing, but yeah, NVidias website has more, theres one particular outdoor scene which i'm sure is a real photograph.
(http://www.nvidia.com/docs/IO/3650/SUPP/large01.jpg)
-
The truck rusts progressively in that demo. Stepping from pristine to impressively decayed.
-
Originally posted by Thunder
Have you got some links to tests carried out with the card? I tend not to trust people who take a look at a cards specs and say "she'll never manage such-and-such"...
http://www.tomshardware.com/graphic/02q4/021118/index.html
Happy now?
-
Yes thanks, I like to be able to make my own mind up as opposed to go on word of mouth :)
-
Here's another (http://www.beyond3d.com/previews/nvidia/nv30launch/).
Here's an older article (http://www.beyond3d.com/articles/nv30r300/), based on what was known about NV30 a few weeks (months?) ago. Intersting read, but remember there's a good deal of speculation.
Source:Beyond3D (www.beyond3d.com)
-
So far it looks quite impressive, but I will suspend any judgement on this until we get some actual benchmark results. Of course, the raw fill rate is probably the most important factor for me; given a choice between extra fps than better quality, I almost always take the latter. :D
However, is there much point in getting this if the rest of your hardware is not top-of-the-line as well, i.e. how dependent is it on other hardware? (I only have an Athlon 1400 TB, 512mb 266mhz ddr and a AGP4x motherboard)
Also, the card is apparently double sized and takes up two card slots on your computer case due to the unusual cooling mechanism.
-
Thats right, it is a bit worrying that it runs so hot as to need that arrangement.
I'm going to wait on the next revision of the card I think, there simply aren't that many games that are good enough to make it worth paying so much for. I'd sooner upgrade my RAM and CPU... perhaps a new HD as well. I'd still have money to spare...
-
I'd wait until a Mod2 version is out, something that requires less cooling. I could put one of these in my machine (since my second monitor died, that Matrox card is doing nothing anyway), but my machine runs hot as it is. I'm not looking to put yet more forced air or worse, a water cooler, on my box just for prettier graphics.
Maybe I'll save my money and skip the GF4 and the FX and go straight to a Wildcat. ;)
-
[newb]
Wildcat?
[/newb]
-
Originally posted by Thunder
[newb]
Wildcat?
[/newb]
As wEvil might point out, the Wildcat is a real graphics card (http://www.3dlabs.com/product/wildcat4/index.htm). :D
Oh, not to p1mp, but I updated the Mjolnir pics, Thunder. ;)
-
Is it any good at games? Also, what is the price?
-
Originally posted by CP5670
Is it any good at games? Also, what is the price?
Games? Buy a cheap game card for games. Buy a Wildcat for 3d visualisation. As for price? The most recent Wildcat is more than I paid for my entire computer. :D
-
I am also saving money for such a card. Me and wEvil talked about it many times:)
Great cards for 3D apps but not for gaming.
-
Games? Buy a cheap game card for games. Buy a Wildcat for 3d visualisation. As for price? The most recent Wildcat is more than I paid for my entire computer. :D
Sounds rather useless to me then; the only non-game 3D visualiztion I am into is for 3D surface plots. :p Does it improve procesor floating-point performance any? (faster mathematica calculations :D)
-
So, this is the new NV30 huh?
I've been saving a lot of money for this.....
-
Originally posted by mikhael
the Wildcat is a real graphics card (http://www.3dlabs.com/product/wildcat4/index.htm)
:shaking:
-
Meh, I'm still holding out for one of these--
(http://www.lamerkatz.com/stories/images/*****in_small.jpg)
-
LOL! I remember seeing that thing! :D
-
Originally posted by mikhael
the Wildcat is a real graphics card (http://www.3dlabs.com/product/wildcat4/index.htm). :D
:eek2: holy goat-pee!!!! This card is awesome!
-
Originally posted by CP5670
However, is there much point in getting this if the rest of your hardware is not top-of-the-line as well, i.e. how dependent is it on other hardware? (I only have an Athlon 1400 TB, 512mb 266mhz ddr and a AGP4x motherboard)
.
I'd bet it needs an 8x AGP to get anywhere near full power... of course, I don;t think there are many games nowadays that need even a GF3 to run, so i really don;t see the point of nVidia releasing the FX as oppossed to spending a few months more optimising the GF4 so it takes less space, makes less heat, etc.
-
Originally posted by aldo_14
I'd bet it needs an 8x AGP to get anywhere near full power... of course, I don;t think there are many games nowadays that need even a GF3 to run, so i really don;t see the point of nVidia releasing the FX as oppossed to spending a few months more optimising the GF4 so it takes less space, makes less heat, etc.
Take a look at the game list for the FX, if that continues to grow then that card will be worth getting. And yes, I'd say u need 8x, but if you can afford the FX then ur gonna afford a new m/board anyway!
-
I'd bet it needs an 8x AGP to get anywhere near full power... of course, I don;t think there are many games nowadays that need even a GF3 to run, so i really don;t see the point of nVidia releasing the FX as oppossed to spending a few months more optimising the GF4 so it takes less space, makes less heat, etc.
Well, that's what I have at the moment (GF3), but I am kind of in two minds on whether I should upgrade when this next generation of cards is released.
Although the article at Anandtech at least said that there is very little difference between 4x and 8x AGP with today's games and other factors are much more of a bottleneck. (in my case, it would probably be the processor)
-
Originally posted by CP5670
Well, that's what I have at the moment (GF3), but I am kind of in two minds on whether I should upgrade when this next generation of cards is released.
Although the article at Anandtech at least said that there is very little difference between 4x and 8x AGP with today's games and other factors are much more of a bottleneck. (in my case, it would probably be the processor)
or perhaps memory speed, drive access time, sound decodingy stuff, m/board connections.... :blah:
-
which is better?View comparisons of graphics cards. (http://www.bestbuy.com/compare.asp?tp=7&title=&m=488&cat=521&scat=522&b=0&ta=&txtcount=25&T0=11090904&1=On&T1=11176475&T2=11183718&T3=11180024&4=On&T4=11101115&T5=11161710&6=On&T6=11101114&Remove.x=111&Remove.y=16)
-
(http://www.muropaketti.com/uutiskuvat/2002/1119fx_1.jpg)
Some Hardware comparisons... But one thing seems a tad odd. They claimed it to be 4 times faster than the GF4!! This comparison proves its only 1,5 to 2 times faster! Ok... So now that I'm over the initial shock, I'm gonna wait until theres some PROPER comparison tests between the Radeon 9700 and the GeforceFX. I'm keeping my money in my pocket until I'm REALLY sure what its actual perfomance is...
-
Originally posted by vyper
Take a look at the game list for the FX, if that continues to grow then that card will be worth getting. And yes, I'd say u need 8x, but if you can afford the FX then ur gonna afford a new m/board anyway!
Yeah, but those will only be games that support FX features... they obviously won't require one, because that'd be financial suicide. Incidentally, UT2003 had its maximum detail levels removed because no current GFX cards supported them..... maybe FX / Radeon 9700 will change that.
But, to be honest, I don't think even a GF3 will be min spec for at least another year.