Hard Light Productions Forums
Off-Topic Discussion => General Discussion => Topic started by: Turambar on August 19, 2004, 07:08:47 pm
-
this thing is so amazingly powerful, i haven't been able to make it stutter or slow at all with any game I own FS runs great and somehow looks better too.
I'm waiting for the next amazing breakthrough now
-
Die.
-
Originally posted by vyper
Die.
-
Originally posted by vyper
Die.
Painfully!! :mad:
-
I don't think I've done this for you guys yet
ahem *cough*
Mwahahahahahaahahahahahahahahahahaahahahah!!!!!!
-
[color=66ff00]I can happily say enjoy; probably only because I've been saving for a bit and can afford a monster system if I so wished. ;)
[/color]
-
You can have your 6800 GT. I'm not jealous. Why? Because I:
- have a sweet 9600 XT which runs excellently and cost me a fraction of what you spent
- Just bought a kick ass spoiler for my car (Which has been repaired and will get painted in a few weeks)
- Still have enough money to buy both sweet rims and tires for my car, and a new processor and memory upgrade for my PC.
So enjoy. ;) :p
-
GeForce cards suck anyway. Radeon 0wnage.
-
Most people don't give a rat's petute about 3-4 pecentage point, Lightspeed.
-
Originally posted by vyper
Die.
Shinu
Crkni
Umri
Dö
-
Originally posted by Liberator
Most people don't give a rat's petute about 3-4 pecentage point, Lightspeed.
Maybe they give a damn about extremely higher energy consumption, an extremely worse card layout, an extremely worse thermal solution,....
And... additionally to that.... it's nVidia *shudders* (I know I'm biased :p )
-
I don't want to hear that from someone who was using a 3dfx Voodoo while everyone else had something that was using an AGP slot.
Oh yeah, I forgot something important.
nVidia sucks.
-
Why, my Voodoo rocked. I could even play Freelancer half-decently with it!
3dfx owned. Sad that they got left behind, actually.
-
Not too mention the fact that some of us were too stupid back then to get(that's right "get" not "build") a computer with an AGP slot. BTW, I upgraded from a Voodoo 3 2000 16mb to a Nvidia Geforce 4 MX 420 64mb.
-
My Voodoo was an AGP-card. :D
-
I can sense jealousy of the poor. And I don't need jedi powers for that. ;)
-
Just bought a kick ass spoiler for my car (Which has been repaired and will get painted in a few weeks)
lol. i'll bet you can go at high speeds with greatly improved traction and handling now
Maybe they give a damn about extremely higher energy consumption, an extremely worse card layout, an extremely worse thermal solution,....
if the card sucks so much, why are so many people buying it, when they could get, for more-or-less- the same price, the radeon x800... which is obviously superior ;)
-
Lightspeed's right about the card layout. on my new 9800 Pro, the card is about an inch shorter in length than my old Geforce 4400 card. Plus, having to shell out money for another power supply along with a 6XXX series card just makes me shudder. I lost a piddly modem due to a power brownout that happened a while back. With that much power running through that card, imagine what could happen if there was a brownout/spike/whatever.
Especially after spending the major dough for that card to see it fry. *SHIVVER*
-
As I understand it, the card doesn't actually need all that much more power. It just needs more than the AGP slot can provide, PCI-E has addressed this but you'd have thought They would have seen the power needs coming and beefed up the standard ahead of time. Regardless, Nvidia's recommendation of PSU size is to ensure a stable current flow to the card like it gets from the bus.
-
i've heard it's more power demanding, and a lot of people that get it get new power supplies too.
i still think it's funny that a video card has more RAM and draws more power than the rest of the computer ;) :D
-
Turamber, nice, but is it paired with a fast CPU and good memory?
Fine, the NV40 layout sucks... SO why is it running anywhere from 5 to 25% faster on most games?
Also, a X800 reguler would suck, with only 12 pipes, while a GT is overclockable to Ultra speeds, with a standerd cooler, and no one runs standerd, you run something better.
-
You'l have to put in TWO seperate molex connectors (i.e. not on one string) and nVidia puts it as nicely as "we recommend a 480 W power supply".
The X800 PE drains LESS power than the 9800 Pro, needs one connector, and thus no special power supply either. That's life, huh?
-
Originally posted by Lightspeed
You'l have to put in TWO seperate molex connectors (i.e. not on one string) and nVidia puts it as nicely as "we recommend a 480 W power supply".
The X800 PE drains LESS power than the 9800 Pro, needs one connector, and thus no special power supply either. That's life, huh?
Most new power supplies come with 5 or more molex connectors, that should be enough for your need. And while nVidia recommends 480W the 6800 Ultra has been demonstrated to run at a 350W PSU on a full rig, and on 300W with a bare bones sytem.
The X800 PE at the price of an ultra is not worth it, esspecially since a GT at 100 $ cheaper can beat it.
So what if it drains less then a 9800? are you lacking in power? Again, with PCI-E, the geforce only needs 1 connector, putting it on the same level as the Radeon.
While I like Radeon cards, and currently own one, I freely admit, that unless near games like HL2, Sims, Rome and for the further titles like STALKER and FEAR, are in favour of ATI in the same way that Doom3 favours Nvidia(I wont get indepth here), there is no way ATI can claim it has 'Won' this round.
Now if, the current situation continues, with the card refresh, ATI pulls an Nvidia, and suddenly is totatly competetive, then we will be back to the 9xx and FX line, only with roles reversed.
-
'cos a 480W PSU is so much hassle.... :wtf:
-
Originally posted by vyper
'cos a 480W PSU is so much hassle.... :wtf:
See Above, a good 350 should serve you well if the computer isn't loaded with more stuff, 400 in that case.
-
Running something with power supplies that don't match the hardware specifications is not too good of an idea.
Other than that, the power drain is the cause to a lot of problems. Heat, for example.
-
Originally posted by Lightspeed
Running something with power supplies that don't match the hardware specifications is not too good of an idea.
Other than that, the power drain is the cause to a lot of problems. Heat, for example.
Actully, nVidia admitted it could run safely at 350W, and that the 480W was only pre-cautionary, since there are many PSU's, that while claiming to provide 480W, are actully far weaker.
On heating, as Xbit proved, you can use a geforce 6800 GT's specification cooler, and still surprass the Ultra in speeds, with no heat problems, the cooler is excellent, and most brand cards have far better coolers.
-
Originally posted by Stealth
lol. i'll bet you can go at high speeds with greatly improved traction and handling now
With my automatic, non-v-tec integra? What are you smoking? :p
I didn't say 'Hug3 r34r w1ng!!!1! Liek F4st a|\| Furiuss!!' :p I got the factory OEM style Type-R rear spoiler. Because it looks good.
I'm not a ricer so :booty: to you. :p
-
yeah i know, i was sarcastically humoring you with that statement ;)
OK as long as it's the factory spoiler, and not some two foot high spoiler that's supposed to look cool kind of spoiler ;)
-
Originally posted by Raa
You can have your 6800 GT. I'm not jealous. Why? Because I:
- have a sweet 9600 XT which runs excellently and cost me a fraction of what you spent
- Just bought a kick ass spoiler for my car (Which has been repaired and will get painted in a few weeks)
- Still have enough money to buy both sweet rims and tires for my car, and a new processor and memory upgrade for my PC.
So enjoy. ;) :p [/B]
Die
-
Four months back I used 4000 € to two computers (I think that's 4,928.99 USD) and last month 5000 € to new car. (6,161.24 USD) The car costs 15500 € total. (19,099.84 USD)
It is good to have a job that pays reasonably. :p
-
what car was it?
-
Originally posted by Stealth
what car was it?
yep i wanted to ask the samke question.:ha:
-
Originally posted by Turambar
this thing is so amazingly powerful, i haven't been able to make it stutter or slow at all with any game I own FS runs great and somehow looks better too.
I'm waiting for the next amazing breakthrough now
HAH! im getting the Ultra as soon as itt comes out, possibly even the Ultra Overclocked version :P
-
Originally posted by Krackers87
HAH! im getting the Ultra as soon as itt comes out, possibly even the Ultra Overclocked version :P
what a waste of money, an ultra is allready just a GT overclocked, and with the standerd GT cooler, you can easily reach ultra speeds and beyond.
-
Originally posted by Stealth
what car was it?
Nissan Micra. Good and economical car for one person. :)
-
and i should care that you spent a fortune on computer component that is gonna SUCK in 6 months? i can barely make stuff studder on my system too and i have R9600XT ....
-
All computer components (more or less) suck in 6 months - thats just how it is.
-
Originally posted by Kalfireth
All computer components (more or less) suck in 6 months - thats just how it is.
Not suck, just not state of art, the nice thing about the 6800 is that its quite advance for now, SM3.0 and all that, and if you buy something upgradeable, you got a clear road.
-
Yeah, I kinda meant "suck compared with the state of the art", y'know what I mean :)
-
Originally posted by Kalfireth
Yeah, I kinda meant "suck compared with the state of the art", y'know what I mean :)
Aha, well, for GPU's 6 months out of date would be about right for the Card refresh, but its not sucky, my radeon 9700 could beat the 9800 alot.
For other stuff, like RAM, if you were lucky and got 2-2-2 RAM you are better then the state of art ;).
-
When buying new computers, it does not make sense to buy mid- or low-end hardware. When you're bying, buy high-end hardware. Lasts longer. Any hardware you buy today will get outdated sooner or later, its an unwritten law so I am not going to worry about it. And I don't regret the money I used on my two computers, as I don't regret what I used on three previous computers. Low- or mid-end hardware is just for computer upgrades, not for new computers. That's my motto on this matter. :)
-
I go for cash-specific choices - enables you to pick up real gems that only got shunted out of the big-buck range because the next big thing hit the market (case in point: The Radeon 9800 Pro). These days I can't afford £300+ every time a new-gen graphics card hits the market.
-
buy high-end hardware
True. But I always jump one generation card (Now I have 9800XT next will be Ati 500 series card)
-
the two molex connectors are supposed to be on DEDICATED lines too - ie each line normally has two molex connectors on it -- but you're suppose to dedicated an entire _line_ to each connector on the card -- ie so you're actually using up 4 connectors
people _think_ the nVidias are superior because they crank slightly higher FPS - at a cost of masive ammounts of image quality
-
Originally posted by Kazan
the two molex connectors are supposed to be on DEDICATED lines too - ie each line normally has two molex connectors on it -- but you're suppose to dedicated an entire _line_ to each connector on the card -- ie so you're actually using up 4 connectors
people _think_ the nVidias are superior because they crank slightly higher FPS - at a cost of masive ammounts of image quality
Where the hell did you get that? no PSU i've seen has 2 ends on 1 line, each line is dedicated for one end, and as I have said above, you can run it safely with 350W and 400W if you want to play it safe.
The 4 line thing is just BS.
People think nVidia won this round because it has supirior FPS in all next-gen games, and finnaly image quality that is equal to ATI.
Find me one picture from a reliable source (I.E anandtech.com , Xbitlabs.com , Tomshardware.com , Firingsquad.com and more), that shows that the NV40 has bad image quality.
-
um every PSU i have ever seen has _two_ connectors on each set of power lines
2 lines x 2 connectors = 4 connectors
ie your'e suppose to dedicate _LINES_
um picking those images thery're going to be showing is going to be nVidia-advertisement images -- ie properly optimized to hide errors
there is a reason why all the high-knowledge people here are against nVidia
-
Originally posted by Kazan
um every PSU i have ever seen has _two_ connectors on each set of power lines
2 lines x 2 connectors = 4 connectors
ie your'e suppose to dedicate _LINES_
um picking those images thery're going to be showing is going to be nVidia-advertisement images -- ie properly optimized to hide errors
there is a reason why all the high-knowledge people here are against nVidia
Okey, since i'm not that knowledgeable on PSU's i'll pass that, though mine isn't built like that.
Those pictures are taken each time by the people working in those sites, and those arn't nVidia ad's, just reguler screenshots.
EDIT: here are some very basic pictures, both cards with optimizations disabled.
geforce IQ (http://www.xbitlabs.com/misc/picture/?src=/images/video/graphics_cards_1H-2004/halo_6800_true.jpg&1=1)
X800 IQ (http://www.xbitlabs.com/misc/picture/?src=/images/video/graphics_cards_1H-2004/halo_x800_true.jpg&1=1)
-
Yeah, and my name is Queen of England too.
-
Originally posted by BD
Yeah, and my name is Queen of England too.
What is that supposed to mean? that every single professional tech site in the net is in nVidia's pay? each with differant IQ comparisons?
sorry if I come off hostile, that just ticked me off.
-
1st thing i would like to draw your attention too
cooling systems
Nvidia 6800 http://graphics.tomshardware.com/graphic/20040414/images/card-overview.jpg
ATI X800
http://graphics.tomshardware.com/graphic/20040504/images/card01.jpg
the profile of the cooling system tells you a lot - ATIs NEWER HIGHER PERFORMANCE x800 board requires less cooling than nVidias 6800
nVidia - clear pixelation error
http://graphics.tomshardware.com/graphic/20040414/images/meermaid.jpg
ATI
http://graphics.tomshardware.com/graphic/20040504/images/hdg-01.jpg
http://graphics.tomshardware.com/graphic/20040504/images/hdg-02.jpg
http://graphics.tomshardware.com/graphic/20040504/images/ruby02.jpg
http://graphics.tomshardware.com/graphic/20040504/images/ruby03.jpg
graphically the images done by tomshardware on the ATI board are more graphically advanced
[edit]
assholes - THG is blocking hotlinking images
PS Ace those screenshots are under CLEARLY different video card settings
-
Preview before posting :p
-
when i posted it the first time they worked
-
Yes the cooling is a problem, but many cards allready have singleslot cooling, same happened with the FX's.
Also, for pixelation error
Doom3 ATi Errors (http://www.xbitlabs.com/misc/picture/?src=/images/video/doom-3-tests/quality-modes/2_ati_ultra_2.jpg&1=1)
Its the be all end all, as the game was optimized for nVidia, yet it clearly shows ATi is not perfect.
EDIT: I could see the pictures as well.
-
Originally posted by Kazan
there is a reason why all the high-knowledge people here are against nVidia
There is difference being high-knowledgeable and being biased. Neither ATI or NVIDIA is perfect, both has their own flaws. Tables are turning every now and then, or did you already forget how first and second generation Radeons and their drivers sucked?
Anyway, I am sure people can base their own opinions based on reviews if they are going to buy something. If they blindly buy hardware, its their own fault. And by reviews I don't mean advertisements.
In any case, in my personal opinion NVIDIA has gotten it quite right with their latest GeForce series, as does ATI. But neither are perfect.
I wouldn't mind owning either GF 6xxx or Radeon Xxxx ( :wtf: ) series, but on the other hand, I wouldn't touch GF FX series or pre-Radeon 9xxx series.
-
Originally posted by Mr. Fury
In any case, in my personal opinion NVIDIA has gotten it quite right with their latest GeForce series, as does ATI. But neither are perfect.
Thank you, it was what I was trying to say, neiter are perfect, neither is the absolute(For FS however, nVidia is needed, because of the shinemap issue).
The real question if your a FPS gamer considering, are you going to play HL2 or Doom3? ;)
-
Fury: you know better than to post dribble like that - i go with the better option -- nVidia has a history of card burnup, driver cheating, low image quality, factory overclocking, etc
as for the Doom pixelation error that's not ATI's fault - that's ID's fault for using nvidia-specific features and writing some hacks (never thought i'd find myself admonishing carmack - but when he did this i was like WTF ARE YOU THINKING MAN!)
-
Originally posted by Ace Pace
The real question if your a FPS gamer considering, are you going to play HL2 or Doom3? ;)
Or not. Maybe you can get few FPS's more with either GeForce or Radeon but it doesn't matter in the end. You can still play both games.
If your computer can handle latest games, it can handle D3 and HL2 as well.
Kazan: You said it yourself. History. Most people want to look at the present.
And also ATI has had its own share of bad publicity, few driver cheats comes to mind. Not nearly as much as NVIDIA, but still.
But it is useless to discuss these things with you because you're stubborn, something what IPAndrews likes to say about me, which I tend to naturally overlook.
-
Originally posted by Kazan
nVidia has a history of card burnup, driver cheating, low image quality, factory overclocking, etc
as for the Doom pixelation error that's not ATI's fault - that's ID's fault for using nvidia-specific features and writing some hacks (never thought i'd find myself admonishing carmack - but when he did this i was like WTF ARE YOU THINKING MAN!)
All ended now, as everyone agrees the drivers are fine now the IQ is fine now, also did you miss the part with ATi cheating aswell?
If I recall correctly, Doom3 had a render path specificly for ATI? or was it a path for both? Because the FX's had their own.
-
Originally posted by Mr. Fury
Or not. Maybe you can get few FPS's more with either GeForce or Radeon but it doesn't matter in the end. You can still play both games.
If your computer can handle latest games, it can handle D3 and HL2 as well.
For doom 3 esspecially, the differance is the nVidia is up to 30-40 PER CENT, faster, with no loss of IQ.
HL2 benchs for now show it about equally with ATI leading during AA and AF, but those are non final benchs.
-
Ace Pace, I wouldn't worry about that. Sounds like overshoot to me, more like maybe 10%.
Future ATI drivers will improve OGL performance and I am sure future D3 patches will also improve performance. Why? Because ID isn't selling D3 to only NVIDIA users.
-
actually they are since they're using an nVidia specific feature
-
It has nothing to do with sales. And to ensure that people in both camps stay happy to promote even more sales.
-
Ya, well when Direct X 9.0c enabled progys start coming out, ATI's X800 will show its true flaws.
-
When true 9.0C titles come, we're already one or two graphics card generations further.
-
everyone has his own opion and some peoples like Ati and some Nvidia
Me I always liked ATI Radeons over Nvidia
edit.
-
I've historically prefered 3Dfx, then nVidia then ATI.
3Dfx's drivers have always been solid. I still have my Voodoo 2 (Can't actually use it since Win2k+ doesn't have any V2 drivers... :().
nVidia's drivers have always been fairly stable and solid, and they do update them on a fairly regular basis. Their hardware has always leaned on the Crap side 'tho in terms of quality. They seem to just throw transistors at every problem. No once can deny their stuff has all the bells and whistles (Heck, they're the only card with the latest Pixel Shader engines), but the way they're designed feels so... inelegant. It's like writing a Notepad program in MFC vs WinAPI.
ATI sucked hard until the Radeon series came out. I was pretty amazed by that - Going from **** hardware and **** software to pretty darn good hardware and okay software :)
The softs have been getting better too (I guess they finally realised that they need to employ more than one guy to QA their drivers :D), but IMHO still don't match nVidia's. And ATI do cheat as much as nVidia, they're just sneakier about it.
*Rant!*
But sod all that, I'm sticking with my Ti4600. I mean, does anybody not find it... I dunno... '****ed up' that a video card needs more power than any other component in the system?!
The Voodoo5 was bad enough - It was stupid then and it still is. But then nVidia go and trump them!
This thing about the 6800 needing two power connectors?! Has nobody else thought "WTF?!"
First of all, all Molex plugs are wired into the same originating points inside a PSU so unless the current draw is enough to melt the cables then you shouldn't need to plug 2 independant cables in!
And the fact that they also state that you can't plug any other components into those sockets apart from wussy crap like fans just scares the crap out of me!
If things keep on like this we'll need to buy a separate PSU to power the whole bloody card!
Jeez I can just see it...
"Intel announces a revision to the forthcoming BTX case specification, adding provision for 3 extra Power supplys to be added.
In other news, they have also revised the PCI-X to provide extra power rails for the more power-hungry PCI-X cards
(Insert picture of 2cm-thick copper tracks embedded in motherboard)"
-
Originally posted by Krackers87
Ya, well when Direct X 9.0c enabled progys start coming out, ATI's X800 will show its true flaws.
True, but by then, every single card from today will be to slow, 6800 included, look how long it took for SM2 to arrive.
-
Nice card, the 6800GT. Almost ordered one of those the other day, but in the end I decided on a Radeon X800 Pro VIVO, because you can flash them to X800XT, and they're a bit cheaper than the GTs ;). Sticking it in with an Athlon64 3500+ and a gig of RAM. Should be rather fast.... :D
EDIT: Oh, and isn't this a rather silly argument?* "nVidia 0wnz0rz ATI!" "No, ATI 0wnz0rz nVidia!" etc. They're both very equally matched, and they both kick the living crap out of the previous generation, so whatever you get, you'll have more than enough power for anything you can throw at them in the near future.
*Unless you lot enjoy bickering about it, in which case carry on! :D
-
Originally posted by pyro-manic
Nice card, the GT. Almost ordered one of those the other day, but in the end I decided on a Radeon X800 Pro VIVO, because you can flash them to X800XT and save £50+ ;). Sticking it in with an Athlon64 3500+ and a gig of RAM. Should be rather fast.... :D
Are you sure its safely flashable? I can't remember what was the card that you could flash, but on high end games like D3, it would create a mass of artifacts and no unflash ability. :sigh:
-
I think it's only the VIVO version. The normal Pro has the extra 4 pipes physically disabled, so you have to mod the board, but the VIVOs are (apparently) rebranded XTs, so you just change the bios and it switches tehm back on. Linkage below - this is the card I'm getting, and it says how to do it in the review. :)
http://www.ocprices.com/index.php?action=reviews&rev_id=230&page=6
-
Originally posted by pyro-manic
EDIT: Oh, and isn't this a rather silly argument?* "nVidia 0wnz0rz ATI!" "No, ATI 0wnz0rz nVidia!" etc. They're both very equally matched, and they both kick the living crap out of the previous generation, so whatever you get, you'll have more than enough power for anything you can throw at them in the near future.
*Unless you lot enjoy bickering about it, in which case carry on! :D [/B]
As I see it here, its not what better, its really more about nVidia then about ATi.
And yes, I enjoy bickering about in graphics. :thepimp:
-
Just to finsh matters with the debate.
Fact: the Geforce 6800 GT can easily be overclocked to Ultra levels safely.
Fact: The GT needs only 1 power connector.
Fact: the GT is a singe slot cooling design.
Whats the problem again?
EDIT: picture supporting the single slot design and single power connector is Evidence (http://www.beyond3d.com/previews/nvidia/6800s/images/all.jpg)
-
Am I the only one here who's going with "who cares" as a final conclusion... so long as what I have can run what I want to a decent level of quality and speed - the rest is bickering for the sake of bickering :)
-
Originally posted by Kalfireth
Am I the only one here who's going with "who cares" as a final conclusion... so long as what I have can run what I want to a decent level of quality and speed - the rest is bickering for the sake of bickering :)
Exactly, I don't really care, probebly getting a ATi anyway, just for loyalty, but its fun to debate.
-
I got my 6800 GT because i like nvidia cards, and for the simple fact that it Pwnz j00!
It's not the only card that Pwnz j00!, several ATI cards Pwn J00! too, but the 6800GT is my card, and it Pwnz J00!
-
[color=66ff00]That last remark makes me think you're trying to augment something...
[/color]
-
Should check his e-mail for that.
-
Would you like a bigger penis? Where would you like it? I can suggest some places...
-
you'll care when your card burns up
-
Originally posted by Kazan
you'll care when your card burns up
Why should it burn up when I have it properly cooled, and running standerd drivers? never heard of proper nVidia cards burning up recently.
-
Originally posted by Ace Pace
never heard of proper nVidia cards burning up recently.
i do regularily
-
Originally posted by Kazan
i do regularily
Got links?
I'd sincerely doubt they burn up regulary, or nVidia couldn't find a single buyer.