Hard Light Productions Forums

Off-Topic Discussion => General Discussion => Topic started by: Turambar on August 19, 2004, 07:08:47 pm

Title: I Got My 6800 Gt!!!!!!
Post by: Turambar on August 19, 2004, 07:08:47 pm
this thing is so amazingly powerful, i haven't been able to make it stutter or slow at all with any game I own  FS runs great and somehow looks better too.

I'm waiting for the next amazing breakthrough now
Title: I Got My 6800 Gt!!!!!!
Post by: vyper on August 19, 2004, 07:18:00 pm
Die.
Title: I Got My 6800 Gt!!!!!!
Post by: Anaz on August 19, 2004, 07:24:04 pm
Quote
Originally posted by vyper
Die.
Title: I Got My 6800 Gt!!!!!!
Post by: Ghostavo on August 19, 2004, 07:34:40 pm
Quote
Originally posted by vyper
Die.


Painfully!! :mad:
Title: I Got My 6800 Gt!!!!!!
Post by: Turambar on August 19, 2004, 08:01:23 pm
I don't think I've done this for you guys yet
ahem *cough*

Mwahahahahahaahahahahahahahahahahaahahahah!!!!!!
Title: I Got My 6800 Gt!!!!!!
Post by: Martinus on August 19, 2004, 08:26:25 pm
[color=66ff00]I can happily say enjoy; probably only because I've been saving for a bit and can afford a monster system if I so wished. ;)
[/color]
Title: I Got My 6800 Gt!!!!!!
Post by: Taristin on August 19, 2004, 08:42:34 pm
You can have your 6800 GT. I'm not jealous. Why? Because I:


So enjoy. ;) :p
Title: I Got My 6800 Gt!!!!!!
Post by: Lightspeed on August 19, 2004, 08:47:59 pm
GeForce cards suck anyway. Radeon 0wnage.
Title: I Got My 6800 Gt!!!!!!
Post by: Liberator on August 19, 2004, 09:36:41 pm
Most people don't give a rat's petute about 3-4 pecentage point, Lightspeed.
Title: I Got My 6800 Gt!!!!!!
Post by: BlackDove on August 19, 2004, 09:50:31 pm
Quote
Originally posted by vyper
Die.


Shinu

Crkni

Umri

Title: I Got My 6800 Gt!!!!!!
Post by: Lightspeed on August 19, 2004, 09:53:34 pm
Quote
Originally posted by Liberator
Most people don't give a rat's petute about 3-4 pecentage point, Lightspeed.


Maybe they give a damn about extremely higher energy consumption, an extremely worse card layout, an extremely worse thermal solution,....

And... additionally to that.... it's nVidia *shudders* (I know I'm biased :p )
Title: I Got My 6800 Gt!!!!!!
Post by: BlackDove on August 19, 2004, 10:09:52 pm
I don't want to hear that from someone who was using a 3dfx Voodoo while everyone else had something that was using an AGP slot.

Oh yeah, I forgot something important.

nVidia sucks.
Title: I Got My 6800 Gt!!!!!!
Post by: Lightspeed on August 19, 2004, 10:23:19 pm
Why, my Voodoo rocked. I could even play Freelancer half-decently with it!

3dfx owned. Sad that they got left behind, actually.
Title: I Got My 6800 Gt!!!!!!
Post by: Liberator on August 19, 2004, 10:42:00 pm
Not too mention the fact that some of us were too stupid back then to get(that's right "get" not "build") a computer with an AGP slot.  BTW, I upgraded from a Voodoo 3 2000 16mb to a Nvidia Geforce 4 MX 420 64mb.
Title: I Got My 6800 Gt!!!!!!
Post by: Lightspeed on August 19, 2004, 11:11:14 pm
My Voodoo was an AGP-card. :D
Title: I Got My 6800 Gt!!!!!!
Post by: Fury on August 20, 2004, 02:23:37 am
I can sense jealousy of the poor. And I don't need jedi powers for that. ;)
Title: I Got My 6800 Gt!!!!!!
Post by: Stealth on August 20, 2004, 02:58:19 am
Quote

Just bought a kick ass spoiler for my car (Which has been repaired and will get painted in a few weeks)


lol.  i'll bet you can go at high speeds with greatly improved traction and handling now




Quote
Maybe they give a damn about extremely higher energy consumption, an extremely worse card layout, an extremely worse thermal solution,....


if the card sucks so much, why are so many people buying it, when they could get, for more-or-less- the same price, the radeon x800... which is obviously superior ;)
Title: I Got My 6800 Gt!!!!!!
Post by: Nix on August 20, 2004, 02:59:41 am
Lightspeed's right about the card layout.  on my new 9800 Pro, the card is about an inch shorter in length than my old Geforce 4400 card.  Plus, having to shell out money for another power supply along with a 6XXX series card just makes me shudder.  I lost a piddly modem due to a power brownout that happened a while back.  With that much power running through that card, imagine what could happen if there was a brownout/spike/whatever.  
Especially after spending the major dough for that card to see it fry.  *SHIVVER*
Title: I Got My 6800 Gt!!!!!!
Post by: Liberator on August 20, 2004, 03:07:32 am
As I understand it, the card doesn't actually need all that much more power.  It just needs more than the AGP slot can provide, PCI-E has addressed this but you'd have thought They would have seen the power needs coming and beefed up the standard ahead of time.  Regardless, Nvidia's recommendation of PSU size is to ensure a stable current flow to the card like it gets from the bus.
Title: I Got My 6800 Gt!!!!!!
Post by: Stealth on August 20, 2004, 03:09:44 am
i've heard it's more power demanding, and a lot of people that get it get new power supplies too.

i still think it's funny that a video card has more RAM and draws more power than the rest of the computer ;) :D
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 20, 2004, 04:27:15 am
Turamber, nice, but is it paired with a fast CPU and good memory?


Fine, the NV40 layout sucks... SO why is it running anywhere from 5 to 25% faster on most games?

Also, a X800 reguler would suck, with only 12 pipes, while a GT is overclockable to Ultra speeds, with a standerd cooler, and no one runs standerd, you run something better.
Title: I Got My 6800 Gt!!!!!!
Post by: Lightspeed on August 20, 2004, 11:14:16 am
You'l have to put in TWO seperate molex connectors (i.e. not on one string) and nVidia puts it as nicely as "we recommend a 480 W power supply".

The X800 PE drains LESS power than the 9800 Pro, needs one connector, and thus no special power supply either. That's life, huh?
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 20, 2004, 11:20:45 am
Quote
Originally posted by Lightspeed
You'l have to put in TWO seperate molex connectors (i.e. not on one string) and nVidia puts it as nicely as "we recommend a 480 W power supply".

The X800 PE drains LESS power than the 9800 Pro, needs one connector, and thus no special power supply either. That's life, huh?


Most new power supplies come with 5 or more molex connectors, that should be enough for your need. And while nVidia recommends 480W the 6800 Ultra has been demonstrated to run at a 350W PSU on a full rig, and on 300W with a bare bones sytem.

The X800 PE at the price of an ultra is not worth it, esspecially since a GT at 100 $ cheaper can beat it.

So what if it drains less then a 9800? are you lacking in power? Again, with PCI-E, the geforce only needs 1 connector, putting it on the same level as the Radeon.


While I like Radeon cards, and currently own one, I freely admit, that unless near games like HL2, Sims, Rome and for the further titles like STALKER and FEAR, are in favour of ATI in the same way that Doom3 favours Nvidia(I wont get indepth here), there is no way ATI can claim it has 'Won' this round.


Now if, the current situation continues, with the card refresh, ATI pulls an Nvidia, and suddenly is totatly competetive, then we will be back to the 9xx and FX line, only with roles reversed.
Title: I Got My 6800 Gt!!!!!!
Post by: vyper on August 20, 2004, 11:21:50 am
'cos a 480W PSU is so much hassle.... :wtf:
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 20, 2004, 11:24:48 am
Quote
Originally posted by vyper
'cos a 480W PSU is so much hassle.... :wtf:


 See Above, a good 350 should serve you well if the computer isn't loaded with more stuff, 400 in that case.
Title: I Got My 6800 Gt!!!!!!
Post by: Lightspeed on August 20, 2004, 11:35:02 am
Running something with power supplies that don't match the hardware specifications is not too good of an idea.

Other than that, the power drain is the cause to a lot of problems. Heat, for example.
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 20, 2004, 11:45:37 am
Quote
Originally posted by Lightspeed
Running something with power supplies that don't match the hardware specifications is not too good of an idea.

Other than that, the power drain is the cause to a lot of problems. Heat, for example.


Actully, nVidia admitted it could run safely at 350W, and that the 480W was only pre-cautionary, since there are many PSU's, that while claiming to provide 480W, are actully far weaker.

On heating, as Xbit proved, you can use a geforce 6800 GT's specification cooler, and still surprass the Ultra in speeds, with no heat problems, the cooler is excellent, and most brand cards have far better coolers.
Title: I Got My 6800 Gt!!!!!!
Post by: Taristin on August 20, 2004, 01:33:16 pm
Quote
Originally posted by Stealth


lol.  i'll bet you can go at high speeds with greatly improved traction and handling now



With my automatic, non-v-tec integra? What are you smoking? :p

I didn't say 'Hug3 r34r w1ng!!!1! Liek F4st a|\| Furiuss!!' :p  I got the factory OEM style Type-R rear spoiler. Because it looks good.

I'm not a ricer so :booty: to you. :p
Title: I Got My 6800 Gt!!!!!!
Post by: Stealth on August 21, 2004, 03:33:43 am
yeah i know, i was sarcastically humoring you with that statement ;)

OK as long as it's the factory spoiler, and not some two foot high spoiler that's supposed to look cool kind of spoiler ;)
Title: I Got My 6800 Gt!!!!!!
Post by: Fear on August 21, 2004, 07:46:07 am
Quote
Originally posted by Raa
You can have your 6800 GT. I'm not jealous. Why? Because I:
  • have a sweet 9600 XT which runs excellently and cost me a fraction of what you spent
  • Just bought a kick ass spoiler for my car (Which has been repaired and will get painted in a few weeks)
  • Still have enough money to buy both sweet rims and tires for my car, and a new processor and memory upgrade for my PC.


So enjoy. ;) :p [/B]


Die
Title: I Got My 6800 Gt!!!!!!
Post by: Fury on August 21, 2004, 08:09:06 am
Four months back I used 4000 € to two computers (I think that's 4,928.99 USD) and last month 5000 € to new car. (6,161.24 USD) The car costs 15500 € total. (19,099.84 USD)

It is good to have a job that pays reasonably. :p
Title: I Got My 6800 Gt!!!!!!
Post by: Stealth on August 21, 2004, 11:36:50 am
what car was it?
Title: I Got My 6800 Gt!!!!!!
Post by: Fear on August 21, 2004, 06:45:37 pm
Quote
Originally posted by Stealth
what car was it?


yep i wanted to ask the samke question.:ha:
Title: Re: I Got My 6800 Gt!!!!!!
Post by: Krackers87 on August 21, 2004, 07:19:15 pm
Quote
Originally posted by Turambar
this thing is so amazingly powerful, i haven't been able to make it stutter or slow at all with any game I own  FS runs great and somehow looks better too.

I'm waiting for the next amazing breakthrough now


HAH! im getting the Ultra as soon as itt comes out, possibly even the Ultra Overclocked version :P
Title: Re: Re: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 22, 2004, 12:03:14 am
Quote
Originally posted by Krackers87


HAH! im getting the Ultra as soon as itt comes out, possibly even the Ultra Overclocked version :P



what a waste of money, an ultra is allready just a GT overclocked, and with the standerd GT cooler, you can easily reach ultra speeds and beyond.
Title: I Got My 6800 Gt!!!!!!
Post by: Fury on August 22, 2004, 02:54:14 am
Quote
Originally posted by Stealth
what car was it?

Nissan Micra. Good and economical car for one person. :)
Title: I Got My 6800 Gt!!!!!!
Post by: Ashrak on August 22, 2004, 04:55:53 am
and i should care that you spent a fortune on computer component that is gonna SUCK in 6 months? i can barely make stuff studder on my system too and i have R9600XT ....
Title: I Got My 6800 Gt!!!!!!
Post by: Fineus on August 22, 2004, 05:14:56 am
All computer components (more or less) suck in 6 months - thats just how it is.
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 22, 2004, 06:14:36 am
Quote
Originally posted by Kalfireth
All computer components (more or less) suck in 6 months - thats just how it is.


Not suck, just not state of art, the nice thing about the 6800 is that its quite advance for now, SM3.0 and all that, and if you buy something upgradeable, you got a clear road.
Title: I Got My 6800 Gt!!!!!!
Post by: Fineus on August 22, 2004, 06:17:30 am
Yeah, I kinda meant "suck compared with the state of the art", y'know what I mean :)
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 22, 2004, 06:22:59 am
Quote
Originally posted by Kalfireth
Yeah, I kinda meant "suck compared with the state of the art", y'know what I mean :)



Aha, well, for GPU's 6 months out of date would be about right for the Card refresh, but its not sucky, my radeon 9700 could beat the 9800 alot.

For other stuff, like RAM, if you were lucky and got 2-2-2 RAM you are better then the state of art ;).
Title: I Got My 6800 Gt!!!!!!
Post by: Fury on August 22, 2004, 07:40:19 am
When buying new computers, it does not make sense to buy mid- or low-end hardware. When you're bying, buy high-end hardware. Lasts longer. Any hardware you buy today will get outdated sooner or later, its an unwritten law so I am not going to worry about it. And I don't regret the money I used on my two computers, as I don't regret what I used on three previous computers. Low- or mid-end hardware is just for computer upgrades, not for new computers. That's my motto on this matter. :)
Title: I Got My 6800 Gt!!!!!!
Post by: Fineus on August 22, 2004, 07:58:08 am
I go for cash-specific choices - enables you to pick up real gems that only got shunted out of the big-buck range because the next big thing hit the market (case in point: The Radeon 9800 Pro). These days I can't afford £300+ every time a new-gen graphics card hits the market.
Title: I Got My 6800 Gt!!!!!!
Post by: Gloriano on August 22, 2004, 09:41:29 am
Quote
buy high-end hardware


True. But I always jump one generation card (Now I have 9800XT next will be  Ati 500 series card)
Title: I Got My 6800 Gt!!!!!!
Post by: Kazan on August 22, 2004, 09:55:26 am
the two molex connectors are supposed to be on DEDICATED lines too - ie each line normally has two molex connectors on it -- but you're suppose to dedicated an entire _line_ to each connector on the card -- ie so you're actually using up 4 connectors


people _think_ the nVidias are superior because they crank slightly higher FPS - at a cost of masive ammounts of image quality
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 22, 2004, 10:01:09 am
Quote
Originally posted by Kazan
the two molex connectors are supposed to be on DEDICATED lines too - ie each line normally has two molex connectors on it -- but you're suppose to dedicated an entire _line_ to each connector on the card -- ie so you're actually using up 4 connectors


people _think_ the nVidias are superior because they crank slightly higher FPS - at a cost of masive ammounts of image quality


Where the hell did you get that? no PSU i've seen has 2 ends on 1 line, each line is dedicated for one end, and as I have said above, you can run it safely with 350W and 400W if you want to play it safe.

The 4 line thing is just BS.


People think nVidia won this round because it has supirior FPS in all next-gen games, and finnaly image quality that is equal to ATI.

Find me one picture from a reliable source (I.E anandtech.com , Xbitlabs.com , Tomshardware.com , Firingsquad.com and more), that shows that the NV40 has bad image quality.
Title: I Got My 6800 Gt!!!!!!
Post by: Kazan on August 22, 2004, 10:03:24 am
um every PSU i have ever seen has _two_ connectors on each set of power lines

2 lines x 2 connectors = 4 connectors

ie your'e suppose to dedicate _LINES_

um picking those images thery're going to be showing is going to be nVidia-advertisement images -- ie properly optimized to hide errors


there is a reason why all the high-knowledge people here are against nVidia
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 22, 2004, 10:05:43 am
Quote
Originally posted by Kazan
um every PSU i have ever seen has _two_ connectors on each set of power lines

2 lines x 2 connectors = 4 connectors

ie your'e suppose to dedicate _LINES_

um picking those images thery're going to be showing is going to be nVidia-advertisement images -- ie properly optimized to hide errors


there is a reason why all the high-knowledge people here are against nVidia


Okey, since i'm not that knowledgeable on PSU's i'll pass that, though mine isn't built like that.

Those pictures are taken each time by the people working in those sites, and those arn't nVidia ad's, just reguler screenshots.

EDIT: here are some very basic pictures, both cards with optimizations disabled.

geforce IQ (http://www.xbitlabs.com/misc/picture/?src=/images/video/graphics_cards_1H-2004/halo_6800_true.jpg&1=1)
X800 IQ (http://www.xbitlabs.com/misc/picture/?src=/images/video/graphics_cards_1H-2004/halo_x800_true.jpg&1=1)
Title: I Got My 6800 Gt!!!!!!
Post by: BlackDove on August 22, 2004, 10:08:26 am
Yeah, and my name is Queen of England too.
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 22, 2004, 10:11:03 am
Quote
Originally posted by BD
Yeah, and my name is Queen of England too.


What is that supposed to mean? that every single professional tech site in the net is in nVidia's pay? each with differant IQ comparisons?



sorry if I come off hostile, that just ticked me off.
Title: I Got My 6800 Gt!!!!!!
Post by: Kazan on August 22, 2004, 10:18:42 am
1st thing i would like to draw your attention too

cooling systems
Nvidia 6800 http://graphics.tomshardware.com/graphic/20040414/images/card-overview.jpg
ATI X800
http://graphics.tomshardware.com/graphic/20040504/images/card01.jpg

the profile of the cooling system tells you a lot - ATIs NEWER HIGHER PERFORMANCE x800 board requires less cooling than nVidias 6800



nVidia - clear pixelation error
http://graphics.tomshardware.com/graphic/20040414/images/meermaid.jpg

ATI
http://graphics.tomshardware.com/graphic/20040504/images/hdg-01.jpg
http://graphics.tomshardware.com/graphic/20040504/images/hdg-02.jpg
http://graphics.tomshardware.com/graphic/20040504/images/ruby02.jpg
http://graphics.tomshardware.com/graphic/20040504/images/ruby03.jpg

graphically the images done by tomshardware on the ATI board are more graphically advanced

[edit]
assholes - THG is blocking hotlinking images


PS Ace those screenshots are under CLEARLY different video card settings
Title: I Got My 6800 Gt!!!!!!
Post by: BlackDove on August 22, 2004, 10:20:05 am
Preview before posting :p
Title: I Got My 6800 Gt!!!!!!
Post by: Kazan on August 22, 2004, 10:21:27 am
when i posted it the first time they worked
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 22, 2004, 10:22:26 am
Yes the cooling is a problem, but many cards allready have singleslot cooling, same happened with the FX's.

Also, for pixelation error

Doom3 ATi Errors (http://www.xbitlabs.com/misc/picture/?src=/images/video/doom-3-tests/quality-modes/2_ati_ultra_2.jpg&1=1)

Its the be all end all, as the game was optimized for nVidia, yet it clearly shows ATi is not perfect.

EDIT: I could see the pictures as well.
Title: I Got My 6800 Gt!!!!!!
Post by: Fury on August 22, 2004, 10:23:22 am
Quote
Originally posted by Kazan
there is a reason why all the high-knowledge people here are against nVidia

There is difference being high-knowledgeable and being biased. Neither ATI or NVIDIA is perfect, both has their own flaws. Tables are turning every now and then, or did you already forget how first and second generation Radeons and their drivers sucked?

Anyway, I am sure people can base their own opinions based on reviews if they are going to buy something. If they blindly buy hardware, its their own fault. And by reviews I don't mean advertisements.

In any case, in my personal opinion NVIDIA has gotten it quite right with their latest GeForce series, as does ATI. But neither are perfect.

I wouldn't mind owning either GF 6xxx or Radeon Xxxx ( :wtf: ) series, but on the other hand, I wouldn't touch GF FX series or pre-Radeon 9xxx series.
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 22, 2004, 10:26:16 am
Quote
Originally posted by Mr. Fury


In any case, in my personal opinion NVIDIA has gotten it quite right with their latest GeForce series, as does ATI. But neither are perfect.


Thank you, it was what I was trying to say, neiter are perfect, neither is the absolute(For FS however, nVidia is needed, because of the shinemap issue).

The real question if your a FPS gamer considering, are you going to play HL2 or Doom3? ;)
Title: I Got My 6800 Gt!!!!!!
Post by: Kazan on August 22, 2004, 10:31:05 am
Fury: you know better than to post dribble like that - i go with the better option -- nVidia has a history of card burnup, driver cheating, low image quality, factory overclocking, etc

as for the Doom pixelation error that's not ATI's fault - that's ID's fault for using nvidia-specific features and writing some hacks (never thought i'd find myself admonishing carmack - but when he did this i was like WTF ARE YOU THINKING MAN!)
Title: I Got My 6800 Gt!!!!!!
Post by: Fury on August 22, 2004, 10:32:39 am
Quote
Originally posted by Ace Pace
The real question if your a FPS gamer considering, are you going to play HL2 or Doom3? ;)

Or not. Maybe you can get few FPS's more with either GeForce or Radeon but it doesn't matter in the end. You can still play both games.
If your computer can handle latest games, it can handle D3 and HL2 as well.

Kazan: You said it yourself. History. Most people want to look at the present.
And also ATI has had its own share of bad publicity, few driver cheats comes to mind. Not nearly as much as NVIDIA, but still.

But it is useless to discuss these things with you because you're stubborn, something what IPAndrews likes to say about me, which I tend to naturally overlook.
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 22, 2004, 10:34:08 am
Quote
Originally posted by Kazan
nVidia has a history of card burnup, driver cheating, low image quality, factory overclocking, etc

as for the Doom pixelation error that's not ATI's fault - that's ID's fault for using nvidia-specific features and writing some hacks (never thought i'd find myself admonishing carmack - but when he did this i was like WTF ARE YOU THINKING MAN!)



All ended now, as everyone agrees the drivers are fine now the IQ is fine now, also did you miss the part with ATi cheating aswell?

If I recall correctly, Doom3 had a render path specificly for ATI? or was it a path for both? Because the FX's had their own.
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 22, 2004, 10:35:34 am
Quote
Originally posted by Mr. Fury

Or not. Maybe you can get few FPS's more with either GeForce or Radeon but it doesn't matter in the end. You can still play both games.

If your computer can handle latest games, it can handle D3 and HL2 as well.


For doom 3 esspecially, the differance is the nVidia is up to 30-40 PER CENT, faster, with no loss of IQ.

HL2 benchs for now show it about equally with ATI leading during AA and AF, but those are non final benchs.
Title: I Got My 6800 Gt!!!!!!
Post by: Fury on August 22, 2004, 10:39:03 am
Ace Pace, I wouldn't worry about that. Sounds like overshoot to me, more like maybe 10%.
Future ATI drivers will improve OGL performance and I am sure future D3 patches will also improve performance. Why? Because ID isn't selling D3 to only NVIDIA users.
Title: I Got My 6800 Gt!!!!!!
Post by: Kazan on August 22, 2004, 10:42:22 am
actually they are since they're using an nVidia specific feature
Title: I Got My 6800 Gt!!!!!!
Post by: Fury on August 22, 2004, 10:51:40 am
It has nothing to do with sales. And to ensure that people in both camps stay happy to promote even more sales.
Title: I Got My 6800 Gt!!!!!!
Post by: Krackers87 on August 22, 2004, 11:34:58 am
Ya, well when Direct X 9.0c  enabled progys start coming out, ATI's X800 will show its true flaws.
Title: I Got My 6800 Gt!!!!!!
Post by: Fury on August 22, 2004, 11:43:34 am
When true 9.0C titles come, we're already one or two graphics card generations further.
Title: I Got My 6800 Gt!!!!!!
Post by: Gloriano on August 22, 2004, 11:43:47 am
everyone has his own opion and some peoples like Ati and some Nvidia

Me I always liked ATI Radeons over Nvidia



edit.
Title: I Got My 6800 Gt!!!!!!
Post by: Cyker on August 22, 2004, 12:11:52 pm
I've historically prefered 3Dfx, then nVidia then ATI.
3Dfx's drivers have always been solid. I still have my Voodoo 2 (Can't actually use it since Win2k+ doesn't have any V2 drivers... :().

nVidia's drivers have always been fairly stable and solid, and they do update them on a fairly regular basis. Their hardware has always leaned on the Crap side 'tho in terms of quality. They seem to just throw transistors at every problem. No once can deny their stuff has all the bells and whistles (Heck, they're the only card with the latest Pixel Shader engines), but the way they're designed feels so... inelegant. It's like writing a Notepad program in MFC vs WinAPI.

ATI sucked hard until the Radeon series came out. I was pretty amazed by that - Going from **** hardware and **** software to pretty darn good hardware and okay software :)
The softs have been getting better too (I guess they finally realised that they need to employ more than one guy to QA their drivers :D), but IMHO still don't match nVidia's. And ATI do cheat as much as nVidia, they're just sneakier about it.

*Rant!*

But sod all that, I'm sticking with my Ti4600. I mean, does anybody not find it... I dunno... '****ed up' that a video card needs more power than any other component in the system?!

The Voodoo5 was bad enough - It was stupid then and it still is. But then nVidia go and trump them!
This thing about the 6800 needing two power connectors?! Has nobody else thought "WTF?!"

First of all, all Molex plugs are wired into the same originating points inside a PSU so unless the current draw is enough to melt the cables then you shouldn't need to plug 2 independant cables in!
And the fact that they also state that you can't plug any other components into those sockets apart from wussy crap like fans just scares the crap out of me!

If things keep on like this we'll need to buy a separate PSU to power the whole bloody card!

Jeez I can just see it...

"Intel announces a revision to the forthcoming BTX case specification, adding provision for 3 extra Power supplys to be added.
In other news, they have also revised the PCI-X to provide extra power rails for the more power-hungry PCI-X cards
(Insert picture of 2cm-thick copper tracks embedded in motherboard)"
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 22, 2004, 12:45:21 pm
Quote
Originally posted by Krackers87
Ya, well when Direct X 9.0c  enabled progys start coming out, ATI's X800 will show its true flaws.


True, but by then, every single card from today will be to slow, 6800 included, look how long it took for SM2 to arrive.
Title: I Got My 6800 Gt!!!!!!
Post by: pyro-manic on August 22, 2004, 01:15:52 pm
Nice card, the 6800GT. Almost ordered one of those the other day, but in the end I decided on a Radeon X800 Pro VIVO, because you can flash them to X800XT, and they're a bit cheaper than the GTs ;). Sticking it in with an Athlon64 3500+ and a gig of RAM. Should be rather fast.... :D

EDIT: Oh, and isn't this a rather silly argument?* "nVidia 0wnz0rz ATI!" "No, ATI 0wnz0rz nVidia!" etc. They're both very equally matched, and they both kick the living crap out of the previous generation, so whatever you get, you'll have more than enough power for anything you can throw at them in the near future.

*Unless you lot enjoy bickering about it, in which case carry on! :D
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 22, 2004, 01:18:57 pm
Quote
Originally posted by pyro-manic
Nice card, the GT. Almost ordered one of those the other day, but in the end I decided on a Radeon X800 Pro VIVO, because you can flash them to X800XT and save £50+ ;). Sticking it in with an Athlon64 3500+ and a gig of RAM. Should be rather fast.... :D



Are you sure its safely flashable? I can't remember what was the card that you could flash, but on high end games like D3, it would create a mass of artifacts and no unflash ability. :sigh:
Title: I Got My 6800 Gt!!!!!!
Post by: pyro-manic on August 22, 2004, 01:25:20 pm
I think it's only the VIVO version. The normal Pro has the extra 4 pipes physically disabled, so you have to mod the board, but the VIVOs are (apparently) rebranded XTs, so you just change the bios and it switches tehm back on. Linkage below - this is the card I'm getting, and it says how to do it in the review. :)

http://www.ocprices.com/index.php?action=reviews&rev_id=230&page=6
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 22, 2004, 01:28:37 pm
Quote
Originally posted by pyro-manic

EDIT: Oh, and isn't this a rather silly argument?* "nVidia 0wnz0rz ATI!" "No, ATI 0wnz0rz nVidia!" etc. They're both very equally matched, and they both kick the living crap out of the previous generation, so whatever you get, you'll have more than enough power for anything you can throw at them in the near future.

*Unless you lot enjoy bickering about it, in which case carry on! :D [/B]


As I see it here, its not what better, its really more about nVidia then about ATi.

And yes, I enjoy bickering about in graphics. :thepimp:
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 22, 2004, 03:01:38 pm
Just to finsh matters with the debate.

Fact: the Geforce 6800 GT can easily be overclocked to Ultra levels safely.

Fact: The GT needs only 1 power connector.
Fact: the GT is a singe slot cooling design.

Whats the problem again?

EDIT: picture supporting the single slot design and single power connector is Evidence (http://www.beyond3d.com/previews/nvidia/6800s/images/all.jpg)
Title: I Got My 6800 Gt!!!!!!
Post by: Fineus on August 22, 2004, 03:07:29 pm
Am I the only one here who's going with "who cares" as a final conclusion... so long as what I have can run what I want to a decent level of quality and speed - the rest is bickering for the sake of bickering :)
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 22, 2004, 04:03:55 pm
Quote
Originally posted by Kalfireth
Am I the only one here who's going with "who cares" as a final conclusion... so long as what I have can run what I want to a decent level of quality and speed - the rest is bickering for the sake of bickering :)


Exactly, I don't really care, probebly getting a ATi anyway, just for loyalty, but its fun to debate.
Title: I Got My 6800 Gt!!!!!!
Post by: Turambar on August 22, 2004, 04:26:51 pm
I got my 6800 GT because i like nvidia cards, and for the simple fact that it Pwnz j00!

It's not the only card that Pwnz j00!, several ATI cards Pwn J00! too, but the 6800GT is my card, and it Pwnz J00!
Title: I Got My 6800 Gt!!!!!!
Post by: Martinus on August 22, 2004, 04:33:43 pm
[color=66ff00]That last remark makes me think you're trying to augment something...
[/color]
Title: I Got My 6800 Gt!!!!!!
Post by: vyper on August 22, 2004, 04:37:15 pm
Should check his e-mail for that.
Title: I Got My 6800 Gt!!!!!!
Post by: Fineus on August 22, 2004, 04:43:00 pm
Would you like a bigger penis? Where would you like it? I can suggest some places...
Title: I Got My 6800 Gt!!!!!!
Post by: Kazan on August 22, 2004, 05:43:47 pm
you'll care when your card burns up
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 22, 2004, 05:45:47 pm
Quote
Originally posted by Kazan
you'll care when your card burns up


Why should it burn up when I have it properly cooled, and running standerd drivers? never heard of proper nVidia cards burning up recently.
Title: I Got My 6800 Gt!!!!!!
Post by: Kazan on August 22, 2004, 05:53:39 pm
Quote
Originally posted by Ace Pace
never heard of proper nVidia cards burning up recently.


i do regularily
Title: I Got My 6800 Gt!!!!!!
Post by: Ace Pace on August 22, 2004, 05:57:10 pm
Quote
Originally posted by Kazan


i do regularily


Got links?

I'd sincerely doubt they burn up regulary, or nVidia couldn't find a single buyer.