Author Topic: I Got My 6800 Gt!!!!!!  (Read 3406 times)

0 Members and 1 Guest are viewing this topic.

Quote
Originally posted by Kalfireth
Yeah, I kinda meant "suck compared with the state of the art", y'know what I mean :)



Aha, well, for GPU's 6 months out of date would be about right for the Card refresh, but its not sucky, my radeon 9700 could beat the 9800 alot.

For other stuff, like RAM, if you were lucky and got 2-2-2 RAM you are better then the state of art ;).

 

Offline Fury

  • The Curmudgeon
  • 213
When buying new computers, it does not make sense to buy mid- or low-end hardware. When you're bying, buy high-end hardware. Lasts longer. Any hardware you buy today will get outdated sooner or later, its an unwritten law so I am not going to worry about it. And I don't regret the money I used on my two computers, as I don't regret what I used on three previous computers. Low- or mid-end hardware is just for computer upgrades, not for new computers. That's my motto on this matter. :)

 

Offline Fineus

  • ...But you *have* heard of me.
  • Administrator
  • 212
    • Hard Light Productions
I go for cash-specific choices - enables you to pick up real gems that only got shunted out of the big-buck range because the next big thing hit the market (case in point: The Radeon 9800 Pro). These days I can't afford £300+ every time a new-gen graphics card hits the market.

 

Offline Gloriano

  • silver dracon
  • 210
  • Oh
Quote
buy high-end hardware


True. But I always jump one generation card (Now I have 9800XT next will be  Ati 500 series card)
You must have chaos within you to give birth to a dancing star.- Nietzsche

When in despair I remember that all through history the way of truth and love has always won; there have been tyrants and murderers, and for a time they can seem invincible, but in the end they always fall.- Mahatma Gandhi

 

Offline Kazan

  • PCS2 Wizard
  • 212
  • Soul lives in the Mountains
    • http://alliance.sourceforge.net
the two molex connectors are supposed to be on DEDICATED lines too - ie each line normally has two molex connectors on it -- but you're suppose to dedicated an entire _line_ to each connector on the card -- ie so you're actually using up 4 connectors


people _think_ the nVidias are superior because they crank slightly higher FPS - at a cost of masive ammounts of image quality
PCS2 2.0.3 | POF CS2 wiki page | Important PCS2 Threads | PCS2 Mantis

"The Mountains are calling, and I must go" - John Muir

 
Quote
Originally posted by Kazan
the two molex connectors are supposed to be on DEDICATED lines too - ie each line normally has two molex connectors on it -- but you're suppose to dedicated an entire _line_ to each connector on the card -- ie so you're actually using up 4 connectors


people _think_ the nVidias are superior because they crank slightly higher FPS - at a cost of masive ammounts of image quality


Where the hell did you get that? no PSU i've seen has 2 ends on 1 line, each line is dedicated for one end, and as I have said above, you can run it safely with 350W and 400W if you want to play it safe.

The 4 line thing is just BS.


People think nVidia won this round because it has supirior FPS in all next-gen games, and finnaly image quality that is equal to ATI.

Find me one picture from a reliable source (I.E anandtech.com , Xbitlabs.com , Tomshardware.com , Firingsquad.com and more), that shows that the NV40 has bad image quality.

 

Offline Kazan

  • PCS2 Wizard
  • 212
  • Soul lives in the Mountains
    • http://alliance.sourceforge.net
um every PSU i have ever seen has _two_ connectors on each set of power lines

2 lines x 2 connectors = 4 connectors

ie your'e suppose to dedicate _LINES_

um picking those images thery're going to be showing is going to be nVidia-advertisement images -- ie properly optimized to hide errors


there is a reason why all the high-knowledge people here are against nVidia
PCS2 2.0.3 | POF CS2 wiki page | Important PCS2 Threads | PCS2 Mantis

"The Mountains are calling, and I must go" - John Muir

 
Quote
Originally posted by Kazan
um every PSU i have ever seen has _two_ connectors on each set of power lines

2 lines x 2 connectors = 4 connectors

ie your'e suppose to dedicate _LINES_

um picking those images thery're going to be showing is going to be nVidia-advertisement images -- ie properly optimized to hide errors


there is a reason why all the high-knowledge people here are against nVidia


Okey, since i'm not that knowledgeable on PSU's i'll pass that, though mine isn't built like that.

Those pictures are taken each time by the people working in those sites, and those arn't nVidia ad's, just reguler screenshots.

EDIT: here are some very basic pictures, both cards with optimizations disabled.

geforce IQ
X800 IQ
« Last Edit: August 22, 2004, 10:09:59 am by 348 »

 

Offline BlackDove

  • Star Killer
  • 211
  • Section 3 of the GTVI
    • http://www.shatteredstar.org
Yeah, and my name is Queen of England too.

  
Quote
Originally posted by BD
Yeah, and my name is Queen of England too.


What is that supposed to mean? that every single professional tech site in the net is in nVidia's pay? each with differant IQ comparisons?



sorry if I come off hostile, that just ticked me off.

 

Offline Kazan

  • PCS2 Wizard
  • 212
  • Soul lives in the Mountains
    • http://alliance.sourceforge.net
1st thing i would like to draw your attention too

cooling systems
Nvidia 6800 http://graphics.tomshardware.com/graphic/20040414/images/card-overview.jpg
ATI X800
http://graphics.tomshardware.com/graphic/20040504/images/card01.jpg

the profile of the cooling system tells you a lot - ATIs NEWER HIGHER PERFORMANCE x800 board requires less cooling than nVidias 6800



nVidia - clear pixelation error
http://graphics.tomshardware.com/graphic/20040414/images/meermaid.jpg

ATI
http://graphics.tomshardware.com/graphic/20040504/images/hdg-01.jpg
http://graphics.tomshardware.com/graphic/20040504/images/hdg-02.jpg
http://graphics.tomshardware.com/graphic/20040504/images/ruby02.jpg
http://graphics.tomshardware.com/graphic/20040504/images/ruby03.jpg

graphically the images done by tomshardware on the ATI board are more graphically advanced

[edit]
assholes - THG is blocking hotlinking images


PS Ace those screenshots are under CLEARLY different video card settings
« Last Edit: August 22, 2004, 10:21:12 am by 30 »
PCS2 2.0.3 | POF CS2 wiki page | Important PCS2 Threads | PCS2 Mantis

"The Mountains are calling, and I must go" - John Muir

 

Offline BlackDove

  • Star Killer
  • 211
  • Section 3 of the GTVI
    • http://www.shatteredstar.org
Preview before posting :p

 

Offline Kazan

  • PCS2 Wizard
  • 212
  • Soul lives in the Mountains
    • http://alliance.sourceforge.net
when i posted it the first time they worked
PCS2 2.0.3 | POF CS2 wiki page | Important PCS2 Threads | PCS2 Mantis

"The Mountains are calling, and I must go" - John Muir

 
Yes the cooling is a problem, but many cards allready have singleslot cooling, same happened with the FX's.

Also, for pixelation error

Doom3 ATi Errors

Its the be all end all, as the game was optimized for nVidia, yet it clearly shows ATi is not perfect.

EDIT: I could see the pictures as well.

 

Offline Fury

  • The Curmudgeon
  • 213
Quote
Originally posted by Kazan
there is a reason why all the high-knowledge people here are against nVidia

There is difference being high-knowledgeable and being biased. Neither ATI or NVIDIA is perfect, both has their own flaws. Tables are turning every now and then, or did you already forget how first and second generation Radeons and their drivers sucked?

Anyway, I am sure people can base their own opinions based on reviews if they are going to buy something. If they blindly buy hardware, its their own fault. And by reviews I don't mean advertisements.

In any case, in my personal opinion NVIDIA has gotten it quite right with their latest GeForce series, as does ATI. But neither are perfect.

I wouldn't mind owning either GF 6xxx or Radeon Xxxx ( :wtf: ) series, but on the other hand, I wouldn't touch GF FX series or pre-Radeon 9xxx series.
« Last Edit: August 22, 2004, 10:28:53 am by 173 »

 
Quote
Originally posted by Mr. Fury


In any case, in my personal opinion NVIDIA has gotten it quite right with their latest GeForce series, as does ATI. But neither are perfect.


Thank you, it was what I was trying to say, neiter are perfect, neither is the absolute(For FS however, nVidia is needed, because of the shinemap issue).

The real question if your a FPS gamer considering, are you going to play HL2 or Doom3? ;)

 

Offline Kazan

  • PCS2 Wizard
  • 212
  • Soul lives in the Mountains
    • http://alliance.sourceforge.net
Fury: you know better than to post dribble like that - i go with the better option -- nVidia has a history of card burnup, driver cheating, low image quality, factory overclocking, etc

as for the Doom pixelation error that's not ATI's fault - that's ID's fault for using nvidia-specific features and writing some hacks (never thought i'd find myself admonishing carmack - but when he did this i was like WTF ARE YOU THINKING MAN!)
PCS2 2.0.3 | POF CS2 wiki page | Important PCS2 Threads | PCS2 Mantis

"The Mountains are calling, and I must go" - John Muir

 

Offline Fury

  • The Curmudgeon
  • 213
Quote
Originally posted by Ace Pace
The real question if your a FPS gamer considering, are you going to play HL2 or Doom3? ;)

Or not. Maybe you can get few FPS's more with either GeForce or Radeon but it doesn't matter in the end. You can still play both games.
If your computer can handle latest games, it can handle D3 and HL2 as well.

Kazan: You said it yourself. History. Most people want to look at the present.
And also ATI has had its own share of bad publicity, few driver cheats comes to mind. Not nearly as much as NVIDIA, but still.

But it is useless to discuss these things with you because you're stubborn, something what IPAndrews likes to say about me, which I tend to naturally overlook.
« Last Edit: August 22, 2004, 10:40:49 am by 173 »

 
Quote
Originally posted by Kazan
nVidia has a history of card burnup, driver cheating, low image quality, factory overclocking, etc

as for the Doom pixelation error that's not ATI's fault - that's ID's fault for using nvidia-specific features and writing some hacks (never thought i'd find myself admonishing carmack - but when he did this i was like WTF ARE YOU THINKING MAN!)



All ended now, as everyone agrees the drivers are fine now the IQ is fine now, also did you miss the part with ATi cheating aswell?

If I recall correctly, Doom3 had a render path specificly for ATI? or was it a path for both? Because the FX's had their own.

 
Quote
Originally posted by Mr. Fury

Or not. Maybe you can get few FPS's more with either GeForce or Radeon but it doesn't matter in the end. You can still play both games.

If your computer can handle latest games, it can handle D3 and HL2 as well.


For doom 3 esspecially, the differance is the nVidia is up to 30-40 PER CENT, faster, with no loss of IQ.

HL2 benchs for now show it about equally with ATI leading during AA and AF, but those are non final benchs.