Author Topic: Whee! We're building a new 'puter!  (Read 1861 times)

0 Members and 1 Guest are viewing this topic.

Whee! We're building a new 'puter!
Quote
Originally posted by phatosealpha

A single 6800Gt will outperform a 2-6600gt SLI setup even in SLI enabled games, and will run circles around them in anything not SLI enabled.  


actually not true, you can see the benchmarks at www.tomshardware.com, the 6600GT SLI setup outperforms a 6800GT, and both cards, Plus the extra price of a motherboard that supports SLI are the same price as one 6800GT.  So i opted for the dual 6600GTs and hey, I can always ask for 2 6800s next christmas :)
« Last Edit: February 13, 2005, 01:19:52 am by 1802 »
Carpe Diem Poste Crastinus

"When life gives you lemons...
Blind people with them..."

"Yah, dude, penises rock." Turambar

FUKOOOOV!

 
Whee! We're building a new 'puter!
SLI enabled games, yes.  Nvidia has to write a profile for each and every game that tells it how to implement SLI in the game.  Without it, you get no benefit from SLI at all - it runs in single card mode.

Or in short, unless nVidia activates SLI with a profile for each and every game, it does jack squat and you're running a single 6600gt.  Never have to worry about that with a single 6800gt.

Reappraising the benchmarks, it's a bit more back and forth then I remember......of course, the real problem is the SLI setups choke big time when you enable Image quality features.  Pretty much across the board, if you kick up AF and AA, sli loses.  Are you really gonna buy a $500 video subsystem to play without antialiasing and Ansiotropic filtering?

 
Whee! We're building a new 'puter!
Well, I brought this up with my da', and according to him, your incorrect.

From the SLI Developer's FAQ:

Quote
Will my game or application just run on NVIDIA's SLI technology?

Yes.
Developers are not required to make changes to make their application work on NVIDIA's SLI solution.  In fact, developers are not even required to make changes to enable the speed-boost available on a multi-GPU system.



...might wanna tell nVidia they don't know what their own stuff is or isn't capable of. ;)
« Last Edit: February 13, 2005, 02:11:55 am by 1802 »
Carpe Diem Poste Crastinus

"When life gives you lemons...
Blind people with them..."

"Yah, dude, penises rock." Turambar

FUKOOOOV!

 
Whee! We're building a new 'puter!
Re-Read that.

DEVELOPERS don't have to make any changes.  And they don't.

NVIDIA has to make a profile.



http://www.hardocp.com/article.html?art=NzEx

http://www.nzone.com/object/nzone_sli_certifiedgames.html



In short, unless nVidia makes the profile, the game doesn't run in SLI.  Which is exactly what I said.

  

Offline WMCoolmon

  • Purveyor of space crack
  • 213
Whee! We're building a new 'puter!
Quote
Originally posted by Jetmech Jr.
SoundBlaster Audigy 7.1 Channel Integrated on the Motherboard.

We ain't exactly into a decked out sound system.


Eep.
-C

 
Whee! We're building a new 'puter!
Quote
Originally posted by phatosealpha
Re-Read that.

DEVELOPERS don't have to make any changes.  And they don't.

NVIDIA has to make a profile.



http://www.hardocp.com/article.html?art=NzEx

http://www.nzone.com/object/nzone_sli_certifiedgames.html



In short, unless nVidia makes the profile, the game doesn't run in SLI.  Which is exactly what I said.


Overreaction.

There are already several user-made programs available, designed for just that: To create Profiles for games.

http://grestorn.webinit.de/yape/Readme.rtf

In particular, this line:

Quote
And one of the most important advantages is that you have full control over all SLI settings, so you can make all the games work with your SLI setup, even though nVidia doesn't provide a profile for them yet.

Carpe Diem Poste Crastinus

"When life gives you lemons...
Blind people with them..."

"Yah, dude, penises rock." Turambar

FUKOOOOV!

 
Whee! We're building a new 'puter!
Well, good luck to you with that.   I wouldn't go that way, for the wide variety of reasons listed, but hey, not my machine :)

At any rate, reconsider your power supply.

 

Offline Liberator

  • Poe's Law In Action
  • 210
Whee! We're building a new 'puter!
Dumb question, but I thought all SLI did was split the video processing in 2, giving each gpu half the load(screen) at a slight cpu overhead?

So why should it need a special hardware setting?  Since, according to my knowledge(which is far, far from complete), it would be splitting the load all the time.
So as through a glass, and darkly
The age long strife I see
Where I fought in many guises,
Many names, but always me.

There are only 10 types of people in the world , those that understand binary and those that don't.

 
Whee! We're building a new 'puter!
AFAIK, it's a matter of how exactly to implement the SLI.  You can either have the two cards rendering alternating frames, or you can have it so one renders the top of the screen, the other renders the bottom.

The former mode is generally faster, but if the game uses framebuffer effects that blur between frames, it won't work properly.  The second has more overhead because it has to constantly be rebalancing the load on the GPUs, but should be more compatible.  Thus the profiles, to determine which mode it should run in.

Why exactly nVidia defaults to single GPU operation I'm not sure.  I think it's a matter of them not wanting to have people using it and have unexpected image quality problems which would not make them happy, but I can't say for certain.

 

Offline Ford Prefect

  • 8D
  • 26
  • Intelligent Dasein
Whee! We're building a new 'puter!
Shocks... pegs...

Luckyyyy!
"Mais est-ce qu'il ne vient jamais à l'idée de ces gens-là que je peux être 'artificiel' par nature?"  --Maurice Ravel