Hard Light Productions Forums
Off-Topic Discussion => General Discussion => Topic started by: Liberator on April 29, 2005, 11:05:18 pm
-
Poking around before bed and saw this:
http://www.theinquirer.net/?article=22917
The Nvidia was asking PCISig for 75w more power.
So you'd basically have:
75w from the slot
75w from one 6 pin connector
75w from a second 6 pin connector
That is just plain obscene.
-
And that's why I'm glad I went for the NV40.
*Grey Wolf waits for his 6800GT to arrive
-
*pats his GF6600GT*
:nervous:
-
ATI 9800
-
The new P4s are supposed to pull down about 130w
-
Just buy the biggest baddest power supply available and dont worry.
-
Well if you figure the following:
Mobo == 45w
Processor == 50-60w
HDD == 25w per(min 2)
Optical == 25w per(min 1)
Video == 225
225+25+50+60+45 == That's 405w minimum...MINIMUM!!! There is no reason for it to be that high. That is more power than the rest of the system combined...
-
You forgot memory, mouse, keyboard, and any usb peripherals you might have.
Edit: And fans/heat pipes.
-
I see rising electricy bills with that...
Time to make a power generator out of a perpetual motion machine then :P
-
Or a heat->energy capacitor.
-
What's the best budget Video card?
I was thinking of an ATI 9800 Pro. I havn't been following Nvidia anymore.
I'll probably be looking to upgrade very soon. It should be fairly cheap (within $300-$350 AU) but if there is a drastic issue, I'd consider paying slightly more. The less the better though.
9800 Pro seems to be the way. Anyone who's got them, have you ever experienced considerable slow down in any game?
-
Originally posted by Grug
What's the best budget Video card?
Originally posted by WMCoolmon
*pats his GF6600GT*
I experience considerable slowdown if I set AAx8 , ASx16, and "High" for everything in Counter-Strike/Half Life 2 at 1280x1024.
No AA, it runs pretty swell. :)
-
I'd seriously have trouble recommending a 9800 pro these days, not with the 6600gts out there for the same price.
-
IIRC, how much is the 6600Gts?
-
$150-$200 nowadays.
-
Newegg is showing a chaintech 6600gt for $159US. The most expensive agp 6600gt is the $225 BFG. 9800 pros in agp run from $130 to $290.
-
hmm...that isn't horridly expensive. I bet if i traded in this 9200 in a few shops her and there, i'd get a descent discount on it too.
Edit: where are my manners? Thank you WMCoolmon
-
Anyway, in regard to the g70 that's no suprise to me. Current gen graphics cards are pretty darned power hungry. A 6800 ultra requires 2 molex connectors as is.
And maybe I'm just being optimistic here, but maybe this is exactly the kind of motivation people need to stop skimping on power supplies?
-
'Skimping' is relative. I'd be skimping as much with my current 450w power supply with two 6800-ultras as if I had a 300w with my 6600GT.
Plus it's annoying if you use MicroATX, or SFF.
-
I was actually refering more to those who sell cases with PSUs and OEM system manufacturers.
And to be honest, I'd expect putting a g70 in a MicroATX or SFF case would get you more thermal problems then electrical issues.
-
Originally posted by Liberator
Well if you figure the following:
Mobo == 45w
Processor == 50-60w
HDD == 25w per(min 2)
Optical == 25w per(min 1)
Video == 225
225+25+50+60+45 == That's 405w minimum...MINIMUM!!! There is no reason for it to be that high. That is more power than the rest of the system combined...
Correct, but wouldnt that be the maximum power needed? Right now, my system is sitting comfortably consuming 220-230 watts. My PSU has a gauge hooked to it showing the total consumption, and it's pretty dead-on. My system by the way, using a 450W coolermaster power supply is:
Asus K8VSE DLX, Athlon 64 Proc 3000+ not o/c'd
Radeon 9800 Pro not o/c'd.
1 Seagate SATA drive, 2 IBM Deskstars on RAID,
Playing audio CD in DVD burner drive, additional DVD Reader, and the rest of the usual components, Modem, NIC, Audigy Sound card, Axim USB Link, the onboard components, etc. Never ever has this psu been stressed past 300 Watts.
and yeah, the needle on this gauge constantly bounces up and down so it does work properly.
-
You'd think we'd have developed less power intensive processors (both CPU and GPU) by now - but nooooo.... these bastards just want to keep running up my electricity bill...
-
Originally posted by Grug
What's the best budget Video card?
I was thinking of an ATI 9800 Pro. I havn't been following Nvidia anymore.
I'll probably be looking to upgrade very soon. It should be fairly cheap (within $300-$350 AU) but if there is a drastic issue, I'd consider paying slightly more. The less the better though.
9800 Pro seems to be the way. Anyone who's got them, have you ever experienced considerable slow down in any game?
No slowdown when running WoW 1280*1024, except when entering areas with large numbers of players, which is a limitation of one's connection, hard drive, and processor speed rather than the graphics card AFAIK.
-
Rawr.
http://www.jscustompcs.com/power_supply/
[edit] and Lib, thats the nature of technology. The more advanced it gets, the more power it will require. A few years ago a 350 watt was unheard of.
-
I've just bought a 400W power supply, glad I did now :)
-
Hmm... I've only a 330w...
But, ehh. I'm really really happy with my 9600XT, so I don't need to worry about this krupke.
-
I've had some limitations here with the 9800 series. More along the lines of limitations in how high I can raise the settings in modern games. Usually can pull off the best LoD, but AA and Anisotropic Filtering suffer in some.
-
Originally posted by Thorn
[edit] and Lib, thats the nature of technology. The more advanced it gets, the more power it will require. A few years ago a 350 watt was unheard of.
I know that.:rolleyes:
It just seems a little obscene for the one component to use more electrcity than the rest of the system combined.
-
That's not totally true. The Cray systems were about 10% as fast as a modern microprocessor, yet burned off enough power to require a liquid-bath cooling system. But yes, in general, more computational power = more actual power. Though power is one of the facets of microprocessor design that is optimized pretty heavily, believe it or not.
-
Originally posted by vyper
You'd think we'd have developed less power intensive processors (both CPU and GPU) by now - but nooooo.... these bastards just want to keep running up my electricity bill...
I guess comparing processors to something like car engines is like comparing oranges and apples, huh? It seems to make sense though, make more powerful processors that use the same or less juice.
-
Just borrow your dad's nuclear generator...
-
Now we just need the Mr. Fusion and the portable liquid nitrogen bath...
-
My PSU is 700W...
-
Originally posted by Clave
My PSU is 700W...
So you have a server-class machine or are you just mad? ;)
-
Originally posted by Liberator
Poking around before bed and saw this:
http://www.theinquirer.net/?article=22917
The Nvidia was asking PCISig for 75w more power.
So you'd basically have:
75w from the slot
75w from one 6 pin connector
75w from a second 6 pin connector
That is just plain obscene.
Combining that with a 110 Watt dual core Prescott would bump that minimum requirement up to 335 Watts. Can you imagine how hot a system like that could get?
-
Originally posted by Nix
So you have a server-class machine or are you just mad? ;)
he's got a dual cpu mac with all the bells and whistles, if I remember correctly.
-
Originally posted by Kosh
Combining that with a 110 Watt dual core Prescott would bump that minimum requirement up to 335 Watts. Can you imagine how hot a system like that could get?
110W for a Pentium D? The standard Prescott is 115W, and that's using Intel's poorly representative numbers. The Pentium D is listed at 130w, so I'd say estimate up to 150w at maximum load.
On a related note, those not understanding P=IV can really hurt your head...
-
I thought the standard Prescott was 90 something watts. My mistake.
-
Originally posted by kode
he's got a dual cpu mac with all the bells and whistles, if I remember correctly.
Single CPU, one bell, one whistle, but I guess the same power supply is used in the dual-processor ones as well. But even so, I just dont get it - as far as I know the IBM processors actual use less power than a P4....:confused:
-
They do. Apply just probably felt like putting an overly large PSU in so they could brag about the specs. Also, the efficiency may not be quite as high as some others.
-
All I know is that my back room gets pretty warm after a while - Apple need a slap if they put that PSU in there for no good reason :mad:
-
Originally posted by Clave
But even so, I just dont get it - as far as I know the IBM processors actual use less power than a P4....:confused:
Have you actually seen the heat sink on the G5? Those chips run dangerously close to cooking most of the time, which is a pretty good indication that they actually use a lot of power. They don't clock as fast, but there's way more going on inside them than a P4.
-
My recent upgrade to an Athlon64 3500 has led to headaches re: heat. The CPU gets very hot even when idle, and this is after I bought a ThermalRight XP-90 heatsink, a 30CFM 92mm fan and some Arctic Silver 5. And lapped both heatsink and CPU heatspreader...
Hence my planned upgrade to watercooling and a ground-up rebuild of my system. I just hope I can get a decent price for my current rig...
-
Like the G5? http://www.apple.com/powermac/design.html
It's just a bit scary when stuff is running so damn hot, there must be another way...
BTW: my office comp (G4) is running at 57.1C right now (ambient 25.2C)
-
Originally posted by Descenterace
My recent upgrade to an Athlon64 3500 has led to headaches re: heat. The CPU gets very hot even when idle, and this is after I bought a ThermalRight XP-90 heatsink, a 30CFM 92mm fan and some Arctic Silver 5. And lapped both heatsink and CPU heatspreader...
Hence my planned upgrade to watercooling and a ground-up rebuild of my system. I just hope I can get a decent price for my current rig...
How hot exactly? I'm running a similar setup, a 3000+ 64 and a Venus 12 on my system, and I idle at 38C, if my room is at actual room temperature. It'll top out at 52C, maxed out running the UD agent.
And you lapped the heatspreader on the processor? I wouldnt think that they'd need that sort of treatment. Heh. I've never actually lapped my heatsinks at all and I get great performance.
Another thing to consider too, how are you putting the AS5 compound on? If you slather the entire chip with it, it might not be transferring the heat as efficiently (though, when you think about it, it probably doesnt matter). Arctic Silver's website has been updated with a new way to apply the compound. Not only does it make perfect sense, it makes it way easier when you have to take the heatsink off. One guy I knew slathered it on thick all over the chip. Couldnt get it off unless we maxed out the CPU for a bit, got the temps up, shut down the system, and got the chip off.
-
I've run into that same problem; nearly took the pins off the chip the first time I removed it, and that was just with silicon grease!
I applied the AS5 in the 'correct' way: a very small blob in the middle of the heatspreader, then lower the XP-90 on to it (insofar as it's possible to lower a heatsink that can really only be clipped down if it pivots around one edge...). It makes a nice thin circle of compound over the core.
My temps are about the same, but I bought the XP-90 to allow me to overclock this system. Seems to have been a waste of money.
On the other hand, the fan doesn't seem to be doing much. I can actually remove the CPU fan entirely and get no difference in temperature.
Given that I'm using an ABIT (Actual is Below Indicated Temperature) mainboard, I'm not inclined to trust the built-in temperature monitor.
This system's performance should change when that massive Delta fan gets here in a week's time... 140CFM and 57dB of turbine.
-
Dood, if you're gonna do that, make a variable resistor switch, turn it down when you dont need it topped out all the time.
I dont trust automatic thermal controls.
-
I'm investing in a 20W fan controller for this purpose. I would build my own, but I'm lazy :p
-
Are you experiencing any actual stability problems though? I ask cause I'm running something very similar w/ an xp-90 or a MSI neo-2 platinum, and the temperatures readings are just plain wrong. If your CPU is a winnie, the problem is probably just very bad readouts from the the thermistor. Something changed between the newcastles and the winchesters, apparently it causes quite a few motherboards to just give wildly innaccurate results.
-
It's a Newcastle. And I know it becomes unstable at readings around 65-70, which is about right for Athlon64s.