Originally posted by Swamp_Thing
80hz means 80 frames per second. I seriously doubt you could see any diference in performace from 80fps to 100fps.
We are not talking of how fast a game runs or how fast a graphics card displays a frame, we are talking about how fast the monitor displays a complete frame.
Ever seen a PC monitor on video? You can see the refresh lines on the screen, coming down. That´s because a video camera captures video at a faster speed than the monitor refreshes, wich allows you to actually see the scan lines. That´s what we are talking about here. Not a game´s fps.
You are wrong on the video issue Swamp_Thing.
TV and VHS/DVD video is 50 hertz, while PC is 60 to 120 Hz.
The problem is that by doing 60 Hz, the camera's fps won't properly match the monitors.
As for too high hertz setting - that's bull****. Yes it takes a toll on the monitor, since it will operate at a higher performance. However nowadays monitors won't operate if you try to force them to.
The ones I saw all wrote - outside horizontal refresh rate OR refresh rate not supported in this resolution.
I have a Phillps Brillance 109P, and I've been running it at 100-120Hz non-stop for the last 5 years.
Another thing: The reason why you should use at least 75-85Hz refreash rate is that the higher the flicker rate, the less the monitor will directly aggravate your nervous system.
If I sit down to a system with 60Hz (default) refreash rate I get headaches witing 10 minutes. It won't "look" better, but try looking at the monitor with your peripheral vision (look "next to it") - the difference will be immediaterly apparent.
Beside a fried IC it is possible that some high-voltage equipment or a surge in the powersource knocked off the calibration of your monitor.
Howevere AFAIS Darkage may tell you the likelyness of that.
...and no I've never seen a case where monitor allowed itself to be harmfully overdriven.