Hard Light Productions Forums
Off-Topic Discussion => General Discussion => Topic started by: TopAce on October 05, 2003, 02:52:49 pm
-
I opened this thread because I am curious if I am the only man who cannot see difference between 16 and 32-bit. I always use 16 bit simply because I find it as nice as the 32-bit setting. The only difference I can see is that 32-bit is darker in games(Quake 3, and even FreeSpace!)
Am I the only one here with this?
-
Older games, not much of a difference. Newer games, they look a bit better with 32 bit.
-
What games do you consider 'older'?
-
Well, both of your examples are fairly old, if I remember the release date of Quake 3 correctly.
-
It's all color depth, and when you are rendering a bunch of 256 color images in an old engine, quite honestly you aren't going to use up the 65+ thousand colors available. When you have a bunch of gradients and such though, and full-pallate images plus shading, having those 4 million colors available in 32 bit makes things look much better.
-
I can even see no change in the graphics of Jedi Knight 2 at 32-bit. :doubt:
-
Even when there is a difference, it's insignificant.
-
In images the difference is not that much.
Actually 16bit - in other words 2^16 colors is more than enough for anything.
The reason why I still vote for 32bit is that, when the colors are coputed it could lead to a critical difference when dealing with big gradients or anthing advanced that deals with mixing colors.
-
I can hardly tell. :)
-
its noticable in IW2. when i get home from skool ill put some screens up
-
It makes a noticeable, but by no means crucial difference to the look of the game. With new games it makes much more of a difference.
-
There is a difference, its visible, its very noticable imho, its nasty in fps which are supposed to look uber cool but... end up not. That is all :D
-
If you run games at 800x600 or 640x480, the difference is easily discernible. 16-bit color (on nVidia cards at least) introduces a visible dither pattern onscreen. You can also get banding with lots of layered transparencies.
-
There's an obvious difference if you know where to look. If you have on-screen a gradient of the rainbow in 16-bit, you'll see banding ("sharp" borders or steps between colors). 32-bit has (to our eyes, at least) no banding.
On the flipside, I've always had problems with 32-bit color depths making the mouse cursor of all things run a bit laggy - enough to be a hinderance at an RTS like Ground Control or HW.
-
There is indeed noticeable difference between 16-bit and 32-bit, For example, in some games backgrounds may look blocky because the colorus do not bend well enough, if you use 16-bit. On the other hand, most current video cards can render 32-bit as fast as 16-bit or even faster because of optimizations for 32-bit.
So, if you've got GeForce or Radeon series video card, you shouldn't have any slowdowns because of 32-bit and colour bending is much better.
-
Originally posted by Sandwich
On the flipside, I've always had problems with 32-bit color depths making the mouse cursor of all things run a bit laggy - enough to be a hinderance at an RTS like Ground Control or HW.
Indeed...
-
First, it should be pointed out that '32 bit' is really '24 bit with 8 bits being ignored', 99% of the time.
The places where you'll see a serious difference between 24bit and 32bit rendering modes is in programs that use specularity mapping, luminosity mapping, shadow mapping, normal mapping, reflection mapping, transparency mapping, bump mapping etc. In anything else, you only see an increase in color depth, which is negligible.
-
The difference is, 32-bit is A1SUPAR.
-
...if you take advantage of it.
Otherwise it's a waste of resources.
Even 16-bit is way enough to show too many colors for you to distinguish from each other.
-
Maybe one with a better eye can see much difference. But I agree with Flaser, it is a needless waste of resources and Framerate. I lower the brightness with 20%, and I receive the same quality.