Hard Light Productions Forums

Off-Topic Discussion => General Discussion => Topic started by: TopAce on October 05, 2003, 02:52:49 pm

Title: 16-bit and 32-bit
Post by: TopAce on October 05, 2003, 02:52:49 pm
I opened this thread because I am curious if I am the only man who cannot see difference between 16 and 32-bit. I always use 16 bit simply because I find it as nice as the 32-bit setting. The only difference I can see is that 32-bit is darker in games(Quake 3, and even FreeSpace!)

Am I the only one here with this?
Title: 16-bit and 32-bit
Post by: Grey Wolf on October 05, 2003, 02:56:05 pm
Older games, not much of a difference. Newer games, they look a bit better with 32 bit.
Title: 16-bit and 32-bit
Post by: TopAce on October 05, 2003, 02:56:52 pm
What games do you consider 'older'?
Title: 16-bit and 32-bit
Post by: Grey Wolf on October 05, 2003, 03:02:06 pm
Well, both of your examples are fairly old, if I remember the release date of Quake 3 correctly.
Title: 16-bit and 32-bit
Post by: StratComm on October 05, 2003, 03:02:44 pm
It's all color depth, and when you are rendering a bunch of 256 color images in an old engine, quite honestly you aren't going to use up the 65+ thousand colors available.  When you have a bunch of gradients and such though, and full-pallate images plus shading, having those 4 million colors available in 32 bit makes things look much better.
Title: 16-bit and 32-bit
Post by: TopAce on October 05, 2003, 03:03:02 pm
I can even see no change in the graphics of Jedi Knight 2 at 32-bit. :doubt:
Title: 16-bit and 32-bit
Post by: Grey Wolf on October 05, 2003, 03:04:20 pm
Even when there is a difference, it's insignificant.
Title: 16-bit and 32-bit
Post by: Flaser on October 05, 2003, 03:58:28 pm
In images the difference is not that much.

Actually 16bit - in other words 2^16 colors is more than enough for anything.

The reason why I still vote for 32bit is that, when the colors are coputed it could lead to a critical difference when dealing with big gradients or anthing advanced that deals with mixing colors.
Title: 16-bit and 32-bit
Post by: Setekh on October 06, 2003, 05:03:54 am
I can hardly tell. :)
Title: 16-bit and 32-bit
Post by: phreak on October 06, 2003, 09:05:29 am
its noticable in IW2.  when i get home from skool ill put some screens up
Title: 16-bit and 32-bit
Post by: pyro-manic on October 06, 2003, 09:08:43 am
It makes a noticeable, but by no means crucial difference to the look of the game. With new games it makes much more of a difference.
Title: 16-bit and 32-bit
Post by: vyper on October 06, 2003, 09:10:57 am
There is a difference, its visible, its very noticable imho, its nasty in fps which are supposed to look uber cool but... end up not. That is all :D
Title: 16-bit and 32-bit
Post by: ZylonBane on October 06, 2003, 04:32:59 pm
If you run games at 800x600 or 640x480, the difference is easily discernible. 16-bit color (on nVidia cards at  least) introduces a visible dither pattern onscreen. You can also get banding with lots of layered transparencies.
Title: 16-bit and 32-bit
Post by: Sandwich on October 06, 2003, 06:23:34 pm
There's an obvious difference if you know where to look. If you have on-screen a gradient of the rainbow in 16-bit, you'll see banding ("sharp" borders or steps between colors). 32-bit has (to our eyes, at least) no banding.

On the flipside, I've always had problems with 32-bit color depths making the mouse cursor of all things run a bit laggy - enough to be a hinderance at an RTS like Ground Control or HW.
Title: 16-bit and 32-bit
Post by: Fury on October 07, 2003, 08:10:14 am
There is indeed noticeable difference between 16-bit and 32-bit, For example, in some games backgrounds may look blocky because the colorus do not bend well enough, if you use 16-bit. On the other hand, most current video cards can render 32-bit as fast as 16-bit or even faster because of optimizations for 32-bit.

So, if you've got GeForce or Radeon series video card, you shouldn't have any slowdowns because of 32-bit and colour bending is much better.
Title: 16-bit and 32-bit
Post by: Hippo on October 07, 2003, 08:14:12 am
Quote
Originally posted by Sandwich

On the flipside, I've always had problems with 32-bit color depths making the mouse cursor of all things run a bit laggy - enough to be a hinderance at an RTS like Ground Control or HW.


Indeed...
Title: 16-bit and 32-bit
Post by: mikhael on October 07, 2003, 01:06:32 pm
First, it should be pointed out that '32 bit' is really '24 bit with 8 bits being ignored', 99% of the time.

The places where you'll see a serious difference between 24bit and 32bit rendering modes is in programs that use specularity mapping, luminosity mapping, shadow mapping, normal mapping, reflection mapping, transparency mapping, bump mapping etc. In anything else, you only see an increase in color depth, which is negligible.
Title: 16-bit and 32-bit
Post by: Zeronet on October 07, 2003, 01:08:59 pm
The difference is, 32-bit is A1SUPAR.
Title: 16-bit and 32-bit
Post by: Flaser on October 07, 2003, 01:17:18 pm
...if you take advantage of it.

Otherwise it's a waste of resources.

Even 16-bit is way enough to show too many colors for you to distinguish from each other.
Title: 16-bit and 32-bit
Post by: TopAce on October 07, 2003, 02:43:30 pm
Maybe one with a better eye can see much difference. But I agree with Flaser, it is a needless waste of resources and Framerate. I lower the brightness with 20%, and I receive the same quality.