Author Topic: Shader Model 3.0  (Read 2027 times)

0 Members and 1 Guest are viewing this topic.

No, I mean HDR.  It implies a large dynamic range of brightnesses, and I'm pretty sure modern displays simply can't do it.  An LCD has like 8 bits per channel if it's a good one, and it's not enough for real HDR.  CRTs are a bit better, but even they aren't exactly up to being called High Dynamic.

 

Offline ZylonBane

  • The Infamous
  • 29
You know, I could have sworn that Giants: Citizen Kabuto was doing years ago what everyone is now calling HDR.

Everything old is new again.
ZylonBane's opinions do not represent those of the management.

 

Offline CP5670

  • Dr. Evil
  • Global Moderator
  • 212
The AOE3 HDR looks rather messed up from what I'm seeing in screenshots. It's overdone. Far Cry and SCCT provide much better examples of what it can do, although you have to tone it down in FC to get it to look right. Does anyone know how the pseudo HDR in Half Life 2 looks?

Quote
The reason some games support HDR only on nVidia cards is because the nV cards support 32 bit framebuffers. I'm fairly sure that those aren't actually part of the SM3.0 specification, and are just something NV does and ATi doesn't.


It seems that the X1800 line will have the OpenEXR FP16 support in addition to SM3 since games are starting to use that. SM3 and HDR are quite separate as you said, but they seem to go together for some reason in terms of both video card and game support.

Quote
No, I mean HDR. It implies a large dynamic range of brightnesses, and I'm pretty sure modern displays simply can't do it. An LCD has like 8 bits per channel if it's a good one, and it's not enough for real HDR. CRTs are a bit better, but even they aren't exactly up to being called High Dynamic.


Some CRTs and 10-bit LCDs can display HDR. Well, technically CRTs have no limit to how many colors they can display, but that's without considering external lighting and glare. Of course, none of the gaming video cards can output more than 32 bit color, so it's kind of a moot point. I think a downsampling technique called tone mapping is used to show HDR images on a normal 24/32 bit screen.
« Last Edit: September 30, 2005, 01:52:26 am by 296 »

 
Quote
Originally posted by phatosealpha
No, I mean HDR.  It implies a large dynamic range of brightnesses, and I'm pretty sure modern displays simply can't do it.  An LCD has like 8 bits per channel if it's a good one, and it's not enough for real HDR.  CRTs are a bit better, but even they aren't exactly up to being called High Dynamic.


Yeah, and games do it by EMULATING it. HDR in games does not require an HDR monitor; in fact, the whole point of doing it in the graphics hardware is so you don't NEED a special monitor.
'And anyway, I agree - no sig images means more post, less pictures. It's annoying to sit through 40 different sigs telling about how cool, deadly, or assassin like a person is.' --Unknown Target

"You know what they say about the simplest solution."
"Bill Gates avoids it at every possible opportunity?"
-- Nuke and Colonol Drekker

  

Offline CP5670

  • Dr. Evil
  • Global Moderator
  • 212
Doesn't hurt to have one though. :D

It's too bad HDR causes a sizeable performance hit with current cards. It's not as much as many people say (about the same as 4xAA), but I have to drop down a resolution level to use it. Still well worth it in games where it's supported, though.

 

Offline Turnsky

  • FOXFIRE Artisté
  • 211
  • huh?.. Who?.. hey you kids, git off me lawn!
newer radeon cards use shader 2.0a, and 2.0b i think
   //Warning\\
---------------------------------------------------------------------------------
do not torment the sleep deprived artist, he may be vicious when cornered,
in case of emergency, administer caffeine to the artist,
he will become docile after that,
and less likely to stab you in the eye with a mechanical pencil
-----------------------------------------------------------------------------------

 

Offline Prophet

  • 210
  • The know-it-all
Stupid shaders... I bought the Project Snowblind recently. Then I noticed that my Radeon 9200 cant handle the game.
It makes me angry because the two first levels work like a dream, look great and run smoothly. Then the third level is just this grayish green haze, only the weapon and hud displays are drawn correctly. Other people at the Eidos forums are reporting the same problem and think the it may have something to do with pixel shaders... So I can't play the game until I get a new graphics card.
BTW. Due to the numerous bugs present in Snowblind, Eidos has decided not to update the game. Well done Eidos...

Anyway This is my opinion on shaders: :ick:
I'm not saying anything. I did not say anything then and I'm not saying anything now. -Dukath
I am not breaking radio silence just cos' you lot got spooked by a dead flying ****ing cow. -Sergeant Harry Wells/Dog Soldiers


Prophet is walking in the deep dark places of the earth...

 

Offline DaBrain

  • Screensniper
  • 212
    • Shadows of Lylat board
The main problem is (imho) that most people think a game has to use Pixel Shaders to look good, which is just not true.

There are many games, that rely on PS too much, while the content is pretty bad. (Low-res maps, low-poly models)


Pixel shaders are great. I think FS2 could look incredibly cool with some shader effect, but we should never forget that shaders are no guaranty for good graphics.
--------------------------------------------------
SoL is looking for a sound effect artist
Please PM me in case you want to apply
---------------------------------
Shadows of Lylat - A Freespace 2 total conversion
(hosted by Game-Warden)
----------------------------------

 

Offline CP5670

  • Dr. Evil
  • Global Moderator
  • 212
You might be right about that. Textures in modern games simply suck. They look quite pathetic compared to what we had a few years ago and the detail texturing effect is nowhere to be seen. The models have been okay though.

With FS2 we already have very good content (except for high res textures for the FS2-only ships, but those are going to be tough to do), so things like bloom on bright objects or even full blown HDR might look nice.

 

Offline aldo_14

  • Gunnery Control
  • 213
The sheer work requriements for most modern games IMO means that you'll see a continuing drop in texture quality as artists get overloaded.  Just look at all the GTA-clones promised a world of xx km; that's got to require corner cutting by the artists to get it done on time, whether it's reuse of stuff, a general lower quality of texture/model, or resorting to less visually impressive procedural techniques.

 

Offline IceFire

  • GTVI Section 3
  • 212
    • http://www.3dap.com/hlp/hosted/ce
Quote
Originally posted by Turnsky
newer radeon cards use shader 2.0a, and 2.0b i think

The next gen, which is due out in a few days now, will be supporting SM3.0.

Yes they dropped the ball on that a bit...but eh.
- IceFire
BlackWater Ops, Cold Element
"Burn the land, boil the sea, you can't take the sky from me..."