Hard Light Productions Forums
Modding, Mission Design, and Coding => FS2 Open Coding - The Source Code Project (SCP) => Topic started by: Mr_Maniac on July 12, 2006, 05:18:24 am
-
Hi there!
Some people are using HDRish to make Bloom-effects in OpenGL-Games (ATI only).
nVidia users can look at this, if they want blooming, too:
http://doom3.filefront.com/file/DOOM_3_Bloom;50594 (http://doom3.filefront.com/file/DOOM_3_Bloom;50594)
It should work with any OpenGL-Game. At least, it got tested with jfDuke3D:
Screenshot (http://img66.imageshack.us/my.php?image=duke3dbloom4vs.jpg)
I can't test it, because it needs at least a GeForce 4 Ti and I only have a GeForce 3...
But I hope that it'll be useful for some of you guys ;)
-
Incidentally file this (http://scp.indiegames.us/forum_viewtopic.php?10.1387) under reasons why you should check the Dev site occassionally :D
-
Well I can't get it to work - although it does break the HUD and crash my game. :)
-
Incidentally file this (http://scp.indiegames.us/forum_viewtopic.php?10.1387) under reasons why you should check the Dev site occassionally :D
am i missing something here? that link goes straight to http://scp.indiegames.us/forum.php (http://scp.indiegames.us/forum.php)
-
This doesn't work - it breaks the HUD and that's it. I have spoken with the developer of this plugin. He told me he'd take a look at Freespace Open and see about compatibility for it, but don't hold your breath as I get the impression the developer is not really actively developing the plugin.
-
Ah. Looks like you have to have logged in to get access to the Feature Requests forum.
-
if you have good gfx its mainly useless if you want to play at normal fps. but if you don't have gfx and still want bloom. so that's the solution for you.
-
Of course, everyone could just wait until we actually get the shader code ready for daily use. Then it won't matter which friggin video card you have so long as it supports shaders.
-
yes taylor, but we want results NOW!!!! you know the score - users don't care about your hard work and the nifty things you've done to make the code elegant - they just want it to work x way all the time, and ASAP!!! :nod: :cool:
-
Of course, everyone could just wait until we actually get the shader code ready for daily use. Then it won't matter which friggin video card you have so long as it supports shaders.
Taylor, I have a question. My Geforce FX has problems running shader code in DirectX 9 in 2.0 mode. But i was looking at the wiki for Geforce FX and apparently the performance problems dont' exist in OpenGL because OpenGL interfaces with the shaders in a different method (i don't know mcuh about Direct3D or OpenGL btw)
Can you comment on this and let me know what the score is ? I have too many expenses right now to buy a new card for this machine and when i have extra money for computer it's going to be to upgrade to second generation PCIe anyway, so there is no point in wasting cash on an AGP card this late, so i have to suffer along for a while.
-
yes taylor, but we want results NOW!!!! you know the score - users don't care about your hard work and the nifty things you've done to make the code elegant - they just want it to work x way all the time, and ASAP!!! :nod: :cool:
Actually, the shader work is being done by wolf. I'm cleaning up his diffs and adding more error checking/handling (I'm pretty anal about that, makes debugging a lot easier for me), plus a few other minor things, but the bulk of the work is being done by him. And he's doing a great job too, I just haven't had the time to keep up with him. 3.6.9 isn't going to get this though and that's where all of my attention is focused. But after 3.6.9 ships then I plan to get the shader code in CVS for all of the bleeding-edge builds. The code does use GLSL (since that works best in OpenGL), so I don't know how we are going to deal with D3D and the user-made shaders yet, but then I don't much care either. ;)
EDIT: wolf posted these shots to the scp.indiegames.us forums:
http://team.pld-linux.org/~wolf/krtqavv.png
http://team.pld-linux.org/~wolf/zgqgapi.png
"These are pure in-game screenshots, no smartshaders or anything were used. The effect is not final." - wolf
Taylor, I have a question. My Geforce FX has problems running shader code in DirectX 9 in 2.0 mode. But i was looking at the wiki for Geforce FX and apparently the performance problems dont' exist in OpenGL because OpenGL interfaces with the shaders in a different method (i don't know mcuh about Direct3D or OpenGL btw)
Can you comment on this and let me know what the score is ? I have too many expenses right now to buy a new card for this machine and when i have extra money for computer it's going to be to upgrade to second generation PCIe anyway, so there is no point in wasting cash on an AGP card this late, so i have to suffer along for a while.
I think you'll be fine with what you've got. There will be initial test builds in the next month or so in order to get a quick peformance rundown of the code before it hits CVS. We'll all have a better idea of how it will perform at that point, but optimizations and performance tuning will certianly follow since I like to tinker with that stuff. :)
-
That is outstanding stuff. Can't wait to see it when it's completely! :yes:
-
:((( but i wanted to be the first to bring bloom into freespace source :(
anyway, ive been searching the recent days a way to make my software bloom into hardware.
then i found the GLSL but unfortunely when i searched for a tutorial i just couldn't find one descent tutorial. they all just began with "void main" function and then the shader code and i was like "what the hell? but there is already a main in the program!"
and there wasn't one tuorial which explained where to write this "void main" thing.
-
then i found the GLSL but unfortunely when i searched for a tutorial i just couldn't find one descent tutorial. they all just began with "void main" function and then the shader code and i was like "what the hell? but there is already a main in the program!"
and there wasn't one tuorial which explained where to write this "void main" thing.
"Shaders" is a generic term that's largely used incorrectly. OpenGL names it properly "vertex program", rather than "vertex shader" like D3D does. I generally use the "shader" term to describe it all since most people seem to understand it's reference more readily. A shader is a program though, so it is layed out like a program, with a main() function and everything. It is then compiled (when loaded the first time) and executed at a particular point in the rendering pipeline.
I don't really know of any good tutorials (I'm still looking to understand it better myself), but if you go over the docs for ARB_vertex_program and ARB_fragment_program then there are some examples and much of the terminology/functionality used.
-
you didn't have to say all that. i knew everything you said except the last part, the only thing i need is where to compile the void main function. or at least, where to include it and how to call the shader from the opengl program itself. i.e. beam.cpp
ill look tomorrow for the ARB_fragment_vertex_program whatever the name is. because now its night and i need some sleep.
-
coder toes-stepping :eek: ;)
-
you didn't have to say all that. i knew everything you said except the last part
Not everyone else does though. It's better to be descriptive enough to help everyone understand what we're talking about. :)
the only thing i need is where to compile the void main function. or at least, where to include it and how to call the shader from the opengl program
You don't compile it, the GL does. You just have to send the program as text to the card via the extension functions and it compiles it. But don't go re-inventing the wheel here, the interface for this was been in the works for about 3 weeks now.
-
great, looking forward to it when it is in. It keeps getting better.
-
I already see Shader-TBMs coming to fs2_open ;)
That'll be great...
What GFX-Card do I need to make use of GLSL?
Because my GeForce 3 does have shaders already... But an old Version...
Hell... I really need a new PC :(
-
i think you need pixel shader 2.0 but im not sure.
-
hi,
GSLS is only a language and is component of OpenGL 2.0.
i dont really know if GSLS work with older versions of OpenGL, like 1.x, but any newer driver for graphiccards have today Support for OpenGL 2.0.
if the hardware not 100% compatible, the driver emulate the missing features.
so i think it should work but im not sure.
maybe in the programm its possible to say to assembly the code to older version.
in HSLS its that possible, so can you programm a code and assembly it to many differnt Pixelshader version for Direct3D, imho.
Mehrpack
-
What GFX-Card do I need to make use of GLSL?
Because my GeForce 3 does have shaders already... But an old Version...
GeForce FX or Radeon 9500 are minimum.
-
What GFX-Card do I need to make use of GLSL?
Because my GeForce 3 does have shaders already... But an old Version...
GeForce FX or Radeon 9500 are minimum.
I think it depends more on what the effect is really. I'm pretty sure that GLSL vertex shaders are supported in hardware on the GeForce3 and above at the least. Some other things are emulated on the old GeForce3+ cards. We'll just have to be sure to support various features levels, getting the most functionality out of the most amount of cards. But do so without turning the code into complete spaghetti in the process of course. :)
-
Yey!
Pure Happiness! ;)
It's unbelievable, what you guys made so far with the FS2-Code!
And it's also unbelievable what you will be making!
I love it!
-
yes it is all rather impressive isnt it :D
-
it's too bad the campaigns don't move at the same speed as the source code.
-
We've got to wait for all of the data to catch up too. The MediaVPs are still lagging behind at the moment. With the shader support, we'll actually need shader effects, so that will be something for people to come up with. And I've got partially working bumpmap code with these test images I've been sent, so that will be something else that the map makers will have to deal with when that hits CVS. I'll probably add the bumpmap support about the same time as the shader support hits CVS so those things will hit everyone at once. The problem is, of course, that those features are useless without the data to go with them. ::)
-
it's too bad the campaigns don't move at the same speed as the source code.
thats cause every time the modders finish something the scp guys give em something else to implement. :D
id love to spend more time on my mod. but its just not a feasable thing for an adult to do. sure people whi are in highschool or coledge have plenty of time to mod, once you get old, your time gets raped! coders can do it only because their brains have been conditioned to be effietient. taylor must be a mentat :D
-
Remember, for every feature the SCP add, most campaigns have a large number of ships to update. Many of us are still trying to catch up with the HT&L updates, let alone animated subsystems etc etc, and the new shader system is going to kick up another round of it ;)
-
taylor must be a mentat Big grin
<mentat>"By the juice of Safu I set my mind in motion, my thoughts acquire speed, my lips acquire stains, the stains become a warning. By the juice of Safu I set my mind in motion"</mentat>
-
[...] The problem is, of course, that those features are useless without the data to go with them. ::)
hi,
they are not useless, definitive not.
because, if there no sign of that feature, the people who have the skill see no really reason to begin the work of creating thing for features and they cant test/work without the new features really.
but if it in, the things will came out.
but its true that i need time to use any new feature.
i think its normal that there is a great time between new features come out and the stuff which support the feature will release.
i mean look at SEE or a new DirectX version or OpenGL release.
it need some time before the first games or software is release what really use the new features.
so its not true to say, without stuff its useless to release new features.
Mehrpack
-
taylor must be a mentat Big grin
<mentat>"By the juice of Safu I set my mind in motion, my thoughts acquire speed, my lips acquire stains, the stains become a warning. By the juice of Safu I set my mind in motion"</mentat>
THAT WAS NEVER IN THE BOOK!!!!! :D
though im sure taylor hits the sapho from time to time. the guys just a genious. anyway if the shader language is about the same as lua script i should be able to do some shaders. if done a couple shader scripts for quake games so it its anything like those i can hack it.
-
EDIT: wolf posted these shots to the scp.indiegames.us forums:
http://team.pld-linux.org/~wolf/krtqavv.png
http://team.pld-linux.org/~wolf/zgqgapi.png
"These are pure in-game screenshots, no smartshaders or anything were used. The effect is not final." - wolf
Wow... those pics are pretty nice. I can see the well-known blooming problems though. Colorful effects get over saturated. Just like the warp effect. Anyway, I hope it will be possible to fix the 'shader effect', or we could just optimize the content for it.
Is wolf posting at indiegames only? Cause I'd really like to congratulate him for this. :)
-
Wow... those pics are pretty nice. I can see the well-known blooming problems though. Colorful effects get over saturated. Just like the warp effect. Anyway, I hope it will be possible to fix the 'shader effect', or we could just optimize the content for it.
wolf said it wasn't a final effect (I've seen slightly different versions already), plus you'll be able to use your own if you want. The effects will work like tbl's basically, functionality wise, where the defaults are stored in the executable, but you can override those with versions on the disk or in a VP.
And I still want per-effect/per-ship effects rather than the overall effect that it is now. That's slower of course, but it's something new for me to spend countless hours trying to optimize. :D Plus, it would give better control to the artists/designers out there so I think that the work would be worth it in the end.
Is wolf posting at indiegames only? Cause I'd really like to congratulate him for this. :)
He's posted in this thread already, but I'm not sure how often he checks here. Feel free to congradulate away though, the man certainly deserves it. :)
-
And I still want per-effect/per-ship effects rather than the overall effect that it is now. That's slower of course, but it's something new for me to spend countless hours trying to optimize. :D Plus, it would give better control to the artists/designers out there so I think that the work would be worth it in the end.
Like I said:
Shader-TBMs ;)
-
wolf said it wasn't a final effect (I've seen slightly different versions already)
There's need to do some tweaking, but basically it's just how it will look.
And I still want per-effect/per-ship effects rather than the overall effect that it is now.
We will need artists to create greyscale textures that will represent amount of the glow. Right now everything is maxed out and it can look very bad in some cases.
-
cant we just use the glowmaps for that? we already have too many maps to work with as it is.
-
Well, I think that is something the Coders are looking into, but I'm not certain, I know the idea was thrown around for a material format that would put all the layers for a texture into one file, but I don't know if anything came of it beyond that, I'll leave it to the people in the know to let you know ;)
-
i wonder if there will ever be a multilayer dds format, that lets you put in as many chanels as wou want and its sctill compressed and can be loaded to vidmem compressed. that would be awesome. there was the concept of putting a bunch of grascale maps into different channels of a texture, using red for shine, green for env, blue for bump, alpha for shine, ect. i was against color shinemaps at first for this reason, but the irredecence effect is too nice to do away with.
-
well, if software shader still count then i've improved the software shader of bloom
ive made a function with more control, sorry but its the same speed (well its better speed but now its more options so it makes it the same speed), and better bloom.
taylor - patch : http://planet.nana.co.il/itay390d/bloom.txt
there is bloom and bloom2 function, bloom is the previous function and now bloom2 is the currect
the problem that the screen is being lighten when there is friendly ship closing or when you shoot still exsists. i don't know how to solve this because im not expert in freespace source but it comes from the read pixels and draw pixels function and not from the bloom function.
some documention about bloom2 function.
void bloom2(unsigned char *pixeldata,unsigned char *blurlayer,int w, int h, int kernelsize,int minbrightness, int multiplation);
*pixeldata is the pointer to the original freespace image. which will be bloomed at the end.
*blurlayer is the pointer to the blurred freespace image.
w - the width of the window you want to bloom
h - the height of the window you want to bloom
kernelsize - the recommended value is 5 i think. as the number is higher, the bloom getting more widen , or blurred, being more spread. also, as the kernelsize number is higher the bloom is being produced slower.
minbrightness - this is a special feature with the new bloom2 function (well it was also in bloom ver 1 but it was bugged) you can specify from what brightness i.e. "gamma" the bloom will take place. for exmaple, if you want only explosions and jumps and bright stuff to get bloomed, so the minbrightness should be 100 or more. if you want everything to get bloomed then the value is 0.
multiplation - this is a number you multiply the blurred image, more higher value means greater glow. recommended value is 2.
640x480 fps is normally 7 on 2ghz computer. (that's the price you have to pay for using software)
pay attention that the function is optimized. ive tried to do some tricks and 'kiks over the function to get some bits of extra fps. (the starting fps was 1-2) that's the best i could do. without my tricks it would've be x3 shorter function.
images
=====
640x480 images
http://img92.imageshack.us/img92/3692/1it5.jpg
http://img133.imageshack.us/img133/6659/2du4.jpg
http://img98.imageshack.us/img98/1927/3kj8.jpg
http://img134.imageshack.us/img134/9875/4lz9.jpg
1024x768 images (diffrent pictures)
http://img136.imageshack.us/img136/930/5vf0.jpg
http://img88.imageshack.us/img88/6066/6ox3.jpg
http://img150.imageshack.us/img150/701/7gx3.jpg
http://img150.imageshack.us/img150/9482/8kp2.jpg - pay attention that when there is tremndeous amount of light the surrounding area is being blurred.
well, that's it. im done with software. im not going to make another upgraded bloom.. im moving to hardware.
-
Im gonna guess...
3.2 GHz P4 w/ HT.
1.75GB RAM.
Nvidia 6600 w/ 256mb VRAM on AGP8x.
200GB HDD.
Would this work? with all the advanced effects with at least 30FPS? Im still trying to get FSOpen with all effects to work. Will it?-
-
We've got to wait for all of the data to catch up too. The MediaVPs are still lagging behind at the moment. With the shader support, we'll actually need shader effects, so that will be something for people to come up with. And I've got partially working bumpmap code with these test images I've been sent, so that will be something else that the map makers will have to deal with when that hits CVS. I'll probably add the bumpmap support about the same time as the shader support hits CVS so those things will hit everyone at once. The problem is, of course, that those features are useless without the data to go with them. ::)
I'm no coder (at all, lmao), but I might have some fun tweaking shaders a bit once I see the shader system you guys are doing. I'm a tinkerer...maybe I'll figure some junk out or something. =P
Are there any programs out there that do bloom for any Direct3D application? I have a bunch of (non SCP) programs that are Direct3D only that I want to put bloom on... >_>
-
cant we just use the glowmaps for that? we already have too many maps to work with as it is.
The more maps you got, the more possibilities you are open for you. ;)
The number of used maps is rather high in FS2. Well, at I can do something about this. At least of the standard FS2 and mods, based on the FS2 content.
It will be part of the MediaVP. Actually... I'm almost done with it too. :drevil:
FS2 never made use of mipmaps and rather used downsized versions of the original maps for lower lods. So almost all lod textures for the lods aren't needed anymore, now that we got mipmapping.
We will need artists to create greyscale textures that will represent amount of the glow. Right now everything is maxed out and it can look very bad in some cases.
Well, it's no problem for me to convert all glow maps into greyscale maps. It would actually take a few minutes only.
I'm a bit worried about animated textures. How are we going to handle them?
-
Well, it's no problem for me to convert all glow maps into greyscale maps. It would actually take a few minutes only.
Such conversion would be pointless, as it can be done within the shaders with negligible performance hit. The point is, however, that the current glow maps may not be approptiate as the bloom glow maps.
I'm a bit worried about animated textures. How are we going to handle them?
Are there any bloom related issues with them?
-
Well, how do I have to set up a map for the bloom effect? An example would be great.
And about animated effect:
If there is an animated texture and lets say it's an EFF using DDS frames, how would I have to set up the bloom file for it?
AnimatedMap_0000.dds, AnimatedMap_00001.DDS
Like... errmm:
AnimatedMap-bloom_0000.dds, AnimatedMap-bloom_0001.dds
Is DDS fine for this? It can be greyscale. And would we need mip maps for this? If it works a bit like I think, mip maps might speed it up...
-
i really hope that procedural materials would replace animated glowmaps. considering th way theyre being used now, that fade effect could be scripted quite easily.
-
A lot of animations could be replaced, but some are just too complex be replaced. We still need support for them.
-
keep support for them, yes, but for effitiency, swap em out for procedurals if those look better, or provide a significant performance increase, and swap em especially both of those turn out to be true.
-
I dont' suppose we could see a uh... test build... before it gets into cvs could we ? you know, for the hardcore...
-
:3?!
-
I dont' suppose we could see a uh... test build... before it gets into cvs could we ? you know, for the hardcore...
There will be a test build before the code hits CVS. BUT, 3.6.9 comes first, so I'm not putting much work into this until after we finally get 3.6.9 out the door. The test build will be for the shader support and the normal mapping support, so everyone has to wait until both features are pretty much working before a build comes out to test them.
-
I love those big improvements. Getting two great visual improvements (or the possibilities for them) at once will be awesome.
3.6.9 has a higher priority of course, but I fear most people will rather stick to your test build. ;)
Anyway, the final 3.6.9 build will be pretty important for the BtRL demo release. As it would delay the release date too much, while the possibilty of problems is rather high, the demo probably won't feature normal mapping or any pixel shader effect/materials.
I will continue to work on normal maps, unless you tell me that my preview maps aren't useable and have to be created in a different way.
-
3.6.9 has a higher priority of course, but I fear most people will rather stick to your test build. ;)
I think that I've only mentioned this in internal, but all of my upcoming feature work will use 3.6.9 as a base. It's one of those "easier to build a house on a mountain than a sand dune" type things. :)
The test build will basically be 3.6.9, but with the bump mapping and shader support. So in that respect it won't really matter if people use the test build over 3.6.9 or not. If there is a bug somewhere then it will be related to either shaders or normal maps, and both of those things will only be enabled via cmdline options. That will make it a lot easier to track down and/or identify bugs since I won't have to worry about whether that particular bug is something in unstable CVS or my new code. If what ever goes wrong happens in my test build and not 3.6.9 then I'll know for sure that it's something with my code changes.
I will continue to work on normal maps, unless you tell me that my preview maps aren't useable and have to be created in a different way.
You're maps appear to work fine. I was able to get a generally bumpy Perseus and that looked halfway decent (it will be a lot better once the lighting code for it catches up). The maps I got from Vasudan Admiral didn't work out so well though, they seem inverted. Whether that's just the maps or partially the code I don't know.
The code mainly just needs a lot more work to blend and light it all properly. Plus some speed adjustments will be needed, not to mention a change or two to how the extra lighting passes get handled.
Unless you just have a good bit of time to work on the normal maps you may just want to get a decent sample to test with with for the test build and then wait on getting maps for everything. We'll need maps for some fighters, bombers, cruisers, capital ships, and some mods (SoL and BtRL mainly) in order to get a good test for speed and memory usage. Doing everything can wait, just in case some changes are needed. :)
-
Once the shader support (I assume we'll be able to do otherthing than Bloom using this, simply by writing additional vertex programs right? Is there some kind of loading support to come with this or will these additional vertex programs need to be put into the CVS by a coder ?) stablizes and we have normal maps for everything... wont' that truly be deserving to be version bumped to 3.7.0 ?
-
@taylor
Sounds reasonable. ;)
I'm doing the normal map in a cheap way. This way only the surface will get more details. I'm just creating a hieght map and convert it. Normal mapping would be way more effective in connection with a high-poly model. So a low-poly model looks almost like a high-poly one.
So this feature has even more potential than I can get out of it right now...
I've just read some very good articles about normal mapping. Well, I may start working on normal maps properly soon, but I'm not as good with model, as I am with textures... yet. :drevil:
This should get some people motivated to help with the release candidate testing:
If you help to squeeze ou the last bugs of the test builds, you'll get to see the new features sooner. ;)
In a few days I'll be able to help with the testing too.
-
Once the shader support (I assume we'll be able to do otherthing than Bloom using this, simply by writing additional vertex programs right? Is there some kind of loading support to come with this or will these additional vertex programs need to be put into the CVS by a coder ?) stablizes and we have normal maps for everything... wont' that truly be deserving to be version bumped to 3.7.0 ?
Shaders will work out of special tbls, or tbms rather, so it will support future upgrades and such. How this will work exactly is still up in the air, and one of the major hold-ups, since it would be good to support full scene shaders without code changes. Doing that isn't a problem, but actually deciding on the best/easiest way to do it is. :)
At this point, there will be no default shaders included. They will all be MediaVP-type data only so you will require the actually tbms in order to use any shader. I figure this will work out better for everyone since it will give mods far more control over how things work rather than just some default effects getting applied that they can't control.
We won't go to 3.7 until the new pilot file code is in. 3.7 is going to be big. Seriously BIG. There a quite a few upgrades to various parts of the code that have been waiting for the new pilot file code so there is a seriously large number of changes coming, and that's just from me. Then there is the new sound code, the all new OpenGL init code (finally with real AA support in OGL), the new ANI replacement, etc. I'm still sitting on a few changes that I originally started the first of last year. Between my 3 different code trees of new features I've got over 100,000+ lines of code changes just eating a hole in my hard drive. ;)
That's why I'm focusing on getting 3.6.9 as good as possible, I'm going to stop fixing bugs and start working almost 100% on new features. I've fixed like 600+ bugs over the past 2 ½ years and it's time to just work on my own new features for a change. :)
-
Whoa, okay i guess we'll be looking at 3.6.10 type chicanery then.
I hope you're putting this work on your resume Taylor, as it's really alot of work from you to upgrade to upgrade a 7 year old game engine :)
Thanks to you and all the others on the forum who improve it too (you know who you are)
-
Shaders will work out of special tbls, or tbms rather, so it will support future upgrades and such. How this will work exactly is still up in the air, and one of the major hold-ups, since it would be good to support full scene shaders without code changes. Doing that isn't a problem, but actually deciding on the best/easiest way to do it is. :)
At this point, there will be no default shaders included. They will all be MediaVP-type data only so you will require the actually tbms in order to use any shader. I figure this will work out better for everyone since it will give mods far more control over how things work rather than just some default effects getting applied that they can't control.
We won't go to 3.7 until the new pilot file code is in. 3.7 is going to be big. Seriously BIG. There a quite a few upgrades to various parts of the code that have been waiting for the new pilot file code so there is a seriously large number of changes coming, and that's just from me. Then there is the new sound code, the all new OpenGL init code (finally with real AA support in OGL), the new ANI replacement, etc. I'm still sitting on a few changes that I originally started the first of last year. Between my 3 different code trees of new features I've got over 100,000+ lines of code changes just eating a hole in my hard drive. ;)
That's why I'm focusing on getting 3.6.9 as good as possible, I'm going to stop fixing bugs and start working almost 100% on new features. I've fixed like 600+ bugs over the past 2 ½ years and it's time to just work on my own new features for a change. :)
hi,
woow, sound really great and interesting.
Mehrpack
-
What kind of shader support is this exactly? Is it going to be limited to pixel shader v2.0 and above? There are neat things you could do with v1.3 nevertheless (I'm asking this because of my old "v1.3"-card and a nasty lack of money :nervous: ).
-
I think we're not limited to a special shader version.
Of course we shouldn't blindly use 3.0 shaders, but afair 1.3 is pretty limited. Especially be the instructions that can be worked on at the same time.
I think I just got a good idea.
I bet it's no problem to find out the the supported shader versions of card. How about different versions of the shader entries in the tables. Like fallback modes for people with older cards. I know this means more work for the modders, but most modders would neither want to use worse graphics than they could actually have, nor to make low-end users unable to run their mod.
1.1, 2.0, 3.0?
If there is no way to recreate an effect with a lower version, the modder could still add a dummy shader, that does nothing.
-
probably you want to aim for 2.0 shaders since those are the mostly widely supported version that allow for long programs. All cards since Geforce FX (2003) and Radeon 9500 (2003) and up support these, this is the widest base. Also, Geforce FX doesn't suffer from performance problems in OpenGL, if the wiki entry is to be believed :
However, the GeForce FX still remained competitive in OpenGL applications, which can be attributed to the fact that most OpenGL applications use manufacturer-specific extensions to support advanced features on various hardware and to obtain the best possible performance, since the manufacturer-specific extension would be perfectly optimized to the target hardware.
However, i'm not a graphics programmer and have very limited knowledge of OpenGL or DirectX. Hopefully Taylor could shed some more light on specifics.
-
The attempt is going to be to have a tbm that can have multiple versions of the same effect. The code will parse the tbm and get the best version that the hardware appears to accept. It's just going to be a matter of whether or not someone writes a shader in that version for that particular effect. There won't be different tbms for each version of each effect, it will all be in the tbm for that effect. That will hopefully make it easy to not only keep track of but also upgrade and modify in the future. At first it's going to be 2.0+, then I'll add some more code to support other languages/versions where possible.
The basic idea would be to have a bloom.fsh ("Freespace SHader", the extension will probably change though, to close to "fish" :)) which would have a section for each shader language/version that someone makes, and then be split into vertex and fragment sections. The code will initially handle any number of different versions in the tbm (just to maintain backwards compatibility for improved effects), even if the game wouldn't be able to actually use them.
There will be a tbm for fullscreen effects, which would call the .fsh files, and those would be executed on a fullscreen basis (ie, bloom effects, sharpen effects, etc). Then you will also be able to have ship/weapon tbl entries for shaders for a specific ship model or weapon, these would typically be something that doesn't require the framebuffer to work out (unlike the bloom effect). Then I also want to either create a new table for general effect shaders (explosions, warp, etc.) or try to incorporate that into the existing tbls for those things.
There is time to work all of that out though. The first test will be more limited, mainly the fullscreen effects and some basic ship-specific shader support. If that goes well then the rest of the details will be finalized (obviously with mod support and a fair bit of input from DaBrain, Wanderer, Axem, ..., etc.). Then there will probably be one more test build after that which will have more shader support, then it will hit CVS.
-
Hey this is really awesome. Much nicer than the hdrish or hdrish-lite smartshaders(hdrish fake hdr's your hud). I saw the screenshots of the software implemented bloom, and while hardware ones will be obviously superior in many ways. The software ones made the game look like the cutscenes, only in much higher resolution.
-
are we ever gonna get that materials system or has that been canned?
-
are we ever gonna get that materials system or has that been canned?
You'll have to ask Bobboau, I've got nothing to do with that.
-
I think we're not limited to a special shader version.
Of course we shouldn't blindly use 3.0 shaders, but afair 1.3 is pretty limited.
Valve didn't think so. I am playing Half-Life 2 again these days and they used some pretty decent v1.3-shaders.
-
The difference between version 1.1 and 1.3 is smaller than the difference between 1.3 and 1.4.
I'm no expert for this, but as far as I know, you can do lots of stuff even with version 1.1, but it's more complicated than for 2.0 and you often need more instructions to achive a effect similar to a 2.0 shader.
The backwards compability needs quite a bit of work.
Valve didn't care about this. HL2 was sure to become a bestseller, so the effort to support older cards was justified by the plus of potential customers with older gfx cards.
Imo it's a logistical issue. Many other games only support 2.0 cards, or have only a DX7 fallback mode for all older cards. Only the really big games support multiple shader versions and/or multiple APIs. (HL1 had D3D, OGL, Glide and Software rendering...)
-
The maps I got from Vasudan Admiral didn't work out so well though, they seem inverted. Whether that's just the maps or partially the code I don't know.
Well blargh. :\
I suppose it's quite possible they're inverted - I only had Truespace to test them with, and it's entirely possible it's the wrong way round (wouldn't be the first time either).
However, AFAIK they were Black->White = Recess->Bump?
-
What kind of shader support is this exactly?
GLSL.
Is it going to be limited to pixel shader v2.0 and above?
Yes.
There are neat things you could do with v1.3 nevertheless
That's true, but it's writing in assembly and using hacks.
Also, Geforce FX doesn't suffer from performance problems in OpenGL, if the wiki entry is to be believed
Only if using half precision AFAIK. And it's not currently being done.
-
so basically, you are restricting the people that can use this feature to those with radeon 9500 and up or Geforce 6 series and up, leaving Geforce FX, Radeon <9500, Geforce 4, and Geforce 3 users in the cold.
That's not very nice :(
-
What shader module will people need? radeon 9500 and up for ati and other cards from that generation generally sm2.0 of which is possible to do real hdr on hl2 with. After that all of the geforce 6 cards have sm3.0, and i assume that means that geforce 5 line had sm2.0. Only the radeon x1xxx series has sm3.0, everything from the radeon 9500 up has sm2.0. Sm2.0 is all that's needed for hdr...in hl2. Now dabrain has shown the light why many more people can play with hdr in hl2 and why people can't play with hdr in farcry(farcry only likes sm3.0 though you can do hdr on sm2.0).
My figuring would be for those without a card that can't do the new fs2 bloom it won't matter too much. It's like just enjoying halflife 2 or farcry without hdr(or at least the bloom affect). Those who can't do it will be able to enjoy the game for absolutely all it has except for the bloom. Not to mention the bloom affect until it's performance optimized from what i've read may be a performance issue for many, and as that a lot of people might not use it because they'd be running too slow with it. Who gives a **** if people get left on the cold on this one. A lot more people got left in the cold with old graphics cards and far cry, a lot less left in the cold with old graphics cards and hl2. Who cares? The geforce 3 by now is quite old(although the original xbox uses the gpu), and with such an old card don't expect it to do everything, the same goes for people who own a geforce 4(you can't even play fear at all with that card).
-
shader model 2.0 is not usable in full precision mode (aka Radeon compatible) on the Geforce 5 series because of the crap way nvidia designed the Geforce FX. It basically slows down to a crawl unless it's in partial precision
-
What shader module will people need? radeon 9500 and up for ati and other cards from that generation generally sm2.0 of which is possible to do real hdr on hl2 with. After that all of the geforce 6 cards have sm3.0, and i assume that means that geforce 5 line had sm2.0. Only the radeon x1xxx series has sm3.0, everything from the radeon 9500 up has sm2.0. Sm2.0 is all that's needed for hdr...in hl2. Now dabrain has shown the light why many more people can play with hdr in hl2 and why people can't play with hdr in farcry(farcry only likes sm3.0 though you can do hdr on sm2.0).
My figuring would be for those without a card that can't do the new fs2 bloom it won't matter too much. It's like just enjoying halflife 2 or farcry without hdr(or at least the bloom affect). Those who can't do it will be able to enjoy the game for absolutely all it has except for the bloom. Not to mention the bloom affect until it's performance optimized from what i've read may be a performance issue for many, and as that a lot of people might not use it because they'd be running too slow with it. Who gives a **** if people get left on the cold on this one. A lot more people got left in the cold with old graphics cards and far cry, a lot less left in the cold with old graphics cards and hl2. Who cares? The geforce 3 by now is quite old(although the original xbox uses the gpu), and with such an old card don't expect it to do everything, the same goes for people who own a geforce 4(you can't even play fear at all with that card).
There is a big difference between games like FEAR, (which runs like crap on almost every card) and HL2/HL2 Ep 1, which runs both smooth and looks as good, while still having a graceful degradation into shader 1.x mode.
Supporting Shaders 2.0 is fine, but at least offer a mode for Shader 1.x folks. All the best games do.
Also there is a big difference between that entire generation of games and Freespace Open, which is probably used by a wider group of cards than just the latest and greatest. Maybe we should do a poll to find out. All i'm saying is, doing a crappy bloom effect is perfectly possible in shader 1.x just go look at Half-Life 2 episode one as a great example, also, Oldblivion is a loader for Oblivion using Shader 1.x, and there are even software bloom shaders for oblivion that use 1.x code, I've seen it and it's high level, it looked to me like C. Certainly far from "assembly"
-
Well good for me that i have a geforce 7600gt coming in the mail. Yes the geforce fx series is ****, all the people i know that owns a geforce 5 card, has usually gotten it for free out of older computers, or for 10$ out of a box at a computer shop. Everyone else was i know that owns a geforce 5 was sucker falling for a 120$ deal for the 5700le(they just don't care about what the le stands for, or rather the low memory interface ddr either). Anyway what's so bad about restricting graphics cards anyway? Hl2 actually opens up to a wide variety of graphics cards about as much as fso has. Given that we at least don't have people here still trying to play on anything older than a high end geforce2, same goes for hl2. But, i think people should really be worrying if they're an ati user with a radeon 9250 with dx8.1 compliance like cobra has and can't even do the hdrish smartshaders. Besides, hardware blooms and glows are better, and it sounds like most of this stuff is going to be dependent on cards with dx9 compliance, maybe dx8.1 also. Maybe people should take the route valve has taken with hl2 and it's hdr, i mean there's a lot of people using hdr on ****ty cards that people didn't think possible, one of them was me with a radeon 9600se running full hdr in lost coast very slowly but surely.
-
It WILL support less than SM2.0.
It WILL work with more than GLSL.
The code just won't hit CVS otherwise. Nothing else will be acceptable to me.
Regardless of what it does now (SM2.0+ and GLSL only) I'll add support for the rest. And I'll continue adding support and making optimizations as needed just in case. Does that mean it will all work for everyone? No. Someone still has to actually write the shaders for everything, but the code should be there to use it if possible. Does that mean that every effect will be available for every card? No. It won't support everything for everyone, but there will be GLSL support, Cg support, and basic support for GeForce3+ series cards at least in code if not the initial shader files. Different versions of the shader effects can be added at any point.
GLSL/SM2.0+ will be greatly preferred for every effect however. And although Cg support will be there, its use will likely be frowned upon.
-
Hmm, well it seems like you're saying one thing, and wolf is saying another.
-
Well good for me that i have a geforce 7600gt coming in the mail.
:(
-
Hmm, well it seems like you're saying one thing, and wolf is saying another.
And that makes sense, wolf's code is SM2.0+ and GLSL. But I'm adding the rest, plus a few extra things. ;)
I'm still working on the extra stuff so I haven't copied wolf on it all yet. What we've got is only a start, there needs to be a good bit more work done before it's truely usable.
-
can you give us a link or two that you may be using for refrence so some of us can get a head-start on learning shader languages so when this hits CVS we can get to work straight away?
-
hi,
theres only two high launges: GLSL for OpenGL and HLSL for Direct3D.
and serverals tools, which can compile the high language to serverls shader versions.
like cg from nvidia, monkey or so from ati and theres a editor from MS, if my memories are correctly.
and to the shader version: if shaders are slower than some on older verison, that doesnt mean its the shaderversion itsself.
with shader 1.3 can you make more effects or speed as in 1.1, in 1.4 to 1.3 too, in 2.0 to 1.x too and so on.
the newer versions are mighty as the preview and alow a programmer more stuff easier to implate with more speed or more effects on the screen.
possible the reason if a shader slower as a lower version:
- the hardware isnt fast enough for the higher version
- the shader display doesnt the same thing, so the lower version doesnt need so much resources
- the shadercode is to complex for the currently hardware and will better run with the next, stronger generation.
- the code is **** or code doesnt optimizes correctly and waste a lot of resource
Mehrpack
-
I'm searching for tutorials. Do you have any idea how the shader filestructure will look like?
Is there a standard format for this, so we can just pick stuff from the net, or will we have our own way of doing this?
I'm well aware of the fact, that you might not be able to answer that at this point. ;)
-
I'm well aware of the fact, that you might not be able to answer that at this point. ;)
Yeah, it's a little tough to answer right now with any firm details. :)
I want to try and make this as easy as possible though so the plan is just have the tbl there to offer multiple versions of the same effect, but the effects themselves will be standard shader language. Just think of it like multi_text in other tbls, the shader script will just get sucked out of a tbl, and that script can just be a copy&paste job out of a tutorial or something. There will be some specific layout/order to the tbl format so that the code can pick out just what it's looking for, but beyond that the actual shader should be 100% standard.
-
I'm searching for tutorials. Do you have any idea how the shader filestructure will look like?
Is there a standard format for this, so we can just pick stuff from the net, or will we have our own way of doing this?
I'm well aware of the fact, that you might not be able to answer that at this point. ;)
hi,
i have search a little bit, i hope that links will help:
GLSL – An Introduction:
http://nehe.gamedev.net/data/articles/article.asp?article=21
GLSL Tutorial:
http://www.lighthouse3d.com/opengl/glsl/
theres can you find the shader designer to play with shaders:
http://www.typhoonlabs.com/
gsls dokumentation:
http://www.opengl.org/documentation/glsl/
GLIntercept, another tool for shaders:
http://home.swiftdsl.com.au/~radlegend/GLIntercept/index.html
and dabrain, ask in the programming forum of 3dcenter.de
i have found there that links, and i think they help you there with material over shading and shadercode itsself.
Mehrpack
-
We've got to wait for all of the data to catch up too. The MediaVPs are still lagging behind at the moment. With the shader support, we'll actually need shader effects, so that will be something for people to come up with. And I've got partially working bumpmap code with these test images I've been sent, so that will be something else that the map makers will have to deal with when that hits CVS. I'll probably add the bumpmap support about the same time as the shader support hits CVS so those things will hit everyone at once. The problem is, of course, that those features are useless without the data to go with them. ::)
I did not read the whole thread yet, but I already posted a bump map shader here:
http://www.hard-light.net/forums/index.php/topic,29853.0.html
Maybe you can use it.
-
Oh... Vertex Shader 1.0... ;)
Should work for all cards <Geforce 3.
I don't know if we can use it as it is. But I think the maths might be helpful... at least for me.
taylor is probably already done with this part, since he has tested some normal maps already...
-
Correction : that should read :
Should work for all cards >= Geforce3.
-
I'm really sorry for bumping this, but I was curious if there's a finished version of this around, considering the site with the download says it isn't.
And does any Nividia card naturally support bloom at all?
-
Umm... nope, and nope. Well, nVidia can support HDR (what bloom is) but theres no cheat program like the ATi one here.
-
All the more reason for the SCP lads to get off their fannies and make proper HDR support built-in.
-
Umm... nope, and nope. Well, nVidia can support HDR (what bloom is) but theres no cheat program like the ATi one here.
So you're saying not even a custom built Nvidia specifically made for something like the Unreal 3 engine could support bloom?
-
Lol, the ati hdrish smart shader is not real hdr, in fact it's fake hdr that's static. It's always been said that cards that didn't have shader module 3.0 couldn't do hdr. Which is not true, it just depends on what you write the hdr for which shader module. The real hdr in farcry is meant only for sm3.0, the real hdr in hl2 works with sm2.0 and 3.0 and idk which others.
And does any Nividia card naturally support bloom at all?
...
................................yes, nvidia and ati cards all can, just depends on which card you use that has the right shader module that ultimately decides which card can do bloom. Pretty much cards that are sm2.0 equipped can do bloom. Maybe lower shader modules can do it also like 1.1 and 1.0, but i really don't know about that....Then again any card doing bloom is a lot easier than older cards being able to do hdr.........................hdr is more than just that famed bloom affect you're looking for, hdr has bloom, but is mainly overall dynamic brightness and contrast that depend on the game environment you're in. Bloom has been done for years before hdr even came out in games. You can have bloom without real full featured hdr, and that was what the scp was after mainly just for that bloom aspect only. They didn't want to do the full hdr. Don't mistake bloom for hdr, they really are two different affects.
-
All the more reason for the SCP lads to get off their fannies and make proper HDR support built-in.
theyre off their fannies. theyre just doing the important stuff that we dont even know we appreciate.
-
hi,
bloom support: imho is the effect if you look in FS2 in the suns, bloom too, simple bloom but i think its the same name for it.
not full HDR supporting all cards with pixelshader support 2.0 (DirectX 9 cards, like the Radeon 9500 or higher).
not full HDR support, but better support as 2.0 pixelshader have 2.x cards (like the Geforce FX 5800 or higher, not the x800 series).
and full HDR supporting all cards with pixelshader support 3.0 (Radeon x1800 or higher and Geforce 6 or higher).
HDR dish was a driverprogramm of ATI to use the Pixelshader on thier cards for Post Processing effects.
the most games use only HDR with cards they support 3.0 (Farcry with patch 1.3, Obivilion) and bloom for cards they supporting only 2.x.
Only Half Life 2 and some Demos useing HDR with 2.x cards, but thats doesnt have all features of the HDR what is possible with 3.0 cards and the programmer have more work.
because you need 2 or more implantations of HDR and have to do most of the work 2 or more times.
the reason is: the implantation of 2.0 is different to 2.x and 3.0 cards, as far as i know.
the other thing is, that i believe you need the seperation content for all HDR implantations, too.
so its a lot of work, for a HDR that doesnt look so good as with 3.0 and the most 2.x cards are really old and can get performence problems.
so isnt really enticing for the studios to make an 2.x HDR implantation.
Mehrpack
-
They didn't want to do the full hdr. Don't mistake bloom for hdr, they really are two different affects.
To make things more confusing, there are two distinct effects in games that are both called bloom, and they're both different from HDR. One is the type that applies a static glow to all light sources in view. This has been around for quite a while now (I think I first saw it in the original Splinter Cell in 2002) and the HDRish thing is basically this effect, although somewhat more primitive. Another effect that is also called bloom is the one that dynamically changes the brightness level depending on the scene you're looking at. This is the kind of thing you see in Far Cry, for example, and it's easy to do with the FP16 EXR HDR, although it's not actually a part of HDR.
the other thing is, that i believe you need the seperation content for all HDR implantations, too.
The new lighting information is apparently only needed for the SM2.0 version. Far Cry and the two newer Splinter Cell games don't have any additional content to make it work. I think it's needed for the second kind of bloom I referred to earlier. The radiance values need to be manually specified for all the individual walls/rooms/etc., whereas with the EXR HDR, they're automatically computed and the range of the tone mapping is adjusted to change the brightness level.
HDR in FS2 would be nice but I don't think it would be that noticeable in this type of game. The main improvement would come from the first kind of bloom.
-
All the more reason for the SCP lads to get off their fannies and make proper HDR support built-in.
theyre off their fannies. theyre just doing the important stuff that we dont even know we appreciate.
QFT. Especially Taylor.