Hard Light Productions Forums
Modding, Mission Design, and Coding => FS2 Open Coding - The Source Code Project (SCP) => Topic started by: The E on October 29, 2010, 08:56:19 pm
-
Okay. As we all know, materials are something that we kinda want in FSO.
So I've been thinking about this a bit, and here's what I came up with.
I think that any system like this should plug into the existing texturing pipeline as easily as possible. Based on this, what I propose would be to create a new quasi-tbl that can be plugged in as a supported texture type into the texture loading mechanism, similar to the way effs are integrated.
This text file would look like this:
$Diffuse Map: <filename>
$Glow Map: <filename>
$Spec Map: <filename>
$Normal Map: <filename>
$Vertex Shader: <filename>
$Fragment Shader: <filename>
$Uniform: <String> ;; The name of the uniform variable that is passed to the shaders
+Type: <One of: Int, float, texture>
$Value: <As appropriate>
$Preprocessor define: <String> ;; Name of a preprocessor define
$Flags: (
"Lighting" ;; Shader handles lighting. Number of lights will be passed to the shader as a uniform.
"Environment" ;; Shader handles env mapping
"Fog" ;; Shader handles fog calculations.
"Timer" ;; Shader gets current missiontime and frametime as floats.
)
Now, in terms of shader compilation etc, this should be handled during mission load, the first time such a material is encountered. In case of failure to compile the shaders, the engine would fall back to the "standard" texture loading process, trying the various supported formats until it can find something it can load.
Comments?
-
I did a quick google search, but I'm still not too clear on what exactly a material is. Is it some sort of procedurally generated texture?
Don't flame me
-
I would strongly discourage the use of parameters like 'Diffuse Map' or 'Glow Map' as fields. I propose something like this instead:
$Map: filename ; Without file extension
+Diffuse: 255 ; 0 to 255, 0 is transparent and 255 is opaque
+Blending: 0 ; Type 0 is no alpha, 1 uses 1-bit green alpha, 2 uses 32 bit alpha channel, 3 uses additive blending
+Lighting: 0 ; 0 to 255, affects self-illumination value
+Specular: 0 ; 0 to 255, affects specularity
+Env: 0 ; 0 to 255, affects environment mapping
$Normal: NO ; Whether map is a normal map
$Translation: NO ; Whether this texture moves at all
Basically something close to the materials systems of rendering engines with plenty of room for expansion. This way groovy effects like animated texture translation or pulsing glowmaps could be generated with a single texture instead of with inefficient and inflexible EFFs, and maps can be layered for additional effects, such as a huge Ambient Occlusion texture and a tiled hull plating texture on the same material.
-
Well, the way it is designed right now would allow just that. The "$Diffuse Map" option, for example is just a shortcut to the following combination of information that is passed to the shader:
$Uniform: sBasemap
$Type: Texture
$Value: <filename>
$Preprocessor define: FLAG_DIFFUSE_MAP
This system is designed for ease-of-use, and to make it easier to fit existing assets into the material pipeline.
Make no mistake, the heavy lifting has to be done in the shader here; and with the various options for data to pass into the shader, we can do anything we want.
-
I did a quick google search, but I'm still not too clear on what exactly a material is. Is it some sort of procedurally generated texture?
No, not really. A material, as I understand it, is a file that specifies all of the properties (texture, shaders) of a surface or object. The name itself comes from Valve and their Half-life engine, where a material also specifies the sound of something hitting the surface, and IIRC, the friction of the surface.
Though, I suppose a materials system could be used to specify procedurally generated textures as well if you really wanted it to.
-
i kinda like it. would it be possible to add additional maps which may be needed by a particular shader?
-
i kinda like it. would it be possible to add additional maps which may be needed by a particular shader?
Yes, via the "$Uniform:" option.
-
Could you explain how would your average modder use this proposed material system?
If I have understood right, this material system would make separate -shine, -normal and -glow maps obsolete? So how exactly you do those maps with only shaders? I'm guessing shaders would be capable of doing same work as image editing software would? Only in real-time. But since you don't usually get wanted results with one method when doing normal maps in particular, wouldn't you need a lot of different shaders to get the results you want?
And since the material system is supposed to do away with inefficient use of animated textures and effects, how exactly do you do animated effects and maps with material system? How would you do, say current mediavps exp06 in material system?
-
If I have understood right, this material system would make separate -shine, -normal and -glow maps obsolete? So how exactly you do those maps with only shaders? I'm guessing shaders would be capable of doing same work as image editing software would? Only in real-time. But since you don't usually get wanted results with one method when doing normal maps in particular, wouldn't you need a lot of different shaders to get the results you want?
Nope, that's not how it works. You will still need separate diffuse/glow/shine/normal/whatever maps. And recreating something like a full explosion effect using shaders alone will not look as good as an eff could (explosions are very intensive computationally, generating them on the fly will never look quite right). Now, the main purpose of this is to allow a modder to specify exactly what shader is used on a texture; As opposed to the current implementation where we have one shader set that is used for everything.
And since the material system is supposed to do away with inefficient use of animated textures and effects, how exactly do you do animated effects and maps with material system? How would you do, say current mediavps exp06 in material system?
As said above, you can't recreate an explosion effect like that. However, what you CAN do, for example, is doing something like the pulsating effect seen on Shivan glowmaps, using only one or two textures that are blended in the shader.
What this system will also allow (eventually, anyway) is to allow the sexp and scripting systems to pass information to the shaders. This can be used (for example) to implement optical stealthing.
-
So eh, then what is it that is supposed to do away with inefficient use of animation frames? How do other games do such effects?
-
Sounds like a good plan. I'm wondering abour a few things though.
How will this work with meshes that use multiple textures/multi-materials? Wouldn't we need some kind of "material" we can save in our models?
It it depends completly on the texturename, we loose some room for content optimizations.
I.e. the subspace jump in/out effects could use the same frames (maybe reverse), but different materials to recolor them.
And about the $Uniform, I was wondering what exactly will be allowed? Will be able to have a dynamic value that we can animate somehow?
Later I'd like to be able to assign materials to my meshes directly in my 3d application, by adding the name to the custom mesh properties. In most 3d apps this would make editing multiple meshes at once a very simple task.
The materials should have names and be stored in a table and could be used as material "templates".
I.e. there would a 'Vasudan_Fighter" material which you'd add to the mesh properties in Blender, Max, Maya, etc, or directly in PCS2.
If you just add the material name the default values of the material will be loaded from the table (ps, vs, textures, uniforms), but you can also add values in the mesh properties to overwrite the default values.
For the fighter mesh:
MatId = 1
UseMat = 'Vasudan_Fighter'
And if you want to change something:
MatId = 1
UseMat = 'Vasudan_Fighter'
DiffText = 'vas_fighter_02_diffuse'
SpecText = 'vas_fighter_02_spec'
DetailNormText = 'gen_vas_detail_normal'
SpecGloss = 35
@Galemp
Looks fine first, but that wouldn't be flexible enough. Not all shaders will need a diffuse texture. If we ever want to have (heat) distortion effects, the objects with this shader will not even be visible. They'll just provide informations for the post processing.
-
so take it shaders themselves would be kept in another file, and you would select the one most appropriate for the material. i also take it there would be a default material that would be used for reverse compatibility which would be about equivalent to the kind of rendering we have now.
i would also want some kind of script output to the variables in the material system. like make the shivan ships glow brighter when they have full shields. stuff like that.
-
So eh, then what is it that is supposed to do away with inefficient use of animation frames? How do other games do such effects?
First, you need to define what "inefficient" is in this context.
For an explosion effect, 100 frames aren't that inefficient; and unless you have a super efficient way that can generate explosions procedurally, those things are going to stay.
Now, something like an animated glowmap on a ship, that's inefficient.
Imagine you want to pull off an effect like the one seen on TBPs Shadow ships. This means an animated diffuse map, and if you're interested in real awesomeness, an animated normal map on top. Depending on map size and effect quality, this can get pretty ugly pretty fast.
However, using a custom shader, with the current mission- and frametime passed as an argument, the same effect can be pulled off using only 2 maps.
A material system is not a magic bullet to kill effs, nor is it intended to be. There are scenarios where an eff is the best choice. There are others in which a well-designed material is the best one.
Sounds like a good plan. I'm wondering abour a few things though.
How will this work with meshes that use multiple textures/multi-materials? Wouldn't we need some kind of "material" we can save in our models?
This is designed to plug into the existing texturing system. There is nothing stopping you from using multiple materials on a model, just as there is nothing stopping you from using multiple textures right now.
It it depends completly on the texturename, we loose some room for content optimizations.
I.e. the subspace jump in/out effects could use the same frames (maybe reverse), but different materials to recolor them.
Why? You can just pass different arguments to the shader via uniforms.
And about the $Uniform, I was wondering what exactly will be allowed? Will be able to have a dynamic value that we can animate somehow?
Stated in the first post. In terms of uniforms, we are limited by what GLSL data types are available.
Later I'd like to be able to assign materials to my meshes directly in my 3d application, by adding the name to the custom mesh properties. In most 3d apps this would make editing multiple meshes at once a very simple task.
As I am not a modeller, I cannot help you there. As I said, this system is designed to plug into the existing pipeline (Especially PCS2) as seamlessly as possible.
The materials should have names and be stored in a table and could be used as material "templates".
I.e. there would a 'Vasudan_Fighter" material which you'd add to the mesh properties in Blender, Max, Maya, etc, or directly in PCS2.
If you just add the material name the default values of the material will be loaded from the table (ps, vs, textures, uniforms), but you can also add values in the mesh properties to overwrite the default values.
A tbl might be a good idea. Will have to think about this some more.
so take it shaders themselves would be kept in another file, and you would select the one most appropriate for the material. i also take it there would be a default material that would be used for reverse compatibility which would be about equivalent to the kind of rendering we have now.
i would also want some kind of script output to the variables in the material system. like make the shivan ships glow brighter when they have full shields. stuff like that.
Doing something like this, and leaving out scripting/sexp integration, would be stupid.
-
Ok., I think I didn't get something right then.
I have texture/EFF called "warp_animation" now I want to recolor it using by using two materials with two different uniforms for the color.
How do I do that?
As I am not a modeller, I cannot help you there. As I said, this system is designed to plug into the existing pipeline (Especially PCS2) as seamlessly as possible.
Well, you wouldn't have to care about that part. The values will be stored in the POFs like i.e. the detail box values. How they get there is entirely a problem of the artist and doesn't require any modification in the POF format.
And on a sidenote about the explosions. Many modern games if not most of them use particles for their explosions now. some in combination with short frame animations others with just one texture per particle system.
I'm not saying that's possible in FSO though. I'd even say with the current options for particles, it's probably impossible to create a decent effect with just particles. (as in, more than one particle with a frame animation)
Edit: Will we be able to use materials and shaders for effects and particles? That would be incredibly awesome! :)
-
I have texture/EFF called "warp_animation" now I want to recolor it using by using two materials with two different uniforms for the color.
How do I do that?
By passing a colour value as a uniform parameter? I should have said this before, the system would allow an arbitrary number of uniform variables to be created and passed as an argument to the shaders.
Well, you wouldn't have to care about that part. The values will be stored in the POFs like i.e. the detail box values. How they get there is entirely a problem of the artist and doesn't require any modification in the POF format.
Ah, I see. Hadn't thought of that, but it's a good idea. One that I do not really like, since I like to keep data in as few places as possible. What I would want to do instead is to have new fields in ships.tbl and weapons.tbl to pass material information.
-
By passing a colour value as a uniform parameter? I should have said this before, the system would allow an arbitrary number of uniform variables to be created and passed as an argument to the shaders.
Well.. the two effect are supposed to use the same textures and I thought the system would tap in here and pick the material depending on the texture name. I think that's the part I got wrong. I'm just not sure how it's supposed to work then.
Ah, I see. Hadn't thought of that, but it's a good idea. One that I do not really like, since I like to keep data in as few places as possible. What I would want to do instead is to have new fields in ships.tbl and weapons.tbl to pass material information.
I'm not a fan of redundant data myself, but I'd still suggest it for usability reasons.
There are tons of options and tools to rename/alter multiple meshes at once in modern 3d software packages.
On top of that you can use Python (Blender), MAXScript (3DS), MEL (Maya) and other scripting languages to create your own little tools for the material/mesh setup.
Btw this is how it works in the UDK:
You set up a "shader" (not quite correct, but lets say it's a shader) in the editor and decide which values will later be exposed to the material. Then you create a material which uses this shader. Now you can assign the material to meshes, or better to material IDs of a mesh. This will create a copy of your material instance. You can go with the default values you set, or change them.
-
Okay, here's what I had in mind. Hopefully it'll clear something up.
We start off with a model. Sad model refers to a texture, let's call it "awesome".
In our maps folder, we have a file called awesome.mat, which is a text file as outlined in the first post. This defines our base material for that texture.
In ships.tbl, we add a "$Material:" section, where we can redefine the used material for that ship class only, using the same syntax as found in the .mat file. Does that make sense?
-
Ah alright, I get it now.
So you refer to a texture on a specific model. That will indeed cause way less problems.
The only limitation I see is that if I want to use the same texture on the model twice or more often, I can set different shaders paramters...
Hmm, but wait. I can assign a texture called "1" and a texture called "2" and simply use the textures like material ids and overwrite the diffuse, spec, whatever textures via the material.
That would work perfectly. :)
Edit: I could even use the actual textures in Max and write a script that will replace the texture names by the actual material IDs and use it before I export a DAE.
On the other hand the texture fallback wouldn't work this way, but I guess an option to set a fallback texture in the material would fix that.
Edit2: How would we handle things that don't have a mesh. Like beam effects or the lens flares and of course all particles?
-
Well, weapon effects are rather easy. After all, they do have a mesh, even if it is autogenerated.
However, That will take some more time, as I have to figure out how the batch renderer (which is responsible for these things) interacts with stuff. Let's try to get this running on normal models first, then move on to that.
-
Sure, I didn't want to make it even more complicated.
I just see a lot of room for effect optimizations with materials/special shaders. I hope to get rid of many frame animations. The memory can be used better elsewhere. :)
There is more than enough content to optimize without the POF-less meshes though. Animated glow maps and the shockwave effects are my first target.
One last question. Can we add position/scale animations via the vertex shaders as well?
-
What would those be?
-
Just moving meshes around or scale them up. Just for effect.
The 3d shockwave mesh is being scaled up by the engine... somehow. I'd just like to be able to do the same.
Since I don't see PCS2 importing mesh animations any time soon (and FS2 supporting them) this seems to be the way to go.
-
Nope, that's a geometry shader effect. Unless someone donates me a new PC, I'm not going to work on that.
-
There is one last thing I'd like to suggest though.
It's not important right now, but will come in handy later.
I'd like ot have a "shader" tab in the lab where you can edit shader values and (if possible) re-load textures. So you can work with the textures and the shaders always get an instant preview of how the effect will look in the game.
-
ok, first off as others have said, taking our existing texture loading systems and predefined effects is a pointless endeavor, to that extent having things like $Glow_Map: is pointless. the whole idea of the materials system is to get away from predefined effects what if the modeler wants something that doesn't have a defuse map or they want the defuse map and the shine map to be the same thing or a million other ideas that would be awkward to handle with named textures. materials should be purpose agnostic, the artist should not be constrained by the preconceived notions of whoever designed the system initially.
second putting this in the ships table is a bad idea, multiple ships are going to want to use the same effect, and a single ship will want to use more than one material. materials should be defined in their own table and referenced in the pof on the polygon level. it may sound like this would be a lot of work, but polygon already have an int for defining how they are drawn, and this int would not be needed anymore after the materials system, I am speaking of course of the texture id. but it will require the addition of a material chunk to the pof to define what textures are given to the material. in order to know how to draw a poly the game needs to know what textures to use, this is where that choice would be made, the material [MTRL] chunk, would be an array that has for each element the name of the material (defined in the material table) and an array of ints that would be the textures defined in the texture [TXTR] chunk. additionally there might be a list of floats or ints to be used as constants in the material, but that's an advanced feature that can be added on later.
now the real meat of the material system would be the material definition as defined in the material table. a material would be a way of using zero to many textures in one to many passes (each consisting of one to eight stages). to further complicate the matter graphics hardware can only handle so many lights at a time (8 last I checked) and so many textures at a time (which is why there needs to be multiple passes). each pass you would set poly wide values like alpha blending zbuffering ect and in each stage you would set stage settings like setting the first and second texture passed to the material as the input or the result of the last stage and a third texture for the second, blending options, ect. you would need a way of defining what happens on the first lighting pass vs subsequent ones
you should look over this thread (http://www.hard-light.net/forums/index.php?topic=32067.0) (now I realize how far I've fallen :( ) as it starts out on how I was going to implement my system. keep in mind one of the core concepts with the way I was going to do it involved having multiple variable definitions, so the POF would have multiple vertex and index buffers defined that used only what the material needed and this would have allowed for a great deal of flexibility in the future so you could have multiple lighting normals or maybe a normal that was to be used to define geometry shader growth or something no one has thought about yet.
-
ok, first off as others have said, taking our existing texture loading systems and predefined effects is a pointless endeavor, to that extent having things like $Glow_Map: is pointless. the whole idea of the materials system is to get away from predefined effects what if the modeler wants something that doesn't have a defuse map or they want the defuse map and the shine map to be the same thing or a million other ideas that would be awkward to handle with named textures. materials should be purpose agnostic, the artist should not be constrained by the preconceived notions of whoever designed the system initially.
Thank you for your input. Can you try it again, but this time, read what I said here (http://www.hard-light.net/forums/index.php?topic=72337.msg1429036#msg1429036)? Those options are meant to be shortcuts for existing functionality. All in the name of making this stuff easier to use.
second putting this in the ships table is a bad idea, multiple ships are going to want to use the same effect, and a single ship will want to use more than one material.
This should work in the same way as tbl-level texture replacement. And since I'm hooking into the texture loading mechanisms, the ability to use more than one material is on the same level as using multiple textures is now.
Of course, this is where having multiple UV channels will come in.
materials should be defined in their own table and referenced in the pof on the polygon level. it may sound like this would be a lot of work, but polygon already have an int for defining how they are drawn, and this int would not be needed anymore after the materials system, I am speaking of course of the texture id.
So instead of making this system so that it can be used without having to alter the pof file format, or without altering pof data at all, you'd want to change pofs as well? I realize that there are advantages to doing it this way, but seriously? I find that trying to change too much stuff at once will only result in failure and misery.
but it will require the addition of a material chunk to the pof to define what textures are given to the material. in order to know how to draw a poly the game needs to know what textures to use, this is where that choice would be made, the material [MTRL] chunk, would be an array that has for each element the name of the material (defined in the material table) and an array of ints that would be the textures defined in the texture [TXTR] chunk. additionally there might be a list of floats or ints to be used as constants in the material, but that's an advanced feature that can be added on later.
See above. I have an intense dislike for squirreling stuff like this away in the pof, when I can just as easily define it in a text file somewhere.
now the real meat of the material system would be the material definition as defined in the material table. a material would be a way of using zero to many textures in one to many passes (each consisting of one to eight stages). to further complicate the matter graphics hardware can only handle so many lights at a time (8 last I checked) and so many textures at a time (which is why there needs to be multiple passes). each pass you would set poly wide values like alpha blending zbuffering ect and in each stage you would set stage settings like setting the first and second texture passed to the material as the input or the result of the last stage and a third texture for the second, blending options, ect. you would need a way of defining what happens on the first lighting pass vs subsequent ones
Again, feel free to break the engine in any way as you please. Me, I'm going to try and make this work within the existing framework first. Because I know what I am capable of (and, more importantly, what I am capable of testing).
-
So instead of making this system so that it can be used without having to alter the pof file format, or without altering pof data at all, you'd want to change pofs as well? I realize that there are advantages to doing it this way, but seriously? I find that trying to change too much stuff at once will only result in failure and misery.
fallback behaviour is easily implemented, if a pof does not have a material chunk you make one that uses a default material for every texture in the POF
Again, feel free to break the engine in any way as you please. Me, I'm going to try and make this work within the existing framework first. Because I know what I am capable of (and, more importantly, what I am capable of testing).
ok, so then you are not going to have any new functionality, you are just going to add an extra way of doing the same thing we are going now without the ability of doing anything that can't be done now? well I guess you can assign a different shader that it is SOMETHING but I don't think this really qualifies as a system so much as its simply better shader file handling.
-
Way I see it, a system like this needs to be able to do just one thing: Pass arbitrary data to shaders. Which this system is set up to do.
Besides, the really interesting things can be done using some render-to-texture magic. Which I plan to do here as well.
Again, my goal here was to design a system that I can finish, maintain and extend later. Not something that does everything anyone would want to do from the get-go.
-
but if you make bad design choices now for the sake of simplicity you will be stuck with them for the rest of time.
a simple fact is that materials need to deal with multiple passes, lighting constraints, fall back in case the effect the artist wants won't worked ect. at the very least you need to have a pass based declaration that is purpose agnostic.
as far as passing data to shaders you would probably be better off using scripting for that, people are not going to be satisfied with constants, they are going to want engine output, weapon energy, shield strength ect.
-
but if you make bad design choices now for the sake of simplicity you will be stuck with them for the rest of time.
I am, at this time, unconvinced that they are bad design choices. You had different plans, yes. Not sure if they're better plans. And again, let me stress this once more, I am aiming for stuff where I am pretty certain that I can pull it off in a reasonable amount of time.
a simple fact is that materials need to deal with multiple passes, lighting constraints, fall back in case the effect the artist wants won't worked ect. at the very least you need to have a pass based declaration that is purpose agnostic.
Not necessarily wrong, but not necessarily right either. What you see as a necessity, I see as an optional nice-to-have-eventually extra. Also remember that there aren't that many artists around in this community who can deal with this system as it is (unless people have been hiding their Shader-writing skills under a rock).
as far as passing data to shaders you would probably be better off using scripting for that, people are not going to be satisfied with constants, they are going to want engine output, weapon energy, shield strength ect.
...
And what, exactly, made you think I was unaware of this? What exactly made you think that I did not take this into consideration? Or that I rejected it?
Remember, this is just a draft. In terms of implementation, I am still at the very beginning, and it'll be some time before the first test builds.
-
ok, so, aside from giving some ability to specify a different shader to a ship, how will this be an improvement on what is possible. what would happen if someone wanted to make an effect that used 6 textures? this draft you have posted seems like little more than a more complicated interface to the current fixed materials, you can't do anything new, only tweak what exists.
-
That's what In thought at first as well, but I think you can pass as many textures into an material as you like, just like constants.
The original texture on the model doesn't even matter. That's just the name of the material.
-
The draft I posted allows you to define an arbitrary* number of arguments to be passed to a shader. What the shader(s) then do with that data is none of my concern.
Now, for the moment, it's limited to a single-pass rendering, but there's a lot you can do in a single pass.
*For values of arbitrary lower than 16
-
so a texture can be passed as an argument?
how do you handle an object being lit by 37 lights?
how do you handle alpha blending?
-
so a texture can be passed as an argument?
Yes.
how do you handle an object being lit by 37 lights?
I don't. Not until there's multi-pass rendering. And as I said, that's something to try later.
how do you handle alpha blending?
Cross that bridge when I come to it.
-
as a side question, what are the concrete shader system limits? (complexity, shader size, shader number, etc,etc)
-
Way I see it, a system like this needs to be able to do just one thing: Pass arbitrary data to shaders. Which this system is set up to do.
Besides, the really interesting things can be done using some render-to-texture magic. Which I plan to do here as well.
Again, my goal here was to design a system that I can finish, maintain and extend later. Not something that does everything anyone would want to do from the get-go.
i like the sound of "render-to-texture magic". i kinda hope this makes things much much faster as far as rtt goes. right now its somewhat crippling when you use it through scripting. a couple of low res sensor displays can really set you back several fps.
-
as a side question, what are the concrete shader system limits? (complexity, shader size, shader number, etc,etc)
A lot of those are depending on the GPU, or rather the supported Shader Model.