So eh, then what is it that is supposed to do away with inefficient use of animation frames? How do other games do such effects?
First, you need to define what "inefficient" is in this context.
For an explosion effect, 100 frames aren't that inefficient; and unless you have a super efficient way that can generate explosions procedurally, those things are going to stay.
Now, something like an animated glowmap on a ship, that's inefficient.
Imagine you want to pull off an effect like the one seen on TBPs Shadow ships. This means an animated diffuse map, and if you're interested in real awesomeness, an animated normal map on top. Depending on map size and effect quality, this can get pretty ugly pretty fast.
However, using a custom shader, with the current mission- and frametime passed as an argument, the same effect can be pulled off using only 2 maps.
A material system is not a magic bullet to kill effs, nor is it intended to be. There are scenarios where an eff is the best choice. There are others in which a well-designed material is the best one.
Sounds like a good plan. I'm wondering abour a few things though.
How will this work with meshes that use multiple textures/multi-materials? Wouldn't we need some kind of "material" we can save in our models?
This is designed to plug into the existing texturing system. There is nothing stopping you from using multiple materials on a model, just as there is nothing stopping you from using multiple textures right now.
It it depends completly on the texturename, we loose some room for content optimizations.
I.e. the subspace jump in/out effects could use the same frames (maybe reverse), but different materials to recolor them.
Why? You can just pass different arguments to the shader via uniforms.
And about the $Uniform, I was wondering what exactly will be allowed? Will be able to have a dynamic value that we can animate somehow?
Stated in the first post. In terms of uniforms, we are limited by what GLSL data types are available.
Later I'd like to be able to assign materials to my meshes directly in my 3d application, by adding the name to the custom mesh properties. In most 3d apps this would make editing multiple meshes at once a very simple task.
As I am not a modeller, I cannot help you there. As I said, this system is designed to plug into the existing pipeline (Especially PCS2) as seamlessly as possible.
The materials should have names and be stored in a table and could be used as material "templates".
I.e. there would a 'Vasudan_Fighter" material which you'd add to the mesh properties in Blender, Max, Maya, etc, or directly in PCS2.
If you just add the material name the default values of the material will be loaded from the table (ps, vs, textures, uniforms), but you can also add values in the mesh properties to overwrite the default values.
A tbl might be a good idea. Will have to think about this some more.
so take it shaders themselves would be kept in another file, and you would select the one most appropriate for the material. i also take it there would be a default material that would be used for reverse compatibility which would be about equivalent to the kind of rendering we have now.
i would also want some kind of script output to the variables in the material system. like make the shivan ships glow brighter when they have full shields. stuff like that.
Doing something like this, and leaving out scripting/sexp integration, would be stupid.