Those limits seem much more reasonable across the board.
How do you figure? I can understand the ships and weapons (though they really just need to be dynamic and not have any limit, but we all know that already), but the model related things are just bad.
If you need 300 different POFs loaded at the same time then you have some serious mission design problems. That would make a mission so memory hungry that almost no one would be able to play it very well. At one point that might have been more of an issue (like with the duplicate model thing for texture replacement, which is now obsolete), but it's pretty difficult to even use that many now, must less need that many. But in spite of that, the number of different POFs which can be loaded at once will also end up going dynamic before too much longer.
And the vertex buffers per submodel thing, that just means the Inferno needed 24 textures for each submodel. That doesn't count glowmaps, specmaps, or anything like that, just the base map. Even a
horribly inefficient model would maybe use 10. 95% of existing models (which aren't absolute crap) would use no more than 3. And if you follow the efficient model texturing guidelines then you should really only need 1 texture per submodel.
Do you know off the top of your head what the performance hit is?
Not really. I have never profiled an Inferno build (mainly because I don't use them), but we are probably talking about sub-nanosecond differences. While a pretty small number by itself, it will add up to something bigger eventually. For the most part though you would never notice the speed drop. By the the time that you ended up using that much stuff to be able to profile it the memory usage would be so high and the rendering performance drop so great that it would be practically impossible measure.