Ok I might be missing something here, but that n² complexity would only apply on a per subobject basis wouldn't it? Maybe even just on a 'groups of linked verts' basis? (As obviously, you can't possibly smooth over disconnected verts so there'd be no point checking them)
I ask because it currently appears to apply to the whole polycount. I have a 40 292 poly version of the same model, split into 3 subobjects of 13 506, 11 626 and 15 160 polys. The Lucifer with 12304 polys loads in 20 seconds. Logically, if PCS2 were applying the n² complexity on a per subobject basis it should take approximately 1 minute to load right? (3 sets of approximately 20 seconds)
Currently it takes 3 and a half minutes just to load the POF, which is a painfully long time. I just saved the 40 292 poly model from PCS2 and reopened it - it took the same length of time even though it should have already calculated the smooth data and everything already.
Oh - you say this smoothing calculation is only supposed to happen on COB conversion? What's happening on POF load that's taking forever then?
