Loading a binary table would certainly be faster, but it has the major downside that you now need a tool to edit it, or even to inspect it... you can't just pull it up w/ Notepad.
And a corrupted binary file is
much more difficult to detect.
I say this with a certain degree of confidence
At work, we are currently converting all of our text-based tracking files to a binary format. These are big files, the text files total about 500M, which we must process every 15 minutes, so load time and file size are a big deal...
What we've experienced:
- The size is decreased to about 60% (FS2 tables would probably compress even better, since there's so much white space)
- Load time is about five times faster if they're fixed record length; if there is a lot of variable length stuff, they seem to load about 1.5 times faster.
- A corrupt file (especailly w/ a bad record length indicator) can wreck your whole day