Author Topic: Woo, VPMagic 0.2 source release!  (Read 9105 times)

0 Members and 1 Guest are viewing this topic.

Offline dizzy

  • 26
    • http://dizzy.roedu.net
Woo, VPMagic 0.2 source release!
I can try to make the autotools stuff for VPCS. Just give me a URL to your latest version.

Other thing, would you consider unifying all this VP management codes? It's kinda useless to have all this codes through all these different projects. It makes maitenance a living hell (if not for the VP file very simple structure I would say there are probably bugs in one VP implementation that are not in another one and so on, instead of having a common library for this where all the bugs should get fixed). This also allows someone (or some persons) to focus just on this, specialization is good thing(tm) :)

About the VP archive speeding up FS access, hmm, I am not sure, but if FS game is loading all needed data files before starting a mission you mean it would only speed up this loading step (there is also the streaming thing, but that should work the same in VP, separate files, or compressed VP mode because it means streaming from a single file content).

Hmm, I have not much experience with windows filesystems but on what I'm used to (Linux) with filesystems like reiserfs this shouldnt be a real issue. Ok, the index access should be a lot faster (being all in memory) but this also means the index always has to fit in memory (ie has to not have huge size or so). Another thing, actual reading of the file contents shouldnt be faster with VP than with a normal filesystem because you have the same reason why it can be slow: fragmentation. Both in normal filesystems and VP, loading a sequence of files can trigger loading of blocks from the disk from random position, thus slowing this operation because of disk latency issues. But, all this is just theoretical talk, a benchmark should prove this quite easily :)

 

Offline Kazan

  • PCS2 Wizard
  • 212
  • Soul lives in the Mountains
    • http://alliance.sourceforge.net
Woo, VPMagic 0.2 source release!
http://cvs.sourceforge.net/viewcvs.py/alliance/vpcs2/

Unifiying code: i already have - all my apps use my VP implementation

I was the first one out there with a serious tool for making VPs -- VP View read before me, but I wrote first and DM tools are written in MFC *barf*

I'm not rewriting four of five programs to appease someone's unification dream - and like hell i'm standardizing ANYTHING on an STL class -- they're decent classes but you'll never see me use them as parent classes.


It's faster to search thruogh an array in memory (which you can load the VP FAT into) than iterating through a list on disk.

"compressed" VP mode would

NTFS is more liek a big database than a filesystem -- lotsa overhead because of that.

Fragmentation of one large file has less impact than a hundred smaller files being fragmented.
PCS2 2.0.3 | POF CS2 wiki page | Important PCS2 Threads | PCS2 Mantis

"The Mountains are calling, and I must go" - John Muir

 

Offline dizzy

  • 26
    • http://dizzy.roedu.net
Woo, VPMagic 0.2 source release!
Nobody is telling you what to do, those are sugestions based on arguments and we can talk arguments. This are technical decisions and as such should be based strictly on technical arguments. Many brains always thinkg better then a single one.

I will look into adding autotools support to that. I will also look through the API and make my sugestions over it (if any).

Having unified codes for reading/writting VP is better than having dozens separate codes (this applies to any set of codes that is suposed to do exactly the same thing), because bugs get's fixed in one single tree and because working on a single tree concentrates more brainpower than having people working on separate trees. All the other codes might just end up using your API if that's concluded in the end but still this would be unification :) I was hoping from you to be more opened to this type of discussions but I guess you already had a lot of talks other this things before long before I started to look over this and as such my sugestions can sound as wasting time.

Do you have any technical problems with the C++ standard library ? Trying to program according to a standard (by using it's features and design your own codes on the similar APIs) has advantages because that's why standards exist, to not have people reinvent the wheel all the time and all "speak another language". Of course I am in no way a promoter of "always program using the C++ standard library way" if that is technically proven to not be so good :)

I was not talking about fragmentation in the VP file when reading a single file but about the I/O seeking involved (the reason why fragmentation is bad in general) when doing reading of some set of files no matter if from filesystem or from VP. Because this list of files doesnt need to be in any way sorted that it turns out to have in the VP file the contents one after another. They can be totally random. As such when reading this set of files (say you have 1000 random files to read from VP or from the filesystem) you might get almost the same latency because of I/O seeks. It also depends on the size of files, if most of the files read from VPs are as such that reading from from a filesystem whould stress the OS block cache more than reading them from a single file (ie you get more hits from that cache when reading from a single file) then again using a VP is better.

In general, I am not trying to prove that VP is slower. It is very clear that IN GENERAL it is faster. But also, until someone comes with exact benchmark information I can argue that this difference might not be very big, that's all :)

About the argument that compression whould eliminate VP speed benefits I tend to disagree. Depends on how you program it. For example if we go by the way to compress the "data content" as a whole then a problem whould be to seek into a specific file in that compressed stream. Now because "deflate" compresses in blocks of 32kbytes size this means you can use in the index isntead of the direct offset in the VP file, an offset to the starting of the compression block that contains the beginning of that file and an offset inside this block as if it were uncompressed to know EXACTLY where the file starts.

So trying to start to read a specific file from the compressed VP file whould mean seeking to the beginning of the compression block that contains the beginning of the file in the uncompressed form, uncompress it and then read the file data from where that file starts inside that uncompressed block. Then the rest of the file content is read just by continuing to read the next compression blocks (so which are continouse to this block in the VP file) and uncompress them until reach the file end. So as you can see with this design there are no additional I/O seeks, there is only ONE seek (just as it were in the uncompressed case). The only additional cost I can think of here is the decompression itself. But because (as far as I can tell from reading my system's I/O activity when loading a mission) this reading from VP is I/O bound then the additional CPU cost of uncompression should really get unnoticed and if programmed well should not introduce additional delays.

 

Offline Kazan

  • PCS2 Wizard
  • 212
  • Soul lives in the Mountains
    • http://alliance.sourceforge.net
Woo, VPMagic 0.2 source release!
Quote
Originally posted by dizzy
I will look into adding autotools support to that. I will also look through the API and make my sugestions over it (if any).

ty


Quote
Originally posted by dizzy
Do you have any technical problems with the C++ standard library ?


yes - there are compiler portability issues, severe performance issues and it's impossible to debug into the STL and see anything useful


Quote
Originally posted by dizzy
Trying to program according to a standard (by using it's features and design your own codes on the similar APIs) has advantages because that's why standards exist,


however coding to standards for the sake of coding to standards is inherently stupid - VPs are NOT anything like ANY of your standard streams - trying to treat them as if they were is inefficient at best

Quote
Originally posted by dizzy
to not have people reinvent the wheel all the time and all "speak another language". Of course I am in no way a promoter of "always program using the C++ standard library way" if that is technically proven to not be so good :)


Writing a class that treats a file in a manner that is SANE to it's internal formatting is not "reininventing the wheel" - attempting to treat a VP like it could be class vp : public fstream is making a square wheel and calling it perfect


the file listing in a VP is structured in a specific manner - files are sorted alphabetically (case sensative) inside their folders - and i believe folders are sorted alphabetically as well, i can go back and check this.



Quote
Originally posted by dizzy
About the argument that compression whould eliminate VP speed benefits I tend to disagree. Depends on how you program it. For example if we go by the way to compress the "data content" as a whole then a problem whould be to seek into a specific file in that compressed stream. Now because "deflate" compresses in blocks of 32kbytes size this means you can use in the index isntead of the direct offset in the VP file, an offset to the starting of the compression block that contains the beginning of that file and an offset inside this block as if it were uncompressed to know EXACTLY where the file starts.


A) you have no way to sort that block in the VP - there is NO ROOM for it in the FAT - we would have to either break support or calculate the block and seek to it's beginning instead of know the position of the file and seek to it to read
B) You have to DECOMPRESS the bloody file

Effect(A): Break Compatability, or Increase CPU Costs
Effect(B): Massive Increase in CPU Costs
Effect(A+B): Requires MASSIVE record of the filesystem module of fs2_open

A = Inacceptable
B = Inacceptable
A+B= You're out of your mind

I FLAT OUT REFUSE  to support ANY type of compression in VP files as it is antithetical to the design philosophy of the file and it's usage


You seem to be blissfully ignorant of the CPU costs of decompression, the delacacy of the filesystem module in fs2_open, the purpose of VP files.

Reading from a VP:
* Open File, Seek to fat, bitblt fat to memory
* (Close file if you wish)
* Find file in FAT
* (Open File if you closed it)
*Seek to File's offset
*Read File (open i just bitblt small files like textures into a memory block and do everything in ram)


Reading from a theoretical compressed VP
* Open File, Read Header, Seek FIRST BLOCK that contains fat, decompress it: find the beginning of the FAT in that block, start reading it - decompress any more blocks you need
* Store fat in memory
* (Close file if you wish)
* File file in FAT
* (Open file if you closed it)
* Seek to first compression block, decompress it, find beginning of file, read on decompressing any additional blocks you need


---------------------

Compression VASTLY increases EVERY form of overhead, breaks compatability, serves no real purpose

[edit again]

oh.. and most of the file put in VPs isn't compressable because the file type itself is compressed data (.dds, .jpg,. png) or is uncompressable data (POFs aren't really very compressable.. )

uncompressable/already compressed data accounts for probably about 80% of the data in a VP
« Last Edit: August 31, 2005, 09:57:33 am by 30 »
PCS2 2.0.3 | POF CS2 wiki page | Important PCS2 Threads | PCS2 Mantis

"The Mountains are calling, and I must go" - John Muir

 

Offline Goober5000

  • HLP Loremaster
  • Moderator
  • 214
    • Goober5000 Productions
Woo, VPMagic 0.2 source release!
Try not to get annoyed at Kazan, dizzy... whenever he gets in a discussion, whether his points are valid or not, he tends to come off as rather brash. :)

 

Offline taylor

  • Super SCP/Linux Guru
  • Moderator
  • 212
    • http://www.icculus.org/~taylor
Woo, VPMagic 0.2 source release!
Quote
Originally posted by Kazan
my VPCS 2.x should compile and run on linux just fine - just like PCS 2.x

Sorry to say, but no.  You are using several Windows specific commands in there just like you tend to do with most everything you write and don't provide a replacement, or even provide notice, that said functions are Windows only.  GCC likes to hate how you do some C++ things too.  It's just like the fs2netd server which took me all of 20 minutes to make Linux friendly but over 5 hours to find replacements for the DOS specific commands.

 

Offline Kazan

  • PCS2 Wizard
  • 212
  • Soul lives in the Mountains
    • http://alliance.sourceforge.net
Woo, VPMagic 0.2 source release!
Quote
Originally posted by taylor

Sorry to say, but no.  You are using several Windows specific commands in there just like you tend to do with most everything you write and don't provide a replacement, or even provide notice, that said functions are Windows only.


A) what functions am I using that are windows only
B) I don't provide replacements because I don't know what functions I'm using that are windows only
C) I would find out which functions these are when I get around to making sure it compiles in linux - however I am working on functionality right now and not compatability - compatability is a cleanup phase

Quote
Originally posted by taylor

 GCC likes to hate how you do some C++ things too.


Yeah.. some shiat that doing the stylistically correct way makes MSVC *****.

I should get in the habit of writing on GCC then porting to MSVC - would be faster and easier: but I'm not using linux as my primary OS at home.

Quote
Originally posted by taylor

  It's just like the fs2netd server which took me all of 20 minutes to make Linux friendly but over 5 hours to find replacements for the DOS specific commands.


what DOS specific function was I using?

[edit]
filelength(FILE *);

DUH

i forgot to remove that and replace it with a seekg/tellg

some of my older code that is getting "rolled up" into newer apps was written before my emphasis on platform agnosticism so until it gets cleaned some of it will contain old references.

feel free to do those cleanups anytime you find one that needs done
« Last Edit: August 31, 2005, 12:03:19 pm by 30 »
PCS2 2.0.3 | POF CS2 wiki page | Important PCS2 Threads | PCS2 Mantis

"The Mountains are calling, and I must go" - John Muir

 

Offline taylor

  • Super SCP/Linux Guru
  • Moderator
  • 212
    • http://www.icculus.org/~taylor
Woo, VPMagic 0.2 source release!
Quote
Originally posted by Kazan
A) what functions am I using that are windows only
B) I don't provide replacements because I don't know what functions I'm using that are windows only
C) I would find out which functions these are when I get around to making sure it compiles in linux - however I am working on functionality right now and not compatability - compatability is a cleanup phase

Mainly file acces stuff I think, like you said "filelength".  I think that there was something else too but I don't remember off the top of my head since I've been too involved with the OSX version the past few days and can't really concentrate on anything else.  It's nothing that can't be fixed but it is annoying that that you have to compile, fix, compile, fix all day since those things aren't marked and going line by line through the code looking for stuff isn't any faster.  That stuff is pretty much always small though and it's the compiler errors which can take real time to deal with.

Quote
what DOS specific function was I using?

kbhit and something else.  It's been months since I looked at any of that so I'm not sure exactly what all I changed.  I eventually found a GPL'd replacement for kbhit that fit easily into what you already had coded and used it instead.  That's stuff that never hit CVS though since I still don't know if it even worked right or not.

 

Offline Kazan

  • PCS2 Wizard
  • 212
  • Soul lives in the Mountains
    • http://alliance.sourceforge.net
Woo, VPMagic 0.2 source release!
kbhit is from conio.h .. and why there was a reference to kbhit in a GUI app is beyond me :D

like i said - VPHandler was old functional code imported up into a new project

-------------

look at this much newer code

http://cvs.sourceforge.net/viewcvs.py/alliance/pcs2/FileList.h?rev=1.5&view=auto

(lol i just noticed i had two case-change functions in there from when it was used in another add with std::string - kaz_string has str_to_lower and str_to_upper in the class)

http://cvs.sourceforge.net/viewcvs.py/alliance/pcs2/FileList.cpp?rev=1.5&view=auto
« Last Edit: August 31, 2005, 01:08:40 pm by 30 »
PCS2 2.0.3 | POF CS2 wiki page | Important PCS2 Threads | PCS2 Mantis

"The Mountains are calling, and I must go" - John Muir

 

Offline taylor

  • Super SCP/Linux Guru
  • Moderator
  • 212
    • http://www.icculus.org/~taylor
Woo, VPMagic 0.2 source release!
Quote
Originally posted by Kazan
kbhit is from conio.h .. and why there was a reference to kbhit in a GUI app is beyond me :D

No I'm talking about fs2openPXO.  It used kbhit which is old DOS functionality and by no means cross-platform.

 

Offline dizzy

  • 26
    • http://dizzy.roedu.net
Woo, VPMagic 0.2 source release!
Quote
Originally posted by Kazan

yes - there are compiler portability issues, severe performance issues and it's impossible to debug into the STL and see anything useful

however coding to standards for the sake of coding to standards is inherently stupid - VPs are NOT anything like ANY of your standard streams - trying to treat them as if they were is inefficient at best



Oki, actually at work we use STL on large projects (projects which handle tens of megabytes per second file processing and many clients at a time, so I'm not speaking about GUIs or something heh) and I wouldnt say it's impossible to debug... :) But anyway, I cant really prove this with words unless talking on a specific example as such I willl let my codes speak for themselves as I do make use of STL and the rest of the standard C++ library where that usage is good. Also about portability, I don't think FS2 being a C++ program should try to support systems that are not fully ANSI C+98 compliant, otherwise why don't we switch FS2 on C89 or something...

Quote

Writing a class that treats a file in a manner that is SANE to it's internal formatting is not "reininventing the wheel" - attempting to treat a VP like it could be class vp : public fstream is making a square wheel and calling it perfect


I wouldnt use "perfect" for any codes in the real world :) Also, I very much agree with your point, if this is the case here, which is why I'm trying to discuss about this, I would expect great chances to be wrong because I don't have much FS2 codes experience.

Quote

the file listing in a VP is structured in a specific manner - files are sorted alphabetically (case sensative) inside their folders - and i believe folders are sorted alphabetically as well, i can go back and check this.


Is this sorting some mandatory requirement ? Why ? :) (please bare with me...)


Quote

A) you have no way to sort that block in the VP - there is NO ROOM for it in the FAT - we would have to either break support or calculate the block and seek to it's beginning instead of know the position of the file and seek to it to read
B) You have to DECOMPRESS the bloody file


Here you got me lost... By FAT you meand the index in the VP file ? Presuming that you talk about the index and it really needs to be sorted I don't understand how sorting the index has anything to do with how the files are stored (compressed or not). In the end sorting the index just means reordering of the index entries but their values (like the offset to the compressed block that contains the beginning of the file) remains the same. You mean to have the contents of the file also sorted in the order found in the index ? What whould be the purpose of this ? :)

Quote

You seem to be blissfully ignorant of the CPU costs of decompression, the delacacy of the filesystem module in fs2_open, the purpose of VP files.


Well I've been saying that all along, no need for 3 replies to realise that :) Yes, I am TOTALLY ignorant of FS2 needs of VP files which probably means the main USE for them their main REASON to exist. As such I am arguing here so someone clearifies this for me.

Quote

Reading from a VP:
* Open File, Seek to fat, bitblt fat to memory
* (Close file if you wish)
* Find file in FAT
* (Open File if you closed it)
*Seek to File's offset
*Read File (open i just bitblt small files like textures into a memory block and do everything in ram)


Ok, this doesn't sound too far away of what I had in mind for FS2 usage even not reading those codes...

Quote

Reading from a theoretical compressed VP
* Open File, Read Header, Seek FIRST BLOCK that contains fat, decompress it: find the beginning of the FAT in that block, start reading it - decompress any more blocks you need
* Store fat in memory
* (Close file if you wish)
* File file in FAT
* (Open file if you closed it)
* Seek to first compression block, decompress it, find beginning of file, read on decompressing any additional blocks you need


Actually, compressing of the index (if that is what you mean by FAT) was just an option. My last proposal of compressed VP file didn't had such an option. So the index is still stored uncompressed. So the flow becomes:
* Open File, Read Header, read FAT in memory (if that is what you mean by bitblt)
*(Close file if you wish)
* Find file in index (or FAT)
*(Open file if you closed it)
* Seek to first compression block (offset to this is already in the index, so is just an I/O seek), decompress it (OK, some CPU load), find beginning of the file (the block has been decompressed at the previous step IN MEMORY (where else) as such "finding the beginning of the block" just means returning the data that starts there, and you don't need to "seek" it, you have the offset in the decompressed block stored in the index too)

So again, the only overhead I see here is the CPU involved by decompression. Because you still think that this is a huge overhead and I don't think so, I already started working on my own VP library that WILL have this kind of compression support and you could expect some benchmark numbers really soon :) Let the numbers speak for themselves...

Quote

Compression VASTLY increases EVERY form of overhead, breaks compatability, serves no real purpose


About breaking of compatibility I think Taylor's idea to have these files with their own extension (.cvp) whould solve that because people will see that either their tools has support for .vp only or also for .cvp.

About purpuse, well, can't say I see one big either... not like disk usage is much of an issue and as you also said many files are already compressed. However, there are a lot of "large" files in VPs that do compress a lot, people that distribute uncompressed vp files (ie non-zipped) should be hunted and killed :)

Quote

oh.. and most of the file put in VPs isn't compressable because the file type itself is compressed data (.dds, .jpg,. png) or is uncompressable data (POFs aren't really very compressable.. )

uncompressable/already compressed data accounts for probably about 80% of the data in a VP


Again I don't have much experience but take fsport VP files but for example fsport:
-rw-r--r--  1 dizzy users  82120116 Aug 29 11:45 tango_fs1.zip
-rw-r--r--  1 dizzy users 159327089 Aug 29 00:57 tango_fs1.vp

I would say that is a huge difference :) (happens on some other fsport VP files too).

In conclusion (if I'm allowed to do that :)), let's see how my VP lib project develops and wait for the benchmark numbers. Doesn't seem to be much in purpose for compressed VP files because their only purpose whould be to reduce used disk space AFTER INSTALLATION (that is, people who DISTRIBUTE uncompressed VP files should be killed as stated, this is different from the VP files you keep then in your FS2 dir which can be uncompressed) and I don't think there are many games out there doing that right now as disk space is very cheap. About unification of VP codes, I still think is something that proves good in time but if someone as experienced and influental as Kazan says that it's stupid then I'll drop it (for the moment:)).

 

Offline dizzy

  • 26
    • http://dizzy.roedu.net
Woo, VPMagic 0.2 source release!
Quote
Originally posted by Kazan

 however I am working on functionality right now and not compatability - compatability is a cleanup phase


Ok I don't want to start another flame war or get too personal but I think this principle is totally wrong. If you ever intend (from the beginning that is) to have your codes working on multiple platforms then compatibility is NOT a cleanup phase but actually a DESIGN phase. Of course this requires good knowledge of things how they are done on multiple platforms and if this knowledge is lacking (hell I would be the first one to admit that I dont know a lot of things how are they done on Win32) then of course that things are done in the limit of the existent knowledge.

But, if one has knowledge about for example how to setup a timer on Win32 and how to setup a timer on POSIX systems, and this one person notices that the Win32 call has different arguments that do not exist/have no meaning on POSIX and that those arguments are NOT necesarry for the project the person is working on, then I think that person should do a wrapper call instead of a direct call with the common interface and do specific implementations for the specific systems. Whould you agree ? :)

 

Offline Kazan

  • PCS2 Wizard
  • 212
  • Soul lives in the Mountains
    • http://alliance.sourceforge.net
Woo, VPMagic 0.2 source release!
the File Allocation Table in a VP has a VERY specific format in data and in ORDER

here is the header and the FAT table entry

Code: [Select]


struct Read_VPHeader
{
char signature[4];   //"VPVP"
int version;         //"2"
int diroffset;       //bytes from beginning of file
int direntries;      //number of files in directory
};

struct Read_VPDirent
{
 int offset;          //from beginning of file
 int size;
 char filename[32];   //Null-terminated string
 int timestamp;       //The time the file was last modified in seconds since 1.1.1970 0:00
                      // Same as from calling findfirst/findnext file using any C compiler.
};


DIRECTORY data is stored in here too - in a sorta weird way.

FS2 reads the FAT sequentially and watches for "special" entries that it takes as directory change directives

[this is off the top of my head and i haven't rewritten a VP writer in several years - consult code and take it as the authoritative source in any discrepancies]
a "enter directory" 'directive' has offset = (offset of first file in directory), size = 0, filename = directory name, timestamp = 0

a "leave directory" `directive has offset = (offset of first file in the next directory), size = 0, filenane = "..", size = 0"

code note - VPHandler writes
Code: [Select]

//Filename[32] = 0x2E 2E 00 CCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCC [Hex Bytes]

for "leave directory" - which is exactly what [V]'s tool wrote


Here follows the example of a simple fat
----------------------

FILE { 16, 0, data', 0 }
FILE { 16, 0, 'effects', 0 }
FILE { 16, 1024, 'somefile', timestamp }
FILE { 1040, 126, 'fileb', timestamp }
FILE { 1166, 0, '..', 0 }
FILE { 1166, 0, 'tables', 0 }
FILE { 1166, 2048, 'menu.tbl', 0 }

-----------

[edit]

and data is packed

[16-byte header]
[file1]
[file2]
[file3]
[...]
[fileN]
[FAT ENTRY 0]
[FAT ENTRY 1]
[...]
[FAT ENTRY N]
« Last Edit: August 31, 2005, 01:44:53 pm by 30 »
PCS2 2.0.3 | POF CS2 wiki page | Important PCS2 Threads | PCS2 Mantis

"The Mountains are calling, and I must go" - John Muir

 

Offline Kazan

  • PCS2 Wizard
  • 212
  • Soul lives in the Mountains
    • http://alliance.sourceforge.net
Woo, VPMagic 0.2 source release!
Quote
Originally posted by dizzy


Ok I don't want to start another flame war or get too personal but I think this principle is totally wrong. If you ever intend (from the beginning that is) to have your codes working on multiple platforms then compatibility is NOT a cleanup phase but actually a DESIGN phase.



perhaps you should pay attention to nuance and context a little more closely BEFORE YOU ****ING FLAME

Cross-platform compatability is CONSIDERED FROM THE DESIGN STAGE - the "cleanup issues" is CLEANING UP THINGS THAT MAKE ONE COMPILER ***** BUT ANOTHER NOT AND CROSS-PLATFORMIZING OLD LIBRARIES THAT GOT IMPORTED UP INTO THE NEW APP


Quote
Originally posted by dizzy

But, if one has knowledge about for example how to setup a timer on Win32 and how to setup a timer on POSIX systems, ...


I'm using wxWidgets - NONE of my GUI code is platform-specific in my 2.x trees
PCS2 2.0.3 | POF CS2 wiki page | Important PCS2 Threads | PCS2 Mantis

"The Mountains are calling, and I must go" - John Muir

 

Offline Kazan

  • PCS2 Wizard
  • 212
  • Soul lives in the Mountains
    • http://alliance.sourceforge.net
Woo, VPMagic 0.2 source release!
i don't see the POINT of a .cvp - it's POINTLESS

it just SLOWS things down

i can get a 300GB Maxtor 5400pm 2MB cache ATA133 drive on pricewatch for $111 [inc shipping]

harddrive space = not a problem
PCS2 2.0.3 | POF CS2 wiki page | Important PCS2 Threads | PCS2 Mantis

"The Mountains are calling, and I must go" - John Muir

 

Offline dizzy

  • 26
    • http://dizzy.roedu.net
Woo, VPMagic 0.2 source release!
Quote
Originally posted by Kazan

perhaps you should pay attention to nuance and context a little more closely BEFORE YOU ****ING FLAME

Cross-platform compatability is CONSIDERED FROM THE DESIGN STAGE - the "cleanup issues" is CLEANING UP THINGS THAT MAKE ONE COMPILER ***** BUT ANOTHER NOT AND CROSS-PLATFORMIZING OLD LIBRARIES THAT GOT IMPORTED UP INTO THE NEW APP


hahahaha, perhaps I should do that more often (ie read more carefully before replying) sorrry for that :):)

 

Offline taylor

  • Super SCP/Linux Guru
  • Moderator
  • 212
    • http://www.icculus.org/~taylor
Woo, VPMagic 0.2 source release!
Quote
Originally posted by Kazan
i don't see the POINT of a .cvp - it's POINTLESS

it just SLOWS things down

Generic compression just to compress is pointless and slows things down.  I'm in full agreement there and is why I hated the idea of a compressed VP because of it's "just compress it" nature.  I want to do it right though and get compression with speed valued more that size and I guarantee that if I can't work out the speed issues to my satisfaction the code isn't ever going to see CVS.

You have to remember though that all of the Quake and Doom3 based games store their files in compressed archives.  It is feasible to do in a large game.  I'm not sure that same approach would work well for FS but something better is needed than the standard VP.  I've currently got 4.7Gig worth of FS2 and related mods on my hard drive.  That's insane.  Hard drive space may not be a problem for you (or me for that matter) but it will be for some.  Remember that we are supporting multiple platforms and on some of those it may not be so easy to just get a cheap hard drive for extra space.  Something will have to be done eventually and though it may be a few months before I actually get working code, I'm hoping that CVP will be part of the solution to that.

 

Offline Kazan

  • PCS2 Wizard
  • 212
  • Soul lives in the Mountains
    • http://alliance.sourceforge.net
Woo, VPMagic 0.2 source release!
well i can think of a MUCH SANER way to store the FAT in a packfile


Code: [Select]



struct fe_packhead // 16 bytes
{
fe_char filesig[8];
fe_int filever;
fe_int numfiles;
};

struct fe_pack_frecord // 256 bytes
{
fe_char filename[120];
fe_char direct[128];
fe_int file_size;
fe_int file_offset; //offset from the begining of the file
};

http://cvs.sourceforge.net/viewcvs.py/alliance/ferrium/FileSystem/FE_Pack.h?rev=1.6&view=markup

http://cvs.sourceforge.net/viewcvs.py/alliance/ferrium/FileSystem/FE_Pack.cpp?rev=1.11&view=auto
PCS2 2.0.3 | POF CS2 wiki page | Important PCS2 Threads | PCS2 Mantis

"The Mountains are calling, and I must go" - John Muir

 

Offline Goober5000

  • HLP Loremaster
  • Moderator
  • 214
    • Goober5000 Productions
Woo, VPMagic 0.2 source release!
Quote
Originally posted by Kazan
perhaps you should pay attention to nuance and context a little more closely BEFORE YOU ****ING FLAME

Cross-platform compatability is CONSIDERED FROM THE DESIGN STAGE - the "cleanup issues" is CLEANING UP THINGS THAT MAKE ONE COMPILER ***** BUT ANOTHER NOT AND CROSS-PLATFORMIZING OLD LIBRARIES THAT GOT IMPORTED UP INTO THE NEW APP
Kazan, that was completely uncalled for.  Not only was dizzy not flaming you, he was trying to avoid a possible flame war by prefacing his comments with "Ok I don't want to start another flame war or get too personal".

Yours is entirely unwarranted behavior for a member of the SCP, especially since he was trying to offer some helpful advice and especially since you were the cause of the misunderstanding.  Cross-platform programming is part of the design stage, and if you're spltting hairs then it's no wonder your post was confusing.

 

Offline Inquisitor

Woo, VPMagic 0.2 source release!
Because someone disagrees does not mean someone is flaming.

I am afk this week, I will be back Sunday night.

Be nice.
No signature.