i cant think of many applications where computers need that kind of temporal precision. games comes to mind, weapons guidance systems avionics maybe. and most of that stuff is handled at the hardware level by a realtime clock chip. for database and financial applications i dont see that kind of precision being required. im sure there were a few applications where the bug needed fixing.
the y2k bug was something blown way out of proportion by people who didnt get what was going on. i can imagine a programmer being told by is boss to fix a non-existant problem with an application, and then using the oppritunity to twiddle his thumbs and work on coding one of his own projects while pretending to fix the bug for two years, and raking in the dough.
the other thing i didnt get about the y2k bug was that 2000 is a very round number, where computers like powers of two. i could understand a crash in 2048 for examle. before y2k i had some programming knowledge, but even now it still seems like a bunch of bs.