Hard Light Productions Forums
Off-Topic Discussion => General Discussion => Topic started by: Kosh on July 09, 2009, 11:02:44 pm
-
How big of a problem would it really have been?
-
Clicky. (http://en.wikipedia.org/wiki/Y2K)
The Year 2000 problem (also known as the Y2K problem, the millennium bug, the Y2K bug, or simply Y2K) was a notable computer bug resulting from the practice in early computer program design of representing the year with two digits. This time code ambiguity caused some date-related processing to operate incorrectly for dates and times on and after January 1, 2000 and on other critical dates which were billed "event horizons". Without corrective action, long-working systems would break down when the "...97, 98, 99..." ascending numbering assumption suddenly became invalid. Companies and organizations worldwide checked, fixed, and upgraded their computer systems.
While no globally significant computer failures occurred when the clocks rolled over into 2000, preparation for the Y2K bug had a significant effect on the computer industry. Countries that spent very little on tackling the Y2K bug (including Italy and South Korea) experienced as few problems as those that spent much more (such as the United Kingdom and the United States), causing some to question whether the absence of computer failures was the result of the preparation undertaken or whether the significance of the problem had been overstated.
-
I know what the wiki says, I'm just asking for other people's opinions.
-
....?
How bad would it have been? :confused:
-
In truth, it depends on what systems were being relied on and how heavily, for example, Y2K would have totally crippled the Fortran based code that ran the UK's telephone routing system, and left everyone unable to contact anyone else, that would have been serious.
-
How big of a problem would it really have been?
(http://www.gcgraphix.com/AMST1/2.jpg)
this big.
-
I'm feeling just a touch more optimistic than that.
In my picture there's at least someone still alive.
[attachment deleted by MSC
-
Y2K would have given us Clan heavy 'mechs? Coo.
-
Y2K would have given us Clan heavy 'mechs? Coo.
Damn straight.
-
There would have been a rain of fish, because the Firewalls around the Sea of Sky would have failed, and yea, they would have been a great wailing and profileration of fish heads.
-
Y2K would have given us Clan heavy 'mechs? Coo.
I couldn't find a picture of MechWarrior: Dark Age that didn't have said title written across it in blood red block letters.
-
This thread is funney
-
(http://www.contrahour.com/contrahour/images/conan_in_the_year_2000.png)
Put in your own funny joke her.....
-
Forget Y2K... you should start worrying about Y2K38.
(http://imgs.xkcd.com/comics/2038.png)
-
i dont know why it uses less memory to store the year in a decimal format anyway. i mean your typical unsigned 16 bit integer is good all the way till the year 65536. so i always said y2k was a hoax, long before it happened.
-
I remember dad was working at the health department here, and spent New Years 1999/2000 at work.
Apparantly medical equiment that did stuff with dates was particularly at risk.
To cut a long story short everything went smoothly. Except the door system which decided it couldn't let them out.
-
i dont know why it uses less memory to store the year in a decimal format anyway. i mean your typical unsigned 16 bit integer is good all the way till the year 65536. so i always said y2k was a hoax, long before it happened.
That's because computer's don't store years, they store nanoseconds (or ticks).
-
i cant think of many applications where computers need that kind of temporal precision. games comes to mind, weapons guidance systems avionics maybe. and most of that stuff is handled at the hardware level by a realtime clock chip. for database and financial applications i dont see that kind of precision being required. im sure there were a few applications where the bug needed fixing.
the y2k bug was something blown way out of proportion by people who didnt get what was going on. i can imagine a programmer being told by is boss to fix a non-existant problem with an application, and then using the oppritunity to twiddle his thumbs and work on coding one of his own projects while pretending to fix the bug for two years, and raking in the dough.
the other thing i didnt get about the y2k bug was that 2000 is a very round number, where computers like powers of two. i could understand a crash in 2048 for examle. before y2k i had some programming knowledge, but even now it still seems like a bunch of bs.
-
the other thing i didnt get about the y2k bug was that 2000 is a very round number, where computers like powers of two.
The issue wasn't related to that, it was because dates were stored by their last two digits and it was assumed the first two would be "19", so when the year 2000 came the computer would think it was "1900" instead of "2000". I can see how that might mess up the tax system......
-
i dont know why it uses less memory to store the year in a decimal format anyway. i mean your typical unsigned 16 bit integer is good all the way till the year 65536. so i always said y2k was a hoax, long before it happened.
You also have to remember that some of these systems were designed in the 70's and 80's where you had, if you were lucky, 64Kilobytes to play with, so they may well have compressed the year up into a portion of a 16-Bit number, say, 12 Bits for year and 4 Bits for month etc. And, as Kosh explains above, other systems only stored the last two digits.
-
What sort of question is that? It already happened, obviously somewhere between what happened and total Armageddon and genocide of the entire human species is as "bad as it would have been". :rolleyes:
-
What sort of question is that? It already happened, obviously somewhere between what happened and total Armageddon and genocide of the entire human species is as "bad as it would have been". :rolleyes:
Well it is assuming that we did little to prepare for it.
-
i can see breaking up a char into two nibbles and using one for each digit in the year, or by just storing the 2 numbers in a char type. if the former then yea i can see it breaking quite easily, but in the latter it would just keep ticking over to 100 (which would have made the year 19100). more likely to store the entire date in one int, it would have taken 7 bits to store the year, and the month would have taken 4 bits, and the date 5 bits. or 16 bits even. but still you would need to place all that bit-shifting code and the constant 19 into the program. which would consume program space in memory. probably more than the extra byte or two you would need to store the whole date accurately.
most antique database and spreadsheet programs ive seen or used, constantly paged data to and from the hard disk (or tape) to save memory. being a memory scrooge would have only slightly increased the size of the batch of info that could be processed at once, if at all. its hard to fathom that in the 30 years of computing nobody upgraded how dates were handled.
-
It's possible, I think the bigger problem was the one Kosh described, but for some of these systems, dealing with Nibbles was a norm, like the Intel 4004, they were inherent 4-Bit processors, so everything was, at machine level, dealt with as nibbles, in fact the larger numbers took more computation.
-
i still consider y2k a joke, considering how much ws spent on a bug which may not have existed in 99% of software on the market.
when computers moved from 4 bit architectures to 8 and 16 and eventually 32 bit systems, id figure a range of recompiles and tweaks would have needed to been done to simply make the old code work on newer systems, and in that re-integration phase not updating the code to take advantage of the new capabilities, seems silly to me.
-
I think there was a media pandemic over it, which the Media sensitive governments of the West reacted to when they should have ignored it to be honest.
There were risks, some pretty big ones, but Y2K wasn't so much a 'problem' for the industry as a 'big annoying job'.
As for upgrading, I could tell you a story about the Fortran system that used to run the UK's telephone system. It still does.
The reason for this is that the original Fortran got so convoluted and extensive (10's of millions of lines of code) that no-one can actually figure out how it works, there's a massive project going on to decode the original Fortran and write a more modern system, but until then, the UK communication network is actually relying heavily on 35 year old Fortran that nobody understands.
-
why not just us a more modern off the shelf system?
-
Because the code is too deeply rooted, it's kind of like the problem with the Freespace 2 Engine in a way, limited or no encapsulation, it's hard to tell where the code ends and the interfaces begin, sometimes the interface is actually inside the code designed to use it etc, sometimes the interface is in an entirely different module, depending on whether a new interface was needed or whether an old one was being re-used, so it's hard to tell what it accessing where in order to achieve the required result.
That's a lot of what the decompiling is centring on, finding the 'link' points between the code and the hardware, and abstracting them so that new software can be written, and identifying the algorithms used for defining where signals are sent.
-
It's possible, I think the bigger problem was the one Kosh described, but for some of these systems, dealing with Nibbles was a norm, like the Intel 4004, they were inherent 4-Bit processors, so everything was, at machine level, dealt with as nibbles, in fact the larger numbers took more computation.
The major mission critical systems I was referring to were often custom built mainframe systems (IIRC), not rinky dink Intel based PCs.
The reason for this is that the original Fortran got so convoluted and extensive (10's of millions of lines of code) that no-one can actually figure out how it works, there's a massive project going on to decode the original Fortran and write a more modern system, but until then, the UK communication network is actually relying heavily on 35 year old Fortran that nobody understands.
Lol, that's comforting. So not even the software engineers who designed it know how it works?
-
Yup, that's about it, because the code was written by a very large number of coders over a large time span, nobody is actually sure how large sections of it are designed, there is extremely limited documentation left and everyone's afraid to tinker, because if they break something, there's no guarantee that they can fix it...
-
That is at once sad and awesome. :yes:
-
I got to siggify that, it's just too funny to pass.
-
i dont know why it uses less memory to store the year in a decimal format anyway. i mean your typical unsigned 16 bit integer is good all the way till the year 65536. so i always said y2k was a hoax, long before it happened.
In the grim future of the 66th century, there are only crashes...
-
of course thbug would comein handy if the machines take over by then :D
-
Forget Y2K... you should start worrying about Y2K38.
(http://imgs.xkcd.com/comics/2038.png)
Offtopic: I lol'd at that. XKCD :D