Author Topic: Okay, Y2K  (Read 3109 times)

0 Members and 1 Guest are viewing this topic.

Offline Unknown Target

  • Get off my lawn!
  • 212
  • Push.Pull?
What sort of question is that? It already happened, obviously somewhere between what happened and total Armageddon and genocide of the entire human species is as "bad as it would have been". :rolleyes:

 

Offline Kosh

  • A year behind what's funny
  • 210
What sort of question is that? It already happened, obviously somewhere between what happened and total Armageddon and genocide of the entire human species is as "bad as it would have been". :rolleyes:

Well it is assuming that we did little to prepare for it.
"The reason for this is that the original Fortran got so convoluted and extensive (10's of millions of lines of code) that no-one can actually figure out how it works, there's a massive project going on to decode the original Fortran and write a more modern system, but until then, the UK communication network is actually relying heavily on 35 year old Fortran that nobody understands." - Flipside

Brain I/O error
Replace and press any key

 

Offline Nuke

  • Ka-Boom!
  • 212
  • Mutants Worship Me
i can see breaking up a char into two nibbles and using one for each digit in the year, or by just storing the 2 numbers in a char type. if the former then yea i can see it breaking quite easily, but in the latter it would just keep ticking over to 100 (which would have made the year 19100). more likely to store the entire date in one int, it would have taken 7 bits to store the year, and the month would have taken 4 bits, and the date 5 bits. or 16 bits even. but still you would need to place all that bit-shifting code and the constant 19 into the program. which would consume program space in memory. probably more than the extra byte or two you would need to store the whole date accurately.

most antique database and spreadsheet programs ive seen or used, constantly paged data to and from the hard disk (or tape) to save memory. being a memory scrooge  would have only slightly increased the size of the batch of info that could be processed at once, if at all. its hard to fathom that in the 30 years of computing nobody upgraded how dates were handled.
« Last Edit: July 16, 2009, 08:25:57 pm by Nuke »
I can no longer sit back and allow communist infiltration, communist indoctrination, communist subversion, and the international communist conspiracy to sap and impurify all of our precious bodily fluids.

Nuke's Scripting SVN

 

Offline Flipside

  • əp!sd!l£
  • 212
It's possible, I think the bigger problem was the one Kosh described, but for some of these systems, dealing with Nibbles was a norm, like the Intel 4004, they were inherent 4-Bit processors, so everything was, at machine level, dealt with as nibbles, in fact the larger numbers took more computation.

 

Offline Nuke

  • Ka-Boom!
  • 212
  • Mutants Worship Me
i still consider y2k a joke, considering how much ws spent on a bug which may not have existed in 99% of software on the market.

when computers moved from 4 bit architectures to 8 and 16 and eventually 32 bit systems, id figure a range of recompiles and tweaks would have needed to been done to simply make the old code work on newer systems, and in that re-integration phase not updating the code to take advantage of the new capabilities, seems silly to me.
« Last Edit: July 16, 2009, 08:33:21 pm by Nuke »
I can no longer sit back and allow communist infiltration, communist indoctrination, communist subversion, and the international communist conspiracy to sap and impurify all of our precious bodily fluids.

Nuke's Scripting SVN

 

Offline Flipside

  • əp!sd!l£
  • 212
I think there was a media pandemic over it, which the Media sensitive governments of the West reacted to when they should have ignored it to be honest.

There were risks, some pretty big ones, but Y2K wasn't so much a 'problem' for the industry as a 'big annoying job'.

As for upgrading, I could tell you a story about the Fortran system that used to run the UK's telephone system. It still does.

The reason for this is that the original Fortran got so convoluted and extensive (10's of millions of lines of code) that no-one can actually figure out how it works, there's a massive project going on to decode the original Fortran and write a more modern system, but until then, the UK communication network is actually relying heavily on 35 year old Fortran that nobody understands.

 

Offline Nuke

  • Ka-Boom!
  • 212
  • Mutants Worship Me
why not just us a more modern off the shelf system?
I can no longer sit back and allow communist infiltration, communist indoctrination, communist subversion, and the international communist conspiracy to sap and impurify all of our precious bodily fluids.

Nuke's Scripting SVN

 

Offline Flipside

  • əp!sd!l£
  • 212
Because the code is too deeply rooted, it's kind of like the problem with the Freespace 2 Engine in a way, limited or no encapsulation, it's hard to tell where the code ends and the interfaces begin, sometimes the interface is actually inside the code designed to use it etc, sometimes the interface is in an entirely different module, depending on whether a new interface was needed or whether an old one was being re-used, so it's hard to tell what it accessing where in order to achieve the required result.

That's a lot of what the decompiling is centring on, finding the 'link' points between the code and the hardware, and abstracting them so that new software can be written, and identifying the algorithms used for defining where signals are sent.
« Last Edit: July 16, 2009, 09:11:22 pm by Flipside »

 

Offline Kosh

  • A year behind what's funny
  • 210
Quote
It's possible, I think the bigger problem was the one Kosh described, but for some of these systems, dealing with Nibbles was a norm, like the Intel 4004, they were inherent 4-Bit processors, so everything was, at machine level, dealt with as nibbles, in fact the larger numbers took more computation.

The major mission critical systems I was referring to were often custom built mainframe systems (IIRC), not rinky dink Intel based PCs.


Quote
The reason for this is that the original Fortran got so convoluted and extensive (10's of millions of lines of code) that no-one can actually figure out how it works, there's a massive project going on to decode the original Fortran and write a more modern system, but until then, the UK communication network is actually relying heavily on 35 year old Fortran that nobody understands.

Lol, that's comforting. So not even the software engineers who designed it know how it works?
"The reason for this is that the original Fortran got so convoluted and extensive (10's of millions of lines of code) that no-one can actually figure out how it works, there's a massive project going on to decode the original Fortran and write a more modern system, but until then, the UK communication network is actually relying heavily on 35 year old Fortran that nobody understands." - Flipside

Brain I/O error
Replace and press any key

 

Offline Flipside

  • əp!sd!l£
  • 212
Yup, that's about it, because the code was written by a very large number of coders over a large time span, nobody is actually sure how large sections of it are designed, there is extremely limited documentation left and everyone's afraid to tinker, because if they break something, there's no guarantee that they can fix it...

 

Offline redsniper

  • 211
  • Aim for the Top!
That is at once sad and awesome. :yes:
"Think about nice things not unhappy things.
The future makes happy, if you make it yourself.
No war; think about happy things."   -WouterSmitssm

Hard Light Productions:
"...this conversation is pointlessly confrontational."

 

Offline Kosh

  • A year behind what's funny
  • 210
I got to siggify that, it's just too funny to pass.
"The reason for this is that the original Fortran got so convoluted and extensive (10's of millions of lines of code) that no-one can actually figure out how it works, there's a massive project going on to decode the original Fortran and write a more modern system, but until then, the UK communication network is actually relying heavily on 35 year old Fortran that nobody understands." - Flipside

Brain I/O error
Replace and press any key

 

Offline Ace

  • Truth of Babel
  • 212
    • http://www.lordofrigel.com
i dont know why it uses less memory to store the year in a decimal format anyway. i mean your typical unsigned 16 bit integer is good all the way till the year 65536. so i always said y2k was a hoax, long before it happened.

In the grim future of the 66th century, there are only crashes...
Ace
Self-plagiarism is style.
-Alfred Hitchcock

 

Offline Nuke

  • Ka-Boom!
  • 212
  • Mutants Worship Me
of course thbug would comein handy if the machines take over by then :D
I can no longer sit back and allow communist infiltration, communist indoctrination, communist subversion, and the international communist conspiracy to sap and impurify all of our precious bodily fluids.

Nuke's Scripting SVN

  

Offline Gibbusflame

  • 25
  • Vous ne serez pas silencieux ?
Forget Y2K... you should start worrying about Y2K38.



Offtopic: I lol'd at that. XKCD :D