Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Y10K compatible been asked yet?

Options
  • 11-12-2005 3:18am
    #1
    Registered Users Posts: 4,142 ✭✭✭


    Anyone been asked for Y10K compliance yet?


«1

Comments

  • Registered Users Posts: 7,500 ✭✭✭BrokenArrows


    Ok when people say plan for the future i dont think they ment 8000 years into the future.

    The way things will work in 8k years will be nothing like today.

    Example in the past 30 years we have gone from needing a giant room with loads of vacume tubes to calculate stuff to the same things being done in the palm of our hands and thats only 30 years.

    Why did you bring up this question anyway.


  • Closed Accounts Posts: 7,145 ✭✭✭DonkeyStyle \o/


    It'll all have changed over to star-dates by then anyway.


  • Moderators, Business & Finance Moderators, Motoring & Transport Moderators, Society & Culture Moderators Posts: 67,850 Mod ✭✭✭✭L1011


    I'd say the 2038 (int32 overflow in UNIX time) problem is a far, far bigger problem than Y2K ever was, to be honest...


  • Closed Accounts Posts: 703 ✭✭✭SolarNexus


    heres my revolutionary prophecy: no human will be programming in Y10K

    why? well its obvious, computers can be programmed to learn (its been done, successfully already) and as such that means they can be programmed to self-program (yet again, already done successfully) therefore you have a thinking computer, and the human element has been taken out of the problem; and as anyone knows, you take the human element out of an equasion, and more than likely, you take with it all the errors and mistakes.

    Basically, I'm saying in Y10K, computers will program themselvs so such a degree that human error such as the idiotic Y2K 'bug' will never happen. Then again, by Y10K computers may not even exist - given that amount of time, I'm sure we would have blown ourselves up - or at least given a good whack at it ;)

    PS.
    by then, if people still programmed, I can guarantee you that the bugs will be more widespread and more complicated than the Y2K 'bug'. The Y2K 'bug' was just an industry-wide oversight, not taking into account something which they should have been well aware of ; given the current trend towards distributed computing, I can guarantee you that next time round, the YnK bug will probably be far less of an anti-climax than the Y2K 'bug' (how many of you were disapointed the world didnt end? I know I was... I was kind of thinking planet of the apes, better luck next time, eh?)


  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 90,827 Mod ✭✭✭✭Capt'n Midnight


    MYOB wrote:
    I'd say the 2038 (int32 overflow in UNIX time) problem is a far, far bigger problem than Y2K ever was, to be honest...
    http://msdn.microsoft.com/library/default.asp?url=/archive/en-us/dnarguion/html/msdn_090798a.asp we can call it the Y2.038K bug.Here's one http://support.microsoft.com/kb/q270617/ Jan 18 2038

    But for many user dB;s 2030 is even closer http://www.microsoft.com/TechNet/prodtechnol/office/office2000/proddocs/field/02migrat.mspx
    Access 2.0/95 interprets the year as 19XX. Access 97/2000 interprets 2-digit years between 0 and 29 as 20XX, and values between 30 and 99 as 19XX. Because of this, Access queries using date literals with 2-digit years may return different result sets under Access 2.0/95 and Access 97/2000.

    http://en.wikipedia.org/wiki/Virtual_Memory_System
    OpenVMS should have no trouble with time until 31-JUL-31086 02:48:05.47. At this time, all clocks and time-keeping operations in OpenVMS will suddenly stop, as system time values go negative.


  • Advertisement
  • Closed Accounts Posts: 7,145 ✭✭✭DonkeyStyle \o/


    Well the easy solution here is to have everyone agree to just call it 1970 again... might be confusing for a while, but if it's an excuse to wear bellbottoms, I'm all for it.


  • Closed Accounts Posts: 703 ✭✭✭SolarNexus


    Well the easy solution here is to have everyone agree to just call it 1970 again... might be confusing for a while, but if it's an excuse to wear bellbottoms, I'm all for it.
    hell no, I'm not going back in time before I was borne! unless... I am my own father... hmmm... ew ew arrgh, *shoots head*


  • Closed Accounts Posts: 3,152 ✭✭✭ozt9vdujny3srf


    SolarNexus wrote:
    heres my revolutionary prophecy: no human will be programming in Y10K

    why? well its obvious, computers can be programmed to learn (its been done, successfully already) and as such that means they can be programmed to self-program (yet again, already done successfully) therefore you have a thinking computer, and the human element has been taken out of the problem; and as anyone knows, you take the human element out of an equasion, and more than likely, you take with it all the errors and mistakes.

    Basically, I'm saying in Y10K, computers will program themselvs so such a degree that human error such as the idiotic Y2K 'bug' will never happen.

    I do hope this is a troll.

    Seeing as humans would still have to make the programmes that make the programmes, and as such, there would still be mistakes. Anyways, many of the issues that will occur are based on decisions that were made in order to handle problems like two-digit years, how could a computer have done a better job of that? And the only mistakes computers dont make are thinks like forgetting semicolons, and humans use computers to correct such errors anyway.

    It's not like every computer in the world will say "parse error, computer over" on new years eve midnight 2037


  • Registered Users Posts: 4,142 ✭✭✭TempestSabre


    ...Why did you bring up this question anyway.

    Would you believe a client mentioned in passing that an old project I worked on, is being tested for it? It uses Windows and Office so I expect they are not even compliant. I'd never even heard of it before, but it seems to be common knowledge here. I knew Software Validation was a little nuts, but Y10K seems a little extreme even for them. Personally I think it a self perpetuating industry.


  • Registered Users Posts: 7,468 ✭✭✭Evil Phil


    Ok when people say plan for the future i dont think they ment 8000 years into the future.

    The way things will work in 8k years will be nothing like today.

    Example in the past 30 years we have gone from needing a giant room with loads of vacume tubes to calculate stuff to the same things being done in the palm of our hands and thats only 30 years.

    Why did you bring up this question anyway.

    I think the OP is talking about the year 2010, not 10000 but I may be wrong.


  • Advertisement
  • Registered Users Posts: 21,264 ✭✭✭✭Hobbes


    Looks like 10,000 to me. Is he building a time machine?


  • Moderators, Politics Moderators Posts: 38,975 Mod ✭✭✭✭Seth Brundle


    SolarNexus wrote:
    heres my revolutionary prophecy: no human will be programming in Y10K

    why? well its obvious, computers can be programmed to learn (its been done, successfully already) and as such that means they can be programmed to self-program (yet again, already done successfully) therefore you have a thinking computer, and the human element has been taken out of the problem; and as anyone knows, you take the human element out of an equasion, and more than likely, you take with it all the errors and mistakes.

    Basically, I'm saying in Y10K, computers will program themselvs so such a degree that human error such as the idiotic Y2K 'bug' will never happen. Then again, by Y10K computers may not even exist - given that amount of time, I'm sure we would have blown ourselves up - or at least given a good whack at it ;)
    OMG - will nobody heed the warnings from the Terminator films? :D


  • Registered Users Posts: 2,758 ✭✭✭Peace


    Would you believe a client mentioned in passing that an old project I worked on, is being tested for it? It uses Windows and Office so I expect they are not even compliant. I'd never even heard of it before, but it seems to be common knowledge here. I knew Software Validation was a little nuts, but Y10K seems a little extreme even for them. Personally I think it a self perpetuating industry.

    Sounds like someone is trying to find fault just to justify their own salaries. Seriously, can they produce a scenario where the software you have written will be in use in 8000 years?


  • Moderators, Politics Moderators Posts: 38,975 Mod ✭✭✭✭Seth Brundle


    Peace wrote:
    Sounds like someone is trying to find fault just to justify their own salaries. Seriously, can they produce a scenario where the software you have written will be in use in 8000 years?
    Given the amount spent on public IT projects, the govermnets may want to keep the software this long to get value for money.
    I wonder will the complete penalty points system be working by then?


  • Registered Users Posts: 919 ✭✭✭timeout


    what about Y3K? If people today are not bothered with the "2" then it will happen in 995 years rather then 8000 years.


  • Registered Users Posts: 4,142 ✭✭✭TempestSabre


    Evil Phil wrote:
    I think the OP is talking about the year 2010, not 10000 but I may be wrong.

    Searching on the net suggested its 10000 to me, as I never saw 2010 mentioned. I'll look into that.


  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 90,827 Mod ✭✭✭✭Capt'n Midnight


    timeout wrote:
    what about Y3K? If people today are not bothered with the "2" then it will happen in 995 years rather then 8000 years.
    You obviously didn't see that episode of Futurama
    Patchcord Adams "Did you hear why they're using Windows 3000 as a prison guard?"
    Fry: "No, why?"
    Patchcord Adams: "Cause it always locks up. ...


  • Registered Users Posts: 15,443 ✭✭✭✭bonkey


    timeout wrote:
    what about Y3K? If people today are not bothered with the "2" then it will happen in 995 years rather then 8000 years.

    And when, exactly, did you see someone "bother" to write the secondmost significant digit in the year, but "not bother" to write the most significant?

    People write dates in two-digit or four-digit.

    But seriously....who cares? This sounds like someone looking for a problem.

    They must have a solution to sell ;)

    jc


  • Registered Users Posts: 919 ✭✭✭timeout


    You obviously didn't see that episode of Futurama

    No missed that one :D
    bonkey wrote:
    People write dates in two-digit or four-digit
    Thats true, so if they only bothered with the 2 then at the end of 2099 it would revert to 2000 so it'll happen in 95 not 995 or 8000. But as you said its rubbish. If there is still someone using a piece of software 100 or 8000 years down the line imagine the royalties for maintence :D


  • Registered Users Posts: 6,762 ✭✭✭WizZard


    Haha, from the advice on that page. Emphasis mine
    To resolve this problem, do not use a system date that is earlier than January 18, 2038.
    So we're currently all wrong now then... ;)


  • Advertisement
  • Closed Accounts Posts: 210 ✭✭HomunQlus


    I don't think there will be a Y10K bug or even a Y3K bug, or a Y2K1 (2100) bug.

    Why? Simple.

    All 32-bit processors do indeed have a limitation on time, which will cause all 32-bit operated computers to suddenly crash and stop on January 18, 2038.

    However, we're already, but slowly, migrating to 64-bit processors. These processors do no longer have that limitation. They do still have a limitation on time measurement, however, the limit there is somewhat 4 billion years or something like that.

    But here's real thing: Ten years ago we were using 486 processors, maybe with a speed of 100 MHz. Now? Ten years later, we have Pentium 4 with Hyperthreading and Dual-Core, clock's ticking at 3.8 GHz. That's quite a leap in 10 years.

    So I think, that maybe in the 2030's, we'll have more like 128-bit processors, which can handle even more time.

    In the end, I don't think there's nothing to worry about.

    Besides, I don't think that a society remotely like ours will exist in the year 10,000.

    Does anybody know the song "In The Year 2525" from Zager & Evans?
    [...]
    In the year 9595
    I'm kinda wonderin if man is gonna be alive.
    He's taken everything this old Earth can give.
    And he ain't put back nothing.Whoa-oh

    Now it's been ten thousand years
    Man has cried a billion tears.
    For what he never knew,
    now man's reign is through.

    But through eternal night.
    The twinkling of starlight.
    So very far away.
    Maybe it's only yesterday.


  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 90,827 Mod ✭✭✭✭Capt'n Midnight


    HomunQlus wrote:
    So I think, that maybe in the 2030's, we'll have more like 128-bit processors, which can handle even more time.
    a lot of Y2K was century related - as in 00 not 2000. Even an 8 bit processor with a signed number can count past 100 - so the problem lay elsewhere - in the SOFTWARE, and if you recompile 8080 code to run on a P4 an byte sized interger will still be a byte sized integer !

    A processor in BCD mode might have problems with 99 + 1 if it uses a single byte but even the 486 provides a 17 digit signed decimal (BCD) integer.
    Go back to the 8088 the AAA/AAM instructions carry the bit into AH

    Go back to the 6502 and there is a carry flag if the software bothers to look at it to handle numbers > 99 in BCD mode - though some undocumented funny stuff there http://www.6502.org/tutorials/vflag.html see Appendix B in case you ever decide to program for one


  • Registered Users Posts: 7,468 ✭✭✭Evil Phil


    Dev meeting 10,000 A.D.

    1730_4439_2.jpg


  • Moderators, Society & Culture Moderators Posts: 9,689 Mod ✭✭✭✭stevenmu


    If I haven't evolved into a hyper-intelligent being with no need for computers by the year 10,000 I'll be severly dissapointed.


  • Registered Users Posts: 4,142 ✭✭✭TempestSabre


    Love the pic LOL :D

    I don't think the problem is 10,000, well it is, but the real question is what other date/time issues might be a problem.


  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 90,827 Mod ✭✭✭✭Capt'n Midnight


    Love the pic LOL :D

    I don't think the problem is 10,000, well it is, but the real question is what other date/time issues might be a problem.
    well there is a leap second at the end of this year

    http://www.npl.co.uk/time/leap_second.html
    2005 December 31 23h 59m 58s

    2005 December 31 23h 59m 59s

    2005 December 31 23h 59m 60s

    2006 January 01 00h 00m 00s

    2006 January 01 00h 00m 01s

    "There are 61 seconds in a minute containing a positive leap second. A leap second occurs at the same instant throughout the world, when the familiar `six pips' radio time signal gains an extra pip before the long pip marking the hour, to become a `seven pip' signal."

    So any equipment connected to internet /radio clocks that can't handle 23h 59m 60s or 6 'pips' will be a little upset. Might be a good time to make two withdrawls from an ATM machine ;)

    18 days to go..


  • Registered Users Posts: 2,758 ✭✭✭Peace


    I would imagine that time will start moving backwards and the humanrace will de-evolve until the year -10000 where everything will cease to exist.

    Actually i think in my office right now there is some sort of weird time pocket where the rest of my colleague seem to be de-evolving 8000 years early.
    Evil Phil wrote:
    Dev meeting 10,000 A.D.

    1730_4439_2.jpg

    ROFL... fukin class.


  • Registered Users Posts: 15,443 ✭✭✭✭bonkey


    HomunQlus wrote:
    All 32-bit processors do indeed have a limitation on time, which will cause all 32-bit operated computers to suddenly crash and stop on January 18, 2038.

    Where did you make that up from?

    The problem (if problem is the right word) is not the processor. The problem is code utilising 32-bit numbers, which is entirely independant of the architecture it runs on. You can have 32-bit numbers on an 8-bit architecture, as well as on a 128-bit architecture.
    In the end, I don't think there's nothing to worry about.
    Of course there's something to worry about.

    The worry is about software implemented with finite limits, where the lifetime of the software in question is capable of reaching those limits. Its easy to say "oh, we'll have that solved by then", but thats ignoring one key fact....someone has to solve it before its no longer a problem. The solution may be widespread abandonment of something, or a redesign.

    Who remembers stories Windows 95 crashing after the Tickcount rollover? Same thing...no-one (or, more correctly, not everyone) bothered to cater for the situation that the tick-counter would revert to 0 after almost 50 days.

    Counter-rollover is nothing new. Its something that any industry-level programmer is likely to run across at some point. In my experience, its something you overlook at most once...unless you're incompetent. Similarily, anyone developing software which is time-dependant to the level of the second will only forget leap-seconds once. Like they did with GPS :)

    jc


  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 90,827 Mod ✭✭✭✭Capt'n Midnight


    bonkey wrote:
    Who remembers stories Windows 95 crashing after the Tickcount rollover? Same thing...no-one (or, more correctly, not everyone) bothered to cater for the situation that the tick-counter would revert to 0 after almost 50 days.
    IIRC about 49.7 days from the 18 milisecond timer. Then again even though this was known about back in 94 with the beta the patch was only issued for windows 98 a LOT later on. Thing is very few people ever got Win9X to stay up for long enough to see the error. You were supposed to buy NT instead ;) Compare to VAXen with uptimes of 15 years and Novell servers remaining powered on while people move office.


  • Advertisement
  • Registered Users Posts: 5,141 ✭✭✭Yakuza


    This problem has already been faced (in sorts) in the NYC stock exchange a few years ago. They had to upgrade all sorts of systems before the Dow Jownes reached 10,000.

    Try googling "D10K Problem". It had the potential to disrupt the markets there, which would not have been pleasant for the rest of the world, either.

    http://www.findarticles.com/p/articles/mi_m0NEW/is_1998_May_11/ai_20590098


Advertisement