Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi all,
Vanilla are planning an update to the site on April 24th (next Wednesday). It is a major PHP8 update which is expected to boost performance across the site. The site will be down from 7pm and it is expected to take about an hour to complete. We appreciate your patience during the update.
Thanks all.

Coding Horror

Options
13133353637

Comments

  • Registered Users Posts: 6,190 ✭✭✭The Continental Op


    I've just found an interesting bug in APCs PowerChute Personal Edition software. I thought something was desperately going wrong until I did a reboot and the error went away.

    I should have taken screen shots but according to PowerChute Personal Edition my input (mains) voltage was varying between 230 Volts and roughly 197000000 Volts.

    Oh well what do I expect of free software.

    Wake me up when it's all over.



  • Closed Accounts Posts: 22,651 ✭✭✭✭beauf


    That dept info could be interpreted in a number of ways, one could be it chooses a different calculation (unlikely), the other could it changes some value that the calculation uses.

    Imagine you were in a bad class for JC but did well and got moved up to better classes for LC.


  • Registered Users Posts: 5,552 ✭✭✭Slutmonkey57b


    That doesn't make any sense to normalise results on an aggregate basis. The whole point of individual achievement is its particular to the individual


  • Banned (with Prison Access) Posts: 3,316 ✭✭✭nthclare


    My son's started studying coding, bless him his dad's a horticulturist.
    Although my brother is a coding guy, and it's tough going he says


  • Closed Accounts Posts: 22,651 ✭✭✭✭beauf


    That doesn't make any sense to normalise results on an aggregate basis. The whole point of individual achievement is its particular to the individual

    Seems odd doesn't it.


  • Advertisement
  • Registered Users Posts: 14,715 ✭✭✭✭Earthhorse


    Especially considering the make up of a class can change over time. In fact the end of Junior Cycle I would say is when most transfers take place between schools. But if there’s a correlation maybe it’s right for them to include it in the algorithm? Don’t really know enough about the area to say.


  • Registered Users Posts: 4,555 ✭✭✭Treppen


    That doesn't make any sense to normalise results on an aggregate basis. The whole point of individual achievement is its particular to the individual

    Ya it's all a bit odd. Considering you could have maybe 140 students in some schools, all doing various combinations of 10+ subjects in JC then moving to various 7 subjects in the LC (with dropouts, moves to other schools, exemptions in Irish.. Often those subjects had little connection to JC. E.g. JC science to LC physics or applied maths.

    And then on top of that say that the error only impacted on one LC subject!

    Anyway, code is going to be released apparently.


  • Registered Users Posts: 3,337 ✭✭✭Wombatman


    Awarding lower grades than should have been given only half the problem.
    Almost 8,000 Leaving Cert grades issued this year were higher than they should have been due to coding errors in the calculated grades process, according to new figures.

    In addition to Saturday’s announcement that 6,100 students received lower grades than they should have, the Department of Education confirmed on Sunday evening that about 7,943 grades were higher than they were supposed to be.

    While lower grades will be corrected upwards, higher grades awarded in error will not be corrected.


  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 90,695 Mod ✭✭✭✭Capt'n Midnight


    https://www.theguardian.com/politics/2020/oct/05/how-excel-may-have-caused-loss-of-16000-covid-tests-in-england
    In this case, the Guardian understands, one lab had sent its daily test report to PHE in the form of a CSV file – the simplest possible database format, just a list of values separated by commas. That report was then loaded into Microsoft Excel, and the new tests at the bottom were added to the main database.


  • Registered Users Posts: 6,236 ✭✭✭Idleater



    I saw a tweet about that yesterday, the "fix" was to start using multiple spreadsheets.


  • Advertisement
  • Posts: 5,917 ✭✭✭ [Deleted User]


    Wombatman wrote: »
    Awarding lower grades than should have been given only half the problem.



    While lower grades will be corrected upwards, higher grades awarded in error will not be corrected.

    Bit harder to do given that those that received them have most likely already started their college courses?


  • Registered Users Posts: 1,029 ✭✭✭John_C


    I found an entertaining bug today. It's a function to take in a duration in seconds and return a text string for display.
    For example getDisplayDuration(123) should return 02m 03s and getDisplayDuration(3723) should return 01h 02m 03s.
    Instead of working out the mathematics, the function converts the duration into a Date object, i.e. what time was it this number of seconds past midnight on 1st January 1970.
    It then reads the hours, minutes and seconds from that object.
    The end result is that (assuming you live in Ireland) the function produces the correct output during winter time but is off by an hour during daylight savings time.

    function getDisplayDuration(duration) {
      if (duration >= 60 * 60) {
        var d = new Date(duration * 1000);
        return (
          addZero(d.getHours()) +
          "h " +
          addZero(d.getMinutes()) +
          "m " +
          addZero(d.getSeconds()) +
          "s"
        );
      } else {
        var d = new Date(duration * 1000);
        return addZero(d.getMinutes()) + "m " + addZero(d.getSeconds()) + "s";
      }
      function addZero(i) {
        if (i < 10) {
          i = "0" + i;
        }
        return i;
      }
    }
    


  • Posts: 0 [Deleted User]


    John_C wrote: »
    I found an entertaining bug today. It's a function to take in a duration in seconds and return a text string for display.
    For example getDisplayDuration(123) should return 02m 03s and getDisplayDuration(3723) should return 01h 02m 03s.
    Instead of working out the mathematics, the function converts the duration into a Date object, i.e. what time was it this number of seconds past midnight on 1st January 1970.
    It then reads the hours, minutes and seconds from that object.
    The end result is that (assuming you live in Ireland) the function produces the correct output during winter time but is off by an hour during daylight savings time.

    Reminds me of this...

    Me04jVB.jpg


  • Registered Users Posts: 6,236 ✭✭✭Idleater


    John_C wrote: »
    The end result is that (assuming you live in Ireland) the function produces the correct output during winter time but is off by an hour during daylight savings time.

    I've seen a publishing schedule done by time of day, but specified by an arbitrary date. Basically "11:30 daily", but specified as "11:30 22 October 1950 ET".

    I don't recall the specifics, but the rigmarole to calculate "today" based on where the user is (say GMT, or more fun, Asia) :mad:


  • Registered Users Posts: 3,337 ✭✭✭Wombatman


    From the kind of developers that brought you 'We fooked up importing the German test results" and "We made a balls of the leaving cert standardisation process"..........

    https://arstechnica.com/tech-policy/2020/10/excel-glitch-may-have-caused-uk-to-underreport-covid-19-cases-by-15841/


  • Registered Users Posts: 2,145 ✭✭✭dazberry


    Idleater wrote: »
    I've seen a publishing schedule done by time of day, but specified by an arbitrary date. Basically "11:30 daily", but specified as "11:30 22 October 1950 ET".

    This reminds me of a place I worked where the empty/null/not supplied date for a customer dob was some arbitrary date in 1966. Caused a problem a few times :D


  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 90,695 Mod ✭✭✭✭Capt'n Midnight


    dazberry wrote: »
    This reminds me of a place I worked where the empty/null/not supplied date for a customer dob was some arbitrary date in 1966. Caused a problem a few times :D
    In the US a few lifers were released on the 9th of September back in 1999


  • Registered Users Posts: 5,552 ✭✭✭Slutmonkey57b


    Wombatman wrote: »
    From the kind of developers that brought you 'We fooked up importing the German test results" and "We made a balls of the leaving cert standardisation process"..........

    https://arstechnica.com/tech-policy/2020/10/excel-glitch-may-have-caused-uk-to-underreport-covid-19-cases-by-15841/

    Over a million rows in excel. I bet they went off and made dinner waiting for the thing to open.


  • Moderators, Politics Moderators Posts: 38,843 Mod ✭✭✭✭Seth Brundle


    Over a million rows in excel. I bet they went off and made dinner waiting for the thing to open.
    You're assuming that they're using the newer Excel formats availabel in Excel 2007 onwards.
    It wouldn't surprise me to see them using the *.xls format which is limited to 65536 rows.


  • Registered Users Posts: 14,715 ✭✭✭✭Earthhorse


    That's what the issue was according to the article.


  • Advertisement
  • Registered Users Posts: 5 sqlmonkey


    ah yes - the old Excel Max Rows gotcha!


  • Moderators, Politics Moderators Posts: 38,843 Mod ✭✭✭✭Seth Brundle


    sqlmonkey wrote: »
    ah yes - the old Excel Max Rows gotcha!
    If you're familiar with Excel macros then it's a basic check. Surprised it happened but then again, if they were using Excel instead of an actual database (even MS Access) then they deserve all the criticism they get!


  • Registered Users Posts: 6,250 ✭✭✭Buford T Justice


    Access is (was) limited at 50MB


  • Moderators, Politics Moderators Posts: 38,843 Mod ✭✭✭✭Seth Brundle


    Access is (was) limited at 50MB
    I'm fairly sure that the max size for an Access 97 database is 1GB. Access 2000 onwards is 2GB.
    I can recall seeing desktop applications using Access macros which were several hundred megs.
    I'm not saying it's perfect but it's not limited to 50Mb


  • Closed Accounts Posts: 22,651 ✭✭✭✭beauf


    I used to have run a Job that maxed out an access database. I think it was 2GB. Every month had to compact it, which would free up some space which would allow me to run the job. Took two weeks to run, and if it failed, I barely had enough time to rerun it before month end. They wouldn't let me rewrite it or port to SQL. Through lateral thinking I eventually got it to run in 24hrs.

    Usually when they get that big you throw the data into SQL and keep the front end in Access via linked tables. Because there's usually a mountain if VBA connected to it, you don't want to touch.


  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 90,695 Mod ✭✭✭✭Capt'n Midnight


    beauf wrote: »
    Because there's usually a mountain if VBA connected to it, you don't want to touch.
    *shudders*

    Printout worked on a 386 but not on the 486 that replaced it :confused:

    There was a delay loop in the code. I increased the counter and left notes.


  • Registered Users Posts: 2,145 ✭✭✭dazberry


    *shudders*

    Printout worked on a 386 but not on the 486 that replaced it :confused:

    There was a delay loop in the code. I increased the counter and left notes.

    Reminds me of the Turbo/Borland Pascal CRT unit timing bug.


  • Registered Users Posts: 6,434 ✭✭✭jhegarty


    I'm fairly sure that the max size for an Access 97 database is 1GB. Access 2000 onwards is 2GB.
    I can recall seeing desktop applications using Access macros which were several hundred megs.
    I'm not saying it's perfect but it's not limited to 50Mb

    I seem to remember the free local only version of SQL Server was limited to 50mb. That might be what the poster is thinking of.


  • Posts: 0 [Deleted User]


    beauf wrote: »
    Through lateral thinking I eventually got it to run in 24hrs.

    You can't just leave us hanging. :pac:

    What did you do?


  • Advertisement
  • Closed Accounts Posts: 22,651 ✭✭✭✭beauf


    You can't just leave us hanging. :pac:

    What did you do?

    The simplistic answer is I split the data into to smaller chunks, ran it on multiple PCs concurrently then re-joined it together at the end But I got it down from about 2wks to about 3~4 days doing this. But it was still a PITA. Eventually I lost the use of the extra machines. So I replaced them all with a single overlocked early i3 @4ghz which chewed through it in about 24 hours. Access being single threaded ran much faster on high clock speeds than on our dual CPU Xeons of the day.

    In another place I saw a similar solution. Big bloated Oracle system, ran like a dog. One of the customers got fed up waiting for developers to optimize it. Bought some crazy sever with like 16+ cpu's (a lot at that time), and it ran perfectly on that.


Advertisement