Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

The deterioration of IT

1235711

Comments

  • Registered Users, Registered Users 2 Posts: 14,463 ✭✭✭✭Flinty997



    Some people are just bad. They just better at hiding it, and talking their way into good jobs.



  • Registered Users, Registered Users 2 Posts: 14,463 ✭✭✭✭Flinty997



    Your problem is with MonkeyButter not me.

    I just gave my experience and being curious looked up the prices.

    Perhaps I never noticed them because I had no interest at the time.



  • Registered Users, Registered Users 2 Posts: 6,315 ✭✭✭CalamariFritti


    As I said above I don't think its necessarily the IT people (it can be too) but more the way IT people are made to work. The current fashion with IT management is get stuff out fast worry about it being right later (or never).



  • Registered Users, Registered Users 2 Posts: 8,979 ✭✭✭Ray Palmer


    It really stems from management. It used to be the technical people set the standard and insisted it was followed. As the industry developed management decided they needed "humans" to manage the technical people. Like in the IT Crowd where Jen didn't know anything yet managed them. Discussions went from this is how it is done to "managers" telling the technical people to do what they wanted regardless of what is standard or possible.

    The amount of management dealing with technical staff that think they are barely human is very high. A lot of these old mangement are gone but they trained up the current lot who should know better.

    This vid is pretty accurate discussion where the people in charge of project have no understanding of what they are saying




  • Registered Users, Registered Users 2 Posts: 8,979 ✭✭✭Ray Palmer


    No you also said it was not possible so many people had computers back then and I was either making it up or extremely rich. You claimed I had all the toys a few comments ago.



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 14,463 ✭✭✭✭Flinty997


    Where did I say use "not possible" or "Extremely Rich" exactly.



  • Registered Users, Registered Users 2 Posts: 14,463 ✭✭✭✭Flinty997



    So true.

    On one project I was dragged into, I eventually refused to meet or have any dealings with the vendor PM because they didn't know what they were talking about. They were an idiot like in that video. I would only talk to the technical staff on the project. Very common.



  • Registered Users, Registered Users 2 Posts: 14,463 ✭✭✭✭Flinty997



    Often valid issues are ignored with the excuse it needs to be done quickly. But in reality its just a means of bypassing doing it properly, or having their opinion questioned.

    Then afterwards when it becomes a nightmare they avoid any criticism with excuse, it was the only way at the time, and they didn't know about anything else. Then they move on to their next disaster.



  • Registered Users, Registered Users 2 Posts: 2,862 ✭✭✭Glaceon


    Nah, I believe Windows 2000 or XP was the peak of Microsoft UX. Windows 7 was just Malibu Stacy with a new hat. Lipstick on a pig, the pig being Vista.



  • Registered Users, Registered Users 2 Posts: 2,542 ✭✭✭JMcL


    OP, I'd agree to a certain extent, but you might be looking at the past through slightly rose tinted glasses

    I'd be of a similar vintage to yourself by the sounds of it and would have had a similar career trajectory, so while I'd agree there's an awful lot of sloppiness out there, you have to frame this against the Microsoft driven horrorshow that persisted up until they finally started to get their act together in the late noughties. Windows OS and application code full of buffer overruns that made it trivial to inject code and root a system though malicious ActiveX (shudder) controls. And not just Windows, I've been using primarily Linux and other Unix like OSes since the mid 90s and they've all had their fair share of vulnerabilities over the years. Apple I'd imagine are similar, but I don't track them, and they tend to fix stuff quietly from what I can see

    Core platform quality is therefore way better than it was 20-25 years ago. Pre mid 90s it didn't really matter all that much as hardware was rarely connected to the Internet and when it was, it was generally intermittent (metered dial-up)

    You are correct though in pointing the finger at wide ranging poor standards. If development of a service is contracted out, it'll be entirely cost driven, so testing and QA is the first thing that's going to suffer. How much may depend on how savvy the commissioning entity is WRT validating that it handles various situations and edge cases (such as the 4 digit GPRN mentioned earlier). All this stuff can almost always be put in automated test suites which will prevent breakage down the line, but this can take as much time as writing the actual code itself, so it's not as widespread a practice as it should be.

    The real worry these days is not what happens on the backend to all that data you give them, but that's another story and hopefully not a matter for the data protection commissioner



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 4,145 ✭✭✭monkeybutter


    no they are middle class you are mixing everything up


    no one is mixing up timelines as you didn;t give any

    in 1994 one person in a class of 30 had a computer and used it on a project



  • Registered Users, Registered Users 2 Posts: 8,979 ✭✭✭Ray Palmer


    So true. I worked with a BA that could only do one thing and that was call a meeting. Most of the time we just had to explain everything to her which she would forget by the next meeting. She wasn't even good with meeting as she never took notes or assigned actions. Once "something" was delivered she felt her job was done. The assumption was everything had bugs so it didn't matter.

    There is also the hidden issues caused by not using standards when the software has to be added to or changed. Worse again when there are a lot of contractors who will be gone by the time that happens. Agile development has definitely seen this increase dramatically and easier to hide



  • Registered Users, Registered Users 2 Posts: 4,145 ✭✭✭monkeybutter


    50% of people in working class dooblin had computers in 1986

    hahahhahaha



  • Registered Users, Registered Users 2 Posts: 8,979 ✭✭✭Ray Palmer


    I think you are obsessing on PCs not the many computers about. Given 1994 is your reference to school class you weren't old enough to remember the early 80s



  • Registered Users, Registered Users 2 Posts: 4,145 ✭✭✭monkeybutter


    obsessed with PCs, that's is what you had

    there weren't more computers around in 1981 than 1994

    I dont know what else to say



  • Registered Users, Registered Users 2 Posts: 8,979 ✭✭✭Ray Palmer


    You weren't there and not old enough. Maybe you lived in a worse area than me but you are simply wrong on statements you have made about my knowledge of families. Don't believe me that a factory worker with 5 kids had a computer in 86 but I know that is true. I won't be engaging with you anymore



  • Registered Users, Registered Users 2 Posts: 4,145 ✭✭✭monkeybutter


    factory workers were the worst hit by the 1980s lad



  • Registered Users, Registered Users 2 Posts: 8,979 ✭✭✭Ray Palmer


    Mirco computers weren't/aren't PCs so you really don't know what you are talking about. There were different computers types about in the 80s never said more than PCs

    https://en.wikipedia.org/wiki/Microcomputer



  • Registered Users, Registered Users 2 Posts: 4,145 ✭✭✭monkeybutter




  • Registered Users, Registered Users 2 Posts: 14,463 ✭✭✭✭Flinty997


    1980s was a decade of recession, unemployment and emigration.

    Irish Interest Rates

    1980	14.15%
    1981	16.25%
    1982	16.25%
    1983	13%
    1984	11.75%
    1985	13%
    1986	12.5%
    1987	12.5%
    1988	9.25%
    1989	11.4%
    1990	12.37%
    

    But computers certainly started to appear. I wasn't interested enough to notice many. I can't find any stats on it though. I only really got interested in at the start of the 1990s.



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 8 ryan0159


    I should be over it by now but I'm baffled almost daily by how bad software is these days. I haven't been in the software development/adjacent areas for that long, less than a decade, but there are some widespread industry practices that I think are contributing to how bad everything seems to be.

    Too much horsepower

    It's been mentioned in the thread already, but hardware has been getting faster and software getting slower. One of the reasons programs written in older languages were performant is that they had to be. If the program you wrote in BASIC in 1987 consumed 100x the resources it needed, you didn't have a program. These days languages like Python can take 100x the time it would take C to do something and nobody bats an eye. The problem isn't necessarily that something like Python isn't performant, that can be fine. Sometimes it's beneficial to sacrifice performance for a more expressive and easy-to-use language. The problem is that most developers don't even know that the language they're using is slow. I've worked at companies where it took 12 minutes for a Javascript app to generate an excel spreadsheet, and it wasn't flagged as a critical issue. Any time it was brought up you'd get "haha, yeah that takes a while". Software performs terribly because it can afford to. This idea has been around for ages. What Andy giveth, Bill taketh away.

    Fast iteration cycles

    Most software/computery companies use scrum, kanban, or a spinoff/flavour of one or the other. These systems have their advantages. They do allow for quick development of new features, frequent updates for users, a good feedback cycle for developers etc.

    In the wrong hands, they also cause corner-cutting and the ignoring of 'smaller' problems. When you've got a BA asking you why it's taking more than 3 days to add a new feature to your application, you start cutting corners. Instead of taking the time to make sure that the email address input box on your web page can accept any valid email address, you cobble together the first solution that works from whatever pre-baked solution you can find on Stack Overflow. You don't have time to make sure newer TLDs are supported, or that it supports more than the 5 most common email services. They asked you to make a box that accepts email addresses, you give them a box that accepts email addresses.

    This also sets up an adversarial relationship between the developers and QA, if you've even got QA. The developers are under pressure to get things done and each time QA finds a fault, they're making you look bad. It shouldn't be that way, you're both working for the same company. But that leads to my next point.

    Skimping on QA

    I've worked with good QA people and they're a godsend. Developers aren't usually focused on making something a pain-free experience for the end user, they're usually interested in implementing the requested feature as quickly as possible. QA is like the barriers in a bowling alley. They stop you from completely whiffing when you have a bad day and keep you on track if you start to stray off target. From what I've seen in the last few years, QA is often very understaffed or missing completely. I've been at companies where we have 1 QA person for the entire software division and I've been at companies where there's no QA. At some of those companies, when I suggested it might be good to hire a QA person for a team that was doing full-stack web development, the manager told me "Well I was a developer for years and I never had QA". Developing without QA is the software equivalent of driving without a seatbelt. Yeah, you'll get away with it most of the time, until you don't.

    Company growth & diffusion of responsibility

    I think as companies get bigger, they tend to get less effective. Case in point, sometimes the wallpaper on my Windows PC randomly disappears. It happens sometimes when I minimize a window. We're talking about one of the wealthiest companies in the world, with one of the largest software development teams in the world, and sometimes it fails to display a jpeg. And it's not a new issue, it's been happening for years. So why isn't it fixed?

    Well, Microsoft is a big company. There are lots of engineers with lots of things to do because lots of product owners, BAs and third-parties are requesting lots of new stuff all the time. And so nobody cares if there's a minor UI bug. You don't get this sort of behaviour from a company of 3 people, because at some point one of the developers gets annoyed/embarrassed about it and fixes it. When you've got 2500 engineers, and a company structure that prevents 2490 of them from even understanding where the problem comes from, nobody is ever going to fix that.

    I once found a bug in a Microsoft app that caused files to balloon out of control. I eventually found a thread complaining about the issue where a Microsoft engineer said he had a fix for it, but it would likely be 2 years minimum before it could be released.

    This sort of thing isn't limited to huge companies anymore though, it has seeped into web development thanks to the proliferation of package managers. There are bugs that show up in multiple different applications because those applications are built using a javascript-to-desktop-app conversion framework. Those UI bugs aren't fixed because the people who make the apps can't fix them. The bug isn't in their app, it's in a package buried so far down in a node_modules folder that trying to grep for its name will set your cpu on fire.



  • Registered Users, Registered Users 2 Posts: 8,979 ✭✭✭Ray Palmer




  • Registered Users, Registered Users 2 Posts: 14,463 ✭✭✭✭Flinty997


    Couldn't agree more.

    I worked as a tester in MS long time ago. The QA systems and processes they had then, I've never seen since.

    Recently have had to try advocate for lots of basic things that were 101 back in the day. Gave up though, no ones listening. Feel like we are back in the stone age with a lot of common development practises. at least where I am.



  • Registered Users, Registered Users 2 Posts: 4,145 ✭✭✭monkeybutter


    i did

    1960s to present, history of PCs, even includes your machine



  • Registered Users, Registered Users 2 Posts: 8,979 ✭✭✭Ray Palmer


    Try looking at computer sales and the price drops of the machines as time went on. The Commodore 64 halved it's price in 83 for example. You admit you weren't paying attention back then. The gaming crash of the 80s saw costs of these computers and game consoles drop dramatically in price. The Atari 2600 went to a quarter of the price and included games compared to initial launch. By 86 they were within reach of many people so looking at launch price and interest rates doesn't tell much



  • Registered Users, Registered Users 2 Posts: 8,382 ✭✭✭petes


    Working class in the 80's meant it was a struggle to buy a new pair of shoes, not buy a computer. I would say rose tinted glasses but it's more delusional to think you were working class or to put yourself in that category.



  • Registered Users, Registered Users 2 Posts: 4,145 ✭✭✭monkeybutter


    yes try looking at computer sales 1982 to 1990



  • Registered Users, Registered Users 2 Posts: 14,463 ✭✭✭✭Flinty997


    Game consoles and gaming is entirely different thing.

    1982, the 2600 cost Atari was sold for an average of $125 (equivalent to $350 in 2021).

    Don't have to remember back then if we can google it today. looking at economic conditions certainly gives a very different context to what your suggesting.



  • Registered Users, Registered Users 2 Posts: 4,145 ✭✭✭monkeybutter


    imagine trying to do a sole traders home accounts on a Vic 20

    and no colour TV

    Ray get off the TV stop doing the accounts which I'm going to give to my accountant after anyway so he can throw the disk in the bin because he doesn't even have a computer himself



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 8,979 ✭✭✭Ray Palmer


    The problem is many people consider working class people as those in poverty and on social welfare. People could afford them and did as the price dropped just like games consoles after 5 years don't cost the same a launch prices



Advertisement