Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi all! We have been experiencing an issue on site where threads have been missing the latest postings. The platform host Vanilla are working on this issue. A workaround that has been used by some is to navigate back from 1 to 10+ pages to re-sync the thread and this will then show the latest posts. Thanks, Mike.
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

90 Hour Weeks?

  • 09-04-2014 2:17pm
    #1
    Registered Users, Registered Users 2 Posts: 586 ✭✭✭


    So I've been following the 'Software Development - A Dead End Career?' thread and I see people referrring to younger devs willing to do 90 hour weeks. In a lot of other forums (albeit with a heavy American bias) there is also lots of references to developers having to put in very long hours (60 - 80). This seems to be especially true for the younger ones. I'm a developer in my mid 20's and to me the idea of doing that number of hours a week is beyond absurd. In my last company I ended up on a project for 5 weeks where I was doing 60 hours a week. I kicked up a stink at the start of the project since I thought what they expected was unrealistic and it was definitely one of the reasons why I left that employer. For that project if I broke down my hourly pay I was working for under minimum wage, and yes the wages were pretty terrible to begin with (it was my first role out of college).

    In general I'll go beyond my contracted hours by working one or two weekday evenings if there is a need. Or under exceptional (and unavoidable) circumstances working a Saturday/Sunday wouldn't be out of the question but this would be only if a critical system was down or a critical deadline wouldn't be met. As far as I'm concerned this is sufficient to cover the clause in my contract that indicates that I can be asked to work beyond my standard hours if the company deems it neccessary.

    What sort of hours/overtime do the rest of you deem acceptable? And have you worked in an environment where you had to put in long hours. If so, why did you do it? And would you do it again?


Comments

  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    Aswerty wrote: »
    What sort of hours/overtime do the rest of you deem acceptable?
    I don't really have a hard number for that. Most of mainland Europe seems to have 39 hours as a line in the sand, though that's an average over a period, not a cut-off like airline pilots have.
    And have you worked in an environment where you had to put in long hours.
    Yes, I've done the 90-hour week thing.
    If so, why did you do it?
    Deadlines for a major demo in a startup where I'd been promised equity and was the tech lead. So it was my design and a chunk of the company was mine (or so I thought at the time).
    And would you do it again?
    Immediately after the demo, the promise of equity went the way of all verbal contracts. More fool me, but the lesson was learned. Would I do it again? Would I ****. It's a stupid practice, caused by deadlines that people shouldn't have committed to in the first place, and it causes more problems than it solves.

    Yes, there's the flexibility thing, if something catches fire putting it out is kindof the job description, but that's why the EU regs on working hours per week are based on an average over a period, there's more than enough wiggle room there.


  • Registered Users, Registered Users 2 Posts: 7,157 ✭✭✭srsly78


    This all falls under "working conditions" - renumeration and hours worked all comes into it. If working conditions are bad then people will quit. Most people have no problem working extra hours so long as they are being looked after.

    But as Sparks has said, some employers are awful dicks. I have also had employers renege on their obligations, so I walked.


  • Closed Accounts Posts: 2,930 ✭✭✭COYW


    I have just over 8 years experience as a .Net developer and I can say that I have never put in an 80 hour week. I would refuse to do so and I wouldn't let any junior developer under me put in an 80 hour week. I have no problem putting in extra hours or working a weekend but 80 hours is unacceptable.


  • Registered Users, Registered Users 2 Posts: 16,413 ✭✭✭✭Trojan


    I'm self-employed and often work very long weeks by regular folks standards. (Of course, I own 100% of the business).

    I was just playing around looking at how many hours I did in what was a fairly extreme week for me recently: 72-74 hours. That was 12 hours each weekday, 8 and 4-6 hours on Saturday/Sunday respectively. I definitely could not sustain that effort for more than 2 weeks.

    An 84 hour week is 12 hours per day for 7 days.

    A 91 hour week is 13 hours per day for 7 days.

    It is quite easy to do a 60 hour week (12 hours/day for 5 out of 7 days) and feel like you've worked a 90 hour week, but I think actual 90 hour weeks are quite rare.


  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    Trojan wrote: »
    I think actual 90 hour weeks are quite rare.
    I'd hope so. The last time I did one (that demo I mentioned), I was doing back-to-back stints of between 20 and 30 hours for about a calendar week. It was stupid, it was unpleasant, and I never want to go through that again.

    Do I sound like I'm a little bitter about the entire experience? Rats, because I was trying to sound as bitter as a bag of old limes sucked through a tennis player's jockstrap.

    I *really* think that practice is amongst the worst problems we have as an industry.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 16,413 ✭✭✭✭Trojan


    One of my lecturers gave me this book a long time ago:

    DeathMarch2.jpg

    Essential reading for anyone in this kind of situation, and interesting for anyone in the technology business.


  • Closed Accounts Posts: 8,015 ✭✭✭CreepingDeath


    The 60-90 hour weeks are usually a sign of a start-up company or one chasing growth.
    More mature software development companies will have proper estimates and project management.

    There'll still be a few crisis where you have to put the extra effort in, but shouldn't have to do it on a regular basis.

    It's been around 3 years since I had to work a weekend, again that was to save a project just before go-live.

    A former CEO followed the "chasing the chasm" book on how to grow an I.T. business, and there was clear period in the companys growth were you sacrafice customer satisfaction for growth/market share. So things get very busy and chaotic in that growth period and it's usually developers fire-fighting the problems then.


  • Registered Users, Registered Users 2 Posts: 7,468 ✭✭✭Evil Phil


    I've worked 60 hours weeks, on one contract I did actually work 90 hour weeks. We were getting paid overtime but everyone burnt out in a month, people left including myself.

    I don't have the citations but the reason we work a 40 hour week is because research has shown that any longer than that and employees are actually less productive. I have limited myself to a 10 hour day maximum (9am - 7pm for example) but even with that the follow day I'm noticeably less capable, so I try to avoid it.

    It seems to be a badge of honour in development, and elsewhere, to be working crazy hours. It's pointless, I guess in the startup environment this happens, and some people thrive on it, but even then they'll burn out. Being self-employed is a different thing because you've a much greater investment in the work.


  • Registered Users, Registered Users 2 Posts: 586 ✭✭✭Aswerty


    Sparks wrote: »
    I'd hope so. The last time I did one (that demo I mentioned), I was doing back-to-back stints of between 20 and 30 hours for about a calendar week. It was stupid, it was unpleasant, and I never want to go through that again.

    Do I sound like I'm a little bitter about the entire experience? Rats, because I was trying to sound as bitter as a bag of old limes sucked through a tennis player's jockstrap.

    I *really* think that practice is amongst the worst problems we have as an industry.
    I can appreciate the bitterness. When I did the few 60 hour weeks on the trot as I metioned in my first post, I was fairly bitter towards my employer. It wasn't so much the extra work that made me bitter but more the fact that they were happy to put me in a situation where I pretty much had the choice of agreeing to be put under a huge amount of pressure or put in my notice there and then. And I can tell you it came close to putting in my notice on the spot. The initial meeting for that project was one of the most uncomfortable experiences of my professional life.

    I think the term Deathmarch would have described the project pretty well. It isn't working the long hours that made it so tough it was the constant pressure, juggling of important tasks, and the dread of hitting unforseen problems I had no time to deal with.


  • Registered Users, Registered Users 2 Posts: 8,449 ✭✭✭Call Me Jimmy


    A very interesting discussion as I have seen the attitude of some people (not directly) where it is a badge of honor to be willing to or to have done these hours.

    I would doubt their prevalence; as phil said, most employers know that though someone may be there working for 90 hours, you are not getting 90 hours of maximised productivity.

    In cases of 'crunch time' if I had a large stake in the product I would do it as it would have been, partially at least, my responsibility for the time management/mismanagement.

    It is a matter of principle that I wouldn't do it as an employee where the necessity for the crazy hours is based on the mistakes and failures of management/team leaders to correct things earlier.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    The harder we crowd business for time, the more efficient it becomes. The more well-paid leisure workmen get, the greater become their wants. These wants soon become needs. Well-managed business pays high wages and sells at low prices. Its workmen have the leisure to enjoy life and the wherewithal with which to finance that enjoyment.

    The industry of this country could not long exist if factories generally went back to the ten hour day, because the people would not have the time to consume the goods produced.

    Just as the eight hour day opened our way to prosperity, so the five day week will open our way to a still greater prosperity.

    In the old days, before we had management and power, a man had to work through a long day in order to get a bare living. Now the long day would retard both production and consumption.

    It is high time to rid ourselves of the notion that leisure for workmen is either 'lost time' or a class privilege.

    The five day week is not the ultimate, and neither is the eight hour day. It is enough to manage what we are equipped to manage and to let the future take care of itself. It will anyway. That is its habit. But probably the next move will be in the direction of shortening the day rather than the week.

    That long-forgotten pinko commie hippie was the highly unsuccessful businessman Henry Ford, who introduced both the 8-hour day and the 5-day week, speaking in October 1926. You'd imagine we'd have moved forward since then, wouldn't you? (And we may well need to, as Charles Stross was writing about recently).


  • Closed Accounts Posts: 2,930 ✭✭✭COYW


    A very interesting discussion as I have seen the attitude of some people (not directly) where it is a badge of honor to be willing to or to have done these hours.

    Some consider it a "rite of passage" for graduates. I really don't understand what it proves myself. The consultancy firms pride themselves on it. You get these grads who are trying to learn about the industry, after their week long crash course in Java, .Net or whatever and they are too tired to learn after working these silly shifts.


  • Registered Users, Registered Users 2 Posts: 8,449 ✭✭✭Call Me Jimmy


    It doesn't apply as much too exclusively physical intensive work, but for a software developer, who is ONLY really living in thought processes all day, the work day should not be 8.5/9 hours. It's my opinion that that is wasteful because the day ends up being as long as any other work day but the brain just can't concentrate on abstractions and concepts for an hour on end, there will be a lot of waste in that hour and the waste-to-productive ratio goes up as each hour passes. I think 5 hours max should be the work day of a developer.


  • Registered Users, Registered Users 2 Posts: 14,148 ✭✭✭✭Lemming


    COYW wrote: »
    I have just over 8 years experience as a .Net developer and I can say that I have never put in an 80 hour week. I would refuse to do so and I wouldn't let any junior developer under me put in an 80 hour week. I have no problem putting in extra hours or working a weekend but 80 hours is unacceptable.

    +1 on that.

    I've done stupid hour weeks, "13/14/15/whatever-number of day" etc. where I've just stopped counting so don't remember. In most cases it's the sign of something very, very wrong at a fundamental level that is beyond simply throwing dev resources at a problem until/upon you start getting diminishing returns. It also usually results in lots of very stupid & costly mistakes - that could otherwise be easily avoided - because the devs are so overworked/over-pressured/tired and burning out that concentration lapses.


  • Registered Users, Registered Users 2 Posts: 9,159 ✭✭✭Royale with Cheese


    I did 60 odd hour weeks for a period of about 2 months once for one of the bigger tech companies in Dublin that are notorious for long hours. At the time you sort of get into a groove and it becomes natural, the thought of doing it again though makes me want to poke my eyes out. We did a smaller release after that went in with more normal hours but motivation was at a low after that. Also I definitely felt I had a sense of entitlement to put my feet up a little, which is far from healthy as the job wasn't done yet.

    That said we got paid an hourly rate for overtime unlike a lot of smaller companies, and since I hadn't been working in Ireland for a few months that year I had built up a load of tax credits. I ended up earning a fair chunk of cash from doing it.


  • Registered Users, Registered Users 2 Posts: 14,148 ✭✭✭✭Lemming


    If you're being paid big money because your employers are trying to pull the rabbit out of a burning hat, the rewards can be very good. But it's not sustainable. Like you said Royale, the thought of doing it over is not thrilling.


  • Closed Accounts Posts: 559 ✭✭✭Joe Doe


    Agree 5hrs am with optional 3hrs evening (remote) for Deve is the perfect scenario. Human/mammal sleep cycles historically show a days sleep should consist of two separate periods (hence siestas etc) this was only changed since the industrial revolution began in N'Europe in increase productivity. To make matters worse caffeine is drip-fed, to maintain alertness yet it reduces alpha brain waves thus reducing complex creative though proceeses.

    Once did a 22hr shift (in the old days of pre-press publishing), but it was only x1 per month and normal practice to secure good pre-flight and handover, so not too bad.

    Another time witnessed people faint/walk into doors during an all weekender in London. Mostly we were all ltd contractors placements (employers prefer this) so no real legal remit to working conditions. Pulled out early after realising the company was at pre-crash, and they wouldn't sign my own basic but watertight freelance 1pg contract and needless to say was the only person out of 15 that even got paid (80% anyway), thanks to calling in a heavy factoring company and using earlier verbal/email communicaitons. The recruitment agency provided no help whatsoever, their own 20pg contracts were worthless and even called me couple of weeks later to ask for advice on how they could help secure pay for their other unpaid placements!

    Overtime is quite simply a result of poor management-scheduling/forecasting/planning/asset acquisition. Bad practitioners also quickly get found out these days particularly by the more experienced workers.


  • Registered Users, Registered Users 2 Posts: 11,986 ✭✭✭✭Giblet


    It's a good way to burn out a good developer. I've worked 60 hours, but any more and I'd probably just let the project fail, as it should.


  • Registered Users, Registered Users 2 Posts: 5,246 ✭✭✭conor.hogan.2


    90 hours for any period of time is beyond ridiculous if you are on a contract and you are being paid to work 35-37.5 on average then you are working a second job for free essentially with none of the "benefits" of that.


  • Closed Accounts Posts: 19,777 ✭✭✭✭The Corinthian


    First of all, when you see people post about "90 hour weeks", I wouldn't take it too literally. What's being referred to is excessive overtime as standard, which is all too common in IT.

    That's not to say that the 90 hour week doesn't happen - I've done it and I suspect most who've been in the business for over five years have had that experience at least once in their career. It was naturally a case of an unrealistic deadline, where to keep us at our desks pizzas would be shipped in every day (eating nothing but pizza is not good for you after a while) and where even the PM eventually had a mental break (we found her one morning outside of the building crying and refusing to come in) and so we ended up having to PM ourselves (it was an improvement, TBH).

    I distinctly remember, at the end of a 19 hour day, reaching a point where I could no longer focus and it took me 40 minutes to write a single line of code, thinking that was it for the day. I lived ten minutes away, so I got home, slept fully clothed and ended up back at my desk the next day at nine to start another 16+ hour day.

    However, such extreme cases are the exception rather than the rule, as they are physically unsustainable. What is more common is where you are pressured into working every day until 9pm, work through your lunch and also come in for at least one day on weekends - effectively a 60 to 70 hour week. And the thing is that when this becomes the expected standard, people don't even realize they're doing it, because everyone else is.

    dilbert-google-20time.jpg

    This is far more commonplace, especially in consultancies and particularly in the more 'prestigious' firms, and in the context of the other thread what you find is that older, more experienced, developers will be far less likely to work these hours and more likely to leave at 17:30 - 18:00, on the dot, because after a few years they realize that this is ultimately exploitation (often driven by unrealistic targets, cost cutting or incompetent planning) and that were they to do these hours they would, as someone already pointed out, be earning less than the minimum wage per hour.

    One of the reasons that more senior developers can do this is ultimately they don't need the job - that it, this isn't your first job and even if it is a prestigious company, it won't be your first, so you can afford to move on. Young developers can't fall back on the other places they've worked on their CV if they choose to move on, especially if they're moving on with less than a year or two of commercial experience.

    My view is that sometimes you do have to pull all the stops for the greater good. If it's because someone else screwed up, then you can deal with that in the post-mortem. But when it starts becoming the standard, when de facto your working week becomes 50, 60 or 70 hours per week, rather than 40, then it's time to reassess if you want to stay.

    But if you're a graduate developer and you find yourself in such a role, then you're kind of stuck and that's where it becomes a rite of passage.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 11,264 ✭✭✭✭jester77


    I haven't worked more than 45 hours in at least 7-8 years. Long hours are a symptom of bad management, planning and estimating. Sprints are usually 2-3 weeks, it is not too difficult to estimate 2-3 weeks worth of development work. There are daily standups to weed out problems quickly. If the team end up doing 50+ hour weeks then something is seriously wrong.


  • Registered Users, Registered Users 2 Posts: 8,449 ✭✭✭Call Me Jimmy


    First of all, when you see people post about "90 hour weeks", I wouldn't take it too literally. What's being referred to is excessive overtime as standard, which is all too common in IT.

    That's not to say that the 90 hour week doesn't happen - I've done it and I suspect most who've been in the business for over five years have had that experience at least once in their career. It was naturally a case of an unrealistic deadline, where to keep us at our desks pizzas would be shipped in every day (eating nothing but pizza is not good for you after a while) and where even the PM eventually had a mental break (we found her one morning outside of the building crying and refusing to come in) and so we ended up having to PM ourselves (it was an improvement, TBH).

    I distinctly remember, at the end of a 19 hour day, reaching a point where I could no longer focus and it took me 40 minutes to write a single line of code, thinking that was it for the day. I lived ten minutes away, so I got home, slept fully clothed and ended up back at my desk the next day at nine to start another 16+ hour day.

    However, such extreme cases are the exception rather than the rule, as they are physically unsustainable. What is more common is where you are pressured into working every day until 9pm, work through your lunch and also come in for at least one day on weekends - effectively a 60 to 70 hour week. And the thing is that when this becomes the expected standard, people don't even realize they're doing it, because everyone else is.

    dilbert-google-20time.jpg

    This is far more commonplace, especially in consultancies and particularly in the more 'prestigious' firms, and in the context of the other thread what you find is that older, more experienced, developers will be far less likely to work these hours and more likely to leave at 17:30 - 18:00, on the dot, because after a few years they realize that this is ultimately exploitation (often driven by unrealistic targets, cost cutting or incompetent planning) and that were they to do these hours they would, as someone already pointed out, be earning less than the minimum wage per hour.

    One of the reasons that more senior developers can do this is ultimately they don't need the job - that it, this isn't your first job and even if it is a prestigious company, it won't be your first, so you can afford to move on. Young developers can't fall back on the other places they've worked on their CV if they choose to move on, especially if they're moving on with less than a year or two of commercial experience.

    My view is that sometimes you do have to pull all the stops for the greater good. If it's because someone else screwed up, then you can deal with that in the post-mortem. But when it starts becoming the standard, when de facto your working week becomes 50, 60 or 70 hours per week, rather than 40, then it's time to reassess if you want to stay.

    But if you're a graduate developer and you find yourself in such a role, then you're kind of stuck and that's where it becomes a rite of passage.

    Pretty much agree with everything except the last line. I understand what you're saying, but I would be confident as a graduate developer that I would leave such a consistently exploitative role and look elsewhere even without the experience. I don't like the idea of anyone feeling they are trapped in a role. A good developer will get hired again. A smart company would accept and agree with a developer who left for these reasons imo.

    I know it sounds very idyllic and luxurious, but I think *possibly*, based on all the literature and fascination in IT with software processes etc. management now are beginning to cop on to the bigger picture.


  • Closed Accounts Posts: 19,777 ✭✭✭✭The Corinthian


    Pretty much agree with everything except the last line. I understand what you're saying, but I would be confident as a graduate developer that I would leave such a consistently exploitative role and look elsewhere even without the experience.
    Fair enough, but not everyone would share your confidence. After all, landing your first job in IT, without commercial experience, isn't the easiest thing in the World - college courses are oblivious to commercial realities and thus don't teach them and without it graduate developers tend to need an inordinate amount of monitoring for their first six months.
    I don't like the idea of anyone feeling they are trapped in a role. A good developer will get hired again. A smart company would accept and agree with a developer who left for these reasons imo.
    If they're telling the truth (People Lie in Interviews Shocker!). I've been told this story more than once in interviews, only to discover that in reality they jumped, or were pushed, from their first/last job because they were basically crap. Or flakes. Or even had mental health issues (I kid you not).
    I know it sounds very idyllic and luxurious, but I think *possibly*, based on all the literature and fascination in IT with software processes etc. management now are beginning to cop on to the bigger picture.
    By all means stick by your ideals. All I'd say is you are taking a risk, and like all risks it can pay off or blow up in your face.


  • Registered Users, Registered Users 2 Posts: 7,498 ✭✭✭BrokenArrows


    My contract is for 39.5 hours a week. Ill occasionally put in an extra hour if i'm in the zone come 5.30 and im enjoying the project but as a rule ill leave on time.

    My company is terrible at promising the impossible so I learned a long time ago to just do my hours.

    I've been occasionally asked to do a Saturday if we are under pressure but that's always with the understanding that ill take another day off at some point.

    I think some junior devs can feel pressured into putting in huge hours because they may not be experienced enough to see that the reason that they are behind schedule is because the schedule was unrealistic.


  • Moderators, Society & Culture Moderators Posts: 9,735 Mod ✭✭✭✭Manach


    A lot of it is scheduling problems, where the times do look realistic on Gannt charts but life and coding are never that simple. I've on occasion done the 60hr work week, but that got rarer as I learned to give realistic feedback on expectations and had the documentation to back this up.
    However it is not only small/emerging firms that have this culture. The last major project I was in a large firm, it switched to wind down mode and staff were still moved onto other projects and the core team were expected to multi-task development efforts at the same level with a minimum of people employed - ie wearing different hats (DBA, Sysadmin, Scripting etc, Customer T1/T2/T3 support etc) to get the last of the legacy code in support mode. That dragged on ever so slightly.


  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    college courses are oblivious to commercial realities and thus don't teach them
    Yeah, that's not actually true. At least not in good courses. Not because they teach them; but because they're not oblivious to them, they deliberately ignore them because commercial requirements are bull**** as far as things-to-base-your-syllabus-on go for undergraduate courses (if you're offering an industrial course on tool X, that's another thing entirely).
    You want to run an undergrad course in CS/CENG as a general course, not some industrial prep thing because by the time your first graduate's emerged from four years of work, commercial requirements can have changed utterly. If my course had used commercial requirements as a basis, I'd have emerged knowing how to make WAP pages and code in Delphi. Instead, I learn how stuff works from the IDE down to the silicon and I've learnt every new trend since in days or weeks.
    All I'd say is you are taking a risk, and like all risks it can pay off or blow up in your face.
    This is true; but it's equally true that nothing in life is safe and risk-free. If the "safe" option involves 60+ hours a week for entry-level wages, I'd take the risk.


  • Registered Users, Registered Users 2 Posts: 2,021 ✭✭✭ChRoMe


    Sparks wrote: »
    Yeah, that's not actually true. At least not in good courses. Not because they teach them; but because they're not oblivious to them, they deliberately ignore them because commercial requirements are bull**** as far as things-to-base-your-syllabus-on go for undergraduate courses (if you're offering an industrial course on tool X, that's another thing entirely).
    You want to run an undergrad course in CS/CENG as a general course, not some industrial prep thing because by the time your first graduate's emerged from four years of work, commercial requirements can have changed utterly. If my course had used commercial requirements as a basis, I'd have emerged knowing how to make WAP pages and code in Delphi. Instead, I learn how stuff works from the IDE down to the silicon and I've learnt every new trend since in days or weeks.


    This is true; but it's equally true that nothing in life is safe and risk-free. If the "safe" option involves 60+ hours a week for entry-level wages, I'd take the risk.

    Still wouldn't ****ing hurt for a uni to even introduce the concept of source control though.


  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    ChRoMe wrote: »
    Still wouldn't ****ing hurt for a uni to even introduce the concept of source control though.
    The courses I taught did...

    Granted, the courses I took didn't, but to be fair, at the time RCS was the norm (and I still have projects from back then in RCS archives so it's not like it was a secret tool), CVS was still experimental really (and without the web, it hadn't spread as far as us - Navigator was only released half-way through my first year in college and we didn't think it was going to be able to compete with gopher except on the iMacs and they weren't real computers :D ), and if you needed source control for a 40-line program, then really you'd get better results by spending a little time learning to touch-type (120wpm on my off days thanks to the nuns, bee-atch)...


  • Closed Accounts Posts: 19,777 ✭✭✭✭The Corinthian


    Sparks wrote: »
    Yeah, that's not actually true. At least not in good courses.
    I can't speak for every course out there, but I have seen the common end result, which is a developer who can be very talented and knowledgeable on an academic level, but still needs a lot of hand-holding in a commercial environment for the first few months.
    If my course had used commercial requirements as a basis, I'd have emerged knowing how to make WAP pages and code in Delphi.
    Not really what I'm talking about, although about fifteen years ago many courses were still teaching CGI-Perl when the industry had long since moved onto frameworks/languages such as ASP and PHP*. By commercial, I don't even mean source control, but even basic stuff like unit testing, or what a staging environment was.
    This is true; but it's equally true that nothing in life is safe and risk-free. If the "safe" option involves 60+ hours a week for entry-level wages, I'd take the risk.
    Absolutely, all I've said is that not everyone will take the risk and will instead suck it up in their first job.

    That risk also really depends on the market; I entered the market during the Dotcom period, so I could afford to move around; I managed to move twice (one day in the first job, a month in the second) before I was in a role I was comfortable with. But that was a boom period for IT and getting a job was easy.

    Had I entered the market in 2002, there's no way I would have been able to do that and I would probably have had difficulty getting that first job to begin with.





    * Not to say they shouldn't; modern Web languages are so high-level that good old fashioned Perl-CGI is far more instructive. The point is that this was the most modern Web language they were teaching.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 2,213 ✭✭✭MajesticDonkey


    I can't speak for every course out there, but I have seen the common end result, which is a developer who can be very talented and knowledgeable on an academic level, but still needs a lot of hand-holding in a commercial environment for the first few months.
    As a student currently over 3 months (out of 8) into work placement as a developer/server administrator I can vouch for this statement. I can't say for other courses, but my course really didn't give me any insight to working in a commercial environment, as mentioned. The sheer fact that breaking something can affect thousands of users is terrifying! :D


  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    the common end result, which is a developer who can be very talented and knowledgeable on an academic level, but still needs a lot of hand-holding in a commercial environment for the first few months.
    It's tempting to say "yes, and?" :D
    Everyone seems to think a computing undergrad course should produce someone who can then be given a professional role running a project and be able to hit the ground running on day one.

    Why, exactly? We don't do that with solicitors, barristers, doctors, or any other kind of engineer (seriously, no civil engineering grad is ever going to build a bridge solo - apart from the obvious, it'd be illegal because you need to be chartered to sign off on things like that and we don't charter engineers until after they've been working in the field for a few years).

    The point of an undergrad degree is to learn the basics, the fundamentals - we expect new grads to need some hand-holding (a lot actually) in their first role or first few years.
    You come out of that 4-year degree course knowing APIs and the college has failed you. You come out of it knowing what an API is and what it should and shouldn't do, and that's the desired outcome.

    That said, yes, some things should be taught more widely (like source control, like staging, like alternatives to OOP - sorry folks, OOP is useful but turns out the research that said it was the best way was wrong and flawed - and like ideas like unit tests and how things like code review are actually better than them in some ways and why we still need them and so on (as opposed to learning how to use xUnit libraries).
    By commercial, I don't even mean source control, but even basic stuff like unit testing, or what a staging environment was.
    Yes, but you know as well as I do that you can't just teach unit testing or staging, you have to teach what unit tests are really used for (hint; not TDD), what they're bad at, what they're good at and so on.
    (And yes, that really does need to be more widely taught).


  • Closed Accounts Posts: 19,777 ✭✭✭✭The Corinthian


    Sparks wrote: »
    It's tempting to say "yes, and?" :D
    Everyone seems to think a computing undergrad course should produce someone who can then be given a professional role running a project and be able to hit the ground running on day one.
    That's a different debate, TBH. The main point in relation to graduates is that regardless of whether they should be taught in a more vocational manner or not, they're still less attractive to employ than a developer with one or two years experience, hence more likely to end up in a situation where they have to 'pay their dues' in their first job and work silly hours.
    Why, exactly? We don't do that with solicitors, barristers, doctors, or any other kind of engineer (seriously, no civil engineering grad is ever going to build a bridge solo - apart from the obvious, it'd be illegal because you need to be chartered to sign off on things like that and we don't charter engineers until after they've been working in the field for a few years).
    Indeed, and when they start their professional careers as apprentices, devils or interns, they too end up 'paying their dues' too - actually graduate developers are probably lucky in comparison, as they get paid; something that apprentice solicitors often do not and devilling barristers almost never do.


  • Registered Users, Registered Users 2 Posts: 8,219 ✭✭✭Calina


    Sparks wrote: »
    It's tempting to say "yes, and?" :D
    Everyone seems to think a computing undergrad course should produce someone who can then be given a professional role running a project and be able to hit the ground running on day one.

    It's a wider discussion but the role of education is blurry at the moment. There is an increasing cohort of people who feel education should be tailored towards employers' needs more than anything. I think it should be tailored towards equipping people to be adaptable because an employer can go bust or decamp. It is not specific to computer science or information technology but also afflicts second level.
    Sparks wrote: »
    Why, exactly? We don't do that with solicitors, barristers, doctors, or any other kind of engineer (seriously, no civil engineering grad is ever going to build a bridge solo - apart from the obvious, it'd be illegal because you need to be chartered to sign off on things like that and we don't charter engineers until after they've been working in the field for a few years).

    One of the things I think compsci has done is taken on board some of the hierarchical labels from other areas - currently scientist/engineer are two - without some of the rigors applied to those labels elsewhere.

    Against that, there seems to be a move to devalue the skills inherent in development, in programming. When you have the pr head for the new computing curriculum in the UK suggesting you can pick up coding in two days...that does give a sense of it's easy, therefore not worth as much money as [what I do regardless of what that is].

    This is a wider debate on whether the industry, and the tasks it fulfills, should be operating on a basis of relying on people working 80-90 hours sufficiently frequently that it's perceived as a feature of the job in question. My view is that it shouldn't and the reason it happens, again, is that the effort is not adequately valued. In certain respects, it almost comes across as a hazing ritual.
    Sparks wrote: »
    The point of an undergrad degree is to learn the basics, the fundamentals - we expect new grads to need some hand-holding (a lot actually) in their first role or first few years.

    In my view, people need hand holding to some extent in every new job; it's just the type of hand holding changes over time. The other issue is the IT industry is sufficiently vast that no undergrad degree is going to cover all of it in any major depth any more plus, if we teach people programming languages rather than problem solving, adaptability can be an issue.
    Sparks wrote: »
    That said, yes, some things should be taught more widely (like source control, like staging, like alternatives to OOP - sorry folks, OOP is useful but turns out the research that said it was the best way was wrong and flawed - and like ideas like unit tests and how things like code review are actually better than them in some ways and why we still need them and so on (as opposed to learning how to use xUnit libraries).

    Related: if you've a citation for the comments on OOP I'd be interested.
    That's a different debate, TBH. The main point in relation to graduates is that regardless of whether they should be taught in a more vocational manner or not, they're still less attractive to employ than a developer with one or two years experience, hence more likely to end up in a situation where they have to 'pay their dues' in their first job and work silly hours.

    Yes but at the other end, a lot of companies don't necessarily want to pay for long term acquired expertise either. or, more to the point, they want neither the pain of training someone up to be commercially useful, nor the cost of paying them properly when they are seriously commercially experienced.

    Cake and eat it springs to mind
    Indeed, and when they start their professional careers as apprentices, devils or interns, they too end up 'paying their dues' too - actually graduate developers are probably lucky in comparison, as they get paid; something that apprentice solicitors often do not and devilling barristers almost never do.

    Actually, I don't see it that way and I don't agree with it. The point is, if people are working, people should be paid, regardless of whether they are an intern or not. The unpaid apprenticeships, particularly in areas like law and the media tend to mitigate against diversity and social mobility.

    Suggesting devs should feel lucky is the wrong approach; the correct approach is that we should be paying the apprentices in the law and media industries. Anything else is a race to the bottom and hardly aspirational. An argument of "that's the way it's always been" or "well it never did me any harm" is not an argument.


  • Closed Accounts Posts: 19,777 ✭✭✭✭The Corinthian


    Calina wrote: »
    Suggesting devs should feel lucky is the wrong approach; the correct approach is that we should be paying the apprentices in the law and media industries. Anything else is a race to the bottom and hardly aspirational. An argument of "that's the way it's always been" or "well it never did me any harm" is not an argument.
    I've not really suggested devs should feel lucky about anything really; maybe luckier than some at worst, but ultimately every post I've made I've repeatedly been more stoic than anything - it's how the system currently is, the market unfortunately does support a 'cake and eat it' approach by firms and as much as we can talk about how it shouldn't be like that, doesn't change the fact that it is.

    Don't confuse that with supporting the current system.


  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    TC wrote:
    That's a different debate, TBH.
    True, we may need to split this out, but it's kindof amorphous so lets leave it run for a little while till we get an idea for where to cut.
    Calina wrote: »
    There is an increasing cohort of people who feel education should be tailored towards employers' needs more than anything.
    Yes, but those people - at least the vocal ones - are predominantly.... well, let's be polite and say they demonstrate a depraved indifference to the good of the students who are paying for the courses. Quite a few of them seem to be employers who are putting their companies' bottom lines over the good of the students, predominantly in the US where employer-employee relations seem to be trying to model themselves on the mugger-muggee relationship (seriously, at-will contracts? Feck right off mate). Okay, that's their fiduciary duty in a reductionist view, but:
    (a) a longer-term view would say that they're not doing themselves any favours because they're driving down their worth in the eyes of the very people they'd most want to hire (just as an example, think of the list of companies you the reader have in your mind under "will never ever work for ever" and ask how many of them this would be true of); and
    (b) it's their opinion, not ground truth, and it's deeply and often unacknowledgedly biased and that's a good reason to mostly ignore it.
    without some of the rigors applied to those labels elsewhere.
    I think in large part that's because we don't have those rigors in computer science or computer engineering yet, though we're (thankfully) moving that way.

    Seriously - one of the more telling questions I've heard recently was "Do we have any empirical evidence to say version control is a better way to do things" and the answers were "er, it's obvious, duh!", "here's an explanation of what version control is", and "did we need to prove that?". Disturbingly, it seems that there's no actual data to prove it - and this question dates from December 2012, not 1972. There's a whole other debate in there, but it boils down to this - maths can prove 1+1=2 (for various reasons, yes that had to be proven), doctors can point to studies showing that not stabbing yourself in the head is a good thing, in short every field but ours can prove the basic fundamentals they based their field on and which the rest of us would call "common sense" (though notably those proofs often pointed out things we wouldn't have thought of at all because of that common sense blind spot and their fields improved as a result).

    We can't do that yet... though we're getting better, slowly.
    When you have the pr head for the new computing curriculum in the UK suggesting you can pick up coding in two days...that does give a sense of it's easy, therefore not worth as much money as [what I do regardless of what that is].
    Was that not the same interview where she said she couldn't code herself? :D
    Related: if you've a citation for the comments on OOP I'd be interested.
    This is a good intro to the main paper; and this quote sums up the problem:
    It turned out that code size accounted for all of the significant variation: in other words, the object-oriented metrics they looked at didn't have any actual predictive power once they normalized for the number of lines of code.
    And if you haven't seen Greg Wilson's talk on this entire area yet, go watch it now, and if you have, he talks about this at the 40 minute mark.

    And when you're done, go read http://neverworkintheory.org because it has a lot of odd but proven results in it. For example, the proof that code review (the deeply stodgy, almost-waterfall, IBM method from the 1970s) works better than TDD (at least within set scenarios). If you're interested in this aspect of software engineering, that site's a good week or two's worth of fun reading, and if you're interested in writing better code faster, it's where you find some of the proven ways of doing that.
    Cake and eat it springs to mind
    Indeed, and while it's not something you can generally say to an employer while an employee, my thoughts would be that "eat it" is a pretty succint and expressive response to that kind of proposal on multiple levels :D


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    I've repeatedly been more stoic than anything - it's how the system currently is, the market unfortunately does support a 'cake and eat it' approach by firms and as much as we can talk about how it shouldn't be like that, doesn't change the fact that it is.
    You and I and everyone who earns their living in this field has to be a stoic just to get through the working week more often than not, I get that. And a decade of experience in another area has taught me (mainly by stomping repeatedly on my face until I got it) that realism and pragmatism are the most effective tools for changing something...

    ...but it's also taught me that you can change the status quo, and it doesn't require superhuman effort or massive armies of people working on it, but it does start with talking about what things should be, while - damn straight - honestly talking about where things are.


  • Closed Accounts Posts: 19,777 ✭✭✭✭The Corinthian


    Sparks wrote: »
    True, we may need to split this out, but it's kindof amorphous so lets leave it run for a little while till we get an idea for where to cut.
    Well, I'd probably be of much the same opinion on the subject than you; such courses should be about the fundamentals, especially as you don't really learn about those in commercial environments, but that there's room for improvement, nonetheless.

    In fairness, I think third-level courses have improved since the late nineties, as universities began to realize that they had to update them more aggressively than you see in other disciplines.
    Sparks wrote: »
    ...but it's also taught me that you can change the status quo, and it doesn't require superhuman effort or massive armies of people working on it, but it does start with talking about what things should be, while - damn straight - honestly talking about where things are.
    Believe it or not, something like ten years ago I tried getting the ball rolling on a professional association for programmers. Employers from SME's immediately tried to hijack it and certain existing professional associations also got stuck in, looking to absorb anything that came of it as they were afraid they were going to have competition.

    As to programmers themselves? Herding cats would have been easier.


Advertisement