Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

90 Hour Weeks?

Options
2»

Comments

  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    the common end result, which is a developer who can be very talented and knowledgeable on an academic level, but still needs a lot of hand-holding in a commercial environment for the first few months.
    It's tempting to say "yes, and?" :D
    Everyone seems to think a computing undergrad course should produce someone who can then be given a professional role running a project and be able to hit the ground running on day one.

    Why, exactly? We don't do that with solicitors, barristers, doctors, or any other kind of engineer (seriously, no civil engineering grad is ever going to build a bridge solo - apart from the obvious, it'd be illegal because you need to be chartered to sign off on things like that and we don't charter engineers until after they've been working in the field for a few years).

    The point of an undergrad degree is to learn the basics, the fundamentals - we expect new grads to need some hand-holding (a lot actually) in their first role or first few years.
    You come out of that 4-year degree course knowing APIs and the college has failed you. You come out of it knowing what an API is and what it should and shouldn't do, and that's the desired outcome.

    That said, yes, some things should be taught more widely (like source control, like staging, like alternatives to OOP - sorry folks, OOP is useful but turns out the research that said it was the best way was wrong and flawed - and like ideas like unit tests and how things like code review are actually better than them in some ways and why we still need them and so on (as opposed to learning how to use xUnit libraries).
    By commercial, I don't even mean source control, but even basic stuff like unit testing, or what a staging environment was.
    Yes, but you know as well as I do that you can't just teach unit testing or staging, you have to teach what unit tests are really used for (hint; not TDD), what they're bad at, what they're good at and so on.
    (And yes, that really does need to be more widely taught).


  • Closed Accounts Posts: 19,777 ✭✭✭✭The Corinthian


    Sparks wrote: »
    It's tempting to say "yes, and?" :D
    Everyone seems to think a computing undergrad course should produce someone who can then be given a professional role running a project and be able to hit the ground running on day one.
    That's a different debate, TBH. The main point in relation to graduates is that regardless of whether they should be taught in a more vocational manner or not, they're still less attractive to employ than a developer with one or two years experience, hence more likely to end up in a situation where they have to 'pay their dues' in their first job and work silly hours.
    Why, exactly? We don't do that with solicitors, barristers, doctors, or any other kind of engineer (seriously, no civil engineering grad is ever going to build a bridge solo - apart from the obvious, it'd be illegal because you need to be chartered to sign off on things like that and we don't charter engineers until after they've been working in the field for a few years).
    Indeed, and when they start their professional careers as apprentices, devils or interns, they too end up 'paying their dues' too - actually graduate developers are probably lucky in comparison, as they get paid; something that apprentice solicitors often do not and devilling barristers almost never do.


  • Registered Users Posts: 8,219 ✭✭✭Calina


    Sparks wrote: »
    It's tempting to say "yes, and?" :D
    Everyone seems to think a computing undergrad course should produce someone who can then be given a professional role running a project and be able to hit the ground running on day one.

    It's a wider discussion but the role of education is blurry at the moment. There is an increasing cohort of people who feel education should be tailored towards employers' needs more than anything. I think it should be tailored towards equipping people to be adaptable because an employer can go bust or decamp. It is not specific to computer science or information technology but also afflicts second level.
    Sparks wrote: »
    Why, exactly? We don't do that with solicitors, barristers, doctors, or any other kind of engineer (seriously, no civil engineering grad is ever going to build a bridge solo - apart from the obvious, it'd be illegal because you need to be chartered to sign off on things like that and we don't charter engineers until after they've been working in the field for a few years).

    One of the things I think compsci has done is taken on board some of the hierarchical labels from other areas - currently scientist/engineer are two - without some of the rigors applied to those labels elsewhere.

    Against that, there seems to be a move to devalue the skills inherent in development, in programming. When you have the pr head for the new computing curriculum in the UK suggesting you can pick up coding in two days...that does give a sense of it's easy, therefore not worth as much money as [what I do regardless of what that is].

    This is a wider debate on whether the industry, and the tasks it fulfills, should be operating on a basis of relying on people working 80-90 hours sufficiently frequently that it's perceived as a feature of the job in question. My view is that it shouldn't and the reason it happens, again, is that the effort is not adequately valued. In certain respects, it almost comes across as a hazing ritual.
    Sparks wrote: »
    The point of an undergrad degree is to learn the basics, the fundamentals - we expect new grads to need some hand-holding (a lot actually) in their first role or first few years.

    In my view, people need hand holding to some extent in every new job; it's just the type of hand holding changes over time. The other issue is the IT industry is sufficiently vast that no undergrad degree is going to cover all of it in any major depth any more plus, if we teach people programming languages rather than problem solving, adaptability can be an issue.
    Sparks wrote: »
    That said, yes, some things should be taught more widely (like source control, like staging, like alternatives to OOP - sorry folks, OOP is useful but turns out the research that said it was the best way was wrong and flawed - and like ideas like unit tests and how things like code review are actually better than them in some ways and why we still need them and so on (as opposed to learning how to use xUnit libraries).

    Related: if you've a citation for the comments on OOP I'd be interested.
    That's a different debate, TBH. The main point in relation to graduates is that regardless of whether they should be taught in a more vocational manner or not, they're still less attractive to employ than a developer with one or two years experience, hence more likely to end up in a situation where they have to 'pay their dues' in their first job and work silly hours.

    Yes but at the other end, a lot of companies don't necessarily want to pay for long term acquired expertise either. or, more to the point, they want neither the pain of training someone up to be commercially useful, nor the cost of paying them properly when they are seriously commercially experienced.

    Cake and eat it springs to mind
    Indeed, and when they start their professional careers as apprentices, devils or interns, they too end up 'paying their dues' too - actually graduate developers are probably lucky in comparison, as they get paid; something that apprentice solicitors often do not and devilling barristers almost never do.

    Actually, I don't see it that way and I don't agree with it. The point is, if people are working, people should be paid, regardless of whether they are an intern or not. The unpaid apprenticeships, particularly in areas like law and the media tend to mitigate against diversity and social mobility.

    Suggesting devs should feel lucky is the wrong approach; the correct approach is that we should be paying the apprentices in the law and media industries. Anything else is a race to the bottom and hardly aspirational. An argument of "that's the way it's always been" or "well it never did me any harm" is not an argument.


  • Closed Accounts Posts: 19,777 ✭✭✭✭The Corinthian


    Calina wrote: »
    Suggesting devs should feel lucky is the wrong approach; the correct approach is that we should be paying the apprentices in the law and media industries. Anything else is a race to the bottom and hardly aspirational. An argument of "that's the way it's always been" or "well it never did me any harm" is not an argument.
    I've not really suggested devs should feel lucky about anything really; maybe luckier than some at worst, but ultimately every post I've made I've repeatedly been more stoic than anything - it's how the system currently is, the market unfortunately does support a 'cake and eat it' approach by firms and as much as we can talk about how it shouldn't be like that, doesn't change the fact that it is.

    Don't confuse that with supporting the current system.


  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    TC wrote:
    That's a different debate, TBH.
    True, we may need to split this out, but it's kindof amorphous so lets leave it run for a little while till we get an idea for where to cut.
    Calina wrote: »
    There is an increasing cohort of people who feel education should be tailored towards employers' needs more than anything.
    Yes, but those people - at least the vocal ones - are predominantly.... well, let's be polite and say they demonstrate a depraved indifference to the good of the students who are paying for the courses. Quite a few of them seem to be employers who are putting their companies' bottom lines over the good of the students, predominantly in the US where employer-employee relations seem to be trying to model themselves on the mugger-muggee relationship (seriously, at-will contracts? Feck right off mate). Okay, that's their fiduciary duty in a reductionist view, but:
    (a) a longer-term view would say that they're not doing themselves any favours because they're driving down their worth in the eyes of the very people they'd most want to hire (just as an example, think of the list of companies you the reader have in your mind under "will never ever work for ever" and ask how many of them this would be true of); and
    (b) it's their opinion, not ground truth, and it's deeply and often unacknowledgedly biased and that's a good reason to mostly ignore it.
    without some of the rigors applied to those labels elsewhere.
    I think in large part that's because we don't have those rigors in computer science or computer engineering yet, though we're (thankfully) moving that way.

    Seriously - one of the more telling questions I've heard recently was "Do we have any empirical evidence to say version control is a better way to do things" and the answers were "er, it's obvious, duh!", "here's an explanation of what version control is", and "did we need to prove that?". Disturbingly, it seems that there's no actual data to prove it - and this question dates from December 2012, not 1972. There's a whole other debate in there, but it boils down to this - maths can prove 1+1=2 (for various reasons, yes that had to be proven), doctors can point to studies showing that not stabbing yourself in the head is a good thing, in short every field but ours can prove the basic fundamentals they based their field on and which the rest of us would call "common sense" (though notably those proofs often pointed out things we wouldn't have thought of at all because of that common sense blind spot and their fields improved as a result).

    We can't do that yet... though we're getting better, slowly.
    When you have the pr head for the new computing curriculum in the UK suggesting you can pick up coding in two days...that does give a sense of it's easy, therefore not worth as much money as [what I do regardless of what that is].
    Was that not the same interview where she said she couldn't code herself? :D
    Related: if you've a citation for the comments on OOP I'd be interested.
    This is a good intro to the main paper; and this quote sums up the problem:
    It turned out that code size accounted for all of the significant variation: in other words, the object-oriented metrics they looked at didn't have any actual predictive power once they normalized for the number of lines of code.
    And if you haven't seen Greg Wilson's talk on this entire area yet, go watch it now, and if you have, he talks about this at the 40 minute mark.

    And when you're done, go read http://neverworkintheory.org because it has a lot of odd but proven results in it. For example, the proof that code review (the deeply stodgy, almost-waterfall, IBM method from the 1970s) works better than TDD (at least within set scenarios). If you're interested in this aspect of software engineering, that site's a good week or two's worth of fun reading, and if you're interested in writing better code faster, it's where you find some of the proven ways of doing that.
    Cake and eat it springs to mind
    Indeed, and while it's not something you can generally say to an employer while an employee, my thoughts would be that "eat it" is a pretty succint and expressive response to that kind of proposal on multiple levels :D


  • Advertisement
  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    I've repeatedly been more stoic than anything - it's how the system currently is, the market unfortunately does support a 'cake and eat it' approach by firms and as much as we can talk about how it shouldn't be like that, doesn't change the fact that it is.
    You and I and everyone who earns their living in this field has to be a stoic just to get through the working week more often than not, I get that. And a decade of experience in another area has taught me (mainly by stomping repeatedly on my face until I got it) that realism and pragmatism are the most effective tools for changing something...

    ...but it's also taught me that you can change the status quo, and it doesn't require superhuman effort or massive armies of people working on it, but it does start with talking about what things should be, while - damn straight - honestly talking about where things are.


  • Closed Accounts Posts: 19,777 ✭✭✭✭The Corinthian


    Sparks wrote: »
    True, we may need to split this out, but it's kindof amorphous so lets leave it run for a little while till we get an idea for where to cut.
    Well, I'd probably be of much the same opinion on the subject than you; such courses should be about the fundamentals, especially as you don't really learn about those in commercial environments, but that there's room for improvement, nonetheless.

    In fairness, I think third-level courses have improved since the late nineties, as universities began to realize that they had to update them more aggressively than you see in other disciplines.
    Sparks wrote: »
    ...but it's also taught me that you can change the status quo, and it doesn't require superhuman effort or massive armies of people working on it, but it does start with talking about what things should be, while - damn straight - honestly talking about where things are.
    Believe it or not, something like ten years ago I tried getting the ball rolling on a professional association for programmers. Employers from SME's immediately tried to hijack it and certain existing professional associations also got stuck in, looking to absorb anything that came of it as they were afraid they were going to have competition.

    As to programmers themselves? Herding cats would have been easier.


Advertisement