Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
90 Hour Weeks?
Options
Comments
-
The Corinthian wrote: »the common end result, which is a developer who can be very talented and knowledgeable on an academic level, but still needs a lot of hand-holding in a commercial environment for the first few months.
Everyone seems to think a computing undergrad course should produce someone who can then be given a professional role running a project and be able to hit the ground running on day one.
Why, exactly? We don't do that with solicitors, barristers, doctors, or any other kind of engineer (seriously, no civil engineering grad is ever going to build a bridge solo - apart from the obvious, it'd be illegal because you need to be chartered to sign off on things like that and we don't charter engineers until after they've been working in the field for a few years).
The point of an undergrad degree is to learn the basics, the fundamentals - we expect new grads to need some hand-holding (a lot actually) in their first role or first few years.
You come out of that 4-year degree course knowing APIs and the college has failed you. You come out of it knowing what an API is and what it should and shouldn't do, and that's the desired outcome.
That said, yes, some things should be taught more widely (like source control, like staging, like alternatives to OOP - sorry folks, OOP is useful but turns out the research that said it was the best way was wrong and flawed - and like ideas like unit tests and how things like code review are actually better than them in some ways and why we still need them and so on (as opposed to learning how to use xUnit libraries).By commercial, I don't even mean source control, but even basic stuff like unit testing, or what a staging environment was.
(And yes, that really does need to be more widely taught).0 -
It's tempting to say "yes, and?"
Everyone seems to think a computing undergrad course should produce someone who can then be given a professional role running a project and be able to hit the ground running on day one.Why, exactly? We don't do that with solicitors, barristers, doctors, or any other kind of engineer (seriously, no civil engineering grad is ever going to build a bridge solo - apart from the obvious, it'd be illegal because you need to be chartered to sign off on things like that and we don't charter engineers until after they've been working in the field for a few years).0 -
It's tempting to say "yes, and?"
Everyone seems to think a computing undergrad course should produce someone who can then be given a professional role running a project and be able to hit the ground running on day one.
It's a wider discussion but the role of education is blurry at the moment. There is an increasing cohort of people who feel education should be tailored towards employers' needs more than anything. I think it should be tailored towards equipping people to be adaptable because an employer can go bust or decamp. It is not specific to computer science or information technology but also afflicts second level.Why, exactly? We don't do that with solicitors, barristers, doctors, or any other kind of engineer (seriously, no civil engineering grad is ever going to build a bridge solo - apart from the obvious, it'd be illegal because you need to be chartered to sign off on things like that and we don't charter engineers until after they've been working in the field for a few years).
One of the things I think compsci has done is taken on board some of the hierarchical labels from other areas - currently scientist/engineer are two - without some of the rigors applied to those labels elsewhere.
Against that, there seems to be a move to devalue the skills inherent in development, in programming. When you have the pr head for the new computing curriculum in the UK suggesting you can pick up coding in two days...that does give a sense of it's easy, therefore not worth as much money as [what I do regardless of what that is].
This is a wider debate on whether the industry, and the tasks it fulfills, should be operating on a basis of relying on people working 80-90 hours sufficiently frequently that it's perceived as a feature of the job in question. My view is that it shouldn't and the reason it happens, again, is that the effort is not adequately valued. In certain respects, it almost comes across as a hazing ritual.The point of an undergrad degree is to learn the basics, the fundamentals - we expect new grads to need some hand-holding (a lot actually) in their first role or first few years.
In my view, people need hand holding to some extent in every new job; it's just the type of hand holding changes over time. The other issue is the IT industry is sufficiently vast that no undergrad degree is going to cover all of it in any major depth any more plus, if we teach people programming languages rather than problem solving, adaptability can be an issue.That said, yes, some things should be taught more widely (like source control, like staging, like alternatives to OOP - sorry folks, OOP is useful but turns out the research that said it was the best way was wrong and flawed - and like ideas like unit tests and how things like code review are actually better than them in some ways and why we still need them and so on (as opposed to learning how to use xUnit libraries).
Related: if you've a citation for the comments on OOP I'd be interested.The Corinthian wrote: »That's a different debate, TBH. The main point in relation to graduates is that regardless of whether they should be taught in a more vocational manner or not, they're still less attractive to employ than a developer with one or two years experience, hence more likely to end up in a situation where they have to 'pay their dues' in their first job and work silly hours.
Yes but at the other end, a lot of companies don't necessarily want to pay for long term acquired expertise either. or, more to the point, they want neither the pain of training someone up to be commercially useful, nor the cost of paying them properly when they are seriously commercially experienced.
Cake and eat it springs to mindThe Corinthian wrote: »Indeed, and when they start their professional careers as apprentices, devils or interns, they too end up 'paying their dues' too - actually graduate developers are probably lucky in comparison, as they get paid; something that apprentice solicitors often do not and devilling barristers almost never do.
Actually, I don't see it that way and I don't agree with it. The point is, if people are working, people should be paid, regardless of whether they are an intern or not. The unpaid apprenticeships, particularly in areas like law and the media tend to mitigate against diversity and social mobility.
Suggesting devs should feel lucky is the wrong approach; the correct approach is that we should be paying the apprentices in the law and media industries. Anything else is a race to the bottom and hardly aspirational. An argument of "that's the way it's always been" or "well it never did me any harm" is not an argument.0 -
Suggesting devs should feel lucky is the wrong approach; the correct approach is that we should be paying the apprentices in the law and media industries. Anything else is a race to the bottom and hardly aspirational. An argument of "that's the way it's always been" or "well it never did me any harm" is not an argument.
Don't confuse that with supporting the current system.0 -
TC wrote:That's a different debate, TBH.There is an increasing cohort of people who feel education should be tailored towards employers' needs more than anything.
(a) a longer-term view would say that they're not doing themselves any favours because they're driving down their worth in the eyes of the very people they'd most want to hire (just as an example, think of the list of companies you the reader have in your mind under "will never ever work for ever" and ask how many of them this would be true of); and
(b) it's their opinion, not ground truth, and it's deeply and often unacknowledgedly biased and that's a good reason to mostly ignore it.without some of the rigors applied to those labels elsewhere.
Seriously - one of the more telling questions I've heard recently was "Do we have any empirical evidence to say version control is a better way to do things" and the answers were "er, it's obvious, duh!", "here's an explanation of what version control is", and "did we need to prove that?". Disturbingly, it seems that there's no actual data to prove it - and this question dates from December 2012, not 1972. There's a whole other debate in there, but it boils down to this - maths can prove 1+1=2 (for various reasons, yes that had to be proven), doctors can point to studies showing that not stabbing yourself in the head is a good thing, in short every field but ours can prove the basic fundamentals they based their field on and which the rest of us would call "common sense" (though notably those proofs often pointed out things we wouldn't have thought of at all because of that common sense blind spot and their fields improved as a result).
We can't do that yet... though we're getting better, slowly.When you have the pr head for the new computing curriculum in the UK suggesting you can pick up coding in two days...that does give a sense of it's easy, therefore not worth as much money as [what I do regardless of what that is].Related: if you've a citation for the comments on OOP I'd be interested.It turned out that code size accounted for all of the significant variation: in other words, the object-oriented metrics they looked at didn't have any actual predictive power once they normalized for the number of lines of code.
And when you're done, go read http://neverworkintheory.org because it has a lot of odd but proven results in it. For example, the proof that code review (the deeply stodgy, almost-waterfall, IBM method from the 1970s) works better than TDD (at least within set scenarios). If you're interested in this aspect of software engineering, that site's a good week or two's worth of fun reading, and if you're interested in writing better code faster, it's where you find some of the proven ways of doing that.Cake and eat it springs to mind0 -
Advertisement
-
The Corinthian wrote: »I've repeatedly been more stoic than anything - it's how the system currently is, the market unfortunately does support a 'cake and eat it' approach by firms and as much as we can talk about how it shouldn't be like that, doesn't change the fact that it is.
...but it's also taught me that you can change the status quo, and it doesn't require superhuman effort or massive armies of people working on it, but it does start with talking about what things should be, while - damn straight - honestly talking about where things are.0 -
True, we may need to split this out, but it's kindof amorphous so lets leave it run for a little while till we get an idea for where to cut.
In fairness, I think third-level courses have improved since the late nineties, as universities began to realize that they had to update them more aggressively than you see in other disciplines....but it's also taught me that you can change the status quo, and it doesn't require superhuman effort or massive armies of people working on it, but it does start with talking about what things should be, while - damn straight - honestly talking about where things are.
As to programmers themselves? Herding cats would have been easier.0
Advertisement