Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Test Driven Development

  • 25-09-2012 9:39am
    #1
    Closed Accounts Posts: 6,075 ✭✭✭


    Hi all,

    The project I am working on does not utilise TDD. I want to learn how to do TDD by myself.

    Do any of you know how I could learn this skill? Most the of the work I do on my current project is about amending existing methods (we are in bug fixing phase). is there a way I could incorporate TDD myself on my work project or should I do some TDD at home?

    Basically I want to add TDD to my CV and claim I've sued it commercially.

    Is this possible?

    Walrus


Comments

  • Registered Users, Registered Users 2 Posts: 42 discodowney


    TDD is just making automated tests that test any code you make. If you are making an API its pretty easy to implement. What is the project? What language are you using?


  • Closed Accounts Posts: 6,075 ✭✭✭IamtheWalrus


    TDD is just making automated tests that test any code you make. If you are making an API its pretty easy to implement. What is the project? What language are you using?

    Java. I think it's more than just writing a test then writing code to make it pass. I've researched it and have done some tutorials via YouTube and I can see the value but the issue is putting it into practice on a real-life project where you are rarely writing from scratch and are working under pressure.

    The project is on a website via Spring and Hibernate and web services.


  • Registered Users, Registered Users 2 Posts: 42 discodowney


    Ive only used Nunit but Junit is the Java version of that that is used for TDD (ive not used Spring/Hibernate but i think Junit is still used). You write the test as its meant to pass and then run the code against that. So any changes you make along the way the test will need to be updated to reflect them.

    I do TDD every day (in C++/C#). The idea is that you have a suite of tests that have been developed from the start. Starting mid way through will be a bit of a pain alright. And take time. But the idea is that changes you make to the code base dont have unforseen consequences that affect other functions. So if you have 50 tests. You make a change to the code that seems to work fine. But it causes one of the 50 to fail. you know theres some problem with the code you just ran. The tests should be run every day/every check in (depending on how you are working). How large is the code base you are working with?


  • Closed Accounts Posts: 5,064 ✭✭✭Gurgle


    The premise is fairly straightforward, you should have a complete spec for each and every method/class/function before you begin to write it. You write a test first, then write your method/class/function, then test it.

    In most cases, it achieves nothing. The focus is on testing, but whats missing from practically every project definition is a proper detailed spec. If the functionality and API is fully specified, writing the code to implement it is pretty much trivial. Writing tests for your own code is pointless in the real world, any pitfalls you miss in the implementation you'll miss in the testing.

    But as its a commercial buzz phrase at the moment, it's worth getting on your CV. No harm doing it retroactively; prepare a detailed spec for some existing code, then write tests to confirm that it works as you intended.


  • Closed Accounts Posts: 6,075 ✭✭✭IamtheWalrus


    The idea is that you have a suite of tests that have been developed from the start.

    So for anyone joining a project, someone will have already written this test suite so the suite is basically used to validate new code? Is this what's happens in the real-world?

    I need to know how TDD works on real projects. Even if I do tutorials (I understand the idea and it's value) I still won't know how a large-scale project implements it and will be found out quickly in an interview.


  • Advertisement
  • Closed Accounts Posts: 6,075 ✭✭✭IamtheWalrus


    My current project is an Agile project. Using TDD, do I pick up a story, design what changes are needed then start writing tests, followed by code to pass the test? Is this really how it works or is that the 'ideal' but in practice it's much different?


  • Registered Users, Registered Users 2 Posts: 42 discodowney


    And new functions/code will cause more tests to be added to the suite. But any time a check in occurs the whole suite is run to ensure and new code/bug fixes/whatever havent affected other functions.

    So if you make a change to a common function (a function used by different classes) you run the whole test suite to ensure that no other function is negatively affected by this change. Like Gurgle says, it isnt a 100% accurate system as it depends on who is writing the tests but it does find a lot of errors before they get released.


  • Closed Accounts Posts: 6,075 ✭✭✭IamtheWalrus


    Gurgle wrote: »
    proper detailed spec.

    I've never coded from a spec - it's just details written in a bug collator e.g. JIRA.


  • Closed Accounts Posts: 6,075 ✭✭✭IamtheWalrus


    It sounds like something that requires project experience to get the hang of and I would be found out if I just coded from online tutorials.


  • Registered Users, Registered Users 2 Posts: 42 discodowney


    My current project is an Agile project. Using TDD, do I pick up a story, design what changes are needed then start writing tests, followed by code to pass the test? Is this really how it works or is that the 'ideal' but in practice it's much different?

    It depends. A lot of companies have specialised Test Programmers whose job it is to write these tests and ensure they all pass. If not they analyse the failure and talk to whoever implemented the changes.

    Some places the developer does the testing as well. In this case usually a user story will involve tasks to implement the code and write the test. In practice, Ive found most people do the implementation first.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 42 discodowney


    Id besurprised if any job was expecting a student (im assuming you are one) to have been involved in TDD for a college project. The time just isnt there for that. Its generally manual testing for them.


  • Closed Accounts Posts: 6,075 ✭✭✭IamtheWalrus


    In practice, Ive found most people do the implementation first.

    Is this true? Do managers accept this?

    I'm not a student. I'm working on a real project without TDD. I want to put TDD on my CV but don't want to out it on there if I genuinely won't be able to bluff it.

    I'm also afraid of moving onto an Agile, TDD project that does pair Programming. If I'm pairing, will it be obvious to my 'pair' that I don't have commercial experience in TDD?


  • Registered Users, Registered Users 2 Posts: 2,040 ✭✭✭Colonel Panic


    Are there unit tests for the existing functionality? If not you will need to add them, as in do you understand the domain, or a subset of it, well enough and could you do it (probably in your free time) in a realistic timescale?

    Writing failing tests first then adding functionality is all well and good, but without coverage of the rest of the system you are modifying, you have no idea of knowing what else you've broken.

    For what it's worth on C++ projects here in work we don't as a rule do unit tests or TDD, but on projects I'm responsible for, I've taken chunks of business logic and other subsystems and applied unit tests to them. Now new features get a test written first. Doing this took a long time for even a modestly sized application.

    I didn't learn how to do this on the job, I applied TDD at the same time as I was learning ASP.Net MVC.


  • Closed Accounts Posts: 6,075 ✭✭✭IamtheWalrus


    Are there unit tests for the existing functionality? If not you will need to add them, as in do you understand the domain, or a subset of it, well enough and could you do it (probably in your free time) in a realistic timescale?

    Writing failing tests first then adding functionality is all well and good, but without coverage of the rest of the system you are modifying, you have no idea of knowing what else you've broken.

    For what it's worth on C++ projects here in work we don't as a rule do unit tests or TDD, but on projects I'm responsible for, I've taken chunks of business logic and other subsystems and applied unit tests to them. Now new features get a test written first. Doing this took a long time for even a modestly sized application.

    I didn't learn how to do this on the job, I applied TDD at the same time as I was learning ASP.Net MVC.

    Yes, we do 'test after development'. I write my code, then unit test it to prove it works. This goes against the TDD ethos. The tutorials I have followed follow a very specific pattern (write failing test, write code to make it pass, refactor and repeat). To me it sounds very time consuming and idealist. So I'm wondering how it is actually implemented on commercial projects.

    Maybe I'm obsessing too much on TDD. Maybe it's something that is something that is casually done in projects and my current knowledge is enough to claim I've got commercial experience of it.


  • Registered Users, Registered Users 2 Posts: 2,040 ✭✭✭Colonel Panic


    I'm open to being wrong, but I don't think you can do pure TDD for the reasons you mention. It's a time consuming endeavour and it's hard to account for that time with management. I'm always wary of TDD because I think there's a cargo cult aspect to it but like everything, it's got a time and a place.

    One suggestion I would make about the time consuming aspect is to write code in your tests even if that code doesn't exist yet. It's annoying because your autocomplete tools will start to break down (there's a keystroke in Visual Studio to disable this) but once you've written the non compiling, and therefore failing test, you can use refactoring tools to fill in the stubs of the functionality you want to write. I consider this a major advantage of TDD because it helps focus on the public interface you are dealing with first, and get down to implementation after.

    Do you consider yourself good at writing unit tests? Not the test code itself, but what gets tested? If so, you could just focus on that if it came up in an interview and put down writing unit test on your CV.


  • Registered Users, Registered Users 2 Posts: 9,391 ✭✭✭markpb


    I'm open to being wrong, but I don't think you can do pure TDD for the reasons you mention. It's a time consuming endeavour and it's hard to account for that time with management.

    Unless you're writing very simple software, I don't think that TDD will take any more time to write good code than TAD or any other method. Will it take more time than basic testing or no testing - absolutely but you'll pay in the long run.

    If you work for a company where management are getting involved (negatively) in how you do your testing or refusing to schedule enough time to test properly, you need to either argue with them or go work for a different company.

    I've used TDD on a few of our projects here, mostly where I knew that specific methods or components would be tricky and have a lot of different flows (decoding protocols, etc) and it was invaluable. I wrote some very simple tests, then wrote the code. As I discovered new edge cases that weren't handled, I was able to add tests and update the code, safe in the knowledge that all my other edge cases weren't being broken.


  • Registered Users, Registered Users 2 Posts: 2,040 ✭✭✭Colonel Panic


    Yeah, perhaps. Maybe me next attempt at TDD will take less time if I can change my thinking and resist the urge to bang out a new feature in a system I know well then create a unit test. "Simple" is a subject term though. Simple in terms of the amount of features to write tests for, or the functionality you are testing?

    Management usually get involved when it comes to deadlines and features but that also includes the time taken to overhaul development processes, set up new build systems, switch to TDD, switch to git or Mercurial and for them it's a balance between commercial demands and the technical side of things. It's not about just leaving if that end of things doesn't work out because you can't go anywhere where it isn't an issue.


  • Registered Users, Registered Users 2 Posts: 9,391 ✭✭✭markpb


    Management usually get involved when it comes to deadlines and features but that also includes the time taken to overhaul development processes, set up new build systems, switch to TDD, switch to git or Mercurial

    There's no need to do any of that. TDD is just about writing tests, writing code and then running the tests. Nothing more. It has no relationship to the build system or the repository system. You can make your build system automatically run the tests on check-in but that's not a requirement, it's a nicety. There's no need to switch from anything to git or mercurial, they're nothing to do with the TDD.


  • Registered Users, Registered Users 2 Posts: 2,040 ✭✭✭Colonel Panic


    I'm talking about management getting involved with development processes in general there.


  • Registered Users, Registered Users 2 Posts: 1,082 ✭✭✭Feathers


    Yes, we do 'test after development'. I write my code, then unit test it to prove it works. This goes against the TDD ethos. The tutorials I have followed follow a very specific pattern (write failing test, write code to make it pass, refactor and repeat). To me it sounds very time consuming and idealist. So I'm wondering how it is actually implemented on commercial projects.

    Why do you think it's more time-consuming to write tests first? The only aspect I'd consider time-consuming is when I have to question the spec/ make design decisions, but doing this upfront saves on debugging down the line & bugs are cheaper the earlier you catch them.


  • Advertisement
  • Moderators, Sports Moderators, Regional Abroad Moderators Posts: 2,666 Mod ✭✭✭✭TrueDub


    Feathers wrote: »
    Why do you think it's more time-consuming to write tests first? The only aspect I'd consider time-consuming is when I have to question the spec/ make design decisions, but doing this upfront saves on debugging down the line & bugs are cheaper the earlier you catch them.

    The bit I've bolded in the quote above is the essence of TDD and ATDD (acceptance TDD). if ou find a bug early, it's much much cheaper to fix. Also, if you examine the requirements ahead of time, you write tighter, cleaner code.

    TDD is a mindset, once you're into it it's almost impossible to get out, and that's a good thing. A full suite of unit tests provide a really good comfort blanket for future changes.


  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    Feathers wrote: »
    doing this upfront saves on debugging down the line & bugs are cheaper the earlier you catch them.
    It's a wonderful argument, and one that you can usually get people to listen to for a new project.

    Trying to bolt TDD (or even just plain unit testing) onto a legacy project with that argument leads inevitably to the line from sales of "you're spending N man-years worth €X; what feature does this result in that we can use to generate more revenue? None? Door's over there." Arguing that you can reduce costs in a quarter a year or so from now, when quite a few of those who signed off on the decision to do so could have moved on (funny how these sales types have such a higher churn rate than the people actually building the things they're selling), doesn't seem to work as well as any argument that increases this quarter's numbers, even if it screws over a quarter a year or so from now - after all, that'll be Some Other Guy's Problem...

    Depending on the size of the project, I've come to think that TDD isn't an idiom that can be used in industry without a management team that believe you can make no profit for a year to pay down technical debt (if you see such a management team, let someone know that you've spotted a highly endangered species, would you?).


  • Registered Users, Registered Users 2 Posts: 9,391 ✭✭✭markpb


    Sparks wrote: »
    Trying to bolt TDD (or even just plain unit testing) onto a legacy project with that argument leads inevitably to the line from sales of "you're spending N man-years worth €X; what feature does this result in that we can use to generate more revenue? None? Door's over there."

    I can't argue with that but I think developers need to stand up for themselves. If a dev manager asks for a time estimate for a piece of work, that estimate should include a reasonable amount of testing built in. Unless they're a complete micro-manager (and I know there are some out there), there should be no problem. In some cases, I wouldn't even both explaining that I'm including unit testing in the estimate.

    Of course if you go overboard and want to add unit testing for every part of an existing project when you're asked to make a small change, you're just bringing trouble on yourself.


  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    markpb wrote: »
    I can't argue with that but I think developers need to stand up for themselves.
    Whoa, whoa, stall that digger there a moment.
    Stand up for ourselves? What?

    We work for companies that sell products. We give our professional opinion on how to do the job, and if management choose to do something that sends it to the wall despite that? Not. Our. Problem. We quit and go work elsewhere (and in our industry, usually with an accompanying salary hike). That's the gig. Doing anything else is getting emotionally invested in a computer product, which (a) is kindof creepy, that's what your family is for; and (b) is absolutely guaranteed to mess you up in the long run.

    What I'm saying is that with a legacy system, trying to make the argument to management for TDD or even plain old Unit Testing is a long, long, long way from easy or simple because the fact that it's a better way to build something in the long run is irrelevant to Sales or Marketing, who don't care about the internals, but only about what externally visible things they can use to sell product, and who operate on infuriating short timescales.


  • Registered Users, Registered Users 2 Posts: 2,040 ✭✭✭Colonel Panic


    Sparks wrote: »
    We work for companies that sell products. We give our professional opinion on how to do the job, and if management choose to do something that sends it to the wall despite that? Not. Our. Problem.

    Well put. This is something it took me a few years to come to terms with, I have to admit. You do the best you with what you get. If the pros outweigh the cons, move on, but you certainly can't expect any environment to be some sort of computer science shangri-la!


  • Registered Users, Registered Users 2 Posts: 1,082 ✭✭✭Feathers


    Sparks wrote: »
    Feathers wrote: »
    doing this upfront saves on debugging down the line & bugs are cheaper the earlier you catch them.
    It's a wonderful argument, and one that you can usually get people to listen to for a new project.

    Trying to bolt TDD (or even just plain unit testing) onto a legacy project with that argument leads inevitably to the line from sales of "you're spending N man-years worth €X; what feature does this result in that we can use to generate more revenue? None? Door's over there." Arguing that you can reduce costs in a quarter a year or so from now, when quite a few of those who signed off on the decision to do so could have moved on (funny how these sales types have such a higher churn rate than the people actually building the things they're selling), doesn't seem to work as well as any argument that increases this quarter's numbers, even if it screws over a quarter a year or so from now - after all, that'll be Some Other Guy's Problem...

    Depending on the size of the project, I've come to think that TDD isn't an idiom that can be used in industry without a management team that believe you can make no profit for a year to pay down technical debt (if you see such a management team, let someone know that you've spotted a highly endangered species, would you?).

    I agree the ideal is having full test coverage, but there's no reason you can't do TDD for the individual method that you need to refactor, provided its properly OO.

    If you're going into legacy code, you a bug or a feature request. If you've a feature request, you're going to have to regression test areas that interact with the code you're touching. TDD is getting some of that effort in up front.

    If you've a bug, just write a test that proves the bug exists, then fix the code to make the test pass - you now have one more regression test against that area for the next change. You'll rarely be working on legacy code to change one single bug :) So it ends up benefiting the rest if your work now, not just in a year's time.

    I don't think it needs to be all or nothing.


  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    This is something it took me a few years to come to terms with, I have to admit.
    I've found two things help enormously though. There's doing your own stuff as a hobby (though not everyone can find the time); and there's this:

    23931.strip.gif

    :D


  • Closed Accounts Posts: 6,075 ✭✭✭IamtheWalrus


    Thanks all for taking part in this debate. I've been so busy to read all posts but I will this week.

    Thanks again.


  • Registered Users, Registered Users 2 Posts: 9,391 ✭✭✭markpb


    Sparks wrote: »
    We work for companies that sell products. We give our professional opinion on how to do the job, and if management choose to do something that sends it to the wall despite that? Not. Our. Problem.

    I disagree. Almost every company will want their developers to get the job done as quickly as possible, in the same way that companies want buildings, ships, cars or aircraft built quickly. It's up to the professionals involved to do the best job they can with what's available. If they succumb to the pressure to deliver as quickly as possible and at any cost, they're only doing themselves and their profession a disrespect.

    Would you say the same about builders throwing up buildings as quickly as possible because the developer wanted it? What respect is there for builders in Ireland now?


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    markpb wrote: »
    I disagree. Almost every company will want their developers to get the job done as quickly as possible, in the same way that companies want buildings, ships, cars or aircraft built quickly.
    Exactly.
    And since the tests can't be seen by the customer, Sales tends to think that they're a waste of expensive engineering time. And it wouldn't be unusual to hear an argument that boils down to "why are we paying for engineers to write code that the customer never sees and whose only role is to spot the engineers fouling up?". To you or me or the rest of the posters here, that's a dumb-ass argument for them to make; to every salesperson out there it's perfectly logical, so long as your top priority is this quarter's figures.
    If they succumb to the pressure to deliver as quickly as possible and at any cost, they're only doing themselves and their profession a disrespect.
    That's assinine. You give your professional opinion as to how best to do the job, yes; but if management (or the client) says "fine, now do it this way instead", then you have precisely two options; accept your salary and do the job they want; or quit on principal.

    The latter doesn't pay many bills. That might not be as important when you don't have a family to support I suppose; but then again, the latter is often red-flagged by some companies when hiring.

    tl;dr : Management pays the piper and gets to call the tune; that's a big part of what being a professional means.
    Would you say the same about builders throwing up buildings as quickly as possible because the developer wanted it? What respect is there for builders in Ireland now?
    Lots. Your average bricklayer isn't being held up as the cause of the economic collapse; the 100 or so developers who gave them their orders, on the other hand, are the ones being hauled into court over debacles like Priory Hall.


  • Registered Users, Registered Users 2 Posts: 9,391 ✭✭✭markpb


    Sparks wrote: »
    That's assinine. You give your professional opinion as to how best to do the job, yes; but if management (or the client) says "fine, now do it this way instead", then you have precisely two options; accept your salary and do the job they want; or quit on principal.

    Nope, you give an opinion about how much the job will cost, not how you'll do it. No-one should come to you asking you how you plan to do it, that's why they hired you - because they assume you know how to do your job. If you find yourself explaining unit testing to a sales person, you're doing it wrong. It's none of their business how software is developed, just that it is. If they come back and say that's too long/expensive, then certainly there's room for compromise but the default position shouldn't be: I'll definitely do the development but I'll only test it properly if they let me.

    No-one asks a builder if he's going to make gas lines safe in a building, they just assume that they'll do it as part of their job. No-one asks a ship builder if he'll check that the ship is watertight, they just assume they'll do it. The same should be said about developers. You might assume that it doesn't affect you because you can move on and let someone else deal with the dogs dinner you've left behind but it does --it affects all of us if developers get a reputation for throwing any old code into production.


  • Subscribers Posts: 4,076 ✭✭✭IRLConor


    If you're tight on time and under pressure from management to not write tests then you should take a broader interpretation of the middle D in TDD. Your development can be driven by testing while writing little or no tests. The object of the game is to write testable code. For every piece of code you write think "How can I test this thoroughly?"

    Once you start thinking along those lines then your code quality will improve. Testable code isn't necessarily good, but code that's hard to test is definitely bad. Think about the things that improve testability (reduced coupling, less global state, fewer singletons, not violating the Law of Demeter, no classes/methods with multiple responsibilities, etc), aren't they just general good coding practices?

    If you can spare the time, by all means write the tests for the riskiest parts of your codebase but there's no point getting into a willy waving contest with your manager over methodology. If you can prove that bad practices are costing money you may be able to have a reasoned discussion about it, but "preventative medecine" is not something most companies buy into.


  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    markpb wrote: »
    Nope, you give an opinion about how much the job will cost, not how you'll do it. No-one should come to you asking you how you plan to do it, that's why they hired you - because they assume you know how to do your job. If you find yourself explaining unit testing to a sales person, you're doing it wrong. It's none of their business how software is developed, just that it is. If they come back and say that's too long/expensive, then certainly there's room for compromise but the default position shouldn't be: I'll definitely do the development but I'll only test it properly if they let me.

    This is an argument that falls down when there's more than one person involved in the development. I'm not on the biggest team in the world, but right now I'm working with about a dozen developers on just one smallish component of the overall product, and even at that point you pretty much have to have more invasive management than you're implying; at least on large projects. So an individual developer can't get told "do task X" and tell the boss it'll take two years - one day for task X and the remainder to shove the entire project into a unit testing framework so he can test his code. That's just not how it works; and on large legacy projects, you can't just add unit testing piecemeal, it has to be an actual directed effort. Depending on the size of the legacy projects, the cost could be enormous - as in, seven figures enormous without breaking a sweat, and possibly eight if things run over; and when the outcome is not a positive revenue gain, but the promise of a lower business cost from engineering in the future.... well, the argument becomes a bit short and one-sided in places where this quarter's figures drive thinking.


  • Moderators, Sports Moderators, Regional Abroad Moderators Posts: 2,666 Mod ✭✭✭✭TrueDub


    I'm with Sparks on this one - twenty years (:eek:) in IT has shown me that in general, as a professional, you have to ultimately do what you're told. You can provide input, or feedback, to a decision, but if the decision does not tally with what you think you have a simple choice.

    As regards TDD, it's definitely easier to start it with a greenfield project. You can do it with an existing project, but it usually requires some re-design, and the knock-on effects can be huge. I'd be lost without unit tests now, and tools like Mockito, and it's plain that the use of these tools & techniques reduces bugs, but for an existing project it can be cheaper to simply fix bugs as they arise than to re-engineer the entire system to be testable.

    There is a middle ground - make things a little cleaner every time you touch the code. Tidy it up, add a test or two, think about testability when making changes. These little things all add up.


  • Subscribers Posts: 4,076 ✭✭✭IRLConor


    TrueDub wrote: »
    There is a middle ground - make things a little cleaner every time you touch the code. Tidy it up, add a test or two, think about testability when making changes. These little things all add up.

    Yup, and take a leaf from the medical profession's book: primum non nocere.

    There are a few circumstances where I'm a little more firm about writing tests:
    • When fixing a bug. You can't honestly fix a bug without reproducing it. That reproduction is really a new test. You have to write it anyway, so you may as well check it in and let others get the benefit of it.
    • When touching code I don't understand. If I don't understand the inputs and outputs of a piece of code I often write some test code to exercise it as a way of understanding its behaviour at a "black box" level.
    • When touching code that could get me fired if I break it badly enough e.g. billing, new user signup, input validation, etc.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 27,370 ✭✭✭✭GreeBo


    Sparks wrote: »
    Lots. Your average bricklayer isn't being held up as the cause of the economic collapse; the 100 or so developers who gave them their orders, on the other hand, are the ones being hauled into court over debacles like Priory Hall.


    There was s similar defence forjust following orders back in the mid 40s. History didnt think too much of their excuses either.

    You shouldnt be approaching management giving them an option about your ode including tests or not including tests. Tests are part of the thing that you deliver, along with comments/documentation/release notes/configuration etc.

    If you dont have good unit test code coverage and good feature coverage (cucumber etc) then your initial costs are lower but maintenance costs and bugs rocket. As soon as someone refactors your code to add some new feature they have no idea if they have broken the existing functionality.

    If you work for a company where you are just told what to do and you (as the expert) dont control how you write your own code then you are pretty much acting like a cheap, off-shore commodity resource imo.

    If this doesnt bother you then stay where you are, if it does and you have a desire to be the best programmer as you can be then I'd advise moving.

    Personally I use BDD, we write acceptance tests first, then the code and then the unit tests for those individually testable parts of the code.
    If you are working on legacy code then unit tests are a great help before refactoring, depending on the coverage of the tests you can completely

    I fully agree that its a mind shift. Using BDD really means that you know the exact requirements before you ever start writing you code, helping greatly with design decisions.


  • Registered Users, Registered Users 2 Posts: 9,391 ✭✭✭markpb


    TrueDub wrote: »
    There is a middle ground - make things a little cleaner every time you touch the code. Tidy it up, add a test or two, think about testability when making changes. These little things all add up.

    +1

    This is exactly what developers should be doing. No-one expects anyone to write a whole suite of unit tests for legacy code but at the very least, the code you're affecting should have unit tests added before you start.


  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    GreeBo wrote: »
    There was s similar defence forjust following orders back in the mid 40s.
    I'm pretty sure I've not been ordered to carry out a genocide.
    I mean, I missed a scrum yesterday thanks to a root canal, but I'm pretty sure someone would have mentioned it if killing an entire ethnic group had been added to my backlog...
    If you work for a company where you are just told what to do and you (as the expert) dont control how you write your own code then you are pretty much acting like a cheap, off-shore commodity resource imo.
    If this doesnt bother you then stay where you are, if it does and you have a desire to be the best programmer as you can be then I'd advise moving.
    Do you have an argument that isn't based on the ego? Because "the desire to be the best programmer I can be" isn't a good business case for a chunk of work that could tie up an entire dev team for a year or two, cost up to seven or eight figures for a large enough legacy project, and not produce a single salable externally visible feature.
    If you are working on legacy code then unit tests are a great help before refactoring, depending on the coverage of the tests you can completely
    Do you think this thread has descended into an argument against unit tests?
    It has not.
    We're just pointing out that while unit testing makes life a lot easier for some tasks - and with legacy code, damn near everything gets easier with unit tests - it may not be possible to use them or to get management to sign off on adopting them; and when that happens you just have to grit your teeth and get on with the job, instead of sitting about wishing you had them.


  • Registered Users, Registered Users 2 Posts: 27,370 ✭✭✭✭GreeBo


    Sparks wrote: »
    I'm pretty sure I've not been ordered to carry out a genocide.
    I mean, I missed a scrum yesterday thanks to a root canal, but I'm pretty sure someone would have mentioned it if killing an entire ethnic group had been added to my backlog...
    No, but thats just a scale issue. You are basically saying that you will accept being told to do what you know is wrong because someone is ordering you to do it.
    Sparks wrote: »
    Do you have an argument that isn't based on the ego? Because "the desire to be the best programmer I can be" isn't a good business case for a chunk of work that could tie up an entire dev team for a year or two, cost up to seven or eight figures for a large enough legacy project, and not produce a single salable externally visible feature.
    I pretty sure I had a number of points in my post that could be construed as an argument...<checks>...yep.
    "Ego" is not part of my argument. If you equate ego with a desire to do your job properly then thats a different issue (which Im not qualified to help you with). No one is saying put an entire scrum team on writing unit tests. But throwing your hands up and whining "they wont let me" is a cop out imo.
    (If you are not able to create a use case for writing tests then that is something that I am qualified to help you with)

    I'm guessing by your argument that you and your team dont have any NFR's, since these are not sellable features either, right? I mean no customer is going to know if you use pooled DB connections or if you retry service calls, encrypt sensitive data thats stored in a DB, check for null, catch exceptions, comment your code, etc, etc. But for some reason you (I hope) wouldnt consider allowing management to tell you to write code that didnt include the above basics but have no problem making the maintainability of your code optional. Why you draw this line here makes no sense to me.
    Sparks wrote: »
    Do you think this thread has descended into an argument against unit tests?
    It has not.
    We're just pointing out that while unit testing makes life a lot easier for some tasks - and with legacy code, damn near everything gets easier with unit tests - it may not be possible to use them or to get management to sign off on adopting them; and when that happens you just have to grit your teeth and get on with the job, instead of sitting about wishing you had them.

    and I disagree. Its always possible to use tests of some description. It may not be worth the effort in all cases of the code, but as said above, you should strive to make any code you touch better, even if the bit you have been (t)asked to do is for something else. Thats part or your job.

    For me delivering code without inbuilt tests is the equivalent of delivering code with no documentation or no exception handling. Tests are an integral part of the development of software. I think the idea of testing as some optional overhead that you can choose to leave out is frankly archaic and is what lead to us have a proliferation of crappy untested legacy code that people are afraid to refactor or even touch because they dont understand it and are afraid of braking some unknown element of it.


  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    GreeBo wrote: »
    No, but thats just a scale issue. You are basically saying that you will accept being told to do what you know is wrong because someone is ordering you to do it.
    I think the day that you can compare methodologies in software development the same way you compare committing genocide with not committing genocide is quite probably the day we should wrap up the entire field and bury it somewhere and just go back to counting on our fingers...
    I pretty sure I had a number of points in my post that could be construed as an argument.
    Indeed, but not good ones.
    For example, "do your job properly" sounds easy enough, but then you ran headfirst into the problem of what "properly" means (or, in your original post, what "best" means). You don't have a metric here. Is "best" the same as "shortest development time" or "has the lowest number of bugs" or is it "lowest development cost" or "lowest maintenance costs" (which will be different) or "meets clients needs most accurately" or "meets clients' stated specifications most fully" (again, not the same thing) or "didn't have to be fired by HR for being unworkable-with", or "could program in a bunch of cool languages we don't actually use"?

    Besides which, "do it this way or you're just not being a good programmer" - that's a fairly extreme statement. Even "goto considered harmful" has caveats and exceptions.
    Why you draw this line here makes no sense to me.
    (a) The phrase "non disclosure agreement" might be something to keep in mind here; and
    (b) When you work on legacy projects, you're almost never the designer. And you can almost never make major changes in an unplanned manner - and when I say unplanned, I'm talking about plans for migration that tend to be measured in years if not decades.

    There are reasons for the line and its location; they seem obvious to me, they may not to you, and that may be down to a different set of experiences.

    I don't particularly subscribe to the notion that whether or not those reasons seem obvious to someone is a good indicator of professional worth.
    Its always possible to use tests of some description.
    Indeed; however, those tests are not always unit tests, and those tests may or may not be of use for TDD.
    For me delivering code without inbuilt tests is the equivalent of delivering code with no documentation or no exception handling.
    Perhaps so; I know that for me, putting unit tests and unit test frameworks into some of the code I deliver would be a firing offence for good reason (for example, legal's concerns over library licencing issues; test's concerns over introducing regressions; and so on).
    Tests are an integral part of the development of software. I think the idea of testing as some optional overhead that you can choose to leave out is frankly archaic and is what lead to us have a proliferation of crappy untested legacy code that people are afraid to refactor or even touch because they dont understand it and are afraid of braking some unknown element of it.
    And I think you are confusing Tests and Unit Tests and seem to have equated one software methodology with good professional conduct; I don't agree with either of those assertions, and the latter is bordering on being impolite.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 27,370 ✭✭✭✭GreeBo


    Sparks wrote: »
    I think the day that you can compare methodologies in software development the same way you compare committing genocide with not committing genocide is quite probably the day we should wrap up the entire field and bury it somewhere and just go back to counting on our fingers...
    I think you are (deliberately) missing the point here. I am comparing two occasions of people using "I was just following orders" as an excuse for doing what they know is wrong. Sure there is a vast difference in the fallout of both but the underlying cause is the same.

    Sparks wrote: »
    Indeed, but not good ones.
    For example, "do your job properly" sounds easy enough, but then you ran headfirst into the problem of what "properly" means (or, in your original post, what "best" means). You don't have a metric here. Is "best" the same as "shortest development time" or "has the lowest number of bugs" or is it "lowest development cost" or "lowest maintenance costs" (which will be different) or "meets clients needs most accurately" or "meets clients' stated specifications most fully" (again, not the same thing) or "didn't have to be fired by HR for being unworkable-with", or "could program in a bunch of cool languages we don't actually use"?
    Without getting into a long argument about definitions, *I* believe that testability and unit tests are a fundamental in delivering code.
    I think your metric point is really just a straw-man argument. Why are all your options mutually exclusive?
    Do you have a Definition Of Done?
    Sparks wrote: »
    Besides which, "do it this way or you're just not being a good programmer" - that's a fairly extreme statement. Even "goto considered harmful" has caveats and exceptions.
    I'm not prescribing any "way" of providing testing as part of a deliverable, I'm just saying that it, in my opinion, is a fundamental part of delivering code.
    Sparks wrote: »
    (a) The phrase "non disclosure agreement" might be something to keep in mind here; and
    (b) When you work on legacy projects, you're almost never the designer. And you can almost never make major changes in an unplanned manner - and when I say unplanned, I'm talking about plans for migration that tend to be measured in years if not decades.
    a) I cant see how any NDA is going to prevent you from writing tests.
    b) I've worked on several products that are multi million lines of code and have been live for 10+ years.
    Adding Unit Tests doesnt involve making "major unplanned" changes. If so perhaps you are not doing it correctly? (Serious question btw)
    If the legacy product is not going to be sunset for several years (typically the case as "they" figure out if they want a rewrite, refactor or something else) then thats all the more reason to add unit tests as you continue maintenance of that product. When/If you get to rewrite it then you know exactly what those parts are supposed to do. Thats gold for a re-write/refactor project.

    Sparks wrote: »
    There are reasons for the line and its location; they seem obvious to me, they may not to you, and that may be down to a different set of experiences.
    Again, I think thats just an "old-school" acceptance of "Im just a coder, I do what Im told". I believe development has moved on from this and the onus is on the Snr Developers to push back if you are being told not to write tests.
    Sparks wrote: »
    I don't particularly subscribe to the notion that whether or not those reasons seem obvious to someone is a good indicator of professional worth.
    I'm saying that delivering code without tests, to me, is not fully doing your job. In my experience I have never come across an occasion where its not possible to improve things by adding some tests.
    Sparks wrote: »
    Indeed; however, those tests are not always unit tests, and those tests may or may not be of use for TDD.
    Agreed, they may be acceptance tests, integration tests or unit tests. But I'm sure you agree that the more levels of tests you have the more you are covered and the earlier you are going to find issues. I have already stated that I dont "believe" in TDD when it comes to unit tests; however I strongly believe in the existence of unit tests (aiming for 80-85% branch coverage on a greenfields project. With legacy you can only do so much, but you should still be trying)
    Sparks wrote: »
    Perhaps so; I know that for me, putting unit tests and unit test frameworks into some of the code I deliver would be a firing offence for good reason (for example, legal's concerns over library licencing issues; test's concerns over introducing regressions; and so on).

    Maybe Im missing your point here, but I cant see how you could have licencing issues by writing unit tests that are only ever run during a build. Its not like they go to production etc? (Again, perhaps I am missing what you are getting at here...)
    I dont follow what you mean by "introducing regressions"? Surely thats the point of automated tests, to find regression bugs without the need for a bunch of manual QA?
    Sparks wrote: »
    And I think you are confusing Tests and Unit Tests and seem to have equated one software methodology with good professional conduct; I don't agree with either of those assertions, and the latter is bordering on being impolite.
    Again Im not at all saying "If you dont use TDD you are a bad developer".
    I dont even use TDD (I use BDD).
    What I am saying is that if you are delivering code (written using whatever methodology) and you dont write the corresponding tests (of all useful types/levels) then I dont think you are being as good a developer as you could/should be.
    (I'm not in any way attempting to be impolite or imply anything about anyone here. I just honestly dont think there is any reason not to deliver tests with your code, irrespective of when the tests are written, first or years later)

    Its indeed an interesting discussion, but perhaps we have strayed too far off TDD? (at least for this thread):o


  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    GreeBo wrote: »
    I think you are (deliberately) missing the point here. I am comparing two occasions of people using "I was just following orders" as an excuse for doing what they know is wrong. Sure there is a vast difference in the fallout of both but the underlying cause is the same.
    I don't believe that you really can't tell the difference between fulfilling a contract of employment in the software industry and committing genocide.
    I'm starting, however, to think that you can't see that the two don't have the same ethical roots. Which is a bit of a headwrecker.
    I think your metric point is really just a straw-man argument.
    It's not. It's central to your whole "You have to do TDD to be the best programmer you can be" thesis, because you've no definition of what "best" means; you haven't even said from whose vantage point that "best" should be measured.
    I'm not prescribing any "way" of providing testing as part of a deliverable, I'm just saying that it, in my opinion, is a fundamental part of delivering code.
    And I'm saying that that's not fundamental, it's a modern methodology; and modern here does not necessarily imply "better", just "modern". In my project, we have one group of developers and they don't do testing; test is done by five seperate groups with different aims and mandates. And you might think your way's better, but you'd have to prove that point and you haven't even postulated a metric to make that determination yet.
    a) I cant see how any NDA is going to prevent you from writing tests.
    No, but it might affect how specific I can be on how it's done in the places I've worked in; as it should for you as well...
    b) I've worked on several products that are multi million lines of code and have been live for 10+ years.
    A measuring contest? :D
    We have tools here that beat those numbers, let alone released products.
    Adding Unit Tests doesnt involve making "major unplanned" changes. If so perhaps you are not doing it correctly? (Serious question btw)
    Adding Unit Tests may not; adding Unit Testing might.
    If the legacy product is not going to be sunset for several years (typically the case as "they" figure out if they want a rewrite, refactor or something else) then thats all the more reason to add unit tests as you continue maintenance of that product. When/If you get to rewrite it then you know exactly what those parts are supposed to do. Thats gold for a re-write/refactor project.
    You just proposed a project that - for this product - would cost between seven and eight figures, take up at least half the development team (ie. several hundred people) as well as quite a lot of other people for at least a year, and produce no salable features at the end of that. That's a lot for a company to eat. And since there's an existing test framework in place (just not one you'd recognise unless you've been in the industry for a few decades, but whose effectiveness is proven), you need a better argument than "it's better to do it this way" or "we might save money down the line".

    Ideology doesn't last long when it costs eight figures to hold to it.
    Again, I think thats just an "old-school" acceptance of "Im just a coder, I do what Im told".
    Try "I'm a professional, I fulfill my contracts". And don't forget that you do have a choice - you can always quit (and it's hardly hardship to do so).
    I believe development has moved on from this and the onus is on the Snr Developers to push back if you are being told not to write tests.
    You can't say things like that without more context, because while that might makes sense in a small team in an SME/startup (and I've been in places where it did make sense and was done); it doesn't make sense in every place. You've got a dose of tunnel vision here.
    I'm saying that delivering code without tests, to me, is not fully doing your job. In my experience I have never come across an occasion where its not possible to improve things by adding some tests.
    I have. I'm living in one (at least with the kind of testing you're talking about). It mightn't happen at the web development level or the small app development level, but with large and very large systems, it does. We don't know how to do very large development projects at a fundamental level as an industry; every study of project success/failure rates and causes shows this. Saying we do and pointing to methodologies developed for and tested on much much much (ie. two to three orders of magnitude) smaller projects as proof is either not bothering to think it through or not understanding the situation.
    Maybe Im missing your point here, but I cant see how you could have licencing issues by writing unit tests that are only ever run during a build.
    You can put damn near anything into a licence agreement. Look at the JSON licence sometime and try to figure out how a company with contracts for any military could comply with it's "don't be evil" clause. There are entire legal departments whose job this is, and a software engineer saying they know better is a fairly good example of hubris; I might not like lawyers but I don't think they're faking their workloads...
    I dont follow what you mean by "introducing regressions"? Surely thats the point of automated tests, to find regression bugs without the need for a bunch of manual QA?
    And if the tests you introduce conflict with existing tests maintained by a different team, that you have no ability to test for because to run those tests takes more hardware than you have access to?
    What I am saying is that if you are delivering code (written using whatever methodology) and you dont write the corresponding tests (of all useful types/levels) then I dont think you are being as good a developer as you could/should be.
    And I'm trying to explain that that's just not true except in the small scale where you have a small development team.


  • Subscribers Posts: 4,076 ✭✭✭IRLConor


    Sparks wrote: »
    And I'm trying to explain that that's just not true except in the small scale where you have a small development team.

    In fairness, your company is an unusually large one. The vast majority of people work in much smaller companies - even the people who work for pretty big companies work for smaller companies than you do. Hell, your company employs more lawyers than the number of people who work with me (based on some estimates I've seen possibly even by an order of magnitude).


  • Registered Users, Registered Users 2 Posts: 27,370 ✭✭✭✭GreeBo


    Sparks wrote: »
    I don't believe that you really can't tell the difference between fulfilling a contract of employment in the software industry and committing genocide.
    I'm starting, however, to think that you can't see that the two don't have the same ethical roots. Which is a bit of a headwrecker.
    Again, you are stuck comparing the results. I'm talking about the problems that are caused by people shirking responsibility about doing the wrong thing just because someone "above" tells them to. You clearly dont want to accept this point so I suggest we drop it.
    Sparks wrote: »
    It's not. It's central to your whole "You have to do TDD to be the best programmer you can be" thesis, because you've no definition of what "best" means; you haven't even said from whose vantage point that "best" should be measured.
    Ah and you see theres the rub. I *never* said you had to do TDD to be the best programmer you can be. In fact if you read my first post you can see where I say that I dont even do TDD as I dont believe in it.
    My whole point is that you should deliver some sort of testing with your code.
    Sparks wrote: »
    And I'm saying that that's not fundamental, it's a modern methodology; and modern here does not necessarily imply "better", just "modern". In my project, we have one group of developers and they don't do testing; test is done by five seperate groups with different aims and mandates. And you might think your way's better, but you'd have to prove that point and you haven't even postulated a metric to make that determination yet.
    Again you are arguing against a point I never made. Im not arguing about when you write unit tests, Im arguing that they should be written. I thought Id made this pretty clear tbh.
    Sparks wrote: »
    A measuring contest? :D
    We have tools here that beat those numbers, let alone released products.
    Not at all, I was merely responding to your "when you work on legacy..." point which implied that I hadnt.
    Sparks wrote: »
    Adding Unit Tests may not; adding Unit Testing might.
    So to test individual logical blocks of the code you write would require major unplanned changes? Thats a flashing potential design issue for me.
    Sparks wrote: »
    You just proposed a project that - for this product - would cost between seven and eight figures, take up at least half the development team (ie. several hundred people) as well as quite a lot of other people for at least a year, and produce no salable features at the end of that. That's a lot for a company to eat. And since there's an existing test framework in place (just not one you'd recognise unless you've been in the industry for a few decades, but whose effectiveness is proven), you need a better argument than "it's better to do it this way" or "we might save money down the line".
    I'm not sure when I proposed any project, I'm almost certain I said that when working on legacy systems you should incrementally improve them as you touch bits of them.
    Sparks wrote: »
    Ideology doesn't last long when it costs eight figures to hold to it.
    Ignoring testing doesn't last long when it costs 8 figures to maintain the code and deliver nothing. Without specific examples we can throw these weighty statements around all day and they mean nothing.
    Sparks wrote: »
    Try "I'm a professional, I fulfill my contracts". And don't forget that you do have a choice - you can always quit (and it's hardly hardship to do so).
    Is this directed at me? ?You havent provided a definition of "fulfill". To me fulfilling my contract is delivering code with associated levels of test.
    If I was in an environment where I was threatened with dismissal for wasting time writing tests then I wouldnt last there too long.
    Sparks wrote: »
    You can't say things like that without more context, because while that might makes sense in a small team in an SME/startup (and I've been in places where it did make sense and was done); it doesn't make sense in every place. You've got a dose of tunnel vision here.
    You seem to be pigeon holing me to working on SME's without any knowledge of what I've worked on. *I'm* not trying to turn this into a pissing contest but...

    how can it ever make sense for Snr Devs to not push back on being told to not write tests for their code?
    Sparks wrote: »
    I have. I'm living in one (at least with the kind of testing you're talking about). It mightn't happen at the web development level or the small app development level, but with large and very large systems, it does. We don't know how to do very large development projects at a fundamental level as an industry; every study of project success/failure rates and causes shows this. Saying we do and pointing to methodologies developed for and tested on much much much (ie. two to three orders of magnitude) smaller projects as proof is either not bothering to think it through or not understanding the situation.
    Again I'm not pointing at any methodology. I think its been pretty well shown that the sooner you find bugs the cheaper they are and that running automated regression/unit/acceptance tests as part of a build is the quickest/cheapest way to find them.
    The larger and more integration points a project(s) has the more important it is to be able to quickly identify what has broken and what has caused it.
    Sparks wrote: »
    You can put damn near anything into a licence agreement. Look at the JSON licence sometime and try to figure out how a company with contracts for any military could comply with it's "don't be evil" clause. There are entire legal departments whose job this is, and a software engineer saying they know better is a fairly good example of hubris; I might not like lawyers but I don't think they're faking their workloads...
    Im not sure where I said lawyers were faking their workload...
    Anyway, I cant see why you'd be using JSON (or any 3rd party library) in your tests and not in your production code.
    Sparks wrote: »
    And if the tests you introduce conflict with existing tests maintained by a different team, that you have no ability to test for because to run those tests takes more hardware than you have access to?
    Define conflict? If your tests are breaking something then you need to learn how to write better tests. If your test results disagree with someone elses results then someone doesnt understand the requirements.
    If hardware is the issue you are probably either talking about performance testing or integration testing. Hardware should be pretty irrelevant for unit tests. You are trying to test the logical parts of your code.
    You dont need to bring down an actual data centre and failover to another one to test the individual steps of the process.
    Sparks wrote: »
    And I'm trying to explain that that's just not true except in the small scale where you have a small development team.

    First define "small".
    You work for IBM. Great. That doesnt make anyone smaller than IBM automatically "small" and suddenly invalidate my arguments that apply to the majority of companies. Also, Im almost certain the half a million employees in IBM are not all working on developing integrated (in an EIP sense) software projects. If that argument worked I could mention the holding company I work for and "beat" you. We are talking about development projects not the size of companies.

    Then explain to me in simple English (because Im clearly missing something) why you cant write code to test other code you have written because of the size of your development team, honestly it just sounds like an excuse to me.


  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    IRLConor wrote: »
    In fairness, your company is an unusually large one. The vast majority of people work in much smaller companies - even the people who work for pretty big companies work for smaller companies than you do. Hell, your company employs more lawyers than the number of people who work with me (based on some estimates I've seen possibly even by an order of magnitude).
    ....thats a pretty fair point allright!


  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    GreeBo wrote: »
    Again, you are stuck comparing the results. I'm talking about the problems that are caused by people shirking responsibility about doing the wrong thing just because someone "above" tells them to. You clearly dont want to accept this point so I suggest we drop it.
    I don't want to accept it because you're wrong about it being the wrong thing and about abdicating responsibility.
    (1) You've not proven it's the wrong thing, you just think it is; and
    (2) You don't abdicate responsibility; you either do the job or quit - the point is that this third option you're pushing doesn't really exist.
    So to test individual logical blocks of the code you write would require major unplanned changes? Thats a flashing potential design issue for me.
    And now we're in NDA territory.
    I can say that you're incorrect or correct depending on the size of the logical block you're considering, but I don't think I can go much further than that.
    You havent provided a definition of "fulfill". To me fulfilling my contract is delivering code with associated levels of test.
    To me, it means complying with the actual written job contract that I signed.
    If I was in an environment where I was threatened with dismissal for wasting time writing tests then I wouldnt last there too long.
    Risking nine or ten figures worth of revenue in a licencing lawsuit because you used a library or framework that wasn't cleared by legal isn't seen as being reasonable here; and given the litigious nature of the industry these days from patent trolls to samsung and apple, it's hard to completely dismiss their concerns.
    how can it ever make sense for Snr Devs to not push back on being told to not write tests for their code?
    When there are five other groups with more people in them writing tests for the code, to give my own example.
    I think its been pretty well shown that the sooner you find bugs the cheaper they are and that running automated regression/unit/acceptance tests as part of a build is the quickest/cheapest way to find them.
    Not at this scale it hasn't been (the latter I mean, not the former). It's been tried and it's been shown not to be the fastest way to work with a codebase this large. Experimentally. In anger.
    Anyway, I cant see why you'd be using JSON (or any 3rd party library) in your tests and not in your production code.
    The point wasn't JSON; it was JSON's licence clause as an example of how licences can have wierd clauses. Nothing says you can't have a licence that requires payment for use even if the code's not deployed in production.
    Define conflict?
    NDA again, but if I bolt on code for testing that breaks legacy testing (which is possible), then the problem isn't someone else's.
    If hardware is the issue you are probably either talking about performance testing or integration testing. Hardware should be pretty irrelevant for unit tests. You are trying to test the logical parts of your code.
    You dont need to bring down an actual data centre and failover to another one to test the individual steps of the process.
    You do for some of our tests.
    And that happens to be the area I'm working in, so I know that some of the tests just can't be done by me.
    First define "small".
    I don't have a solid line, but lets say that large for me equates to several tens of MLOCS, several decades worth of work, hundreds of developers in multiple sites, outnumbered a few times by testers; and small would be where you can know everyone's name on the product team and hold all the codebase's general block diagram in your head with less than a year's work.
    Then explain to me in simple English (because Im clearly missing something) why you cant write code to test other code you have written because of the size of your development team, honestly it just sounds like an excuse to me.
    In simple english, because it's not my job. I'm paid to do other things; other people are paid to write tests. At the risk of monumental oversimplification in order to avoid NDA violation: there's a spec; I write to it, they test to it.

    If I don't want to do it that way, I can leave - transfer to another product team where they do it a different way, or just quit altogether. If I don't want to leave, I do the job the way they want it done. There is no third option where I dictate to men and women with thirty years of experience in the product how to do their job and restructure over a thousand devs and testers and managers spread all over the world, and it would be deeply unprofessional not to mention deeply arrogant to try to do so.


  • Registered Users, Registered Users 2 Posts: 27,370 ✭✭✭✭GreeBo


    Sparks wrote: »
    To me, it means complying with the actual written job contract that I signed.
    So, simply, if they dont ask for it you dont deliver it. Does that extend to documentation, naming conventions, error/exception handling and all the other NFR that are typically associated with software development?

    To me thats acting like a cheap off shore contractor that you have to specify pathetic details to otherwise you get junk back.

    In my mind you are using your current situation (working for one of if not the biggest software house in the world) to colour your situation. Using your definition of "large" how many large companies are there? What percentage of companies do you think falls into "large"?
    I think your "tunnel vision" jibe was spot on, you just aimed it incorrectly.

    I think delivering tests is a fundamental part of writing code and something every developer should strive for (irrespective of methodology) you dont.

    I think we will have to agree to disagree on this one tbh.
    Enjoyed the conversation, thx but I'm out.


  • Moderators, Sports Moderators, Regional Abroad Moderators Posts: 2,666 Mod ✭✭✭✭TrueDub


    I've worked in similar situations to both Sparks and GreeBo, so here's my take on the situation:

    Situations similar to Sparks: you do what you're asked. This is driven by project setup (everyone has a specific task, don't cut across each other), managerial fiat and convention - if that's how it's done here, it's how we do. I had 4 serious jobs (18 months plus) and the first 3 were like this. We coded, the testers tested and we all moved on. It worked (these were BIG companies who are all still around), and the job got done.

    Situations similar to GreeBo: you've more freedom to do things the way you want. My current job is very similar to GreeBo's, and we do things the way we feel achieves the job best. This is BDD, with lots of automated tests, and it pays off. We do this because we're able to, though, and because we've proved that it works for the company. Other teams & projects are not so lucky and do not get to do things this way.

    My ultimate point: people will do the job as best they can given the circumstances of the job. If the culture says do this, you do it. Even in my current role, where we've a lot of freedom, there's lots I'd do differently, but the restrictions are there and we get on with it.

    I can't say one way is "right" and one "wrong", because from job to job, project to project, the "right" way varies. Part of being professional is trying to improve what you can, by example or by suggestion, and another part is accepting that certain restrictions won't change anytime soon and finding a way around it.

    My 2 cents: I'd be very reluctant to go back to an approach where there's no BDD and no automated unit test suites. However, if it was the difference between eating and not eating, I'd probably even go back to COBOL. :mad:


Advertisement